Have you tried to install the latest drivers, as specified in Adobe Photoshop Lightroom Help | Lightroom Graphics Processor Acceleration (GPU) Troubleshooting & FAQ ?
I invite you to read also this thread GPU notes for Lightroom CC (2015) : remember that GPU improves DEVELOPMENT speed, but it can have overhead loading pictures in memory.
Each combination PC/OS/GPU/Driver can have different results, so it is better to test the GPU setting and decide if to keep it on or to switch it off.
Just for an example, on my PC I have a Intel4000 (totally unsupported) but it works like a charm with the GPU acceleration.
Thanks for the answer, mcucinat.
The graphics adapter which I have bought is among ones, explicitely suggested by Adobe and I am certainly using the latest driver. I have chosen Nvidia, because I have read about issues, connected with LR GPU acceleration and the latest version of AMD Catalyst.
I have actually read read both articles previously. The point of the second link "Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1." For me it is simply an evidence of bad programing and bad software architecture. Image to image switching is THE operation, which is used by anybody. One of key advantages of Lightroom is that it is optimised for working with big quantities of pictures - if it is not one's case, Lightroom may be the false product and there are better alternatives. So if image to image switching actually gets slower (and it does), it is such a big drawback, that it makes GPU acceleration absurd. My suggestion was pretty straightforward - if the guys at Adobe can't program it properly, the simpliest solution would be (to make an option) to turn off GPU acceleration for image swithiching and just to take its profit for the developer sliders.
In my situation, my system is currently way better off without GPU acceleration, meaning that I have thrown 400 bucks out of the window, thanks to Adobe's marketing promises on acceleration.
Look, I an not even willing to discuss about allegations of marketing b***it or "bad programming", first because I am an Adobe employee, second because this kind of discussion takes nowhere.
What I can repeat is: combination of OS/GPU/Driver/Hardware can have so much different behavior, as there are so many variables.
MY Intel 4000 should not be ever supported, instead I have a difference of about 150-200% in performance with GPU on, and when I start the secondary GPU (Nvidia Quadro K2000M) it goes even better.
I have Catalog on SSD and files on a Synology, linked via Wifi 802.11g, and everything is smooth (and my DNGs are from a Nikon 610, so 25Mb each).
And just to have a different opinion on LR 6/CC + GTX 970, a quick Google search points to Need Graphic Card Recommendations for Lightroom CC: PC Talk Forum: Digital Photography Review ;
citing literally a user:
I use a Nvidia GTX 750 (non ti) this works exceptionally well with LR 6.
This means that there s something between your and his setup that leads to a performance degradation.
. My suggestion was pretty straightforward - if the guys at Adobe can't program it properly, the simpliest solution would be (to make an option) to turn off GPU acceleration for image swithiching and just to take its profit for the developer sliders.
You can't merely switch it off for changing the image. The image *must* be loaded into the GPU, which is where the delay comes from. Even if you could, think what would happen: sure the image would change quickly, but then you'd still have to wait before you could process it.
At the moment the GPU acceleration in LR is very much in it's infancy: it's first generation stuff, and we are all pretty much beta testers.
Some people are lucky: they love it and have no problems.
Some people are unlucky: it causes nothing but problems.
Some people like myself are sat on the fence: we are having mixed experiences.
I don't think you've thrown away $400, at some point (I live in optimism) all the issues will be fixed and LR will run sweetly on your system. You just needed to wait. Not being funny, but you only needed to cast a casual eye over any photography forum to see that there are issues with the GPU acceleration, and now isn't the time to buy the latest and greatest card.
I've personally held off buying a new card to replace my ageing one until I see evidence that there will be zero chance of any issues.
Look, I did not intend to offend neither you nor employer. I am just frustrated about the way the GPU feature works with MY system and that the new, recommended graphic adapter did not bring any performance improvement.
I guess, another user, whom you are quoting hat bit more luck with his GTX 750 (you have mixed it up with my GTX 970) or may be he is using the software in a different way.
"combination of OS/GPU/Driver/Hardware can have so much different behavior" - surely one can hardly about that and it is a great excuse for any possible performance issues.
I am not alone with the critic, by the way - the prominent German magazine have published their own tests and benchmarks with and without GPU acceleration for Lightroom 6 and their results and conclusions were nothing to be proud of.
(Article is here, unfortunately in German: Nachgemessen: GPU-Beschleunigung in Lightroom 6 und Lightroom CC | heise Foto ).
Certainly I can not eleminate completely the possibility that there is something in the setup which is making things worse and would be glad to give it a try to improve it. However I have no clue, where to start, because there is actually nothing to set up in Lightroom for GPU acceleration, I have the latest driver, which works and gets reconized and I could see no useful settings or tweaks there as well. If you could share some hints, which settings might improve GPU acceleration performance, I would be more than happy to try them out. It also could be that Nvidia Quadro might have been a better choice for me to match my quick CPU, but unfortunately I can not invest a four digit sum just to try it out, without even being certain.
@ F.McLion Thanks for the links, but unfortunately I could not find there any information on my topic - my adapter is not a "Titan" and it is actually working - I am just not satisfied with the performance, which is all in all poorer than without GPU acceleration.
I have the same problem—using a 6-month-old 27" iMac. I had black bars randomly appearing on images and I have to cut off gpu acceleration. AMD Radeon R9 M290X 2048 MB. The only way to get rid of the bars is to do a safe boot. This is just annoying so I cut the feature off.
Could you send me the System Info dump, so to have a look to your configuration.
Also, when you are scrolling between photos (are you in Developer mode, right? not in Library), can you look at the Resource Monitor and see if any of your CPUs are getting over 100% during the image switch?
Then, as F.McLion said, is Lightroom in your NVidia panel (this should be regardless of the model) ?
As an example, here are my settings:
I have sent you a PM with a system dump and a short screencast with and without GPU.
I have already done what F.McLion said and have added Lightroom manually. However the GPU acceleration was available prior to that as well and adding LR to 3D settings did not bring any visible change. (However after I have messed around with global 3D settings, some other software like Skype has begun to crash, but it might also be a coincidence).
I have played around with changing 3D settings in "Specify the settings for this program", but could not see any changes either. Is there anything there to set, which might affect lightroom's display or performance?