My PC is getting old. In the next few months I'd like to build a new system. Currently I run 1080p on a Samsung 52" display. I've seen how nice images look up close on 4K displays. And with 4K coming down in price i think that should be my goal.
First, this needs to be somewhat on the economy side. Not a $1000 CPU.
Big mistake. If you are going to run a 4K display, you need lots of CPU horsepower.
At present, Lightroom does not use a GPU, so all rendering of your images must be done by the CPU, and for a 4K display, that means getting a powerful CPU.
In this situation, the idea of "bang for the buck" I think ought to be replaced with the idea of getting the most powerful CPU you can afford. Yes, I would recommend that top of the line $1000 CPU, whatever that is ...
Strange LR does not use the GPU (Mercury Engine). Maybe when 6.0 comes out. The small increase in clock speed from a midrange to the Extreme CPUs is not warranted unless you are a gamer. I would not over clock, so an unlocked CPU would be wasted $$. Now that Mac has a 5K display, and many androids have Quad HD displays. 4K will become the new standard. Makes a big difference looking at an image up close.
Strange LR does not use the GPU (Mercury Engine). Maybe when 6.0 comes out.
Not strange at all, LR architecture is different than Photoshop, and the designed have stated that they get more speed this way than by using the GPU. I have no idea if Lightroom 6 will use the GPU or not.
The small increase in clock speed from a midrange to the Extreme CPUs is not warranted unless you are a gamer.
The small increase in clock speed from a midrange to the Extreme CPUs is not warranted unless you are a gamer or you are using Lightroom on a 4K monitor
It just shows lack of knowledge, that you would equate the graphics intense nature of rendering a 3D game at its highest 100+ frames a second, overclocked and water cooled CPU, to the relatively light processing power needed to render a single RAWs development settings. Or even Exporting a 24 megapixel to a JPG or Tiff. I'm pretty sure the ability to handle a 4K screen resolution vs. 1080p has more to do with the cores on a GPU than a CPU.
> to the relatively light processing power needed to render a single RAWs development settings
Actually the raw processing pipeline is computationally surprisingly more complex than rendering 3D games. The reason it is not wholly done by ACR and Lightroom on the GPU is that there is very little speedup and even some speed loss under certain conditions according to the Adobe folks. The only raw converter that supposedly does raw rendering on the GPU is Apple's Aperture. It is no speed demon even on the highest end machines that Apple sells. On their Mac Pro which has dual video cards precisely to do GPU computation, Lightroom feels a lot faster than Aperture! GPUs are good at operations that can be done massively parallel and in low precision and that have few conditionals in them. 3D rendering is a typical example of that as is video effect rendering, fluid dynamics calculations, etc. For some reason, raw demosaicing and processing is not one of those. I don't believe you'll see GPU raw processing very soon especially since the camera megapixels keep going up with the same pace as GPUs get faster.
P.S. it is not true that Lightroom does not use the GPU. It just doesn't use it for the raw file processing. It does use it for displaying images, scaling images to the screen res, etc.
Wow, my knowledge must be old. Apparently PhaseOne added OpenCL support in Capture One in v8. It's supposed to speed up raw rendering by quite a bit: Capture One Pro features Maybe I should check it out again.
Actually dj_paige is correct, Lightroom uses the CPU even for rendering the UI and scaling images to the screen (which makes no sense whatsoever which is why it's counterintuitive). You can test this yourself by opening your CPU manager (Task Manager in Windows) and quickly hiding and showing the sidebars and filmstrip (I think the button to do this is shift+tab?) As far as I can tell, Jao vdL, Lightroom doesn't even use the GPU for displaying or scaling images.
Another proof that this is all done by the CPU (not that I'm happy about it, my performance on a dual 1200p/1440p setup is horrendous on a 4.5ghz 2500k even when not using 2nd monitor in LR) is that they recommend a more powerful CPU for higher resolution displays:
Options that can help increase performance include:
- 64-bit, multiple-core processor (for best performance, up to six cores. The extra power is especially important if you use multiple or high-resolution monitors which require more power).
And technically, the only reason RAW processing can use relatively light processing power is that, unlike games, the files aren't needed to be rendered in real time. Personally, I'd rather throw super heavy processing power at my RAW files so they finish sooner
I also just managed to crash LR by simply quickly switching between the Develop and Library modules. There are so many things that should work differently and I can't speak for why Adobe hasn't improved the performance on high-end systems.