GPU CUDA Acceleration for rendering image previews, exporting images, and playing video in Lightroom 4.
I'd like to see Lightroom 4 make use of GPU processing. Similar to the Mercury Playback engine in Premiere Pro CS5.5
GPU acceleration should be available for ALL CUDA enabled GPU's.
I absolutely love GPU acceleration & Mercury Playback engine in Adobe Premiere Pro. It really helps to speed up real-time previewing of high resolutuon footage.
I am sure that GPU acceleration will speed up any professionals workflow.
OpenCL/OpenGL or Cuda would be so great for Lightroom. I`d like to see it in several places:
- survey mode: the animation of images still is stuttering compared to what Bridge does in the preview(?) window
- Image display: Compared to Photoshop, LR`s image display still does not zoom in or pan images smoothly. It`s stuttering on my PC (Win7 64bit, Q6600/2,8Ghz, GTX 470, 8GB RAM). And of course alll the nice features like animated zoom, flickpanning and birdseye view etc. are missing!
- Actual image processing or at least preview: In Capture One, which uses OpenCL, every slider movement of exposure etc. (or any other adjustment) is translated to a really smooth screen redraw. No stuttering at all. If I move an adjustment slider in Lightroom 4 and earlier versions, screen redraw is not really fast and smooth. And it gets even worse, if you activate the secondary monitor view ( for example LIVE mode and zoomed in). BTW. image panning really gets very bad, if you have made several local adjustment. If you then are zoomed in and pan around you get bad slow tiling of screen redraw.
Will we get any of this GPU accelerations?
How about me and my 2GB HD-695X-CNFC? That's an overclocked version of AMD's 6950.
I just spent a bunch of money to build a system that could overcome the sluggishness of Lr 3.x. Please tell me that I didn't waste my money on the wrong GPU.
Hopefully, it wil be fast enough to overcome that snag in any case. If Win7/64bit, i7 2600K, 16GB of high performance/low latency SDRAM, and an over clocked 2GB AMD 6950 GPU isn't fast enough to give me smooth slider redraws, then I'm seriously going to have to reconsider my use of Lr.
I understand that GPU acceleration is a request, but to be honest and to set expectations: Lr has a very long and complex image rendering pipeline, and getting it to run on the GPU would take a lot of work.
I've just got this homebuilt cookin'. Still have some configuring to do. Took me about 2 weeks to get my 16GB SDRAM tuned to the lowest latency possible, 8-8-7-24-2N. The two sets of 8GB weren't matched from the factory. My limited tests with jpegs so far show a great deal of improvement in slider/screen redraws over a top of the line desktop from 4 years ago.
This problem was my biggest complaint. It made the use of Lr not worth it to me. I could live with other performance problems.
Haven't tried Lr 4beta yet or high volumes of Raw Files. Hopefully by the end of next week.
Okay MadManChan, so there won`t be GPU acceleration. What a pity. But if it is too much work to get the rendering pipeline into the GPU (like Capture One did?), it would have helped (maybe) to only do screen redraw in the GPU and to get at least an animated zoom and panning that is smooth like in Photoshop. Or GPU for survey mode to make it as smooth as in Bridge. For me as as user one of the most important things about a program is how it "feels" on screen. If something is stuttering, it feels bad and is no fun to use. Capture One with OpenCL really IS fun in this regard. Same applies to the UI. It`s a shame that we still cannot drag panels to a secondary monitor, which was a request by many people over the years. I want to see and have access to all adjustments at once without having to scroll through numerous panels or having to open and close them. I really love LR, but I hate the UI in terms of speed and flexibility.
BTW., if you look around in the forums, there are a lot of complaints about slider responsivness in LR. And always have been since LR1. You might think, that we get faster and faster CPUs over the years. As is seems, this does not help, because your code is getting more complex as well. This is good for quality, of course, but speed doesn`t get better even with better CPUs over the years.
lets all just hope that there will be gpu acceleration. I have my bachelors in computer science so I know that these guys go under alot of stress.
Here in Hawaii, we have alot of business in video and photography through weddings. People from all over the world come here to have weddings.
Having MPE/GPU acceleration for premiere pro is great as it has helped my company to go through less stress when we are doing same day edits for the bride and groom.
Less time is waiting for computer to render or export a movie.
GPU acceleration would be great for lightroom especially when multitasking and being able to scrub through movies.
It would allow for my employees to be more productive as they could export photosets and then move onto another photoset to edit also
I have reaped the benefits of MPE and GPU acceleration in PP, thats why I am hoping it could be in lightoom as well!
If not in lightroom 4, lets hope they are able to implement it into future iterations such as LR 4.1, 4.2 etc.
Today I made an interesting experience: I worked on an old PC with a single core Pentium 4/2,4 Ghz. I was working with Capture One 2.9. That`s really old stuff. But guess what: No matter what you did in that old software, everything could be seen in realtime on screen. Moving any sliders (okay, there aren`t too many) changed the image in realtime. Scrolling through pictures, no lag or sluggishness like in Lightroom since version 1. That supports my theory that software like Lightroom or any other software doesn`t get faster, because we get faster CPUs over the years. Software is getting more CPU hungry as well. We get many more possibilities and much better quality, which is all really welcome. But if this is at the cost of "onscreen experience" getting slow, than engineers have to look for other ways to speed software up. We see various software getting GPU acceleration, also from Adobe. And Lightroom is one of the softwares that does "feel" slow, be it scrolling ar adjusting sliders. Why is it still not getting GPU even in version 4? If Capture One could do it, why not Adobe?
I can imagine that, MadMan. But aren``t other software vendoes (Capture One, again) not faced with the same problem? Could we, as a starting point, not at east get GPU accelerated screen redraw like PS CS4 had? Just to have smooth zooming and panning?
That would not be that complex, would it?
Different software packages optimize for different things. They have some things we currently don't have. We have some things they don't have. GPU accelerated screen redraw in Ps was a significant effort and required many months of development and testing. Even if we did that, it wouldn't really help the user experience because our pipeline is different than in Ps (in Ps the pixels are effectively baked; in ours, we do all our rendering on-the-fly since the editing is parametric -- our bottleneck is NOT the display path!).
I use an iMac OS10.6.8 with NVIDIA GeForce 8800 GS w/512 MB GDDR3. I use a Bluetooth Apple trackpad as a mouse, and a Wacom Intuous 3 tablet. In Libray Module of still images, a finger swipe on the trackpad moves rapidly between images without lag or stutter; swipe does not work in Develop Module (I am not sure if it is meant to). However, in Library Module the filmstrip thumbnail highlighting always lags by about a second. With pinch/zoom the effect takes a few seconds to be recognised, thence the zoom follows the finger movements without lag.
Really? CUDA? So everyone has to have one brand of graphics card? No thanks.
OpenCL. Fair enough.
I use a 2012 mac mini and have no problems with the speed of the display update with 21MP files. Maybe movies is different? But given the other photography features on request, I'd go with those first.
Have you developed CUDA and OpenCL code? Given the difference in difficulty of using the development tools (read that as cost), I would expect that accelerating parts of Lightroom using CUDA to be much cheaper. We may have a choice of CUDA - with a restricted user base - or nothing.
It will also be interesting to see what platforms OpenACC will support, and how well. That should make make GPU aceleration much easier/cheaper.
OpenACC is a new standard for programming accelerators, announced at Supercomputing '11, and due to be out some time in the first half of 2012.
Please consider enabling some kind of GPU acceleration.
As a professional event photographer shooting thousands of images per week, speed is absolutely essential. Cutting even a second or 2 off of rendering times is a HUGE deal for high-volume, daily use professionals like myself.
While I sincerely appreciate all of the new features that Adobe rolls into each release, and I can understand the need to be competetive with similar products on feature-set, I'm begging Adobe not to forget that for professionals like myself, speed is the most important feature. Specifically, I never want to have to sit in front of my computer and wait on the program.
There are certain tasks I'm willing to wait for if I can batch them and walk away to handle other business (though speed increases here are also essential). But I never want to see an hourglass or spinner in my way of advancing through images.
I understand that there are myriad hardware configurations that Adobe has to support, but I think its safe to assume that a professional would be willing to spend some money on a video card (even if it has to be of a particular type) if it would significantly speed up rendering time, particularly for those of us who have already 'maxed out' our editing machines with all the high-end CPU power, RAM, and SSDs/RAID possible, and we are STILL seeing hourglasses/spinners.
Here's how I would prioritize the 'pain points' of speed based on my workflow -
1. Checking focus at 1:1 (100%) MUST be instant. Waiting even 1 or 2 seconds for a photo to render at 100% just so that I can check focus on faces to decide if the photo is a keeper adds up when multiplied over thousands of images. I know that Library mode uses a different rendering sequence than Develop mode, but regardless of mode, when I advance from one image to the next at 100%, rendering should be instant. I already batch-render 1:1 previews before I sit down to pick rejects and keepers, but I'll often need to change white balance on the next 30 photos, for instance, and as I advance through them, I have to wait while they render. It has been mentioned that the image processing pipeline is long and complex. Then simplify it. Or at least give us the option to trade something for increased speed.
I think a great idea that makes a really compelling argument for facial recognition in Lightroom would be the ability automatically check focus on faces. I'd be willing to pay untold sums for this feature alone, as it would easily save me hundreds of hours per month. In fact, as a wedding/event/portrait photographer, I basically spend my whole week checking focus on faces as I weed out the good from the bad.
If Adobe combined facial recognition with focus/blur checking(similar to Photoshop Elements Auto Analysis feature) specifically on those faces, and had an option to pre-tag those photos, then pre-render just a small 1:1 area centered on the face(s), it would be HUGE.
Then I could simply sit down and very quickly scan through to double check that the algorithm did its job. Checking focus just around the detected faces would cut down on analysis time and false-positives due to 'legitimate blur' from bokeh and panning.
2. When I drag a slider (or hit a key) that changes any aspect of the photo (especially the 'Basics'), the result should be instantaneously rendered. No waiting, no hesitation. Again, I think the GPU could help here.
3. Improved performance when making lots of local adjustments. I don't know if this is a RAM thing or if the GPU could help here, but I know that this is a common complaint. If the GPU can help crunch through the processing it takes to handle lots of spot retouches and airbrushing, it'll make portrait retouchers very happy.
4. Export Speed. This is number 4 on my list because at least I can walk away and get a beverage or something, but when you're under the gun and the client is waiting on the images, increased export speed is appreciated. I think GPU acceleration could help here, assuming that you've already got fast SSDs and there are no disk/RAM bottlenecks.
5. Rendering 1:1 previews. The sooner I can get to work on a wedding or portrait shoot, the better. If the GPU can render in the background or help me get started quicker, all the better.
6. Video Rendering. Premiere has the Mercury Playback engine, Lightroom should too, now that it edits/exports video clips. GPU acceleration has already been proven to help in this regard. 'Nuff said.
Great List. This effects my bottom line and I'd count myself as one of the professionals who'd be happy to purchase the "right" video card to get GPU benefits. Right now my cards support OpenCL but I'd gladly switch for the added horsepower.
I totally agree that we want to see 100% to evaluate focus. What I would like to see is the preview in the import module, using the embedded JPG, speeded up. I find that as I move through to images the caching does not keep up, and the delay from image to image increases. This is not the case using photo mechanic reading images from a compact flash card.