Only Adobe insiders can answer this. It appears they have chosen to only allow their own filters to access CUDA processing. Many third-party plugs DO or CAN use the GPU (You can tell, because some of them have a "Use GPU" checkbox, such as many BCC effects, Looks (in preferences), etc.), they're just being hobbled by the application. Many of the same filters ARE using the GPU in Ae.
I have put in my feature request. Have you?
Now i did, this is more related to RedGiant they have to work on Premiere's accelration etc whatever they call it.
MB Looks does run on the GPU, but using OpenGL, which is an entirely different API.
I read a while ago on an Adobe blog that, regarding Premier, Adobe breaks its own rules regarding the SDK. Third party developers have to use it to the letter, whilst Adobe doesn't! This limitation restricts functionality as the SDK does not allow full access.
For some reason, the AE SDK adheres to different rules, hence the reason why there are so many third party plug-ins - all of which are more tightly integrated.
Hmm... maybe an Adobe employee would be willing to explain the reasoning behind this.
I use quite a few third-party plugins, but I tend to stay away from them in Premiere because it slows playback down to a crawl.
I already told you why Looks isn't "accelerated". Red Giant has coded the plug-in to work with several different programs, including PP, AE, FCP, Avid, Vegas and possibly more. It has to work consistently across all host applications using both nVidia and ATI cards. Add to that the fact that CUDA acceleration happened long after Looks was around, and you begin to understand.
Looks renders using OpenGL. If you want Looks (or any third party plug-in) to run using CUDA inside of PP, you have to get that third party to program it that way.
You may have told us, Jim. That doesn't mean you're necessarily right now does it!
From the Red Giant web site...
Paul – This is from Stu Maschwitz’s blog, and may answer your questions:
“Both Adobe Premiere Pro and Final Cut Pro bypass their own plug-in SDKs for their native 3-way color correctors. They use window configurations and graphics drawing routines that third-party developers don’t have access to. On some systems this can make UI interaction for third-party effects with Custom UIs slow. In the case of Premiere Pro, the slowness can be bad. Real bad.
Have you noticed that Premiere’s own 3-way color corrector has never been ported to After Effects? This is one consequence of the Premiere team’s choice not to use their own plug-in SDK. Another is that third parties cannot provide a fluid custom UI experience within Premiere Pro.
After Effects, on the other hand, “eats its own dog food,” and has no effects that don’t use the public SDK. This means that third parties can create excellent user experiences within After Effects. The benefits to us users are obvious — just look at all the amazing plug-ins available for After Effects.
Premiere Pro and After Effects actually share the same plug-in SDK. This is amazingly cool, because it means that, for example, you can start a project in Premiere, use Colorista II all you want, and then move the project to After Effects, keeping all your settings. But despite this shared architecture, plug-ins like Colorista II sing in After Effects and bog down in Premiere.
Red Giant has is committed to working with Adobe to resolve this situation. We love Premiere Pro and feel that it and Colorista were born for each other. The playback performance is amazing. We’ve done the best we can with what we have. If you try Colorista II in Premiere and find the performance lacking, please consider contacting Adobe and asking them to improve the performance of Custom UI plug-ins written to their own SDK.”
I think that Stu Maschwitz was referring to problems with the Colorista 2 Interface in the quoted post. The GUI controls used to be very laggy when used with PPro. They are better now, but not ideal. The OpenGL vs CUDA issue is a different topic, relating to actual rendering and encoding.
Different subject, I'll grant you, but my understanding is that Stu was taking about the general SDK implementation - and this may remain the underlying reason for Looks 2 not having a tighter CUDA integration.