We're moving a program to HD, so in the next couple months I need to redesign a huge number of graphics and lower thirds.
We've switched from FCP to Premiere. Right now, if you edit in Premiere CS4-CS5.5 and export using "Maximum Render Quality", everything gets blended in linear 1.0 gamma. This makes every graphic with transparency I've ever made look terribly wrong. Until now, I've been rendering with low quality to achieve the right result.
So here's my question: Do you believe the future is linear? Are you checking "Linearize color space" for all your AE graphics now?
If I invest a bundle of time in making everything in linear gamma, will I be future-proofing these graphics, or will Adobe most likely abandon the linear push and support high-quality 1.8 gamma again?
What are you doing/betting on?
My TV set has Gamma bias. My computer monitor has Gamma bias. Every single display out there has Gamma bias and so has pretty much any recording format. So why in all the world should I do everything in linear just because someone tells me to? True, there are hard mathematical reasons with regards to the under the hood stuff during processing, but there is no need to over-obsess. That's Stu Maschwitz' and other people's hobby... I for one couldn't care less - as long as my stuff ends up being shown as H.264 or other compressed formats played from harddisk recorders on tradeshows, is played back from computers during corporate presentations or aired on TV all I care for is linear blending giving me greater creative options for some effects, but not much more beyond that. It's really not that important as long as you do not work for digital cinema. In fact many times it's just an excuse for software vendors to put more responsibility on their users along the lines of "If it looks crap, you didn't linearize it" while they make a mess of the Gamma in their own plug-ins. It's the same like "linear workflows" in 3D programs - why should I suddenly start to care to make my textures and lighting linear when over the years I have learned to tweak everything with Gamma to make it look like I want? If you get my meaning: Linear is overrated and overhyped. It's an option that may solve some specific issues, but it doesn't fix everything and if you have established a suitable workflow without it, I see no reason to change it just for being hip or in some people's eyes "acadamically correct".... *boohoo*
Thanks Mylenium, you raise some good points that speak to why I and others have been dragging our feet on this.
But who cares if we don't like it or need it? I think the real problem/question is: Do we have to? Is Adobe is going to force us to go linear? Should I be making everything linear starting today?
Even today, if I want to use a non-1.0 workflow, Premiere forces me to turn off CUDA and to uncheck Max Render Quality on output. That means if I want to do a little Ken Burns effect on a photo or resize a video, I'm going to get a cheap consumer-y looking shimmer thanks to the lack of anti-aliasing without Max Render Quality. Any animation or resizing must to be done in AE.
Will it be even harder to use non-1.0 lower thirds in the next version of Premiere? Will CS6 use linear blending in all export configurations? If so, it means I either need to change how I produce all of these alpha graphics in AE, or we need to change to a different NLE.
(Obviously I'm only talking about graphics w/transparency, like lower thirds and bugs. You can make fully opaque graphics in whatever gamma you want since Premiere won't be blending it.)
I think those are issues specific to Premiere and more or less compatibility issues of its older non-GPU basis vs. the new stuff vs. how your GPU may or may not meet some requirements. Per se there is nothing that speaks against processing Gamma-lized footage even in GPU mode. In fact I believe there are native Gamma functions directly in any of the APIs, be it OpenGL, CGFX, CUDA or what have you. They just cost you a few more processing cycles. In your case the problem is probably more that "Render at max quality" inflates all 8bit/ 16bit data in a not-so-smart way rather than this per se being a failure in all 32bit processing. The rest you will surely find out next week when Adobe makes some big noise at NAB and I'm sure then Todd will give you al lthe answers you desire (or some docs and videos becoming available on the website)...
You're right that it's specific to Premiere, but the problem doesn't have anything to do with the GPU necessarily (although it may have been designed with GPU processing in mind.) It has to do with Adobe's choice to focus on linear blending.
You can set the Mercury Playback engine to "software only", turn on Max Render Quality and still have the problem. In fact, you can even go back to PP CS4 and have the same problem. I don't believe CS5 uses the GPU during the export process anyway.
The problem is this: 50% opacity with Max Render Quality on is nowhere near the same opacity as 50% opacity with Max Render Quality off. (When off, the result is identical to the way the graphic looked when it was designed inside AE) Semitransparent areas behave very differently. Bright semitransparencies become much more opaque, and dark semitransparencies become much less opaque. This means text set on dark semitranparent colors become washed out and illegible over video, while subtle glints and shines suddenly become bright and bloaty.
(My card, by the way, is the NVideo Quadro 4000. Areas that are fully opaque are always fine because I keep my AE-Premiere workflow color managed.)