Skip navigation
Currently Being Moderated

So is everyone working in linear (1.0) gamma these days?

Apr 10, 2012 6:48 AM

We're moving a program to HD, so in the next couple months I need to redesign a huge number of graphics and lower thirds.

 

We've switched from FCP to Premiere. Right now, if you edit in Premiere CS4-CS5.5 and export using "Maximum Render Quality", everything gets blended in linear 1.0 gamma. This makes every graphic with transparency I've ever made look terribly wrong. Until now, I've been rendering with low quality to achieve the right result.

 

So here's my question: Do you believe the future is linear? Are you checking "Linearize color space" for all your AE graphics now?

 

If I invest a bundle of time in making everything in linear gamma, will I be future-proofing these graphics, or will Adobe most likely abandon the linear push and support high-quality 1.8 gamma again?

 

What are you doing/betting on?

 
Replies
  • Currently Being Moderated
    Apr 10, 2012 8:30 AM   in reply to Clint Porter

    My TV set has Gamma bias. My computer monitor has Gamma bias. Every single display out there has Gamma bias and so has pretty much any recording format. So why in all the world should I do everything in linear just because someone tells me to? True, there are hard mathematical reasons with regards to the under the hood stuff during processing, but there is no need to over-obsess. That's Stu Maschwitz' and other people's hobby... I for one couldn't care less - as long as my stuff ends up being shown as H.264 or other compressed formats played from harddisk recorders on tradeshows, is played back from computers during corporate presentations or aired on TV all I care for is linear blending giving me greater creative options for some effects, but not much more beyond that. It's really not that important as long as you do not work for digital cinema. In fact many times it's just an excuse for software vendors to put more responsibility on their users along the lines of "If it looks crap, you didn't linearize it" while they make a mess of the Gamma in their own plug-ins. It's the same like "linear workflows" in 3D programs - why should I suddenly start to care to make my textures and lighting linear when over the years I have learned to tweak everything with Gamma to make it look like I want? If you get my meaning: Linear is overrated and overhyped. It's an option that may solve some specific issues, but it doesn't fix everything and if you have established a suitable workflow without it, I see no reason to change it just for being hip or in some people's eyes "acadamically correct".... *boohoo*

     

    Mylenium

     
    |
    Mark as:
  • Currently Being Moderated
    Apr 10, 2012 10:10 AM   in reply to Clint Porter

    I think those are issues specific to Premiere and more or less compatibility issues of its older non-GPU basis vs. the new stuff vs. how your GPU may or may not meet some requirements. Per se there is nothing that speaks against processing Gamma-lized footage even in GPU mode. In fact I believe there are native Gamma functions directly in any of the APIs, be it OpenGL, CGFX, CUDA or what have you. They just cost you a few more processing cycles. In your case the problem is probably more that "Render at max quality" inflates all 8bit/ 16bit data in a not-so-smart way rather than this per se being a failure in all 32bit processing.  The rest you will surely find out next week when Adobe makes some big noise at NAB and I'm sure then Todd will give you al lthe answers you desire (or some docs and videos becoming available on the website)...

     

    Mylenium

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points