5 Replies Latest reply on Dec 16, 2013 2:26 PM by D Fosse

    cd/m2 vs gamma for Youtube and digital cinema

    funkytwig Level 1

      Hi, I have a X-Rite i1 Display Pro and a Dell Ultasharp 2410.  I wish to calibrate it for grading Youtube and Digital Cinima.  The iProfiles software uses cd/m2 and I have also used the dispcalGUI sortware with the i1 whitch uses gamma. 

       

      My first questino is what cd/m2 should I use for grading for YouTube and what cm/m2 for Digital Cinima. 

       

      I did try calibrating with the dispcalGUI sortware using the Rec.709 setting.  Aparently this calibrates to a gamma of 2.2.  This calibrates the monitor significanly brighter, and when I say significanty I mean massivly brighter than even cd/m2 160!

       

      So my second question is what is the relationship betweem cd/m2 and gamma?

       

      I gather the dispcalGUI sortware is suposed to be better (i.e. more acurate) than the X-Rite iProfila software but it seems to set up everything far to bright so is seems useless.

       

      Ben

        • 1. Re: cd/m2 vs gamma for Youtube and digital cinema
          funkytwig Level 1

          120 is what you want.  The software will give you a gamma curve of 2.4 with Rec.709.  This is OK but in normal room viewing you probably want 2.2.  You can get this by selecting cusom rather than Rec.709 alough I think this will give you the full guant your monitor cam deliver.

          • 2. Re: cd/m2 vs gamma for Youtube and digital cinema
            D Fosse Adobe Community Professional & MVP

            Cd/m² is the measurement unit for white point luminance. Gamma is the tone response curve from black to white. Rec 709 is a TV standard with little relevance for a home computer system.

             

            Set your U2410 to custom mode, and calibrate to 2.2 and 120 cd/m². That's it, don't complicate it.

             

            The U2410 is natively very bright (around 400 cd/m²). To get to 120 set the monitor OSD way down to reach the target. This is much better than letting the calibration LUT reduce brightness (you may get banding because of lower bit depth in the video card. Knocking down 400 to 120 in the LUT reduces 8 bits to effectively 6 or so).

             

            There is probably a sort of "pre-calibration" function in the i1 software to let you get as close as possible to the targets prior to the actual calibration.

            • 3. Re: cd/m2 vs gamma for Youtube and digital cinema
              D Fosse Adobe Community Professional & MVP

              Oh, almost forgot the most important: The U2410 is of course a wide gamut monitor, which requires a fully color managed environment from source to monitor to work as intended. A Youtube video watched in a web browser probably doesn't qualify whatever the browser configuration.

               

              So then you have to "dumb down" the monitor and use the sRGB mode to limit the gamut, instead of the custom mode. Everything else above still applies.

              • 4. Re: cd/m2 vs gamma for Youtube and digital cinema
                funkytwig Level 1

                Sorry, I know it was a while ago bt have just got a Boradcast TV job so am revisiting my setup.  If I use sRGB there is a real limit to what I can set on the monitor (in terms of RGB), it is effectivly a preset.

                 

                The Rec.709 setting in displaycalGUI is deigned for the correct guant, Rec.709 is THE Broadcast standard, so not using it seems a bit nts.  I can set a cd/m2 of 120.

                 

                Ben

                • 5. Re: cd/m2 vs gamma for Youtube and digital cinema
                  D Fosse Adobe Community Professional & MVP

                  First of all you need to establish whether you have a fully color managed pipeline from source to monitor. That is absolutely essential with a wide gamut monitor such as the Dell U2410.

                   

                  There is an important distinction here between calibrating and profiling a display. Calibration is a basic modification of the display itself, setting white point luminance and temperature, gamma and neutral color balance. Calibration affects everything globally, but is not part of the color management chain (except indirectly).

                   

                  Profiling the display introduces color management. The profile is a complete and detailed description of the display in its calibrated state. Included in that description is the precise position, in three-dimensional color space, of the three primary colors.

                   

                  In a color managed process, the source profile is converted on the fly to the destination profile, in this case the monitor profile. All the source colors are remapped into the destination (monitor) color space, according to the profile description.

                   

                  This has two important implications relating to your question. One, it doesn't matter what gamma the monitor is calibrated to -  it's remapped from source gamma to monitor gamma whatever it is. Monitor gamma matters only in a non-color managed situation, because then there's no such remapping. You just want a gamma target that makes the monitor behave at it's best, which is usually native.

                   

                  Two, only the monitor profile can account for gamut differences. Again the operative word is remapping. Calibration alone can not define a gamut. The only way to do that without full color management is to set it in the monitor itself, if it has that option.

                   

                  In this context Rec.709 is equivalent to sRGB. The primaries and the white point are the same. Gamma (tone response curve) is different, 2.4 vs 2.2, but that is irrelevant in a color managed workflow. And with that monitor, you absolutely need a fully color managed environment for the gamut remapping, since Rec.709 has sRGB primaries. The monitor OTOH has Adobe RGB primaries (or close to it) - unless you use the sRGB preset on the monitor.

                   

                  With a standard gamut monitor you could get away without color management (and then gamma would make a difference). It wouldn't be strictly correct, but it would be close, perhaps close enough.