2 Replies Latest reply on Mar 4, 2010 6:18 PM by p_d_f

    Luminance Intensity: A Basic Question


      I am a rookie in the field of color management. As a photographer, my ultimate goal is good print matching. But for now I need to calibrate & profile my new NEC P221W display "properly". As I understand (& please correct me if I am wrong), the "proper" display calibration will ultimately be determined by my actual prints.


      For calibration, I am using the SpectraView II puck & software. The SV-II "Photo Editing" target uses the following values: white point = D65, gamma = 2.2 & intensity = 140 cd/m2. From what I have read, many veterans of color management recommend an intensity setting of 90 -100 cd/m2.


      A couple of quick questions:


      (1)  Is it reasonable to assume that NEC sets the intensity target of 140 cd/m2 based upon the characteristics of the NEC display?

      (2)  Without being able to verify the effects on print matching ( I haven't purchased a suitable printer yet), what visible effects would lowering the           intensity have on the displayed image? For example, the display seems excessively bright.


      Thank you for your help.

        • 1. Re: Luminance Intensity: A Basic Question
          John Danek Level 4

          I agree with you in that your prints will determine if your monitor is properly profiled.  Your numbers look a little higher than what I would recommend.  I always get in hot water when I discuss what I use for numbers.  I am still using a LaCie Electron Blue IV, which is not anywhere near like your NEC.  However, i tend to use D50, gamma @ 1.8.  It may come down to some experimentation on your part.  If you had a dedicated output device for your prints, then you could calibrate everything.  As it is now, you'll have to get the prints back and see what it takes to get the monitor to match.  I've seen some luminance numbers recommended around 110 - 120.  Your current setup looks like defaults for video.

          • 2. Re: Luminance Intensity: A Basic Question
            p_d_f Level 2

            The best luminance for you screen is determined by how much ambient light you have. CRT displays could only hit 85 or 90 cd/m2 and still have a decent lifespan. LCD displays are brighter by nature and often can't have their luminance set low enough to not be too bright in the standard darkened room that has been traditionally used for CRTs. Third party calibration software packages can sometimes get around these limitations, but barring that, the only way to compensate is to raise the ambient room lighting to a point where the perceived brightness on screen is correct.


            Most monitors have a native gamma of somewhere in the 2.2.- 2.4 range. Using gamma 2.2 has a couple of advantages. One is that it's probably very close to the native hardware gamma and will require less compensation in the video card lookup tables and, two, is that most non color managed and user interface elements are designed to look best at gamma 2.2. Color temperature settings of around 6500K give a very close color match to daylight even though numerically they might not match. Gamma 1.8 is Macintosh legacy prepress setting and is pretty much only used by older prepress people who haven't fully adapted to modern methods. There is no reason to use 1.8 for anything anymore and plenty of reasons why it can be detrimental.