18 Replies Latest reply on Apr 29, 2015 7:07 PM by michael_____

    Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?

    michael_____

      I've asked this in a couple other places online as I try to wrap my head around color management, but the answer continues to elude me. That, or I've had it explained and I just didn't comprehend. So I continue. My confusion is this: everywhere it seems, experts and gurus and teachers and generally good, kind people of knowledge claim the benefits (in most instances, though not all) of working in AdobeRGB and ProPhoto RGB. And yet nobody seems to mention that the majority of people - including presumably many of those championing the wider gamut color spaces - are working on standard gamut displays. And to my mind, this is a huge oversight. What it means is, at best, those working this way are seeing nothing different than photos edited/output in sRGB, because [fortunately] the photos they took didn't include colors that exceeded sRGB's real estate. But at worst, they're editing blind, and probably messing up their work. That landscape they shot with all those lush greens that sRGB can't handle? Well, if they're working in AdobeRGB on a standard gamut display, they can't see those greens either. So, as I understand it, the color managed software is going to algorithmically reign in that wild green and bring it down to sRGB's turf (and this I believe is where relative and perceptual rendering intents come into play), and give them the best approximation, within the display's gamut capabilities. But now this person is editing thinking they're in AdobeRGB, thinking that green is AdobeRGB's green, but it's not. So any changes they make to this image, they're making to an image that's displaying to their eyes as sRGB, even if the color space is, technically, AdobeRGB. So they save, output this image as an AdobeRGB file, unaware that [they] altered it seeing inaccurate color. The person who opens this file on a wide gamut monitor, in the appropriate (wide gamut) color space, is now going to see this image "accurately" for the first time. Only it was edited by someone who hadn't seen it accurately. So who know what it looks like. And if the person who edited it is there, they'd be like, "wait, that's not what I sent you!"

       

      Am I wrong? I feel like I'm in the Twilight Zone. I shoot everything RAW, and I someday would love to see these photos opened up in a nice, big color space. And since they're RAW, I will, and probably not too far in the future. But right now I export everything to sRGB, because - internet standards aside - I don't know anybody who I'd share my photos with, who has a wide gamut monitor. I mean, as far as I know, most standard gamut monitors can't even display 100% sRGB! I just bought a really nice QHD display marketed toward design and photography professionals, and I don't think it's 100. I thought of getting the wide gamut version, but was advised to stay away because so much of my day-to-day usage would be with things that didn't utilize those gamuts, and generally speaking, my colors would be off. So I went with the standard gamut, like 99% of everybody else.

       

      So what should I do? As it is, I have my Photoshop color space set to sRGB. I just read that Lightroom as its default uses ProPhoto in the Develop module, and AdobeRGB in the Library (for previews and such).

       

      Thanks for any help!

       

      Michael

        • 1. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
          bob frost Level 3

          Using wider color spaces in LR or PS future-proofs your edited images. My current monitor already shows 98% of AdobeRGB, and good inkjet printers can print colors way outside sRGB, so the wider spaces pay off now. Of course, if you are sending images to the web, or projecting them, etc, you convert to sRGB.

           

          Bob Frost

          • 2. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
            D Fosse Adobe Community Professional & MVP

            You're on the right track, sort of, but this gamut clipping on a standard display is not nearly as bad as it seems. The point is that the colors it can reproduce, are reproduced accurately. Only those outside its gamut are clipped to the gamut boundary and cannot be discriminated.

             

            As Bob says, the main advantage of these really large spaces like ProPhoto is headroom. Once a color is clipped, it's gone. So this means you can work on a file without worrying about clipping until it's output time. Then, of course, you'll have to squeeze it into the output profile. But you haven't lost anything up to that point.

             

            But yes, display gamut is something you should always have in mind, e.g. for soft proofing. On a standard gamut display what you see on-screen is already soft proofed to sRGB, and anything bigger than that is out of gamut and soft proofing it is a waste of time. Which is why a wide gamut monitor is well worth the expense.

            • 3. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
              ssprengel Adobe Community Professional & MVP

              You're approximately asking why anyone use a larger workspace than the monitor colorspace.

               

              I am one of those people:  I use 16-bit ProPhotoRGB RGB Workspace in Photoshop even though I have a standard gamut monitor.

               

              There are plusses and minuses to small-vs-large gamut color workspace.

               

              The thing I take to be most important is that every time I click Ok or Done on a Filter or Adjustment panel, PS will truncate the colors to the workspace. 

               

              If the workspace is small then the more edits I do the more colors I lose so the end result will have only colors that were in-gamut at every step.

               

              If the workspace is large then rather than PS truncating to monitor gamut at every step, I can do a softproof against the output device color profile at the end, and decide what to do with the out-of-gamut colors, and as the preceding future-proof argument goes, I will be saving my document in the large colorspace so if a wider-gamut device is produced in the future then my document colors can expand into that gamut, rather than be limited by whatever monitor I was using at the time.

               

              As a thought experiment, consider a photo with blue sky as well as reds, yellows, oranges and browns of fall foliage.  Suppose I increase the contrast and saturation as an initial step, then shift the color-balance toward red as a subsequent step, and generate some output, then I shift the colorbalance back toward the blue and generate some more output--for one output I want warmer and one output I want coolor colors.   My initial saturation increase probably pushed the blues beyond the gamut, and then my subsequent warming of the color balance would have brought them back into gamut.  If I had truncated them after increasing the saturation by using a small workspace, then I'd be losing some of the blue range when I warmed the photo, later.  I'd rather keep my blues as I go along and only worry about what is out of gamut at the end.

               

              It is my understanding that when colors are out-of-workspace-gamut they are just truncated to the nearest in-gamut color.  This can lead to abrupt edges in a color gradient--the sky for example.  I'd rather those edges only be visible on my monitor and not baked into the document at each step.

              • 4. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                Simon G E Garrett Level 2

                Lightroom doesn't simply default to those colour spaces: there's no choice.  Lightroom uses a number of different colour spaces:

                • Develop module uses ProPhoto RGB with linear gamma (Tone Response Curve, TRC) for most purposes, but uses ProPhoto RGB with sRGB's TRC for histograms and RGB values.
                • Library module uses Adobe RGB - as the previews are stored as 8-bit jpegs, and ProPhoto RGB can get visible banding in 8-bits.
                • Web module (AFAIK) uses sRGB colour space - so that it matches most web users' monitors.

                 

                There's really no disadvantage in using a wide colour space as a working space.  It's a bit like saying "my thermometer scale runs from 0 to 100C, but internally on my computer I store temperature in numbers that go from -1000 to +1000."  Sure, I can display only temperatures in the range 0 to 100, but so what if my computer can store bigger numbers?  

                 

                I don't think it's true to say that something nasty happens if the software can handle colours outside the range of the monitor.  Better that colours outside the range of the monitor are handled gracefully than simply hard-clipped at the boundary.

                 

                An example: suppose you increase the saturation on an image.  Then later you decide that you've gone too far, and reduce the saturation.  After increasing the saturation, you may have taken some colours outside sRGB.  If you work in sRGB, you will have clipped those colours at the sRGB boundary.  When you reduce the saturation again, you will get distorted colour.  If the colour had been allowed to go beyond sRGB, then when the saturation is reduced, colours won't be distorted.

                 

                The logic in your first paragraph is, I think, the wrong way round.  If I understand you correctly, you're suggesting a wide working space can't make it better and may make it worse.  I would say that a wide working space can't make it worse, and can make it better!  Colours outside the sRGB gamut aren't going to be displayed on most monitors, which have very approximately sRGB gamut.  But it doesn't make it better to process the colour in sRGB.  At best it makes no difference, at worst it results in distortion from uncontrolled gamut clipping.

                 

                And forcing everything to sRGB loses future proofing.  We're likely to see 4K driving wider gamuts in monitors (see https://en.wikipedia.org/wiki/Ultra-high-definition_television).  Cameras have colour gamuts significantly wider than sRGB; why not keep that wider gamut colour information for the day when we have wider gamut displays?

                 

                PS - Lightroom uses ProPhoto RGB in devleop module, as I said, so I use ProPhoto RGB as the working space in Photoshop.  I can't see any reason to use any other working space, but would be interested if anyone can explain something I'm missing. 

                • 5. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                  Simon G E Garrett Level 2

                  ssprengel wrote:

                   

                  There are plusses and minuses to small-vs-large gamut color workspace.

                   

                   

                  One possible disadvantage of large gamut working spaces can be that if one works in only 8-bit, there could be more banding if working in a very wide colour space such as ProPhoto RGB.  This is because the step size between adjacent colours in 8 bits might be large enough to be visible.  Provided one works in 16-bit, I can't think of any disadvantages of the widest possible working space - which effectively means ProPhoto RGB.

                   

                  But maybe I'm missing something?   I'm always open to being shown the errors of my views!

                  • 6. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                    ssprengel Adobe Community Professional & MVP

                    There is always a problem using an 8-bit colorspace, so I didn’t include that in my analysis.   The gaps are just wider in a wider colorspace, but you’ll get banding in any case, depending on the amount of adjustment.

                    • 7. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                      thedigitaldog MVP & Adobe Community Professional

                      michael_____ wrote:

                       

                      And yet nobody seems to mention that the majority of people - including presumably many of those championing the wider gamut color spaces - are working on standard gamut displays. And to my mind, this is a huge oversight.

                      The weak link here is the display. So you have to ask yourself, do you funnel color into the gamut of the display losing color gamut you could utilize for output to a print OR work within the limitations of the display gamut and use the data in a wider gamut for output to print? Only you can answer that. But I'd suggest that if you output your raw data to a modern printer, say a good photo ink jet, you'll want to stick with ProPhoto RGB which is the gamut of the processing color space used internally within the ACR/LR engine. IOW, using anything color space other than ProPhoto gamut will potentially clip colors. To prove this to yourself, using your own printer:

                       

                      The benefits of wide gamut working spaces on printed output:

                      This three part, 32 minute video covers why a wide gamut RGB working space like ProPhoto RGB can produce superior quality output to print.

                       

                      Part 1 discusses how the supplied Gamut Test File was created and shows two prints output to an Epson 3880 using ProPhoto RGB and sRGB, how the deficiencies of sRGB gamut affects final output quality. Part 1 discusses what to look for on your own prints in terms of better color output. It also covers Photoshop’s Assign Profile command and how wide gamut spaces mishandled produce dull or over saturated colors due to user error.

                       

                      Part 2 goes into detail about how to print two versions of the properly converted Gamut Test File  file in Photoshop using Photoshop’s Print command to correctly setup the test files for output. It covers the Convert to Profile command for preparing test files for output to a lab.

                       

                      Part 3 goes into color theory and illustrates why a wide gamut space produces not only move vibrant and saturated color but detail and color separation compared to a small gamut working space like sRGB.

                       

                      High Resolution Video: http://digitaldog.net/files/WideGamutPrintVideo.mov

                      Low Resolution (YouTube): https://www.youtube.com/watch?v=vLlr7wpAZKs&feature=youtu.be

                      • 8. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                        michael_____ Level 1

                        Okay - school is definitely in session. Much for me to chew on here... (I haven't figured out how to embed quotes from multiple messages so it's copy and paste)


                        Simon G E Garrett Apr 28, 2015 5:49 PM

                        If I understand you correctly, you're suggesting a wide working space can't make it better and may make it worse. 


                        ssprengel Apr 28, 2015 4:59 PM

                        You're approximately asking why anyone use a larger workspace than the monitor colorspace.


                        Yes. Thank you for putting it simply.


                        D Fosse Apr 28, 2015 3:36 PM

                        You're on the right track, sort of,

                         

                        I'm trying!


                        bob frost Apr 28, 2015 2:28 PM

                        Using wider color spaces in LR or PS future-proofs your edited images.


                        D Fosse Apr 28, 2015 3:36 PM

                        As Bob says, the main advantage of these really large spaces like ProPhoto is headroom. Once a color is clipped, it's gone.


                        Simon G E Garrett Apr 28, 2015 5:49 PM

                        And forcing everything to sRGB loses future proofing. 


                        I thought I was future-proofed because of the files, not the color space. I thought it was the RAW/DNG that made everything happy family, because there's no color space/gamut restriction there. And I'm editing non-destructively (aren't I?).


                        D Fosse Apr 28, 2015 3:36 PM

                        So this means you can work on a file without worrying about clipping until it's output time.

                         

                        Right. (I'm losing my train of thought now) I think I've been misunderstanding what's really going on in LR and PS when editing. Where and when these destructions/potential badnesses might occur. For instance, I kind of thought, if there's going to be clipping, it won't be until I assign the color space for output. And mean time, while working on my sRGB display, it might be better to use sRGB as a workspace, because that way I'm seeing the images 1:1 with what my display can reproduce. But I think I see now how that doesn't really make sense. I guess I would be handicapping myself if, say, I wanted to output in Adobe RGB to someone who used a wide gamut display, from my sRGB color space. (unless I changed to ARGB before editing? ugh, my brain is melting again)

                         

                        ssprengel Apr 28, 2015 4:59 PM

                        If the workspace is small then the more edits I do the more colors I lose

                         

                        The image is being limited right off the bat by having my color space sRGB? I guess that's what I wasn't factoring. Or realizing.


                        Simon G E Garrett Apr 28, 2015 5:49 PM

                        The logic in your first paragraph is, I think, the wrong way round...I would say that a wide working space can't make it worse, and can make it better!


                        I think I see. It's looking like having a working space / color space (I know they're different!) of sRGB kind of defeats the purpose of shooting RAW?


                        Thanks all for taking the time, and sharing your wisdom. And articulating plainly your thoughts on what is (for me) a very complex confluence of math, science and theory. And for the video references (thedigitaldog). More homework. I'm getting there.

                        • 9. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                          ssprengel Adobe Community Professional & MVP

                          If your PS RGB workspace is sRGB then all colors are limited to that after each operation including when you open an image from raw; however, how much raw benefit you lose depends on how many of your adjustments you're doing in the ACR plug-in and how many you're doing in PS, and what the ACR Plug-in workflow options colorspace is set to, which limits the data to that on the way into PS even if PS's workspace is larger, I think.

                           

                          My normal settings are:  ACR plug-in 16-bit ProPhotoRGB, PS RGB Workspace:  ProPhotoRGB and I convert any 8-bit documents to 16-bit before doing any adjustments.  I save my PS intermediate or final master copy of my work as a 16-bit TIF still in the ProPhotoRGB, and only when I'm ready to share the image do I convert to sRGB then 8-bits, in that order, then do File / Save As: Format=JPG.

                          • 10. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                            Simon G E Garrett Level 2

                            A few responses to Michael, and take it that everything I say is prefixed by "IMHO" - I'm always willing to be corrected. 

                             

                            "I thought I was future-proofed because of the files, not the color space. I thought it was the RAW/DNG that made everything happy family, because there's no color space/gamut restriction there. And I'm editing non-destructively (aren't I?). "

                             

                            Yes, and to expand on that a bit.  The raw files have a colour gamut of the camera sensor, which will certainly be larger than sRGB, and probably larger than Adobe RGB.  Lightroom editing is non-destructive, which is to say the original image data isn't altered.  Lightroom instead keeps a list of the edits you do, and applies them on the fly when you view the image.  However, editing is done by LR in its working space (not the camera sensor's).  "Working (colour) space" is just the term for the colour space used internally by some piece of software.  In general, the requirements for a working space are that it should be at least as wide as the range of colours in the image (preferably wider to allow for intermediate calculations that might go wider), and the resolution should be such that the steps aren't visible.  It can't be too wide, and it can't have too high a resolution, IMHO.  Using ProPhoto RGB in 16-bit resolution is ideal, IMHO.

                             

                            In the case of LR, it uses ProPhoto RGB, which is almost certainly a wider colour space than the camera sensor's, so you don't lose any colours.  If you edit in Photoshop (or another pixel editor), then Lightroom renders the image into Photoshop's working space.  Again, if you use ProPhoto RGB for Photoshop's working space, then no colours are lost.  If you use sRGB, then all colours outside sRGB are permanently lost in the copy of the image that Photoshop is working on.

                             

                            By using Lightroom, or using a wide working space when using any other editor, in my view you're keeping your options open.  Only when the image is output - to a monitor or printer, say - the colours are limited to the gamut of that device.  And when you edit the image, obviously you can't "see" colours outside the monitor's colour gamut, but that's better than throwing them away.  You might want them later.

                             

                            I can think of one disadvantage of ProPhoto RGB: I think you really need to use 16-bit with ProPhoto RGB.  With Adobe RGB or sRGB then 8-bit jpegs are fine for many purposes.  With 8 bits, there are 256 possible values.  If you use those 8 bits to cover a wider range of colours, then the difference between two adjacent values - between 100 and 101, say - is a larger difference in colour.  With ProPhoto RGB in 8-bits there is a chance that this is visible, so a smooth colour wedge might look like a staircase.  Hence ProPhoto RGB files might need to be kept as 16-bit TIFs, which of course are much, much bigger than 8-bit jpegs. 

                             

                            For me, the big benefit of Lightroom is that the raw files are kept as the masters, which are much smaller than tifs (they aren't generally demosaiced) but retain all the original information from the sensor. 

                            • 11. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                              michael_____ Level 1

                              Okay. Going bigger is better, do so when you can (in 16-bit). Darn, those TIFs are big though. So, ideally, one really doesn't want to take the picture to Photoshop until one has to, right? Because as long as it's in LR, it's going to be a comparatively small file (a dozen or two MBs vs say 150 as a TIF). And doesn't LR's develop module use the same 'engine' or something, as ACR plug-in? So if your adjustments are basic, able to be done in either LR Develop, or PS ACR, all things being equal, choose to stay in LR?

                               

                              ssprengel Apr 28, 2015 9:40 PM

                              PS RGB Workspace:  ProPhotoRGB and I convert any 8-bit documents to 16-bit before doing any adjustments.

                               

                              Why does one convert 8-bit pics to 16-bit? Not sure if this is an apt comparison, but it seems to me that that's kind of like upscaling, in video. Which I've always taken to mean adding redundant information to a file so that it 'fits' the larger canvas, but to no material improvement. In the case of video, I think I'd rather watch a 1080p movie on an HD (1080) screen (here I go again with my pixel-to-pixel prejudice), than watch a 1080p movie on a 4K TV, upscaled. But I'm ready to be wrong here, too. Maybe there would be no discernible difference? Maybe even though the source material were 1080p, I could still sit closer to the 4K TV, because of the smaller and more densely packed array of pixels. Or maybe I only get that benefit when it's a 4K picture on a 4K screen? Anyway, this is probably a different can of worms. I'm assuming that in the case of photo editing, converting from 8 to 16-bit allows one more room to work before bad things start to happen?

                               

                              I'm recent to Lightroom and still in the process of organizing from Aperture. Being forced to "this is your life" through all the years (I don't recommend!), I realize probably all of my pictures older than 7 years ago are jpeg, and probably low-fi at that. I'm wondering how I should handle them, if and when I do. I'm noting your settings, ssprengel.

                               

                              ssprengel Apr 28, 2015 9:40 PM

                              I save my PS intermediate or final master copy of my work as a 16-bit TIF still in the ProPhotoRGB, and only when I'm ready to share the image do I convert to sRGB then 8-bits, in that order, then do File / Save As: Format=JPG.


                               

                              Part of the same question, I guess - why convert back to 8-bits? Is it for the recipient?  Do some machines not read 16-bit? Something else?

                               

                              For those of you working in these larger color spaces and not working with a wide gamut display, I'd love to know if there are any reasons you choose not to. Because I guess my biggest concern in all of this has been tied to what we're potentially losing by not seeing the breadth of the color space we work in represented while making value adjustments to our images. Based on what several have said here, it seems that the instances when our displays are unable to represent something as intended are infrequent, and when they do arise, they're usually not extreme.

                               

                              Simon G E Garrett Apr 29, 2015 4:57 AM

                              With 8 bits, there are 256 possible values.  If you use those 8 bits to cover a wider range of colours, then the difference between two adjacent values - between 100 and 101, say - is a larger difference in colour.  With ProPhoto RGB in 8-bits there is a chance that this is visible, so a smooth colour wedge might look like a staircase.  Hence ProPhoto RGB files might need to be kept as 16-bit TIFs, which of course are much, much bigger than 8-bit jpegs.

                               

                              Over the course of my 'studies' I came across a side-by-side comparison of either two color spaces and how they handled value gradations, or 8-bit vs 16-bit in the same color space. One was a very smooth gradient, and the other was more like a series of columns, or as you say, a staircase. Maybe it was comparing sRGB with AdobeRGB, both as 8-bit. And how they handled the same "section" of value change. They're both working with 256 choices, right? So there might be some instances where, in 8-bit, the (numerically) same segment of values is smoother in sRGB than in AdobeRGB, no? Because of the example Simon illustrated above?

                               

                              Oh, also -- in my Lumix LX100 the options for color space are sRGB or AdobeRGB. Am I correct to say that when I'm shooting RAW, these are irrelevant or ignored? I know there are instances (certain camera effects) where the camera forces the shot as a jpeg, and usually in that instance I believe it will be forced sRGB.

                               

                              Thanks again. I think it's time to change some settings..

                              • 12. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                                thedigitaldog MVP & Adobe Community Professional

                                michael_____ wrote:

                                 

                                Okay. Going bigger is better, do so when you can (in 16-bit). Darn, those TIFs are big though. So, ideally, one really doesn't want to take the picture to Photoshop until one has to, right?

                                Yes! And once you do, you're done in LR's develop. You've got your 16-bit wide gamut master for further work there. If you sent sRGB, you're stuck with that data from now on. You can use LR to catalog the new master you'll presumably edit in Photoshop using it's tool set. It's high bit, full gamut data that you can downsize and convert to sRGB when sRGB is useful (pretty much posting to the net and mobile devices). You can print the master in LR of course, build slide shows and web gallerias and even use LR to convert the data to sRGB for that web work. So the idea is, do all the work in Develop you can. Render the pixels for further editing in PS if needed. Catalog that master for creating iterations from within LR or PS proper.

                                 

                                Consider you need to do 3 hours of Photoshop work. If you render full resolution high bit, wide gamut data and edit that for three hours, you can do anything you want with that data from now on. If you start with sRGB in 8-bit's per color and do the three hours of work in PS, you've painted yourself and your data into a corner. You can go smaller in terms of pixel dimensions and color gamut. You can't go the other way.

                                • 13. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                                  michael_____ Level 1

                                  Thanks!

                                   

                                  Why can't Photoshop edit our RAW/DNG's like Lightroom? They're both using a wide gamut color space. Why do files have to go become behemoths in PS?

                                  • 14. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                                    thedigitaldog MVP & Adobe Community Professional

                                    michael_____ wrote:

                                    Why can't Photoshop edit our RAW/DNG's like Lightroom? They're both using a wide gamut color space. Why do files have to go become behemoths in PS?

                                    Photoshop only works on rendered pixels. Adobe Camera Raw can work on raw data like LR. One is a pixel editor (PS), the other two are raw converters that produce rendered pixels FOR Photoshop.

                                    • 16. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                                      ssprengel Adobe Community Professional & MVP

                                      >> Why convert from 8-bit to 16-bit?

                                       

                                      This is about keeping all the data when I am adjusting colors, making room for fractional adjustments to be kept.

                                       

                                      For example, an 8-bit has values from 0 top 255 for the Red, Green and Blue channels.  If I am adjusting the saturation a little the red channel from 120 to 126.25 then if I was using 8-bits the value would be truncated to 126.  If I am using 16-bit then each number from 0-255 has another fractional part of 0-255 so I can keep my 126.25 value.  This is a simplistic example, because things don’t start out as whole numbers and then add fractions to those whole numbers when you convert to 16-bits, but the idea is similar, more digits of precision allow for more “in-between” numbers to be represent the colors more precisely instead of chopping them off to the low precision  of 8-bits after each Ok or Done on a Filter or Adjustment panel.

                                       

                                      16-bit files should only be twice as big as 8-bit files.  Raw files are so small because they have only 1-color-per-pixel usually consisting of 12 or 14-bits, and LR and ACR both are in charge of making up the other two colors (out of red, green or blue) for each pixel location using a process commonly referred to as demosaicing. 

                                       

                                      So besides going “big” by using ProPhotoRGB as the working space, I also go “deep” for the working pixels numbers.

                                      • 17. Re: Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?
                                        Jao vdL Adobe Community Professional & MVP

                                        I can offer up an alternative way of saying what folks have already said here. Working in a wide colorspace is all about avoiding clipping at any stage of the processing chain. Imagine the camera Raw/Lightroom processing chain as a series of steps where the data gets modified one way or another. If at any stage the data goes outside of the possible range (even if in a subsequent stage it would be pulled back!) you have a big problem. Imagine for example a vibrance slider setting of +50 and a saturation slider setting of -50. You could imagine in the vibrance step some colors flying out of sRGB and therefore clipping to the max that sRGB can go to. Then the -50 saturation step would pull it back, but all gradation in the saturated color will be lost. If you worked in a wider color space, you would get back a good image. So in any processing chain, you need to have sufficient headroom that at any stage in the chain you don't run out of range. This is true even if your starting and ending point do not need that headroom.

                                         

                                        Smilarly, the 16 vs 8 bit is all about avoiding compounding errors in the processing chain too. In most of these cases, it is definitely possible that you can have a visually indistinguishable version of the end-product of the processing in 8 bits, however when you are doing multiple processing steps to your image data you can easily generate artifacts at intermediate processing steps. Imagine a subtly gradated blue sky where the tint changes from blue to cyan or some such. Now if your slider settings at one stage desaturate all colors (because you want to decrease overall saturation), in a 8-bit processing chain, this could lead to the entire sky only showing a variation of a few steps in bit depth. You would not notice much at this point, it would just look more grayish than before. Then using the HSL sliders you would dial back the blue and cyan saturation because that's what you creatively wanted, you suddenly would be left with terrible posterization, even though intuitively your sky should look like it was originally. This would not happen in a higher bit workflow since you have so many more levels to represent the image data in. Scaling it down at one stage and scaling it back up in another doesn't produce any significant amount of extra error with 16-bit data. You can always do better of course and you could do 32-bit floating point for example, but for images 16-bit is a good tradeoff.

                                         

                                        P.S. this is the exact same reason why audio processing in professional settings all happens in high bitrate and high bit depth such as 96 kHz at 24 bits, but audio delivery happens at 44.1 kHz and 16 bits (i.e CD format). Nobody can hear the difference between 44.1 kHz at 16 bits and 96 kHz at 24 bits (proven over and over in actual testing) but many people are able to pick up on artifacts introduced by the processing if it all happened at the lower bitrate and depth and there were some modest amplification and equalization steps in there that could produce clipping or loss of resolution at any stage even if nobody can hear outside of the nominal data range. Same thing as with sRGB at 8 bit vs ppRGB at 16 bits.

                                         

                                        Last, even though most displays are close to sRGB, none is exactly. If you calibrate any screen well, you will see that the primaries deviate from sRGB subtly. You can introduce annoying and visible color shifts by having your data limited to sRGB instead of a wider space because of these subtle differences. A good example is the sRGB blue/purple problem.