Lightroom has color management - most image viewers don't.
But the Windows Photo Viewer (at least on Windows 7) is color managed, so it should display the same as LR.
And there isn't the typical over saturated colors that are typical of viewers without color management, so this is puzzling.
Possibly an issue with your monitor profile - I don't know ...
You could try to re-calibrate your monitor.
This problem is really mystifying me so I ran some controlled experiments to try to figure out what's going on. I used Matlab to create a 899x899 pixel test image and saved it as both a BMP and TIF. Here's generally what it looks like:
The code I used to produce it is available upon request, but I don't want to glaze people's eyes over on this post.
* Loaded raw.bmp and raw.tif into a number of different programs
* Took screenshots of the image being displayed
* Saved those screenshots to BMP
* Loaded the screenshots into Matlab (via imread)
* Performed a pixelwise comparison of raw.bmp and the screenshot (raw.tif exactly matches raw.bmp; confirmed by loading them both in Matlab and doing the same comparison)
Note that each pixel, in the raw BMP, raw TIF, and screenshot BMP, consists of three 8-bit numbers because that's how image data is sent to most video cards, including mine -- in other words, that's a raw expression of how the image is being displayed on the monitor from the video card's perspective.
The following viewers produced screenshots that were pixelwise identical to the source image for all pixels in the image:
Windows Photo Viewer (bmp & tif)
Picasa Photo Viewer (bmp & tif)
Windows Photo Gallery (bmp & tif)
Windows Paint (bmp & tif)
Chrome (bmp only; tif not supported)
Adobe Photoshop Elements 11.0 (bmp & tif)
Only Lightroom produces a screenshot with differing pixel values (using only the tif image because Lightroom won't import bmps). In the pictures below, white indicates that the screenshot pixel in Library (left) or Develop (right) previews was different from the original:
Additionally, this is the difference map (white=any difference regardless of size, black=identical) of importing raw.tif into Lightroom, then exporting it as a TIFF:
Most of this exported image is identical to the original. The white pixels are probably where there was rounding error/loss of precision when converting to and from a different color space. All of this is very consistent with the idea that Lightroom is performing some kind of adjustment to the images when displaying them on my screen because it thinks my monitor needs to be corrected. However, it is the only viewer on my computer that exhibits this behavior, and I can't figure out how to disable it.
One final note is that when I start Photoshop Elements, I get this message:
However, I get the same results (screenshot pixels always exactly identical) regardless of which option I choose.
So again, how do I stop Lightroom from applying whatever correction it's applying to my on-screen images?
I'm not sure where on this thread that has already been suggested, but it so happens I had already done that before the second post:
Per Berntsen in reply #1: "Possibly an issue with your monitor profile - I don't know ... You could try to re-calibrate your monitor."
There's a logical explanation as to what's causing the issue, since the vast majority of users aren't experiencing this problem. If you can you upload one of the pictures to Dropbox or other file sharing site we can try and determine what's happening. Export one of the affected files to DNG file format, which will embed your LR Develop settings. Thank you!
I think I've resolved this issue personally and have been trying to figure out exactly what was happening to close out this thread publicly, but unfortunately I don't really know what was going on. As best I can tell, the issue really was with Windows color management and was addressed by adding and using a sRGB IEC61966-2.1 ICC profile (as per trshaner's lightroomqueen.com link). I can switch between the old profile and that profile and programs take whatever profile was present when they were started. So, I can:
- Select the old profile
- Open the photo with Windows Photo Viewer
- See that it looks bad
- Select the IEC profile
- Open the photo again with Windows Photo Viewer (spawns new instance)
- See that it looks good in the new instance
- Switch to the previous instance of Windows Photo Viewer and see that the photo still looks bad
- Scroll through pictures in previous instance and then back to the original picture; observe that it still looks bad
This is very replicable, so I feel pretty confident in saying that most programs take the Windows system color profile selected when they are first started and use that profile to display all content for their entire lifecycle. The one exception to this rule is Lightroom: it seems to be immune to these Windows system color profiles. I'm not sure how, but it always displays the image correctly regardless of system color profile selected. This seems a bit odd since I've never performed any calibration in Lightroom, but this is how it seems to work in any case. So basically, Lightroom was the only sane program on my computer that I tested.
I can't explain how I got the results I did above. I haven't redone the full set of tests (that was pretty time consuming), but the spot checks I do now yield different results -- the problematic picture as viewed in other programs looks very similar to it as viewed in Lightroom. I did a lot of things in between like restarting my computer, but I don't see how that can account for the difference because I observe color profiles taking effect now without restarting the computer -- just restarting the programs. So, I think I'm going to leave that a mystery and simply be happy that it's solved for me.
As best I can tell, the issue really was with Windows color management and was addressed by adding and using a sRGB IEC61966-2.1 ICC profile (as per trshaner's lightroomqueen.com link).
LR and apparently the Windows Photo Viewer require an ICC Version 2 monitor profile. My guess is that the original monitor profile you were using is ICC Version 4, which is incompatible with both applications. Substituting the sRGB color profile (or Adobe RGB for a wide gamut display) is a temporary solution since it isn't a proper monitor profile. The best solution is to purchase or borrow a good monitor calibrator and adjust your display to 100-120 cd/m2 Luminace, 6500 K Color Temperature, and 2.2 Gamma. This is the only way to achieve accurate color and print output that matches what you see on the display.
You can also use the below website to try and "visually" calibrate your monitor using the sRGB profile. At the very least it will get you closer to the proper settings.