My wife is an aspiring fine arts photographer/editor- currently working of a 2008 macbook, which I know is not the best thing to be editing photos on. I have read some of the forums which talk about wide gamut and sRGD gamut but, frankly, they exceed my technical knowledge. I would like to get her a monitor that she can hook to her macbook that wont break the bank but gives her a higher quality image to work on- one more similar to what professionals will review her images on once they are submitted. I recently got her a wacom 4 tablet and she is using PS6 and LR4 as her main editing suites, if that matters. Her intentions are to submit these images as potential book covers- which means they will eventually be printed. Any suggestions?
Look for a monitor that does not change its response characteristics when you move your head to be at different angles relative to the monitor (in other words it shouldn't get lighter or darker as you move your head up and down). That would generally be an IPS type display.
With monitors I've noticed that generally speaking you get what you pay for. Try not to skimp too too much.
This is a good site for reading reviews:
I'm personally most fond of Dell monitors (e.g., their Ultrasharp line), though I admit to being a little unsure how PC monitors work with Mac systems (I'm a PC person). I think the days have passed where there are Mac-only and PC-only monitors, but please don't take my word for it.
You will want to get a monitor that is significantly bigger than the MacBook's screen size. A 24" display would be a nice size bump-up. Since the end use is for "printed" book covers, that would be CMYK offset output. The color gamut of CMYK 4-color printing can be accurately "soft-proofed on a standard sRGB gamut display inside Photoshop or Lightroom 4. A wide gamut monitor is only beneficial when processing images for 6-color or higher printer output, but even then a standard sRGB display will work well.
One of the most important tools required for accurate display color and luminance rendering is a hardware monitor calibrator. If your wife is currently using one that's great. If not then you first order of priority should be to purchase one and calibrate the MacBook's display. When using an external display the hardware calibrator will be needed for it as well, otherwise it's pretty much a waste of time.
You should be able to find numerous standard gamut displays that will work well with the 2008 MacBook's DVI display connector. Highest supported native resolution is 1920x1200 for the MacBook and 2560x1600 for the MacBook Pro.
Here are some links that may be helpful:
Noel and trshaner, Thank you both for your comments.
Trshaner- great point on making sure it is calibrated and something I do not think is happening at this point. That will certainly be the first purchase. Also appreciate you drawing the distinction between the sRGB and wide gamut displays- while I still don't know the technical differences between the two I can eliminate the wide gamut display for now.
Great links as well.
For a good photo editing monitor that won't break the bank, try the Dell U2412m. I am not a fan of Dell but purchased this monitor a while back when I was on a tight budget after reading some good reviews. I ultimately upgraded to a much more expensive monitor but went right back to the Dell. I use it with my Mac and it works great.
Dells MUST be calibrated because their delta E is way off out of the box.
The OP has already been advised to purchase and use a hardware monitor calibrator, on the MacBook's display and any external display(s). Most all monitors exhibit high delta E "out-of-the-box.". Wide gamut monitors aren't any better, except for those with "factory calibrated' modes (Adobe RGB, sRGB).
The problem with "factory calibration" is that in a few days or weeks of use the calibration will drift and increase the delta E. In addition all factory calibrated monitors are set to Luminance levels that are way too high for photo editing use (>150 cd/m2+). The only way around this is to purchase and use a hardware monitor calibrator on a scheduled basis, so there's very little benefit buying a monitor that's "spot-on" right out-of-the-box.
That said do you have specific monitors you can recommend to the OP?
The Dell U2413 has some nice improvements over the U2410, making it a good wide gamut monitor choice:
The U2412M also has very good performance, making it a good sRGB monitor choice:
Both of these displays use LED backlighting, which should provide better long term calibration stability compared to CCFL.
For the monitor calibrator the Datacolor Spyder4Elite seems to have the least issues on Mac OS X platforms, and it also works with wide gamut displays. Here's a good review and comparison to the closest competitor the X-Rite i1 Display Pro:
For best calibration results use the 'Advanced' mode with 6500K, 2.2 Gamma, and 120 cd/m2 settings.
What intrigues me about some of the newest designs is the ability to calibrate the display at different positions on the screen.
This could yield an absolutely flat, well-calibrated result all across the screen, corners and center alike.
What Lundberg and others may not realize is that for years, because of uneven lighting, the delta-E from one position to another on the display differs far more than the difference of any one point from ideal. These new designs promise to correct that.
Reading the two reviews I posted it appears both the U2413 and U2412M incorporate 'Uniformity Correction' circuitry. Unfortunately it can only be enabled in the factory preset modes, which is not what you want when doing your own hardware calibration. Is this what you are referring to, or something else?
See the end 'Conclusion section at these links:
"One area which disappointed us again was the uniformity correction. We had seen from our U2713H tests that it didn't seem to work at all, but had been more impressed with its implementation on the U2913WM since. With the U2413 the function did work, but it was practically unusable since it wasn't available in any of the main modes you'd want to use it, and even when you did find one where you could, it would lock the brightness at a setting which was too high and uncomfortable for any prolonged use. Along with support for more hardware colorimeter tools for LUT calibration, this is an area we'd like to see Dell address if they can."
Yes, that's what I'm referring to. I imagine that there will be a few designs where they don't get it quite right, then the even newer designs will correct the early problems. I have high hopes for the U3013 in development.
With a wide gamut monitor unless you're running a full 10-bit display signal path this isn't advisable, and if you're using Lightroom it only supports 8-bit color. You already run the risk of banding due to spreading the wider Adobe RGB gamut over an 8-bit signal and double-profiling will make it worse.
In the case of the Dell U2413 it has an internal 14-bit LUT, which can only be programmed using the custom calibration mode and software provided by Dell. Academically using the Dell factory calibrated Adobe RGB internal LUT mode AND external calibration of the graphic card's LUT just doesn't make sense, does it? Read the 'Hardware Calibration' section here for more details:
Sorry Noel, it looks like we just had two cross-postings (double-cross?). I agree 100% that the monitor manufacturers should include an internal LUT in all wide gamut displays (and maybe sRGB) that allows custom programming AND use of features such as 'Uniformity Correction.' (Ref. quoted text from TFT Central site above).
Once OLED technology has matured all of this should become academic. Displays can then be designed to use an internal GPU and programmable LUT to automatically control the luminosity uniformity and color accuracy at the pixel level. Am I a dreamer or what!
trshaner: what relevance does your link to the U2412M have to the discussion of the U2413?
Here's what the OP said in his last response:
"Also appreciate you drawing the distinction between the sRGB and wide gamut displays- while I still don't know the technical differences between the two I can eliminate the wide gamut display for now."
The U2412M is an sRGB gamut monitor, so it would be of interest to the OP.
And why not adjust the monitor's LUT and the card LUT?
Here's what I said:
"With a wide gamut monitor unless you're running a full 10-bit display signal path this isn't advisable, and if you're using Lightroom it only supports 8-bit color. You already run the risk of banding due to spreading the wider Adobe RGB gamut over an 8-bit signal and double-profiling will make it worse."
The primary purpose of using a monitor internal is to minimize the possibility of banding, since the monitor profile is created using higher bit depth (i.e. 14 bits for U2413) than the display signal (i.e. 8bit/color, or 10/bit/color). I never said you couldn't do it, only that it's not advisable unless you have a full 10 bit/color display data path. In the case of Mac OS X platforms you are limited to 8 bit/color. Only Windows platforms support 10 bit/color.
Yes, absolutely. +1, trshaner!
It's about maximizing the quality of the displayed image.
If you can put a clean gradient on the screen and see all 256 shades (ideally 1024 if you have the technology), displayed with proper gamma and a good brightness for editing, as well as having colors that are accurate, then you've got your system working as well as it can.
Most folks probably don't notice two-level jumps in gradients, but once you're sensitive to it, loss of precision somewhere in the signal path will bother you.
My two (older) Dell Ultrasharp monitors right now give me a good, clean 8 bit sRGB color, and I believe I'm ready to feed the U3013 10 bit color when it comes out (empty mini DisplayPort socket at the ready). That's going to be a really cool setup. Not cheap, though...
I assume when you say "empty mini DisplayPort socket at the ready" you are on a Mac OS X platform. Unfortunately Mountain Lion OS X v10.8 only supports 8 bit/color. Apple does a good job of not mentioning this in any of its literature, including technical documents.
If you are on Mac OS X and know how to get 30 bit/color in PS please let us know. I'm not trying to bash Apple, just stating the current capability (i.e. 8 bit/color only).
Acording to Diglloyd as of 02/01/2013 this is still the case – OS X only supports 8 bit/color:
This is all the more reason for Mac users to only purchase "wide-gamut" monitors that use a programmable internal LUT with high bit-depth (14-16 bits) for creating and applying the monitor profile. Do not double-profile and only use manufacturer approved software and hardware puck to calibrate the internal LUT on monitors like the Dell U2413.
Using a "plain vanilla" sRGB monitor like the Dell U2412M avoids all of these issues, but then what fun is that!
Maybe I learned something material here. Photoshop supports 10 bit but only Win can use it now? Could OS 10.9 support it if Apple chose to do so?
In regard to factory cal, why does Dell do it with high brightness and commercial cal soft/hardware does it with comfortable brightness? Is the factory trying for gamut or Delta E? I'm betting on gamut.
Photoshop supports 10 bit but only Win can use it now? Could OS 10.9 support it if Apple chose to do so?
That is correct. Apple only needs to provide updated drivers that support Open GL 10 bit/color. Some have suggested that Apple hasn't done this because there are no Apple monitors that support 10 bit/color, at least not yet. Some have also suggested that Apple has neglected the pro graphics market to focus more heavily on the larger consumer and mobile devices market.
Regardless Apple's marketing decisions you will still get good imaging with 8 bit/color on wide gamut monitors that have a properly calibrated internal LUT with high bit-depth (14-16 bit/color). The primary issue is visible banding in smooth gradient areas, which is eliminated or reduced when using a 14-16 bit/color LUT in the monitor with no other external profiling.
That is helpful, and I think the latter conclusion is correct. Apple doesn't care about desktop users any more. The prime example is the new scroll bar, which is an ergonomic tragedy.
Googling around a bit I found that the reason factory cal is done at high brightness with LCDs is as I expected, to preserve wide gamut as advertised. This raises the concern that if you rduce the brightness from 140-200 cd/m2to maybe 80, how much gamut is lost? If aRGB is 100% of NTSC at 140 cd and sRGB is 75% of NTSC, what is the gamut of a wide gamut LCD at 70 cd (half of the cal). This is a bit too dim, but some folks use 80 with what they think is a good result. If the gamut decreases even somewhat less than linearly you're not getting what you pay for. Hm. Do you ever?
Googling around a bit I found that the reason factory cal is done at high brightness with LCDs is as I expected, to preserve wide gamut as advertised.
I suspect this is more a "marketing" decision rather than a calibration issue. One of the biggest selling points for monitors and TVs has always been to make the screen appear "brighter" and more "colorful," even if it isn't accurate. This in fact is what "gamers" want in a display more so than low delta-E and display marketeers want to capture the broadest market possible.
This raises the concern that if you rduce the brightness from 140-200 cd/m2to maybe 80, how much gamut is lost? If aRGB is 100% of NTSC at 140 cd and sRGB is 75% of NTSC, what is the gamut of a wide gamut LCD at 70 cd (half of the cal). This is a bit too dim, but some folks use 80 with what they think is a good result.
TFT Central does all of their hardware calibration tests at 120cd/m2 with very high color accuracy and gamut performance. Because of the marketing issue mentioned above panel manufacturers use backlight technology designed for the highest possible Luminance(>300 cd/m2). The downside is that this makes it difficult to adjust some displays below 120cd/m2 without causing visible flicker and reduced image clarity. If interested this article explains the issue in more detail:
This is also discussed in the Dell U2413 review at TFT Central along with the consequences of using a lower luminance level (i.e. 80-100 cd/m2).
TFT Central was able to lower the U2413 luminance to 114cd/m2 (20% Brightness setting) without any flicker issues. Screen flicker is highly "subjective" and some people seem to be bothered by it more than others, including myself. Fortunately you can use higher luminance with photo editing applications by adjusting the ambient room lighting and print lighting. Here's a good article on the subject:
Since the "brightness" control is actually a duty cycle control, it appears that gamut wil not be affected, barring some peculiar human eye response thing.
It is unfortunate that the hardware cal of the 14 bit LUT can only use one particular i1 colorimeter, but I really don't see any real advantage to doing it anyway, from their results. I'd rather have a spectrophotometer and be able to cal the printer as well. The software cal used an i1 device with Lacie software which is about twice as expensive to do, so I don't see the economics of their solution as applying to the average user,
It all comes down to marketing decisions, which in this case were between Dell and X-Rite. In return Dell probably got a better deal from X-Rite on the software development. NEC Spectraview is a similar situation also with X-Rite calibrator exclusivity.
It looks like the newer NEC Spectraview II software does allow using Datacolor calibrators including Spyder 3 & 4 models:
The software can be purchased separately for about $100 and then you can use any calibrator on the list above.
Europe, Middle East and Africa