1 person found this helpful
What would you require that functionality for? 16pc and 32bpc already far exceed those specs and even though they are dithered down to be displayed on your LDR devices, that should be more than enough to work with. Additionally, 10/12/14bpc support would only make sense, if you worked with a file format that also stores that info natively as well as requiring a whole lot of other magic in the color profile engine, the print engine, filter handling and so on. So by all means, this is much more complex than just shuffling around a few bits and bytes in the display routines, which in turn means it's not coming any time soon. Definitely not for CS5 and most likely not for CS6, either. You have to understand that any software vendor will have to weigh the development effort agains how many users would actually benefit from such a feature and currently that is probably only a fraction of a percent, so it is not a must-have feature. This may change as wider gamut devices become more affordable and thus more widespread used as well as OSs supporting them, but for the time being it is something most people have no need for.
Hi Mylenium, thanks for your comment!
What would you require that functionality for?
Like I said, I'm building a new system with Windows 7 (supports 30-bit and 48-bit color depths). The monitor I plan to buy also supports 10bpc via Displayport. Now I have to choose between a regular graphics or a more expensive ATI Firepro. If PS and/or the current file formats don't support 10bpc OUTPUT yet (and won't in the near future), I do not need to buy the expensive graphics processor and will probably buy another monitor
There was a plugin from Matrox that enabled 10bpc in PS some years ago. Like I said there is already a 16 bit per channel mode in PS I wondered about Photoshop's ability to display a higher bit depth. For years PS has been able to create HDR images although noone can actually see the tonal values.
That leaves the question: Why would one need to buy a 10bpc monitor? Why do workstation graphics support 10bpc? Are they only useful in CAD, animations, video and rendering programs?
It seems we were talking at crossed purposes (thanks Zeno Bokor ), so is your comment still valid?
I`d also like to see 10bit display output in CS5. I have the impression that 10bit displays will be affordable soon. Some years ago a wide gamut display was extremely expensive. Now you can get very affordable Wide gamut displays. Same will happen with 10bit displays. There is the HP Dreamcolor and also 2 new Eizos for around 2000.-. Not cheap, but not totally out of reach. Would not surprise me, if you can get a 10bit display for 1000.- in one or two years from now. At this time many people will use PS CS5. So it should support 10bit display output.
I once had a small program that simulated 10bit output via temporal dithering (or so). I tested many pictures. Some of the pics showed (very little) banding in certain areas (in PS and in the other program (when 10bit dithering was disabled). But you never knew wether the banding was in the file or because 8bit output was the limiting factor. When I enabled the 10bit output, you could see that in some of the files the banding went away. In other files it did not. So it was clear what was the reason for the banding. Sometimes the files, sometimes the 8bit display output.
So if you want to judge your files on a very critical level 10bit output would really help.
I told "A file format caller Kodak CINEON is a 10 bit"
I review it....3 channels together compose 10 bit in CINEON..
Message was edited by: Gustavo Del Vechio
Photoshop can't support greater than 8 bit/channel output yet.
We tried, and the APIs still had problems. We're working with the vendors to resolve those problems. (and trying to find good displays to test with that *really* output 10 bits or more per channel).
Chris, please, can you present actual state of Adobe aproach to implement 30bit color depth viewing in Photoshop CS5? I am writing little article about whole 30-bit viewing situation and it will be fine to have right information. Thanx in advance.
In addition to Mikrotoms request I would like to know, what would be needed to display 10 bit, IF CS5 (or later) would be able to do it.
What OS, what graphics card (only Quadro, or also "Consumer"-card), what monitor? Would this Eizo do it:
According to specs it can display 10bit, although only by dithering.
This is really easy nowadays. Windows 7 has native support for 30-bit color depth, nVidia produces complete row of Quadro FX card with 30-bit color depth (even ATI does with FirePro cards), drivers are working good, DisplayPort can transfer 30-bit content to displays and HP/EIZO/others made lot of LCDs with native 30-bit support. The one and only missing piece of chain is SW application wich will be capable to utilize new API functions and manage out true 30-bit per channel output...
Since the ability to detect color differences varies greatly between humans anything above 24 bit is a waste of time. Unless a woman is a tetrachromat.
It would not make much sense to have 10-12-14 bit functionality for monitor purposes if the other output devices didn't. Banding for instance. How much aggravation would be encountered to see banding in a print which isn't present on screen? I would want to be forewarned.
It would be interesting if klsteven could print examples of both images displaying banding in 8 bit but certain images in 10 bit did not band, on a 16 bit printer. I believe canon has 16 bit printers.
Banding is my #1 concern these days. Agressive moves with Shadow/Highlight followed by agressive moves in Black and White produce serious banding (as well as other noise components, but the banding is a deal breaker). I could, of course, test it myself and see if the 16 bit file which shows banding on screen and in print shows on the 16 bit printer. Probably will.
This is common mistake. You definitely need much more than 256 color steps in each channel. Creative photography in BW usually produces lot of banding in smooth transitions between black-greys-whites...
Well, I wouldn't call it a mistake!
I've explored it sufficiently that I have tentatively ruled out equipment problems, so it comes down to limits of processes.
I first encountered it years ago with a scanned image of a mountain reflected in a mirrored surface of still water. The banding in the sky's reflection blew me away.
It's at least one of my legacy images that cannot be adequately handled in digital form. Curiously, returning to that site at the appropriate time with a digital camera did not show banding until I made the conversions. Then it was the sky that banded!
I have some workarounds but they do not always work.
If the reviews can be trusted for the Eizo. I did not look in the Eizo forums for the real story.
I did check out the new Dell U2410, just for the heck of it, and the reviewers love it and it has no problems. However, in the Dell forums there are all sorts of Dithering problems and Photoshop problems found in a 7 page thread with the ICM that Dell has duplicated. A fix is in the works. So everything a person reads on the net is not exactly fuzzy kittens.
Test, test test!
Ha, you love it don't cha.
It cant be tested any way, because there is not such a application with abilities to display 30-bit content... It is not just Adobe who is little bit behind HW. Even ACDSee and all of other viewers does not displays content using real 30-bit output :-(
List of current LCDs with true 30-bit display abilities (input, inner calculations and panel):
Hewlett Packard - LP2480zx DreamColor (24″, S-IPS, 1920 × 1200, 100% Adobe RGB, 12-bit LUT, DP, LED)
Eizo - SX2262W (22″, PVA, 1920 × 1200, 95% Adobe RGB, 12-bit LUT, 16-bit, DP, CCFL)
Eizo - SX2462W (24″, H-IPS, 1920 × 1200, 98% Adobe RGB, 12-bit LUT, 16-bit, DP, CCFL)
Eizo - CG243W (24″, H-IPS, 1920 × 1200, 98% Adobe RGB, 12-bit LUT, 16-bit, DP, CCFL)
Eizo - CG232W (22,5″, S-IPS, 1920 × 1200, 97% Adobe RGB, 12-bit LUT, 16-bit, HD-SDI, CCFL)
Dell - UltraSharp U2711 (27″, IPS, 2560 x 1440, 98% Adobe RGB, 12-bit LUT, 16-bit, DP, CCFL)
What would you require that functionality for? 16pc and 32bpc already far exceed those specs and even though they are dithered down to be displayed on your LDR devices, that should be more than enough to work with.
Opinions vary on how many levels the human visual system can distinguish. 8 bpc may be sufficient to prevent banding, but barely so. However, 10 bpc on the system and software side would allow an extra two bits for calibration. Some high end monitors do have 12 bit LUTs for this purpose, but does it make sense to have two LUTs?
For me, having 2 LUT monitors, would be a waste of money for what I do.
Love testing is an oxymoron. Ask any test engineer!
As for untestable, it's a matter of perspective. And ingenuity.
So, what is the situation in CS5? Does anybody already knows? Still no 10-bit per channel color output?
Nope, still no support
Actually, the 10 bit/channel display path is working quite well in CS5 - on cards and displays that support it.
Again, we've been working with the manufacturers for a while to get it working...
Chris, that great news. I guess, you should make that public so that users know this as a CS5 feature. Do you know wether the new Fermi cards will output 10bit or will Quadro cards be required? And what about ATI?
I don't have all the details on hand. We should be documenting the requirements for 10 bit display when we get closer to shipping.
I still couldn`t find anything in the system requirements for PS CS5 for 10bit output. Where will I be able to find details?
Well great. I just bought a NEC P221 because I didn't see any need to spend fancy money on hardware that couldn't live up to its potential. While the NEC has a 10 bit internal LUT, it has only DVI connections and so is doomed to receive 8 bit input forever.
Still, it's the nicest monitor I've had that wasn't a CRT.
Now, to recap. Win7 has high bit capability and CS5 apparently has it under the hood somewhere. So, THIS generation of hardware and software finally gets us there?
We haven't documented it yet - we're still working on it. (yes, there are more complications than you could imagine)
That PDF is very interesting. Did you try it with this card and CS5?
Do you also have a link to something similar from nvidia?
I`m wondering why this feature is only available to the pro cards. I guess, it`s not expensive.
Some day I guess it will be standard to all cards.
I asked Adobe support about 10bit/colour channel support and the Nvidia Quadro FX1800 and they referred me to the after effects OpenGL supported card document.
It does not mention 10bit/colour channel at all. Why can't we get a straight answer? like "no", "yes", "we are still working on it" rather than pretend answers?
Hello, I am new to the forum in terms of posting. I usually only read the responses but 10-bit support peaked my interest to the point that I felt like posting. I apologize if the question is stupid but I have had this lingering question for a long time
My understanding is that to get 10-bit color a lot things need to come together
The underlying OS has to support it
The Video Card needs to support it and you need use DisplayPort
The Monitor has to have a 10-LUT (or in the case of the new NEC PA24W, a 3-D LUT) and supports wide gamut aRGB
The Video Driver has to support it so that software can be written against it
Finally, Adobe PS has to write software maybe even specific to a card/driver to display the information correctly
Then, I assume, so that all this work is not lost, I assume you need a 16bit printer driver as well to complete the workflow (which I believe Epson has at least on the Mac but I do know that Vista/Win7 does support)
So, given the above and given that many tutorials today prior to CS5 say "Work in aRGB or ProPhoto", how does one do that if all we have is 8bit sRGB monitors? Can we possibly see the difference?
I am just trying to understand how tutorials and books tell us to use wide gamut color spaces when we all have sRGB and PS only to this point only supports sRGB and we only just now have the possiblity of getting 10-bit wide gamut support.
I am obviously missing something or don't understand what is happening under the hood.
Hopefully my response is not stupid and someone more knowledgeable can explain.
Read what I already said.
Yes, "we're still working on it".
emaini, here's how I understand it:
An 8 bit monitor can display a certain number of steps between colours. Until a few years ago most monitors were only able to display the small sRGB colour-space. Depending on how good your monitor is you may or may not see colour-banding with those. But when you have a wide gamut 8bit monitor, the steps between colours stay the same, but get spread out over the wider gamut. The spaces get larger (the transitions between colours get rougher) and banding/tone seperation is more likely to appear. With a higher bit display you increase the number of steps and get rid of banding.
In other words: the „saturation“ or intensity of the colours stays the same, because it's bound to your monitor's gamut, but with 10 bit you get more steps (=more colours) in between.
Quick comment regarding ProPhoto: No monitor is able to display such a huge colour-space. You use it only to preserve colours, to make sure you don't „run out of space“ (=lose colours) while working on your pictures. That's also why you use 16-bit in Photoshop. You can't see it directly, but it gives you more steps between tones and thus helps avoid posterisation.
Hope this helped!
Chris, thanks for keeping us up to date, much appreciated!
I can confirm that PS5 still does not displays pictures in 10-bit per channel format. Its a shame. Once again SW is far behind HW...
which graphicscard, OS, monitor and monitorconnection (displayport?) did you use? And how could you tell, which bitdepth is displayed?
on this page you can download a .psd file to test 10-bit display:
I'm confused… according to this site it should even work in CS4?!
Tested it on nVidia Quadro FX1800 + EIZO SX2462W connected via Displayport + Windows 7 Professional 64bit + actual drivers and patches.
I simply draw a grayscale gradient and can clearly see steps /banding/ between each color. When i switch mode of file to 8-bit per color, the picture remains the same