I would assume that option would be grayed out for anyone still using older monitors. I like the idea of them taking the next step though. I wonder how that would affect video? The bandwidth would be much greater. Which probably would require a faster video card. Interest idea though.
It seems you need to be in the FirePro line of cards, the Radeons do not support it - and apparently not even all the FirePros support it.
I haven't (yet) done a lot of research on this, mainly because finding out if a particular card supports 10-bit color or not is like pulling teeth. They just don't say. It seems to be regarded as an "oh, by the way"- feature.
Most of these cards are expensive ($3000 expensive), but I did come across one at $600 that specifically lists 10-bit support in the specs: http://www.bhphotovideo.com/c/product/1014975-REG/amd_100_505561_firepro_v7750_graphics_ca rd.html
The FirePro V4900 seems to be the entry level offering with specific "full 30-bit display pipeline" support. Trouble is I can't find anyone selling these things, so I can't find prices: http://www.amd.com/en-us/products/graphics/workstation/firepro-3d/4900#
OK, I've found a couple of Norwegian outlets for FirePro V4900 cards. The price isn't bad; about the same as a 256GB SSD.
Now I got interested and it just got on my short list of things to consider (I assumed these cards were a bit too expensive to be worthwhile). However, not one of these outlets specifically mention 10-bit capability in this card. All I have to go on is the AMD website, although they should know...but I need to make sure there aren't different editions of them.
I would assume that option would be grayed out for anyone still using older monitors.
The way it was described to me was that the system setup has to be using HDMI or DisplayPort in order to enable this.
It seems you need to be in the FirePro line of cards
That's been the thinking up to now. But I caught wind that maybe AMD was going to open up 10 bit support for the high-end Radeon cards, which ARE actually sold as having "10 bit" capability in the hardware but up to now have been disabled from using it. Note the "Radeon" moniker in the driver screen grab shown.
If in fact this turns out to be the case, and the next driver release will enable the capability, you can be sure I'll be testing it and reporting back on how it works.
Since there is strong competition between the big name makers, if such latent capability exists in the less expensive nVidia cards as well maybe nVidia will notice AMD's move.
Edit: By the way, someone could test this now with the current driver beta, but I'm in a condition where I can't afford to destabilize my system right now so I'm waiting for the release.
Ah, interesting. I'll keep an eye on this then (although I suspect I'll need a new card anyway, the 5000 series that I have is by no stretch high end).
I can confirm this is not true, that is for the preferred bit depth option of the monitor and not instruction to your card to run in that said bit depth. I have PS CC 2014, an Eizo CG277 via displayport running two AMD 7970's (for gaming) with that beta driver and can not get 10 bit, tested with a 16 bit ramp in PS. On a previous system I ran a Eizo CG223w on a firepro4800 and 10bit was achievable, I upgraded to a Wacom 24HD and lost the 10 bit support so I scrapped the 4800 and got the 7970's.
I am going to attemp shortly running a Radeon 7970 and a FirePro 4800 together on the same screen. I will have DP outputs from both cards into the one screen where I will toggle between the two, hoping to get the 10 bit functionality of the 4800 and the gaming capability of the 7970. I have been advised that it is theoretically possible and will be interesting to see if it works..
Well THAT's not good news.
This document seems to imply that if it can be selected in the Catalyst Control Center it will work...
Figure 5 demonstrates how the user can enable the 10-bit display support in the OpenGL driver by simply checking the relevant checkbox in the Catalyst™ Control Center (CCC).
By the way, the 7000-series ATI cards were sold with (usually somewhat understated) blurbs claiming 30 bit color capability. Many of the blurbs have mysteriously disappeared from web sites over time. I know the hardware is capable as people have reported being able to enable smooth gradients with modified FirePro drivers. I wonder if it was a last minute marketing decision at ATI to remove support.
Please keep us posted on what you find. When AMD actually releases the "late 2014" drivers I will try them myself and report back here.
When you do, try doing a screen grab - That's the major problem I have with my FirePro. PrtScr and the Snipping Tool make the colors all psychedelic, like a bad LUT.
Have you ever tried Irfan View for making screen grabs? That's what I use. It's entirely possible it will screw up similarly. I also use a remote access tool called RAdmin to collaborate, so I am imagining situations where the driver would enable 30 bit color in Photoshop but I just won't be able to use it for practical reasons.
Tonight I installed the newly released Catalyst 14.9.
It went in smoothly and it came up defaulted to the 10 bits per channel setting:
The bad news is that smooth gradients as displayed by Photoshop still show 256 steps in them. Ah well, maybe ATI's color depth setting means something different than I thought.
I'll keep poking around; maybe there's still hope.
The Passmark PerformanceTest benchmark doesn't show any significant speed difference from Catalyst 14.4 drivers, by the way. If I take on any new problems I'll probably just uninstall 14.9 and drop back. 14.4 was working flawlessly.
Have you tried doing a screen capture? That's the big issue I have.
Do you still have CS6 installed? Try looking at the gradients there.
Screen grabs look perfect. No psychedelic colors.
Gray gradients still show 256 steps in Photoshop CS6. In all versions of Photoshop I've tested with this setup I've had 30 Bit Color selected.
I'm clearly missing a capability or maybe a step in reconfiguration. The ATI driver shows 10 bpc but doesn't work any differently whatsoever than the prior version.
The 14.9 driver works fine with Photoshop by the way. I worked with Photoshop CC 2014 for half the day and had zero problems.
Part of the problem seems to be that CC doesn't support 10-bit with ATI cards... Using my V3800, I get good 10-bit ramps on CS5.1, bot not on CC 2014 (even with the 10 bit option enabled in CC). I wish Adobe would repair this...
I get good ramps in CS6, but not in CC, original or 2014. The psychedelic colors are only with screen grabs; any copy and paste from an app, like Windows Paint, works fine.
Could be worse, I suppose - I could have a Retina display . . .
[Edited later, when I'm actually at my computer]
I just checked, and I do NOT get a smooth ramp in CS6. It was in CS4, which I lost a while after my last rebuild, and haven't felt I needed it. I do ned to de-activate it, though
Actually, I'm really thinking of turning off ten-bit. I don't really see any difference in the work I do, and I currently don't have a good enough printer to merit it. Especially since AMD's latest driver makes it even more noticeable.
But I paid for a Ten-bit card, and a Ten-bit monitor. It's the principle of the thing.