i think the gtx 1000 series still limit 10bit to directx, so it may not work for 10bit photoshop or adobe software with opencl. maybe someone that has the hardware to confirm/test will also comment.
That question probably should be asked in the Photoshop forum. How would one test to see if it really works?
the photoshop forum probably would have more folks that could answer. how to test: with the hardware (gpu+ true 10bit monitor), one way would be a basic grayscale gradient. 8 bit would create banding, 10 bit would show none.
I will try there. Because I have two choice. Buy the new GTX 1060 for 200 euros, or the old Quadro K620 for 100 euros (second hand).
I won't use this workstation for gaming, but i built it just to use Photoshop and Capture One.
At the moment i couldn't spend more.
i was wondering because the "slow" macbook 12" 2016 edition that I own, supports 10 bit mode in Photoshop with its intel 515.
Of course the monitor isn't a true 10bit monitor, but the difference is really big and unexpected comparing to a macbook pro 15" (with nVidia GT 650M)!
i'm not a mac person, but i heard apple made the move to 10bit. so maybe they have a custom driver for the intel hd 515 in the 2016 macbook to enable 10bit. the older macbook pro with the nvidia 650m was probably made before apple moved to 10bit.
the quadro k620 is slow, so even if its required for 10bit you might not be happy with it. it would just depend how much performance you need for photoshop and Capture One...
Curious as well...all the new 10xx cards supposedly support 10-bit. Have a prosumer monitor that supports dithered 10-bit (asus ips monitor PB279Q)...