• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Apparent bit depth problem and bit depth change bug

New Here ,
Oct 03, 2017 Oct 03, 2017

Copy link to clipboard

Copied

I am using a newly installed version of PS CC on a new Windows laptop.  While editing images, I noticed that in several of them, clouds and/or sky areas have banding. The banding is present in unedited RAW files viewed in PS (the banding is not present in LR). I adjusted settings in the Preferences dialog to see if any of those would take care of the issue, but none of those changes made any difference.  So, I converted the bit depth from 8 bit (and later after changing the import option to 16 bit, I did the same with the 16 bit file) to 32 bit and the banding went away.  When I tried to revert back to 16 bit or 8 bit, instead of changing the bit depth, PS opened up the HDR dialog.  This is a repeatable problem. 

Based on the inaccurate rendering of the images with the introduced banding combined with the HDR dialog popping up when I try to reduce the bit depth from 32 bit to a lower bit depth, I think that there is a bug in PS. 

Views

1.7K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe
Community Expert ,
Oct 03, 2017 Oct 03, 2017

Copy link to clipboard

Copied

Photoshop’s 32 bpc mode is intended for HDR, so even if no explicit HDR effects are wanted, one still has to “tone map” from 32 bpc to a lower bit depth, so this is by design and not the same as switching between 8 and 16 bpc. You would likely need to use a setting such as the following screenshot:

tonemap.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 03, 2017 Oct 03, 2017

Copy link to clipboard

Copied

There's no bug. In an 8-bit file there are 256 discrete levels per channel, from black to white.

A standard display system is also 8 bit color depth, with its own 256 discrete levels - except that laptop displays are really 6-bit panels with temporal dithering to simulate the last two bits.

In a shallow gradient, both the above will result in visible banding - the combined effect being that much more pronounced.

In a 16-bit file, any banding you see is in your display system.

Lightroom uses output dithering, so you won't see any banding there. I don't know why Lightroom does and Photoshop doesn't. You normally don't see banding in photographs, because there's always just enough noise to break it up. Synthetic gradients show it much more.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Mentor ,
Oct 05, 2017 Oct 05, 2017

Copy link to clipboard

Copied

https://forums.adobe.com/people/D+Fosse  wrote

In a 16-bit file, any banding you see is in your display system.

Unfortunately not so in Photoshop. Photoshop's 16bpc mode (which is actually 15bpc) mode will use an image pyramid of 8bpc when the user zooms the view out at a certain threshold (around 67% and lower). The banding can be quite apparent in this case, and is solely the result of the ancient legacy 16bpc code written by Chris Cox. I think he did that at the time to prevent performance issues due to the slow hardware that was in use decades ago.

It depends a bit on the particular image how bad the (visual) banding will be - but it is definitely a Photoshop specific issue. It is very confusing for anyone not aware of this issue, and can lead to unnecessary dithering actions on the part of the Photoshop operator.

Photoshop is the only 16bpc capable image editor in the world that is afflicted with such a limitation. I wish the developers would re-code the 16bpc mode in Photoshop, and bring it up to modern standards. That includes implementing true 16bpc, instead of 15bpc, which is problematic in specific cases, because in full range 16bpc images it will cut off the range - without warning the user.

Lightroom, however, has a proper implemented 16bpc mode. And no silly view 8bit image pyramids there 🙂

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 05, 2017 Oct 05, 2017

Copy link to clipboard

Copied

Well, yes, that's technically true, and you can see it with a 10-bit capable monitor/video card.

But the primary reason you see banding on screen is still the 8 bit video signal path, further degraded by an 8 bit panel that might not even be true 8 bit, but actually 6 bit plus dithering (all TN panels and some cheaper VA and IPS panels).

The 8-bit preview is freshly calculated from the 16-bit data every time, so it won't add to the display banding you already see. It's not cumulative, as it will be in a true 8-bit file.

So, again, you will always have banding on screen, unless you use a 10-bit capable display system, or dithering is applied on output as in Lightroom. The question is whether the banding becomes so irregular that you notice it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 06, 2017 Oct 06, 2017

Copy link to clipboard

Copied

LATEST

Actually there is one situation where the 8 bit previews in Photoshop have a more significant impact: adjustment layers. If you have several, with severe adjustments, you'll get a cumulative effect as one is stacked on top of another.

This banding disappears when the layer is committed, but a many-layered working psd may show this cumulative banding on screen.

Apart from that, I got to think that maybe the "dithering" in Lightroom is based on a misunderstanding. I have this from a couple of lengthy threads in the Lightroom forum, where some mistakenly believed Lightroom has 10-bit support (it doesn't).

What are you really looking at in Lightroom? Photographs, that's what. Photographs always have enough noise to conceal any banding. You don't need any additional dithering, you'll never see banding in Lightroom. Synthetic gradients in Photoshop is another kettle of fish altogether.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 05, 2017 Oct 05, 2017

Copy link to clipboard

Copied

Thanks to both of you for the responses.  I learned about the HDR dialog from another person after I posted my question, and your responses with regard to that are the same as what the other person told me.

Ok - I did not realize that the laptop display is only 6-bit.  Kind of surprising being that it is supposed to be a very good display for photography.  This would certainly explain why I have not seen the banding on my desktop monitor (which is a high quality monitor), but am seeing it on the laptop.  I really appreciate this info - now I can stop trying to fix the problem.

I too find it interesting that LR and PS use different methods.  It would seem that they would be consistent, but apparently not.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines