One of the advantages of using 16 bit mode is that it will help you eliminate banding in the sky. Generally speaking, I use 16 bit mode until I'm ready to create a "final" copy for other uses. Then I will use one of several methods to create that 8 bit mode copy.
It all depends on the final purpose of the image (web vs. print, for instance) and how much image manipulation you think you're going to need.
I almost never do anything for the web—other than posting an image to illustrate a point here in the forums—so my images remain in 16 bit.
How is image quality affected by an 8-bit image with 256 levels of brightness VS. a 12-bit with 4,096 levels? Is there a reason to not want the extra levels of brightness? Does this affect how colors are represented or just their brightness levels?
Bit rate is not the proper term in this context, and what I think you wanted to ask about is bit depth. Bit rate is the number of bits/second at which the image is sampled. A slow readout (low bit rate) enhances image quality. In scientific imaging systems the bit rate can be adjusted, but with most regular cameras it is fixed. One needs a sufficient number of levels so that the difference in adjacent levels will not be apparent to the eye (posterization). The number of required levels varies with the gamma of the image, and bits are lost when going from a linear raw image to a gamma 2.2 image. See Bruce Lindbloom's levels calculator. When rendered to a 2.2 gamma space a 12 bit raw file goes from 4096 levels to only 249 levels.
If you don't edit images, 8 bits is usually sufficient. With a wide gamut space such as ProPhotoRGB, the distance between adjacent color levels is greater, and posterization of colors may result if the image is edited to any extent. This posterization is often most evident in smooth areas such as sky. With an sRGB image (average gamma of 2.2) 8 bits per pixel is usually sufficient of you don't do extensive editing. With ProPhotoRGB or when extensive editing is contemplated, 16 bpc is a good choice.
The number of required levels varies with the gamma of the image, and bits are lost when going from a linear raw image to a gamma 2.2 image. See Bruce Lindbloom's levels calculator. When rendered to a 2.2 gamma space a 12 bit raw file goes from 4096 levels to only 249 levels.
If you are interested in the subject, recently in a sort of comparison between RGB color spaces in raw photography, I compared the losses that occur in different tonal range (shadow, highlight, etc). You can find details in the graph along this article that sooner or later I will have to translate into English.
Marco N. wrote:
...this article that sooner or later I will have to translate into English.
I followed your link, right-clicked on the page, and chose "Translate with Bing". It did a darned good job of it!