This content has been marked as final. Show 36 replies
In most cases the benefits will out weigh and negatives. In landscape (which I do) pushing the exposure to the right and then taking it down considerably allows one to also use the fillight to a strong setting and get much more shadow detail and less noise. That alone is a good enough reason to use it.
Have you recorded compressed raw data?
Added another question: the difference image does not indicate, which one is brighter. Do you mind posting both images on their own?
What were the noise characterizations of the two images? Plots on a curve won;t tell you which image will end up looking "better" from the standpoint of usable image detail vs. noise.
>What were the noise characterizations of the two images? Plots on a curve won;t tell you which image will end up looking "better" from the standpoint of usable image detail vs. noise.
Yes, there is 1 f/stop less dynamic range and 1.4 times more luminance noise with the reduced exposure and 1 stop push, as one would expect. On average the noise follows a Poisson distribution with a factor of sqrt(2). However, in the deep shadows where read noise comes into play, the noise doubles.
IMHO a gain of 1 stop in DR and half the noise in the deeper shadows is worth a change in the TRC. The latter could be corrected with a curve, but I was a bit surprised that the TRCs were different.
Well, we all know that linear captures aren't REALLY, REALLY linear...they're just "sort of" linear.
>Well, we all know that linear captures aren't REALLY, REALLY linear...they're just "sort of" linear.
That depends on what the definition of REALLY LINEAR is. Here is a plot of the normally exposed image previously shown but converted with DCRaw as a linear raw file. It is pretty much linear to me. The non-linear portion in the shadows is most likely due to flare light and not a defect in the sensor or software. Anyway, I'm not sure what point you were trying to make.
Interesting confirmation of some of my comments a year ago. I have always said that ETTR is not a replacement for proper exposure metering. I was attacked then, and probably will be again. But
Unless you can alter the laws of physics, digital capture will never be as linear as some have suggested. In addition, the dynamic range of the scene has a giant impact on the histogram. Most high end DSLRs have a larger dynamic range than paper or film. This has been proven with many color target measurements. Changing exposure to compensate for contrast limitations in the scene is like measuring with a micrometer and cutting with a chain saw.
Then, there is the issue of RGB gamut in a real color scene. I can show many examples of clipping in the camera or in ACR aRGB images that completely disappears simply by changing to a ppRGB color space with no exposure adjustments anywhere. In other words, the right hand side of the histogram can be dominated by color clipping that has nothing to do with exposure at all. This is even more pronounced if the camera display is sRGB.
The histogram is just one of many new tools enabled by digital technology. It should be used along with the camera preview to verify that you did capture the scene you expected. That assumes you know what you want from the scene, artistically. It should not be used as a crutch to avoid exposure metering. When in doubt, bracket.
Mathematical analysis and plots of gray scale wedges tell you very little about the proper exposure for a sunrise or a portrait. Fundamentals and practice will.
Cheers, Rags :-)
>Unless you can alter the laws of physics, digital capture will never be as linear as some have suggested.
Actually, I work with similar sensors as are put in DSLRs for physics experiments and I can assure you that they are very linear as long as you are above noise and below saturation. As Bill shows above, sensors in DSLRs are the same. They are almost perfectly linear. The reason behind this is that they actually count the number of photons hitting each photosite.
>Then, there is the issue of RGB gamut in a real color scene. I can show many examples of clipping in the camera or in ACR aRGB images that completely disappears simply by changing to a ppRGB color space with no exposure adjustments anywhere. In other words, the right hand side of the histogram can be dominated by color clipping that has nothing to do with exposure at all. This is even more pronounced if the camera display is sRGB.
This is normal as cameras can capture a much wider gamut than you can describe in aRGB. Only ppRGB is wide enough. Unfortunately, the histogram displays on most DSLRs can only be seen in sRGB or aRGB and you will misjudge whether there is actual overexposure. Even when you would be able to see it in ppRGB, you would not be able to judge correct exposure to-the-right from the histrogram, as your histogram is influenced by the setting of the white balance. What you would need is a histogram of the actual RAW sensor data, preferably modified by a tone (gamma) curve to judge whether your exposure to the right is good. On the current generation of DSLRs you cannot really do this.
Thank you for that insight, Jao.
The ETTR Myth
ETTR is short for expose to the right. Some folks have promoted it as a replacement for traditional exposure metering. The premise is that you can validate camera metering by simply reading the histogram in the cameras preview window.
Unfortunately, it is based on some basic misunderstandings about digital photographic technology. The first misunderstanding is the premise that each bit level in a digitally encoded image represents an exposure stop. The second misunderstanding is the premise that all digital cameras capture light in a perfectly linear fashion. The third misunderstanding is the premise that the histogram represents the raw image data captured by the camera. I will briefly address each of these.
Any correlation between exposure stops and digital bit levels can only be accidental at best. The total exposure range in a scene or an image is correctly known as the dynamic range. The dynamic range of digital cameras is wider than most folks assumes and usually equal to or better than film or paper. It can be defined in terms of tone density, decibels, or exposure stops. It is a function of the optics and sensor electronics in the camera. The few cases where an accurate range is provided by the vendors, it varies from 8 to 12 f/stops.
The image data is converted from analog measurements by the analog/digital (A/D) circuits early in the capture. This can wind up as an 8-bit, 12-bit, 14-bit, or even 16-bit digital value depending on the camera and its user settings. It is simply a number that has been digitized. Any correlation between bits and exposure levels is pure speculation, end of subject.
Second, the digital capture of light is not strictly linear. It is true that the silicon sensor itself will capture light in a very linear fashion. But this ignores reciprocity at the toe and heel of the extremes, the quantum efficiency of the substrate, and most importantly it ignores the optical filters in front of the sensor. If the color filter array were linear it would be impossible to reconstruct colors. And these are not the only optical filters in your camera. Then, the A/D circuits have gain controls based on the current ISO setting. And some A/D circuits perform some pre-processing based on the illuminant color temperature (white balance) and limited noise reduction based on the ISO setting. The point is that there are many steps in the pipeline that can introduce non-linearity.
Finally, the image in the preview window has been color rendered and re-sampled down to a small size. This is the data shown in the histogram. The camera can capture all colors in the spectrum, but the rendered image is limited to the gamut of an RGB color space. So, in addition to exposure clipping the histogram will include gamut clipping. This is also true for the blinking highlight and shadow tools. This might imply an exposure problem when none exists. There is no practical way to map all the data in a raw image into a histogram that you could use effectively in the preview window.
If you capture an image of a gray scale chart that fits within the dynamic range of the camera, at the right exposure, you can create a linear graph of the raw data. But if you underexpose or overexpose this same image, the graph will not be linear and it is unlikely that software will be able to restore true linearity. End of subject.
If you typically shoot JPG format, the histogram will accurately represent the image data. But clipping can still be from either gamut or exposure limits. If you typically shoot RAW format, the cameras histogram is only an approximation of what the final rendered image might look like. There is a significant amount of latitude provided by the RAW image editor. This is probably why you are shooting RAW in the first place.
So, in closing, I am not saying that histograms are bad. They are part of a wonderful toolkit of digital image processing tools. I am saying ETTR is not a replacement for exposure metering. If you understand what the tone and color range of the scene is, you can evaluate the histogram much better. And if you master traditional photographic metering, you will capture it more accurately more often.
I hope this clears up my previous statements on this subject. And I hope it explains why I think ETTR and linear capture are based more on technical theology than on technical fact.
Cheers, Rags :-)
>And I hope it explains why I think ETTR and linear capture are based more on technical theology than on technical fact.
If the scene you are shooting falls within the dynamic range of your sensor (or is even lower than the sensor) you are a fool not to ETTR...if you are shooting a scene that is at or above the dynamic range of your sensor, you, the user must decide how to expose for the scene based upon the importance of elements in your shots. If shadows are more important than highlights, expose for the shadows and the heck with the highlights. And visa versa...
Where most people get lost or fail to grasp is that unlike film, a linear digital capture has a _LOT_ more highlight headroom than can be seen in the LCD or the camera's histogram. Camera Raw in particular can take advantage of a LOT of highlight detail even beyond the first clipped channel and between highlight recovery and artful use of the curves, that highlight detail can be teased of of a linear capture as long as all three channels don't get clipped.
The biggest issue is whether the scene is within or beyond the dynamic range of the sensor...but, again if it's within the range, it's foolish to not ETTR because you can always darken down the shadows of a light (but unclipped) image easier than you can lighten the shadows on a down exposure.
>most importantly it ignores the optical filters in front of the sensor. If the color filter array were linear it would be impossible to reconstruct colors
1. the color filter array is neither linear nor not linear. I suppose the subject is the color filter over the individual sensel, the transmission of which is always linear (I guess it is possible to bundle light to such density, for example with laser, that the filter becomes non-linear, but that's out of the scope in this subject).
The transmission curves of the filters are overlapping, and that makes it possible to reconstruct the original colors. The spectral response of the filters is usually a secret of the manufacturers.
Here is a set of spectral responses, not from the filters over the sensels, but from B+W, which are publishing the characteristics of their filters:
Back to digital cameras: even rays of a single wavelength will be transmitted at least by two, but mostly by all three filters, but to different degree.
I made a test with a red laser light (no idea of the wavelength), with the Canon 20D. The yield of the "green" pixels was about 25% of that of the "red" pixels, and the blue was perhaps 5%.
If the transmission of the filters were not linear, it would be impossible to reconstruct the original colors.
>some A/D circuits perform some pre-processing based on the illuminant color temperature (white balance)
I have never heard of such camera, but that is certainly not an argument for or against anything. I wonder, if there is any actual example for such camera.
Note, that this behaviour would make proper adjustment of the white balance in raw processing impossible.
>If you capture an image of a gray scale chart that fits within the dynamic range of the camera, at the right exposure, you can create a linear graph of the raw data. But if you underexpose or overexpose this same image, the graph will not be linear and it is unlikely that software will be able to restore true linearity
I wonder if anyone would expect linear raw data beyond the point of clipping.
>If you typically shoot RAW format, the cameras histogram is only an approximation of what the final rendered image might look like
I am exclusively recording raw files. I was sick and tired of the incorrect exposures based on the in-camera histogram. I shot bracketed, sometimes 6 shots, and still not captured everything in my effort to capture as much light as possible (aka ETTR).
Now, my new camera, the Canon 40D, displays a color histogram as well. This is, of course, based on the generated JPEG image, which is based on the setting parameters: contrast, saturation, sharpening, color tone, and most importantly, white balance.
So, I shot a few hundred images (isn't digital photography great? It does not cost anything beside your time) testing the characteristics of my camera, and I figured out the meaning of "contrast" (no, it is not trivial), and found a WB setting, temperature plus tone adjustment. All these together make the camera software create an ugly image, the histogram of which comes very close to the raw histogram. They are still not identical, but the only thing interesting me is, how close the highest one (usually the green) is to clipping.
It works fabulously, except, that the thumbnails and previews are quite useless, and of course a white-shot is necessary every time I start shooting.
>Camera Raw in particular can take advantage of a LOT of highlight detail even beyond the first clipped channel and between highlight recovery and artful use of the curves, that highlight detail can be teased of of a linear capture as long as all three channels don't get clipped
Jeff, can you demonstrate this on actual examples? The demonstration should include the raw data as well.
>If the transmission of the filters were not linear, it would be impossible to reconstruct the original colors.
Well some time ago Thomas Knoll did post that the filters are not perfectly linear. If they were, he stated that it would be possible to reconstruct the colors perfectly, even if the wavelength of the filters were different from those used in the CIE color matching model.
>Jeff, can you demonstrate this on actual examples? The demonstration should include the raw data as well.
I'm not Jeff, but I happen to have some data I can post showing 0.8 EV of highlight recovery in some shots I took with the Nikon D200 using uncompressed NEFs in order to use the Fors script to calibrate with ACR. Exposure was with daylight.
On top is a properly exposed Color Checker, the middle is overexposed by about 0.8EV and the bottom is after highlight recovery in ACR. I adjusted the white and black squares in the recovered image to match those in the properly exposed shot.
You can check the colors with the eyedropper, but it is easier to use Imatest Colorcheck. Here are the results. They are a pretty good match. Please refer to the Imatest web site if you need help interpreting the graphs.
Here is a histogram of the raw file converted by DCRaw. The green channel is nearly completely blown, but the other two channels are intact, permitting good recovery.
In the context of this thread, we are talking about the linearity of the transmission of any given wavelength. This means, that if the intensity of the captured light changes by x percent, then the intensity of the transmitted light too changes by x percent, no matter what wavelength.
The spectral response of a filter is usually not characterized as linear or not linear. I don't think there is any filter, which could be characterized as "linear" in that respect, except in a small segment of the spectrum. Look at the chart of B+W I posted above; the infrared and the black and white filters are the closest to linearity, and they are still far from being truly linear.
do you mind posting the raw images as well?
The second link (ISO100_007) is dead.
This is an excellent case: without the recovery action, i.e. with plain linear reduction, the white square would appear light magenta.
However, I would prefer this as an option, not automatic, for it has a downside as well.
I posted in a new thread, how incorrectly ACR thinks of the clipping/nonlinearity points of the Canon 40D. The same problem exists with the Nikon 200 and D2X as well, to a lesser degree (I have analyzed only ISO 100 images from these cameras): the white level is set at 3880. The green becomes nonlinear from 3910 or so, that difference is negligable. However, the red and blue appear to be linear up to 4095.
One of the consequences is, that when the actual pixel values are between the white level and the factual clipping/nonlinearity, these will be interpreted as clipped. When correcting with negative exposure adjustment, the color is off, due to the automatic "recovery".
This is difficult to demonstrate with your camera, for the difference between 3880 and 4095 is only a tiny fraction of a stop, but I can show it easily with my camera, as the difference between 13600 and 16383 is large enough to "shoot into". Those areas will be not only marked as "clipped", but their colors are slightly off.
Just thought I would remind you all that in-line images are a contravention of the forum rules. Folk using News Readers can't easily see them and the rest of us find the forum layout smashed. So, please use links rather than in-line images. Next time I might just remove them, so be warned!
Sorry, Ian. I did not know about that rule. Inline images are common on other forums such as DPReview and the Luminous Landscape, but your points are well taken and the rules are the rules.
I didn't know what you were doing with the raw image. You might want to look at this NEF of a Stouffer step wedge which has clipping in all three channels and other combinations.
I took a look at the file in Iris and noted that the green clips at 4075, the red at 3989, and the blue at 4022. In ACR with a linear TRC, step 8 clips in the green at 32755. With a DCRaw conversion it clips at 32579. Please perform your analysis and give some feedback.
I uploaded a layered TIF with about two dozen screenshots documenting the behaviour:
I selected always one color only, up to a specified pixel value. The displayed color's intensity is fixed in this mode, i.e. a red dot means, that that red pixel falls within (or outside, but here always within) the given value range.
1. The red clips exactly at 4095. If it is linear up to that point is another issue.
2. The blue clips at 4055. I come back to that.
3. The green starts behaving strangely from 3981. I don't know if it is linear up to that point, neither the Stauffer not the Macbeth are suitable sceneries to determine that.
Anyway, the green is certainly far from being linear from 3981. Look at the pixel counts at the left edge: there is no green with 3983 (the counter with 3982 is the same as with 3983). Then blocks of every fourth column take a certain value. Half of the greens is covered at 3989. Then almost nothing happens to 4021, and then blocks of the other half of the green start coming in. (Note, that the increase of the green intensity is due to the addition of more pixels, not to the change of the pixel intensity. Look at the image in 200% to see the individual pixel columns.)
The green goes up to 4025, only 321 pixels are with 4026 and 4027.
Back to the blue: I still have the Macbeth images with different exposure biases. I took the one with +2 EV. There the blue goes up to 4095, but the change from 4094 to 4095 is too abrupt. However, from 4094 downwards it seems to be linear (I captured 4094, 4084 and 4074, but it goes down in single increments).
The situation is the very same with red.
So, I would say based on the Macbeth shot, that blue and red are ok up to 4094. However, I find it really strange, that the blue behaves so differently in the Stouffer shot and in the Macbeth.
Btw, several Canon cameras too have different clipping points in the two "kind" of greens (beside the red and beside the blue). Perhaps this is due to slightly different filters over the rows.
>I would say based on the Macbeth shot, that blue and red are ok up to 4094
I slept over this, and now I think that what I analyzed is not enough to make this claim. A shot of gradual transition up into "absolute clipping" would be necessary to judge that.
>Btw, several Canon cameras too have different clipping points in the two "kind" of greens (beside the red and beside the blue). Perhaps this is due to slightly different filters over the rows.
This explanation is likely. I used the Iris split_cfa command to separate the raw channels in a test shot of a Stouffer wedge, and used the ImageJ histogram to analyze the green channels in areas of clipping.
In a selected clipped area, green1 clipped at 4022 in 4079 pixels, at 4080 in 4080 pixels and at 4024 in 1 pixel. In green2, clipping was at 3985 in 688 pixels, at 3986 in 1570 pixels, at 3987 in 1724 pixels, at 3988 in 874 pixels, and at 3989 in 20 pixels.
Up to clipping, the signal was linear in both green1 and green2, sampled in 0.3 EV intervals.
>I used the Iris split_cfa command
It might be that there is slight difference in the spectrum of the filters on the two green channels and the camera is correcting for it before it writes the RAW. The difference is very minor though.
>Up to clipping, the signal was linear in both green1 and green2, sampled in 0.3 EV intervals.
Of course they are! It is very hard to make these sensors non-linear. There is nothing special going on in the filters nor in the silicon. Over the range where the sensor operates it is very much linear. You have to go to much higher light intensities to make the quantum efficiency lower than one. At low light intensities, you have noise which makes it appear non-linear, but you are away from that point. Filters at these extremely moderate light intensities hardly ever behave non-linearly (i.e their transmittance spectrum changes with light intensity). In fact it would be really hard and special to make them do that. At high laser intensities for sure you can get bleaching effects or other non-linear processes, but not at the very low intensity a filter on a Bayer array gets. The differences that are shown above in the original post are almost definitely due to the way ACR treats the images, not due to some weird physics in the camera. One could check this if we had a way of writing synthetic RAW file where we compare two files where the only difference is that you multiplied all values with two (one stop exposure) in one. Of course, physically this is what you do when you expose two images one stop different in the camera but it would be good to eliminate any variable of how the camera preprocesses the RAW data.
>Of course they are! It is very hard to make these sensors non-linear. There is nothing special going on in the filters nor in the silicon. Over the range where the sensor operates it is very much linear.
Yes, that appears to be true, statements by some to the contrary.
>Up to clipping, the signal was linear in both green1 and green2, sampled in 0.3 EV intervals
Sampling with 0.3 EV interval at the right end does not say much about the point of saturation/non linearity of the sensors, because that's a huge numerical range. Measured from 4095 backwards, this range starts at 3250.
>Over the range where the sensor operates it is very much linear
Obviously there is a mix-up in the usage of the term "linearity". The relevant factor in the given context is the transmissivity of the filters for any given wavelength, which is, of course, linear. Some statements made above unknowingly/unintentionally refer to the spectral response of the filters; there is not much sense to characterize that aspect as "linear" or "non-linear", because there is no such thing as a filter with a linear spectral response, not even clear glass.
I think there is a bit of misunderstanding and overreacting to what I said. My comments are meant to be helpful and constructive, not critical.
At least some of the TRC tests seem to confirm at least some of my observations.
Some time ago, I did some similar testing with my own cameras and DCRAW. I had to modify it to meet some of my unique objectives. I discovered that the raw data was being manipulated in some of the camera model specific modules. One of these was my Kodak SLR/n. The manipulation was related to a Kodak feature, Extended Range Imaging or ERI. Today, there are other manufacturers touting similar functionality.
Recently, I modified my calibration scripts to support some new targets. One of these was the ISO IT8 or Kodak Q60 target. You might find these interesting because they have 12 tone steps for each primary color. Anyway, I encountered some difficulty in the highlight colors. It turned out that the measurements (photo-spectrometer) were the source of the problem. The cameras were recording the highlight colors more accurately than measurement tools. Last week I verified this with a rental unit and my own measurements. The bottom line is that these tools are designed for and calibrated to diffuse reflecting paper white. Your camera also records direct transmitted light. These target values are available on the Kodak and Wolf Faust sites if someone insists on proving me wrong.
Recently we had a well-known photographer as the speaker for our local camera club. He started his show by noting that the images we were about to see were from a recent trip to Central America. He consciously left his flash at home and used ETTR exclusively for metering. IMHO, many images were underexposed. Lots of distracting motion blur. Many of the colors seemed over saturated and garish, at least to me. To top it all off, neither the slide show software nor the projector were color managed. And all the images were in Pro Photo RGB. The term abstract art was used frequently throughout the session.
Im neither an artist nor an expert. But I do firmly believe that I have a solid foundation of technology knowledge. I would be delighted to discuss the rational ways to use histograms. But youve got a long way to go to convince me that they should replace basic metering.
I believe we can all learn from each other. If I didnt I wouldnt have read or entered this thread. I look forward to hearing more about your experiments and results. I expect the results will be explainable rationally. I dont expect anomalies to be attributable to hardware or software defects.
Cheers, Rags :-)
> Many of the colors seemed over saturated and garish, at least to me. To top it all off, neither the slide show software nor the projector were color managed. And all the images were in Pro Photo RGB. The term abstract art was used frequently throughout the session.
hahaha. That sounds absolutely horrible! Talk about not having your ducks in a row. Of course, this says nothing about the validity of ETTR, but more about the photographer having the wrong priorities and dabbling in an esoteric detail while not getting the whole picture (pun intended) right. Amazing that there is still so much non-color managed software around. I use keynote on my mac for every presentation I give and it gives outstanding results, whatever the color space of my images. Shamefully, powerpoint is not managed even on macs so I don't use it for anything. Obviously one cannot calibrate every projector, but usually the default calibration is good enough. These projectors are often slightly too blue, but in a dark room nobody will notice.
>But youve got a long way to go to convince me that they should replace basic metering.
In my testing, it always turns out that traditional metering gives better overall results than ETTR. My strategy usually is basic metering which I correct slightly if I see that I have a lot of flickering highlights on the camera preview. I have found the histogram method to be unreliable maybe because the color space conversion in the camera messes things up quite a bit and I end up underexposing images and increasing the noise because of ETTR.
Due the Bayer's uneven RGB filter weighting I have no clue what constitutes correct, or overexposure. I can only rely on the RAW converters histogram as my guide, and through testing how any highlight recovery moves will perform.
I calibrated my incident light meter to my camera sensor by recording a Gretag Macbeth ColorChecker, presented in ACR then reading the 1/3rd bracketed exposures to locate an exposure as close to 255, +1/3, and +2/3rd over 255 (over for intentional highlight recovery). To make my life easier I override the handheld incident meters internal calibration adjustment control in order that the sensor is given the correct photon dosage.
It is easy for me to remember that 100 ISO = 1/3200 sec at F2.8 for 255 in ACR (consistently 1/10th over). I can quickly work out other apertures and shutter speed combinations on the fly from here. Far more reliable than matrix metering, and slightly less fiddling in the RAW converter during processing.
However If the subjects contrast range is lower than normal, the brightest highlight could be a 3/4 tone, the incident meter will be short of ETTR, and therefore a reflective spot reading from the brightest highlight, with +2 1/3 exposure adjustment works best for 255 placement, or an additional +1/3, or +2/3rds over 255 for my highlight recovery routine.
I obtain a better result with the Canon 1D series cameras if I overexpose 255 at +1/3 or + 2/3, recover the highlights, then reduce the mid- tones, as this provides me with a cleaner image - maybe I should class this as my normal exposure, and not overexposed, as the two weaker Bayer induced channels (red and blue) are invariably underexposed in relation to the green channel.
Although I use Canon cameras I have been impressed with some of the Nikon D3 sample images, in particular Joe Macnally's, high contrast, fireman test image, shot at 200 ISO, as it holds up to very strong positive exposure moves in ACR and barely breaks up, better than my 1D MKIII and 1DS MKII cameras - time will tell if Nikon have now provide better highlight headroom.
Maybe camera sensor technology, such as the D3, will slightly alter exposure methods negating the requirement to intentionally overexpose beyond 255, then rely on clever HR mathematics, for cleaner images.
>maybe I should class this as my normal exposure, and not overexposed
I think this is a pithy statement...
In general the technologists tend to get into the theory and mathematics and then kinda forget the fact that a step wedge ain't a real photo...
First and foremost people need to know HOW their cameras and sensors are biased at various ISOs and then learn to expose for those biases...a factor in those biases is whether the scene is within or beyond the range of the sensor. Then, the person behind the lens must make a judgement regarding the aesthetics of the image and whether highlights, shadows or the range of tonalities is the most important aspect...and all of this must be done, in effect, in the time leading up to a fraction of a second exposure...
Personally, my first and foremost concern is getting SOMETHING on the capture. Then, if I've got the time I'll bracket so I've got a range of options. If I'm in the studio lighting, I can do an absolute control of the range in the scene but if it's the good old sun (or variants therefrom), I just aim to get something close cause most the time I'm really not even sure if that particular shot is worth any special effort...
As a professional sports snapper I don't record GM ColorCheckers for a living (or for fun) but I prefer some from of exposure accuracy. I don't have the luxury of controlling artificial lighting, as the studio photographer does, and weather conditions in the UK are testing for photographers at the best of times - especially picture sequences where the exposures must be identical between each frame. Thats why we spent two weeks, every year, photographing some of the world best golfers, for golf instruction articles, in your country, in the sunshine state.
Once I have calibrated the light meter to my sensor, a one off task, to my visual preferences, I'm done, until I change sensors in two or three years time, so I am a 5 minute, once in three year technologist. I concede that my ACR calibration scripting routine takes 35 minutes but I can put my feet up and have a coffee during this technical procedure.
I seem to recall you stated that you exposed large format film to within 1/10 of a stop.
On reading Jeff Schewe's excellent update of Real World Camera Raw, I learned that the behavior that I reported, a different TRC (tonal response curve) with ETTR is expected with ACR 4.x, which now adapts to the image content (see p. 31). Previous versions of ACR did not have this behavior.
ETTR gives the highest dynamic range, and, in my example, I lost 1 stop of DR with "normal exposure". This may explain the different TRCs