I know this is a frequent topic, but it just doesn't click, I don't get it.
Why can I choose a target gamma when calibrating my display? Doesn't this make softproofing impossible?
Dot gain etc. leads to unambiguous TRCs for output profiles. An IT-8 target plus setpoint value tables creates unambiguos TRCs for scanner profiles. Why doesn't my colorimeter + profiling software create a "correct" curve for my display?
The display is one attribute and the output another. In the CMS, the two can have dissimilar TRC's or Gamma, that's not a problem. It's expected.
The display can have a native TRC and if the software and colorimeter define that correctly, that's all we need to worry about (in ICC aware applications! That's an important distinction and a reason why a non Native TRC is attractive to some users.)
But wouldn't that mean that there should be no visual difference between two settings? I usually use Gamma 2.2 but just calibrated my monitor with a target Gamma of 1.0 — it is much, much brighter now.