I want to hash this out because i think i am right. I preface this with the fact that i use a NECPA27 for CC & Grade along with a Io4k (over kill for me).
For Video: Color is somewhat a phony science in this way. 97% of the monitors looking at video ARE NOT CALIBRATED. So your video graded on a 20k wide gamut monitor will still have a soft yellow cast on Dell monitors & Sony HDTV. It will appear lighter and slightly cyan on a LG or have a magenta hue on the HP DREAM-COLOR.
If i CC & Grade on a LG 4k or 1080 TV then it looks good on ALL LG TV's if the user leaves the tv on its defaults. But no one leaves their TV on default, and people watching TV on LG's GameMode will see something different than another watching in Cinema mode.
Maybe all movie theaters are calibrated to show exactly the same color values but i really doubt that also. I do not do many movies. This doesn't mean that for the colorist or freelance swiss army knife can not benefit from having one decently calibrated monitor with the correct output is not needed, but It is only a benefit to the post team. Once its out for mass consumption there are tens of thousands of settings that people do to their TV's or monitors make them feel like that picture quality looks good to me.
I think that if you are doing a sitcom, commercial, ENG, music video or CGI you can be just fine with a well reviewed wide gamut monitor between $750 - $2500USD. An I/O can be had from $295. - 2000. and you have a nice system that will give you PRO quality grading that still will be effected by those 10,000+ user setting on the average viewers TV or monitor.
So, I put this argument on the table and i feel like its sound. Thoughts?