1 person found this helpful
Your understanding is mostly correct, a few notes:
-We will only upsample to 4:4:4 when we need to process the data in that form, if you have 4:2:2 data at the start, are exporting to a format with similar requirements, and have nothing in between, we will generally keep the data as 4:2:2.
Thanks for clearing this up. I have a few follow-up questions:
1) Would you clarify the conversion process a bit? Does the YUV-RGB and RGB-YUV conversion happen only during export transcoding? What happens during playback from the timeline or from rendered sequence segments? On-the-fly conversion to RGB?
2) If a non-YUV effect or transition is used anywhere in a sequence, is the entire sequence converted to RGB during transcoding?
3) What is lost during the conversion from YUV to RGB and back to YUV?
4) I believe all video effects work in the RGB color space. Correct? Or at least there controls are RGB oriented in that none refer to Cb or Cr. So, what value does the YCbCr Parade scope provide? Would you explain a color correction process that uses the YCbCr scope?
1) Generally the same color processing occurs during playback as export. Where possible we process and preserve YUV data all the way until display where we will convert on the fly.
2) Only the track items containing non-YUV effects will be converted to RGB, and only for the non-RGB effects.
3) Information can be lost in two ways. First some YUV values will map to negative RGB values or RGB values greater than 255 and we will clamp to those limits. Second a given YUV pixel value may map to a fractional RGB pixel value at which point we have to round to the nearest. 32-bit floating point RGB does not have either of these problems.
4) Many effects work directly in YUV. If you are looking for an example of chroma showing up in an effect's UI, video limiter would be one example. How we present controls to an effect is not always directly correlated to how it is rendered. Using the scopes to monitor YUV values has several purposes. Most output formats are still YUV so you will be monitoring what your output will be. Also depending on your target, ensuring broadcast legal values may be important. Further if you are trying to adjust something like the brightness of a scene the luma channel may be the most important value to measure.
Looking for information on Premiere "style" of RGB to YUV conversion.
In the Title panel, in the Color Picker, I select RGB = 191 191 0 and get YUV 161 -84 13.
See attached screen capture.
The YUV values reported in the color picker dialog are converted from the RGB values using the BT.601 matrix coefficients with the appropriate scaling for footroom and headroom with one important caveat. For some reason the traditional offset is only applied to the luma channel and not the chroma channels. With all this said, when I hand calculate the YUV values from your RGB values above I get:
So close enough I guess.