What operating system? What version of After Effects?
The issue that you're referring to is casued by the codec, not by the After Effects application. If you want a different behavior, you can choose a different codec.
Sounds like it may be a footage interpretation issue. If you create a black solid in AE, and render that black solid to the DV codec, do you get 0,0,0 RGB? I do.
I have seen people post inaccurate statements to the effect that YUV codecs such as DV are always forced to a 16-235 range. This is not accurate, because other programs (Sony Vegas) do not force the issue in this way. The codec obviously permits levels from 0 to 255, but the upper and lower ranges are not used-- BY CONVENTION ONLY.
That's nonsense. The 16-235 RGB equivalent ranges are specified by the ITU standard based on the underlying Ycbcr/YUV color space transform, so effectively, there is never a greater range in the native DV data. Whether or not CoDecs choose to expand a normalize that range for previewing purposes on RGB devices, is another matter, but, and I guess that's the whole point, outside of a dedicated editing apps such as Vegas or Premiere, that behavior would not make much sense in terms of creating predictable output. You would wanna sample the "true" colors and do your color corrections and effects based on that rather than operate on the expanded ranges, risking to create invalid colors that would only get clamped or compressed on output.
I'm using AE CS5 on Windows 7.
I did try the test with a black solid... and also with colorbars coming from a TIF image.
The levels in my composition get mangled when rendering an AVI file with DV codec.
The levels in my composition are preserved correctly when rendering a Quicktime file with the animation codec.
So it's not a footage interpretation issue. The issue is definitely that 0-255 levels in the composition are being squashed to 16-235 when rendering to AVI with DV codec.
If I do the same test with Sony Vegas, the input and output levels are equal.
By the way, if I perform the same test in AE, outputting to Quicktime with DV25 codec, then the levels also shift from the original, but not as much. Black level stays constant, but midtones shift upward slightly, as if the gamma were ever so slightly off. Turning "Preserve RGB" on or off makes no difference.
So, in summary, the problem is with After Effects on Windows, incorrectly squashing 0-255 levels to 16-235 when rendering to AVI DV.
There is a similar problem when rendering to Quicktime DV25, but it looks to be a gamma issue.
All of this is very difficult to test, because there is no waveform monitor in After Effects, and the histogram only exists in the Levels effect. It would be really nice to have some way of monitoring input, output, and rendering levels in AE. I don't know how it's possible to do color-critical work in this application. Certainly, I won't be using the DV codec, either the Microsoft or the Apple version, in AE. Ever.
But yeah, it sure would be nice to get to the bottom of this.
Although not ideal, the waveform monitor in the bundled Color Finesse plugin is certainly available to you.
I don't have access to a Windows machine at the moment, but here's a black-white ramp rendered to Animation codec, then re-imported and measured using Color Finesse:
Note the Min and Max values at full 0 and 255 levels.
Now, here's the same situation, but rendered as a DV Quicktime file first:
No change. Whatever is happening to your file appears to be a Windows-related issue.
Of course, if you're seeking a genuinely colour-accurate workflow, you won't ever use DV. It's a heavily compressed codec that simply doesn't stand up to the rigours of professional colour work.
Thanks for the info about Color Finesse, it's helpful to have 'scopes inside AE. Now if they would only make it possible to see the levels at any time, while working on anything. How about it Adobe, is there some compelling reason why we *wouldn't* need to monitor levels in a compositing application?
Anyway, of course DV color is rubbish, 4:1:1 YUV and all that. But my source material is DV, and I was hoping to use that codec throughout the workflow to avoid transcoding problems.
But this is just horrible. "We always force your AVI DV renders to comply with 16-235 by mangling your already perfectly clean levels." Great. And it's definitely a problem in After Effects, and definitely not a problem in Vegas. Now, Vegas uses its own Sony DV codec, so maybe that's the difference. It could be all Microsoft's fault. But then again, rendering out of AE using the Apple DV codec also has issues, they're just not as bad.
Bottom line, DV output from AE appears to be broken.
This might be helpful
Native DV, like any properly-recorded digital component format, doesn't have setup. Setup exists only in the analog world and has no relationship to DV as an encoding scheme or as a tape format. DV looks like it has setup (or not) because the decks that DV is recorded in / played from may add setup (or not).
All 601-conforming digital formats record nominal black at a luma level of 16, and nominal white at a luma level of 235 (in a 0-255 range, using 8 bits: there are 10-bit versions, too, like D-5 and DigiBeta, where the range is 64-940, but the DV formats are all 8-bit formats so we'll stick with 16-235 for this discussion).
When played over FireWire or over SDI or over SDTI, that's what you get: blacks at 16, and whites at 235. When you interchange files digitally, whether as DV-format stream files or QuickTime or AVI with the appropriate DV codec, black is 16 and white is 235. The same numbers hold, by the way, if your computer file holds 601-format uncompressed data, or DVCPRO50 data, or HDCAM, or DVCPROHD, as long as it's stored as "YUV" (really YCrCb) and not transcoded to RGB (wherein a whole range of gain/offset problems can occur, and even a gamma change if you're using Final Cut Pro; see white clip for some of the gory details).
It is correct that DV gets scaled from 16-235 -> 0-255 (thereby clipping super-blacks and super-whites) when converted to RGB – and Adobe apps behave appropriately according to this convention.
An application such as Premiere, which handles YUV natively in certain filters, is able to deal with this expanded range (which can allow you to recover lost highlights, for example).
An RGB-only application like After Effects cannot do this, however. I really wish Adobe would implement a feature that allows this kind of "expanded" interpretation when working with footage but, alas, they do not.
Here's what I do when I need to use this expanded range in AE. You'll need some tools (all freeware – but this won't work for Mac):
Install Cedocida, Lagarith and VirtualDub. Note that you don't really need to "install" VirtualDub; just unzip the files and place them in a folder called "VirtualDub", such as C:\Program Files (x86)\VirtualDub
Here are the steps to set up the process:
- Launch VirtualDub.exe
- Go to the Video menu, choose Compression and select the Cedocida DV codec; then click the Configure button.
- In the configuration options look at the Output settings (lower half of the dialog) and change the setting for YUV to RGB conversion from its default to [0..255] -> [0..255].
- Also, be sure to un-check the box in Output that is labelled YV12, as it may cause the video to lose color resolution.
To process the file in After Effects, first save it out as a lossless intermediary file like this:
- Open your DV AVI file in VirtualDub
- Go to the Video menu, choose Compression and select the Lagarith Lossless Codec.
- Choose Save as AVI… from the File menu and save it to a new file
- Open this Lagarith-compressed RGB file in After Effects and use tools such as Color Finesse to tweak it to perfection.
VirtualDub allows you to save settings files and to create batches very easily if you need to do a lot of these conversions.
Some DV codecs automatically make this conversion, others do not. What specific DV codec are you using? BTW, I never render to DV codecs from AE unless the file is going directly back to tape. Any further processing of a DV codec file will degrade the color information and introduce compression artifacts. There's no way around this except to bring in your DV original then render to a lossless, preferrably 10 bit minimum codec, then do the rest of your editing or processing on the new file.
Actually, to my knowledge, After Effects (and other Adobe apps) always use their own internal DV decoder – so you don't really have any choice. This is why I suggested using an external app, like VirtualDub, that will use a VfW DV codec (like Cedocida) to decode the video.
Sorry to re-animate a dead thread, but I have this exact problem. Dan Isaac's recommendation for Cedocida / Lagarith / VirtualDub seems to cure it, but I still have some questions.
1. Is Cedocida lossy?
The return in highlights detail is significant on some of my overexposed footage. So, even if Cedocida is lossy, I'd use it anyway. I'm just curious.
2. What purpose is served by a Lagarith intermediate step?
If Cedocida converts YUV to RGB, couldn't I just use the Cedocida decoded .avi in AE to do color corrections and then output to some lossless codec through AE? I ask because going from a Cedocida file to Lagarith file results in additional shifts in levels. I'm worried that I'm getting generation loss by creating a lossless Lagarith file and wonder if I'm just better off staying in DV format until the very end.
Any relevant comments would be appreciated.