This content has been marked as final. Show 12 replies
Did you actually find the scopes in Premiere?
If so... what was the problem with them?
He may not have found them but I can identify the problem, they are one of the worst implemented features of PP. They don't have an "always on" position in the PP layout and they only work when the video is stopped and not playing. Just a good implementation of a waveform vectorscope that actually works realtime would be one of the greatest "New" features for CS4.
YOU ARE WRONG. I don't know where you got that idea, but it works RT for me while playing.
Works when playing for me, too. Both in 2.0 and CS3.
It will play in realtime if you turn on one of the scopes inside the program monitor, so that's all that's playing back. However, if you try to gang a reference monitor displaying a scope to the composite program output, then only the composite output plays back--the scope freezes. When you pause the playback, the scope changes to reflect the current frame. This is limiting if you're using your computer monitor to do any kind of color adjustments, but in real professional practice, you'd have some sort of accurate video monitor attached to your edit system, and you'd use that to gauge your adjustments along with the scopes.
I've always just switched back and forth between composite and scopes myself. Never tried both at the same time.
Ryan, Colin... Out of curiosity, what is your intended output format (DV, DVD, Blu-Ray) ?
If it is DVD or Blu-Ray, are you encoding it in Premiere or Encore or are you using a 3rd pary encoder?
Output varies across various media. Sometimes I'm working in DV for TVCs; other times I'm working in 720p for DVD; yet others I'm working in 1080i/p for web. I know, it seems a little backward, but most of my DV clients are "now" whereas most of my HD clients are "now and later".
DVD encoding is generally done using the AME from Premiere, though I'll occasionally spit out an intermediate AVI and go to Encore. Web encoding roughs are done also with the AME (only H.264, single-pass, CBR for quick encodes), whereas "final" versions are sent via DebugMode Frameserver to Sorenson Squeeze Flash and encoded to VP6 or H.264. Might eventually invest in the full version of Squeeze and do my MPEG-2 encoding there, as well.
OK, thanks for the info. Sorry I've got no more "cohesive" answer for you, as such, but here are some things to think about regarding PPro and video levels.
1.) When editing HDV, PPro incorrectly converts RGB<->YUV as Rec.601. This can be seen on the scopes within Premiere. So -- scopes or not -- maintaining accurate colors can be difficult in HDV projects. DV projects with DV media have no issue here.
2.) When output via AME, all YUV sources (DV, HDV, etc.) are converted to RGB. The same is true for DebugMode when set to RGB. Since your MPEG2 (or H.264) encoder must then convert these back to YUV, the transformation is done in such a way that luma levels are essentially clipped to the legal range of 16-235. Yo may still want to know what your luma levels are, but they will never be out-of-range when you output from Premiere in RGB.
3.) Considering point #2 above, it is unlikely that chroma will ever exceed the "legal" range of 240. I am not certain of this, but I believe that AME (and most encoders) will map RGB -> YUV as "clamped" to legal chroma ranges (as with the luma).
4.) Since output to MPEG2, etc. from Premiere will implicitly have a YUV->RGB->YUV conversion, the values you see in PPro's built in scopes are not necessarily the values that will be appear in your final output. The scopes may be give you an idea of how much will be clipped, but little more.
5.) You can get a more realistic and accurate picture using DebugMode FrameServer in YUY2 mode and opening the "signpost.avi" in AviSynth. There is the Vscope plugin for AviSynth that will show you the YUV values on output. Assuming that your encoder (such as ProCoder, CCE, HC Encoder to name a few) can natively handle YUV ouput via AviSynth, these scopes will allow you to see the values that will ultimately be encoded. This YUY2 output may contain illegal values, however. Fortunately AviSynth provides a simple way of clipping these without performing a YUV->RGB->YUV conversion. The command Limiter(16, 235, 16, 240) will clip your values into legal ranges.
So, there's my 2 cents on the subject. Take it as you please.
Thanks for correcting me, AND THANKS FOR HELPING FIGURE OUT HOW TO DO IT. :)
Thanks for the tip. I hadn't tried that. However, it seems a bit weak since you can't see both at the same time. I do have an professional NTSC monitor on my system but I am just used to having scopes running realtime along with my monitor and computer monitor. Also, when I move to HD soon, I probably won't spring for a high end pro monitor for colors and will depend more on the scopes at that time.
I started my career in an old school full on edit suite and learned to depend on scopes and miss them in PP.
I just tried using the program monitor for scopes and monitoring the video on my NTSC monitor. I could live with that setup but I don't have full control when editing in the timeline. For instance, I can start playing forward "L" but I have no control after that. J and K do nothing and neither does the space bar. I have to mouse click in the timeline to get it to stop. I am assuming I am the only one?