This content has been marked as final. Show 23 replies
I'm familiar with VirtualDub, but didn't every think of it as a deinterlacing program. Thanks for the tip.
Thanks for the tip Jeff. I used VirtualDub with the Smart Deinterlacer plugin and it worked better than ProCoder.
Glad I could help. :)
Dan Isaacs first pointed me in that direction.
I have to do some tests between VirtualDub and the latest release of Sorenson Squeeze (v5), which has improved its deinterlacing significantly.
Free is better than $600, but since I already have a Squeeze license...
I appologize for continuing this thread, but if you would help me out I would appreciate it.
Somewhere I heard that it is best to deinterlace the HDV footage that I capture, so I have done that using Cineform's capture module which simultaneously batch captures and deinterlaces winding up with an AVI.
I assumed that this meant that somehow the 60 fps which were half pictures in the HDV were combined in such a way that it came out 30 fps that were each a full frame or 30p. Is that what happens and indeed does it give better pictures?
Actually I edit the AVI as a 30p preset in Premiere, and then export as an .avi which is progressive and encode to Bluray using MPEG2.
Actually, I also burn a widescreen sd folder, using MPEG2.
I was hoping that by keeping things progressive which I naively think means each frame is a fullframe, that it would reduce motion lines. Then I thought you said that the DVD player had to be a special type to play this progressive footage.
Do I understand the Progressive, interlaced thing and how I should be doing it or should I not worry about interlacing.
I hope this makes sense and I would appreciate your thoughts.
Thanks in advance,
>I assumed that this meant that somehow the 60 fps which were half pictures in the HDV
Just a small technical clarification here. 1080i is actually 30 frames per second. It is 60 fields. Listing fields seems to confuse people. I do wish camera makers and such would stick to listing only frames to be more consistent.
So, you start with 1080i/30, and after deinterlacing, you end up with 1080p/30. Same frame rate of 30 for both, only the i and p have changed.
Finally, if you're end goal is Blu-ray, the only 1080 progressive frame rate in the specification is 24 fps, so it wouldn't make any sense to deinterlce your 1080i material if shot at 30 fps, as it would have to be reinterlaced for Blu-ray, and that may cause unnecessary artifacts.
The short answer is, is you shot in 1080i/30, edit that way. Don't deinterlace.
I appologize for being "mathematically challenged" , but with your help I may be starting to get it. If you have time I would appreciate it if you could help me with a couple of other questions which may appear stupid.
1. I will be using an HDTV with an upconverting BluRay player which I assumed winds up changing the stream from the disk to progressive anyway, so why not try for a progressive output for encore to encode.
2. I'm still not sure what Encore does when it encodes the Cineform progressive .avi that Premiere puts out. I select the MPEG2 choice in the BluRay project settings, and the footage looks good on my TV.
Do you think Encore is encoding a progressive stream?
Sorry, but I've looked at a bunch of Wikopedia articles and can't figure it out.
Thanks in advance.
>why not try for a progressive output for encore to encode.
Because 1080p/30 is not a legal stream for Blu-ray. Let the player/TV do the work.
>Do you think Encore is encoding a progressive stream?
At 1080 and 30 fps, no. That's not part of the Blu-ray spec. 1080 progressive can only be 24 fps. What encore does under the hood to make the progressive stream usable I don't know, as I'm not a programmer. My vague guess is that it'll just throw out one field and make it 1080i/30, which is the legal spec.
Neil Wilkes and I just had a discussion about this, and I agree wholeheartedly with his observations.
I find it really hard to accept that interlaced *anything* could be considered "high definition". 720p - yes. 1080p - of course. 1080i? Questionable at best.
My upscaling receiver can upscale SD sources to HD resolutions. It has a 1080i option. I use 720p. I think it looks better, despite the mathematical difference in horizontal resolution.
1080i shown on a set that can handle the interlacing is sharper than 720p. But to be honest, your TV set needs to be 60" or more to be able to see the difference. I did my research on 65" sets.
This does not apply to PC screens. Just properly set up HDTVs with decent Blu-ray players.
But the set needs to be able to write the material to the screen like old CRTs did. Otherwise you are deinterlacing and that causes problems as you point out.
Personally, I feel the only HD format, both for broadcast and Blu-ray, should have been 1850 x 1000 at 48 progressive frames. Perfect match for (most) films, which really is the standard against which all else is judged.
Would it not be time to get rid of the ludicrous NTSC (never the same color) encoding, forget about the hype of 24P, the limited IRE range of NTSC and just switch to PAL standards with 25P or 50P or even 100P with full IRE range, with TV sets mostly capable of 100 Hz display. It would make life so much easier and reduce the number of questions here significantly and give you all much more time to do what you want. Editing and producing.
Use one standard (PAL), forget about 24P, switch to 25P or a multiple of that and get rid of all the confusion of all these different standards. In terms of resolution I do not disagree, Jim. Just about the 24P hang-up you seem to have as do a lot of NTSC users. Your other thread about the film look reflects that.
What does this mean to the average amateur like me. I've exported a cineform avi from Premiere CS3 which was not interlaced and burned out as a BluRay folder to MPEG2.
Is that the best way to go? When I use Gspot on the files I find in the "stream" file of the BMDV folder, there is nothing about the frame rate. I guess I would just like to find out what I'm doing, even though it looks good playing from a PS3 into a 65" DLP TV.
If it looks fine, why mess with it?
> 1080i shown on a set that can handle the interlacing is sharper than 720p
As sales of the Movie Looks plug-in has demonstrated, sharper is not necessarily "better".
On my 57" Mitsubishi Diamond DLP, my opinion is that 720p is better (smoother, more solid, more pleasing). There may be more edge detail with 1080i, but I find that to be objectionable with all program material except sports.
I guess I don't understand is what you are referring to when you say 720P or 1080i? Are those the resolutions of the stream that comes out of Premiere which has to be encoded to MPEG2 or H264 or are they the resolutions of the clips that have been encoded for use in Encore in the BluRay folder?
Life is grand in PAL Land
>On my 57" Mitsubishi Diamond DLP
Is it a 720 or 1080 set? That certainly makes a difference. And at that size, there is no real point to a 1080 set. The pixels are small enough at 720, no need to go smaller.
And by sharper, I meant that we used calibration cards to determin what could be seen, and not seen.
It's a full 1920x1080, 1080p DLP set. So I'm not imagining things. But this is definitely a subjective measurement. And I still say sharper is not always better. :)
>And at that size, there is no real point to a 1080 set.
Actually, there is. Blu-ray players that can output 24p at 1080p *require* that the set be able to display 1080p. A 720p set takes away that viewing option. And I can tell you, having watched just a very few BRD movies at that frame rate, you don't want to lose that option. ;)
>I guess I don't understand is what you are referring to when you say 720P or 1080i?
In this case, I was referring to the audio/video receiver upscaling a source stream from its native size (which is something less than 720p or 1080i). I use it to scale my standard-def sources like cable, laserdisc and my DVDR.
My Blu-ray player is set to upscale any disc that it can play to 1080p, independent of what size the source video is. It is most useful for standard-def DVDs.
This has nothing to do with producing content, only with viewing it.
Thanks. My PlayStation 3 also upscales, but are there upscaling options (Say 1080i vs 1080p) on these players. I'll check mine if I can find the controls.
I'm trying to figure out a workflow for BluRay authoring. Right now I capture HDV footage using Cineform and the HDV is automatically converted to a 1440x1080 AVI which is deinterlaced.
Then I use a PremPRoCS3 Cineform progressive project to edit and then export out as a progressive .avi.
So it's imported into Encore and now what happens? If I use the Encore automatic BluRay rendering, it creates an MPEG2, but is it a 24P 1920x1080 "stream"?
Or does it make any difference what's on the disk if the player is doing all the conversion.
Sorry to ramble, but any thoughts concerning the above BluRay workflow, or the Bluray disk contents would be greatly appreciated.
If the PAR of the 1440x1080 video is 1.33, then the video will fill a 1920x1080 HD screen, but the video itself will not have 1920 pixels along the horizontal dimension.
24p refers to the frame rate and the fact that it is progressive. Unless you recorded in 24 fps or converted to 24 fps during export, the video clip will have whatever frame rate was recorded or exported.
As with all video sources, produce at the highest resolution and quality that you can and then export for your target audience. Since your target is Blu-ray, it's easy - export at the highest resolution and quality that you can. Your audience is responsible for how it will be viewed, and ultimately, how it will look on their system.
More than that general advice I cannot give. Right now, I'm only watching Blu-ray, not encoding or authoring it. Sorry.
Thanks. I really appreciate your expertise.