I'm using Premiere Pro CS5.5 on a PC running Windows 7, 2.80GHz Intel Core i7 CPU, 6GB RAM, 64-bit OS.
I am working on an interview I shot with 2 Canon DSLRs, a 500D and a Rebel T3i, using audio captured with an M-Audio Microtrack II. The audio was edited down in Premiere then rendered out and handed over to a third party to be cleaned up in Pro Tools. The length of the edited interview is just short of 4 minutes.
When I preview in my Premiere timeline the audio is synced but when I render out the clip the audio drifts so that it starts already out of sync by a frame or two and by the end it's clearly off by a lot. I'm not sure how to proceed since the audio looks more or less perfect in my timeline.
I'm really hoping somebody has some suggestions as this project deadline is tomorrow, I'm supposed to be just making minor fixes at this point but suddenly I'm dealing with an issue that has totally broken nearly half my video.
What are your project/sequence settings, the formats you're exporting to and the frame rates of the EOS footage? This type of thing can sometimes happen if the audio sample rate is being changed at export, but the obvious question to begin with is which track is actually correct - is the audio running slow or the video running fast? How do the lengths of the exported footage compare to your sequence?
I'm definitely working with a mess of a project. Frame rates is actually one of the things I'm looking into and there are a number of discrepencies in settings between, raw footage (30fps from the 500d & 29.9 from the rebel), sequence (29.9fps) and export settings (59.9fps). Unfortunately in the overall project I'm working with video from a number of sources so I'm starting with a number of different frame rates (in other sequences I have captured video game footage that is 60fps too). I'm still going to try and match them up as best I can and hope it helps. I'm just worried that the destination for these rendered clips is a pre-existing After Effects project where all the comps are 59.9fps, if I try to change those and that causes problems I could be looking at a pretty huge set back right before my deadline.
To add a little more detail, the problem seems to be getting progressively worse. In early renders there was no sign of an issue, then yesterday during a review we noticed it was slightly off so I thought i fixed it did a render last night and when I checked it this morning it was as I described in the first post. I have at least 3 or 4 early renders where the problem was either not there or so slight it's hard to notice. The really weird thing I've discovered, if I put the rendered clip back in premiere it syncs up, but if I play it in quicktime or use it in After Effects (the next step in my workflow) it doesn't. Guess that could still have to do with frame rates. We'll see.
Thanks for the tip
and, wow. I've just now noticed that the file I got back from the audio is guy is 41.1Hz instead of 48Hz which is what I've been working with. I'm thinking this might be my main issue (fingers crossed).
Well, converting the audio clip to 48Hz and dropping it back in th eproject didn't help. I guess next I'll export at 41.1 before i rule that out completely.
Update: I've now also tried exporting my existing sequence at 41.1Hz and creating a new sequence at 41.1Hz and exporting at the exact sequence settings with no change in results.
Update: I've now tried creating yet another sequence. This time 30fps (the frame rate of most the video in the sequence) and an audio sample rate of 41.1Hz. Then rendered a quicktime .h264 100% quality, 30fps, 41.1Hz. The render I got was slightly better but the audio was still off from how it appears in my timeline. The clip is the exact length of my sequence down to the frame 5:15:29 and again if I put it back in Premiere it plays perfectly but in quicktime or AE it drifts.
in other sequences I have captured video game footage that is 60fps too
Sorry, that is not possible. Capturing only works over firewire from a tape based camera and that never is 60 fps. Please get your terms and phrasing correct to help us understand you in order to help you. 41.1 KHz is a very weird format. 44.1 is possible but not advisable. 48 KHz is the regular format. NEVER use QuiRcktime if you can avoid it. It causes serious gamma shifts and reverts your 64 bit environment to 32 bit.
Capturing only works over firewire from a tape based camera
I'm sorry but that is not true. I use a blackmagic intensity to capture game footage straight from XBox 360 console via HDMI. The footage I get is 60fps (the frame rate of the XBox 360). totally possible and in fact pretty standard for people capturing footage from video games. Why in the world would I ever want to capture video game footage from a tape based camera?
You are correct about the audio sample rate, I miss typed it the first time and then my brain continued to type it that way. the offending clip was 44.1 but I've since gotten a 48 clip from the audio guy and the problem persists.
I use a blackmagic intensity to capture game footage straight from XBox 360 console via HDMI.
That is not capturing but ingesting. Confusing these terms can lead to misunderstanding. Like saying to your car mechanic, when I drove here versus when I swam here.
Again in my industry that is very commonly reffered to as "capturing," the devices we use to do it are reffered to as "capture cards". I'm sorry you are not familiar with the specific area of video production that I work in, but that does not make it wrong. I've been working in video production for nearly a decade, I've been specifically working in the games industry for over 3 years and every time I've ever discussed capturing video game footage with my co-workers, video producers from other studios, our publishing partners or even third party production houses that we sometimes use to make trailers "capturing" is exactly what we called it, and "captures" is what we call the resulting footage. Your failed attempt at a metaphore is both insulting and blatently wrong. Do you have any useful advice or are you simply trolling here?
Cannibal Shogun is perfectly correct, the term 'capturing' is standard terminology for the process of recording computer screen images using an application other than the one drawing the screen data (Adobe use it for that exact meaning, it's why Captivate is called Captivate and not Ingestivate). The word has a slightly different meaning to those working in tape, but the definition of 'capture' in Harm's glossary still applies to what Cannibal Shogun is doing with his XBox, as the console is behaving as the camcorder or DV deck, sending picture frames to a device which encodes them into digital video.
..and if people don't start acting polite and talking about the topic in question rather than attacking each other, I'm locking the thread.
What application did you use to Resample the Audio? Have you tried any other codec export besides Quicktime H264? Any time you have audio drift over time like that which gets worse as you progress is almost always a sample rate issue /re-sampling between 44.1K and 48K. Those 2 sample rates are the only ones so close that the issues starts as a frame or 2 and gets worse each second. Do you only have the 1 audio file that you had to resample in the sequence your exporting? Did you by chance select Non Drop Frame in your project sequence or export settings?