Just a quick one, I am re-reading the BBC SPec sheet as I feel I need to get to grips with it 100%, as we are about to go into shooting proper, and I still dont really know what Time Code is all about!!
Once of the sections mentions that Electronically generated moving graphics and effects (such as rollers, DVE moves, Wipes, fades and dissloves) must be generated and added as interlaced to avoid unacceptable judder.
Ok, I get wipes, fades and dissolves, but what are rollers and DVE moves? and since we are shooting and editing in 25p, is this still neccesary?
Sorry, heres another one I dont quite understand:
Field dominance - Cuts in material must always happen on frame boundaries (ie, between field 2 and 1)
Since we are shooting and editing in 1080p25, can we edit without worry, and then seamlessly export to the required 1080i/25?
You'll have a hard job cutting on the wrong field with 25P material so you can ignore that one. In fact it is hard on any NLE to cut on the 'wrong' field. (Sometimes material comes in from a studio where the vision mixer is set to switch on 'either' field - where 50% of the cuts are wrong - but that hasn't happened to me for a few years now.)
A roller is an end credit roller, a DVE move is really any sort of 'Digital Video Effect' (e.g. slides, pushes, zooms etc).
If you render with fields enabled (Upper first) then you'll get field based motion.
You can see the difference if you export a small animation - one with fields off, one with fields on.
Reimport and put in a 50fps composition... step frame by frame and you'll see animation every frame on the 'fields' rendered, but only every other frame for the 'frame' rendered default.
(I think 50fps comp is only way of stepping by fields in AE?)
But then are you really 'editing' in AE?
Thanks for the reply. Great news on the cuts, not so great with the rest (but thats probably because I just dont quite understand it yet!)
Since we film progressive, and add effects in AE to that footage (mostly post zooms and pans using 4K images in a 1920x1080HD comp) before we save and import each project into a premier sequence for editing, is this still going to be an issue? I tried the test suggested by you, but, probably due to inexperience, I cant see a difference.
I took 1 second of footage, made it 3D, pushed it back in Z-Space, and keyframed ove the second that it zooms back to fill the view screen. I took this to premier and exported through media encoder, 2 MXF OP1a files, one progressive, the other as Upper Field First. When I place these two renders back in a 50FPS AE comp, and play frame by frame, each comp does the same thing, ie, zoom on one frame, no movement on the second, zoom on third etc.
Am I making an issue out of nothing for this? Can I blissfully shoot progressive, rough cut in premier, link to AE and retime, key, add effects etc to my hearts content, then take each project back to the premier timeline for final editing and export as MXF OP1a upper field first as requested by BBC? And not worry about wierd judder?
As an aside, I am reading through the 41 page document, and am finding a few things I dont understand, I will post all the questions on this discussion , if thats ok?
Changing the title of the discussion from "Just a quick one...." to " And then....."
I see that a Video line up with colour bars of type EBU 100% or 75% is requested. I have read a few updates on creative cow done by Dave, and i am kinda getting it, but what do we use on AE or Premier to generate the bars?
Is the Harding Flash Pattern Analyser Algorithm something we can purchase and run ourselves, or do we farm this out once the project is complete, just before actual delivery?
The FAQ on this site have two sections that will help you avoid "judder" or Flicker. I also wrote an article on judder that you'll find here.
You must be aware of and check for these problems in every stage of production. You must also watch for judder with camera moves. Shooting at 25P you will also run into problems with Critical Panning speeds that cause the same problem with video and film. Since a lot of your work involves miniatures you'll have to be especially careful about this.
The only way to really check your footage for these problems is to view the footage on a broadcast monitor. Refresh rates on computer displays can hide problems or show problems that don't exist in a broadcast situation. It's also important that you watch your finished footage from a proper viewing distance. Too close or too far away and your eyes can start to play tricks on you.
If you get everything to work and your picture and sound are great then prepping for compression for final delivery is just a matter of using the right settings in your compressor. Every network has their own standards. If your production master is perfect simply loading up a standard and approved preset for delivery is all that it takes to perfectly match the delivery standards.
I hope this helps. I hope you realize how lucky you are to have a long term paying gig that gives you the chance to ask these questions.
I'm an editor, mainly Avid, not an AE guru so apologies if my explanations are a bit off. Your exports are frame based animations.
The render settings I adjusted were in the AE render queue (i.e. AE needs to know it is rendering fields). I exported from AE then reimported into AE and added to a 50 fps comp in AE.
Pr I don't have installed currently, but if that can jog by fields that saves the 50fps comp workaround.
I think you can probably work in 25P to your hearts content without an issue, after all anything film originated is 25P (including the end rollers). Almost everything that I've seen that has gone through AE is rendered as frames.
If you have an end roller I would ensure that has 50 fields per second movement though.
Your retiming might well be a little funky if you were to render fields as it wouldn't match the temporal movement of other aspects of your scene (which would be frame based if not retimed).
Harding machines are expensive... many facilities will do a full tech check for a price, including Harding. There are also online services you can upload files to. Factor in some time to make fixes if it fails for any reason.
In Pr I think it's 'file - new hd bars & tone' to generate colour bars - I suspect they are US spec rather than EBU though.
http://www.belle-nuit.com/test-chart is a good source of bars for use within your project to keep an eye on levels as it is full range (0-255 in 8bit) and you can see any shifts that occur between apps but I don't think the Beeb will accept it on final output, they'll need 16-235 bars and -18dB tone matching their specs.
Always great to read a response from you sir. I will check out the FAQ right now, and your link as well. I guess asking another question before reading them might be counter productive, as the answer may be there, but Ill do it anyway - its late in the week, and I feel like being a rebel!
Do I understand you correctly, that if I complete the project with progressive footage, add all my effects etc to the footage as is, and end up with a perfect production master - no visible artefacts of any kind, I can then run this through Media Encoder and get a perfectly acceptable interlaced, broadcast ready clip? (I am aware of the many other issues - sound, colour safe etc, so this question is directed only to the section in the BBC specs that state all effects must be done on interlaced footage)
And I most absolutely do sir, I count my blessings daily!
It has been just a little over a year now, we have still not started shooting, but I am able to learn daily, and earn a salary doing so. It really is a dream come true. Now I just need to get my ducks in a row (or my fields in a phase...) to ensure the project is not immediately kicked out when we present it to the broadcasters. As mentioned right at the start (And Trevor, if you dont mind, Ill be picking your editing brain later for this) I dont even understand timecode yet.
Sigh - but what a ride this has ben so far....
Thank you all
Thanks. I will get the Bars for now. I know for a fact that there is still tons we need to do before we are ready to present to investors, like getting an actual broadcast monitor in the office! - not to mention properly colour corrected screens. We are shooting with Red, so I am currently doing a workflow breakdown keeping with native Red files, and will go into the colour correction stage much later. For now, as long as the files are intact and backed up, I know we can come back to them when we need to.
I need to think of investing in a colour correction program. At the moment, I use RedCineX - its all we have.
I was wondering about the frame based output. It would make sense to me that a discreet frame, that looks and acts correct, shoult be able to be seperated into two fields, that look and behave correctly. BUT,that is an uneducated assumption. I am off to read Rick's links, and hit the bars....um, the colour bars that is!
I read the documents, and they are great, but I am still wondering, If we shoot progressive, and the footage looks great - no flikker/judder/twitter or other, and we then take it to AE and apply digital effects, zooming and panning within our 4K space, once this is then exported as interlaced footage for delivery, is it possible that merely changing from progressive to interlaced could introduce judders etc?
If your're working in 4K and deliver in a smaller broadcast format (HD, SD) I'd rather worry about the downscaling quality as this can introduce serious flicker issues. As for interlaced, you are going to lose some detail but are on the safe side with fast transitions. If you are not obligated to deliver as interlaced and your transitions, VFX, etc. are guaranteed judder-free, I would render progressive for some more detail.
I need to think of investing in a colour correction program. At the moment, I use RedCineX - its all we have.
Do not underestimate Color Finesse, which is included with After Effects. If you managed to read up the Color Correction Handbook, you find that Color Finesse is a nice start to implement your brand new theoretical knowledge.
And I think SpeedGrade is also at your disposal. Check out, for example, some new features introduced in upcoming SpeedGrade CC:
A year with CS6, and I never even knew I had speedgrade till today! - Thats how Ive been focusing on AE and Premier - and I still know so very little. This is a great journey. As soon as I process the hundreds of BBC spec document pages, Ill be looking into speedgrade some more!
Most of our 4K footage wont be downscaled, but rather placed within the 1080 viewing window to fake some post pans and zooms. If the full 4K frame is required, I will make it a 3D Layer and push it back in Z-Space.
At least that is the plan, does that sound like something that makes sense?
Well it depends how your editing software handles the downscaling. In case you run into issues I can recommend this workflow: http://www.bellunevideo.com/tutdetail.php?tutid=12
It's for HD to SD conversion but can be easily applied to any other format.
If the full 4K frame is required, I will make it a 3D Layer and push it back in Z-Space.
That's still scaling the shot. You're re-sampling the images. AE does a better job scaling down than scaling up but sometimes, just like in Photoshop you will want to add a bit of sharpening to the layer at this point. Most of the time nothing is required and over sharpening can lead to other problems when the shot is played back at full frame rate.
Once again, this points out the importance of a broadcast monitor for final quality judgements. Nothing can beat a well setup production environment. Even though I do most of my work on a MacBook Pro Retina, my critical judgements for critical projects are always checked for final quality on a fully calibrated broadcast quality monitor in a color neutral room from an optimal viewing distance.
I will raise the issue at next weeks meeting to start pricing out a production monitor and reviewing system.
Have a great weekend all - 2nd day spent reading through the BBC technical specs, and the 140/150pg DPP documents. There are a lot of questions answered on the DPP site, for those interrested:
Some good articles too.
Just a quick re-check....
Rick, where you said
"If you get everything to work and your picture and sound are great then prepping for compression for final delivery is just a matter of using the right settings in your compressor. Every network has their own standards. If your production master is perfect simply loading up a standard and approved preset for delivery is all that it takes to perfectly match the delivery standards."
Does that mean that if the project is filmed with progressive footage, and effects are added to this footgae as normal in AE, that we can still deliver interlaced end product without any issues?
All SD broadcast material sent over cable or broadcast is interlaced. It has to be. If field one and field two are identical then you have "progressive" footage because you have pairs of identical fields. If field one and field 2 are different then you have interlaced footage.
HD and digital content is a different matter. Depending on the network you can send out or broadcast all kinds of different formats.
Thats what I thought. I did a lot of reading up around progressive and interlaced when I started up last year, and I like to think i kinda have the hang of it, but this part of the BBC tech specs is what is confusing me:
"All material delivered for UK HD TV transmission must be:
1920 x 1080 pixels in an aspect ratio of 16:9
25 frames per second(50 fields)interlaced
-now known as 1080i/25
colour sub-sampled at a ratio of 4:2:2"
So we need to deliver interlaced, but this is the part I am trying to figure out:
"2.1.2 Post - production
Electronically generated moving graphics and effects (such as rollers, DVE moves, wipes, fades and dissolves) must be generated and added as interlaced to prevent unacceptable judder"
I cant seem to duplicate what Trevor mentioned earlier in the post, and so I am trying to determine 100% wether we can continue to shoot progressive, keep the footage progressive the whole way through the workflow (adding mouths, liquify effects, space effects etc) and then once 100% complete, we can render it out as interlaced for delivery. I am so worried that we go through the whole process of completing the project, and then when we deliver, it gets turned back for something we could have prevented at the start.
I feel I finally conquered some of the After Effects and Premier mysteries, but now the workflow and technical issues are cropping up, and I need to provide the answers before we get going.
If your underlying content is 25 distinct images per second (i.e. frame based motion) and you are modifying parts of the frame (e.g. mouth movements) then trying to do that with 50 distinct images per second (i.e. field based motion) then you would never be able to get it to match. This animation must be progressive if the underlying plates are progressive.
If you have transitions between scenes that are (for example) 'push' effects where one scene moves off and is replaced by one that moves on then, strictly speaking, that movement should be field based. I'm sure you could get a dispensation for it not being field based, provided you ensured the movement was not too 'juddery' (subjective).
If you have an end credit roller then that should be field based. Again you might get a dispensation if you thought you needed it artistically.
Best to do some tests and get some clearance from your commissioning body for anything you are unsure of (which will be much more definitive than the Internet's guesswork).
Or do a deal with a facility or post-production supervisor to act as your technical guides... They have many years experience of this (which is why it is more expensive than dry-hiring equipment).
Thanks for the reply. I am going to shoot a few close ups today, and then record a test vocal. Ill add the voices, as well as some face movements. Ill do this on the R3D footage, and once complete, I'll render out a few options and check whats what. While we wait for our broadcast monitor, as Rick suggested, I think this is the easiest technical test I can do.
Thanks to everyone for all the replies, I guess I should have just taken the time to do these tests first, but I needed to get my mind around what exactly I was testing for - thanks to all for helping me clear this up.
I really hope that in the near future, you can all see our project on TV. and see what it is your helping me with!
I did some tests and here are the results:
1) A close up of a characters eye on the Scarlet.
2) Pushed character from side to side
3) RedCineX to Editorial 4K version
4) Rendered out editorial through AE as Progressive (as is)
5) Rendered out editorial through AE as Interlaced (as is)
6) Added Liquify effect to both Progressive and interlaced output ( in its own comp) and re-rendered (to Progressive and Interlaced)
7) Rendered out Progressive comp as interlaced (after effects had been added)
I have uploaded the 5 files to DropBox and am happy to say, since I have never had a use for dropbox before, I have no idea how to use it!
Anyway, If anyone is interrested in seeing the results, and knows how to share dropbox folders, let me know and ill give you access.
To explain though - Progressive and interlaced renders with no effects look pretty much the same, as the movement was minimal. Liquify added to progressive looks great (as is expected) and interlaced has the slight combing effect common to interlaced footage. this combing does seem to be a bit more pronounced, when the effect is added to the progressive footage and THEN exported as interlaced.
So there you have it. I need to decide if the effect is that so much more pronounced as to cause viewing displeasure, then Ill have to add a few steps to our workflow to add these effects to intermediate interlaced renders.
What fun - now where did I put that "Do everything in one click" button again....
I went one step further now, and took some macro footage of the characters eye. Nice and big with the scarlet. I took the footage to AE, and rendered out editorials. I took a small section of the clip where the eye moves from left to right, and set the comp up. I then replaced the quicktime footage with the original red files, did some colour correction and rendered out new masters as Quicktime, lossless animation at 1080x1920 - one as progressive, one as interlaced. I imported the renders and made two new comps from them, added liquify and did some madness. I rendered them out again, doing the progressive one twice, once normal, and once as interlaced.
What I am testing here, is the outcome of adding effects to Progressive footage, and then rendering out as interlaced VS adding the effects to footage already converted to interlaced.
There is a difference, but very very small. In fact, the interlaced "combing" effect seems to be slightly less pronounced when adding the effect to the progressive footage, and only THEN rendering an interlaced file.
Whew, This is fun, but at this point, before we double check on a broadcast monitor, I am happy to say we continue the entire project from sourcing to final step in progressive, and then render out an interlaced version for delivery. I am relatively confident that the effect should not be disturbing to view.
Oh yes, I can add these renders to drop box too if anyone wants to see...
Interlacing lessens the chance of judders but increases the chance of dancing edges especially in type for things like credit rolls. If you render your footage progressive and then re render it interlaced you have pairs of identical fields so it looks exactly the same as it would if it were not interlaced. You change nothing about the way the shot looks. If you render a composition with movement as interlaced then you change the look of the motion.
I would choose one. Interlaced footage will look smoother than progressive with camera movement. Progressive will have a more pronounced motion blur but if you are filming stop motion then the motion blur is a mute point because you'll either have to fake it with something like Force Motion Blur or live without it. The point is that taking some already progressive footage and re-encoding it as interlaced will not change the look of the footage at all.
I understand that re-rendering progressive to interlaced would nat change anything, what I did, was added the liquify effect before rendering. I made the characters eye swirl as it moved accross the screen. the original footage was progressive, then immediately rendered a copy out as interlaced. added the same effect on both sets of footage, then re-rendered them, and of course rendered the progressive one out as interlaced with the newly added swirly eyes.
So there was a change to the footage layer before each render.
The whole conversation really came to be because of the BBC Tech specs sentence about adding DVE and rollers etc as interlaced. It kinda threw me a bit, but I think I have it now. I am starting to digest all the info you, fuzzy and trevor have given me. It is starting to make a bit more sense.
Once again, I am truly grateful for all of you that share your knowledge so freely. once the show is a hit, and we have bought holywood, Ill have i get the creative team to make puppets for each of you and do a tribute show! INTERLACED and PROGRESSIVE!!
Sorry to dredge up an old thread, but I was wondering, you mentioned Speedgrade, and I have confirmed that I do have it as part of the suite we use. Still being very new to colour work, is there a preffered intermediate render output for colour work?
Our final end delivery must be MXF OP1a, but that format is not compatible with speedgrade. Should I try an image sequence, or a video wrapper instead? Is there even a difference, or is this also a "Whatever fits your budget" type answer - in other words file size being the decider?
1. As far as I understand, BBC doesn't currently accept file delivery: they're still under testing with selective acceptance and plan to make a special announcements about delivery by file. So, make sure you'll be allowed to deliver by file from the very beginning. Otherwise you'll have to start your delivery on HDCamSR tape.
2. Currently you have to either save a look and apply it in AE or render an intermediate of your choice out of SpeedGrade. With Lumetri Deep Color Engine in upcoming CC 'SpeedGrade <---> PrPro' workflow will definitely become easier.
Regard to preferred intermediate codec, it's, as always, about finding optimal 'quality / file size / render time' ratio.
Thanks for the info. I am aware asbout the file delivery to BBC not being in use yet, I am hoping that they are ready according to their projected timeline, which should be early 2014. We will probably only be ready around then as well.
Since there seems to be no dynamic link between AE and speedgrade, I am going to test sending the project from Premier to Speedgrade and see what it does. Currently I start in Premier, Replace with AE comp, no will just go bacl to Premier to do the colour work.
Ill let you know how it works out.
Since your AE comps are a complex compositing work, you need to render an intermediate in either case, 'cos it's unwise to rely on Dynamic Link under these circumstances, and you can't send dynamically linked comps to SpeedGrade from both AE and PrPro. Hence, you have to render DIs.
I also humbly remind you about Color Finesse: it's a nice colour grading tool, and it may appear Color Finesse is all you currently need...
I have played around with Color Finesse, and as weird as it seems (even though Speedgrade has so many more complex functions), I seem to be getting better results from Speedgrade, quicker than Color Finess.
I will still play with both as I have about three weeks before I need to have initial the colour workflow figured out.
Thanks again for your advice - it is truly appreciate.
Ill probably stick to either an Image sequence for colour correction - just need to remind myself of which format has the best colour bit depth ( I cant remember offhand if it was TGA or PNG, but it's time to go home now, so Ill check in the morning! )
Thanks again, chat later