• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Major quality disparity between in-project & export

Explorer ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

Hey there,

I'm currently editing a music video and exporting some teasers for it (hopefully I can get an answer before I have to export the final thing) and I've noticed a major difference in quality between what I see in the program monitor in-project and what I'm getting in the project.

At first, I exported a very low quality version (reduced target bitrate) to send over the phone, and now I went to export a high-quality version. This is where I'm getting stuck.

At first I thought it was a matter of checking off "render at maximum depth" and "use maximum render quality" so I did that, raised the target bitrate, and exported. (.MP4). I was surprised to see I was getting a really nasty image in the beginning (and throughout) when the video fades in from black. I tried exporting another version, this time as an .MOV, checking off all the settings I thought were necessary for higher quality - kept getting the same quality result. I decided to go drastic - I went for the "Uncompressed YUV 8-bit" option which yielded a 5 GB file size, but with an equally bad quality video that looked just about the same as the very first low-quality export.

The issue here seems to be banding in the shadows. In the project I'm seeing perfect, smooth gradients but every export gives me the nasty banding. I don't want a "fix banding" work around because in the project there is no banding for me to fix.

I'm really confused here and it's becoming really frustrating. How do I fix this?

For reference, I still don't know much about transcoding (I just drag my footage into my bins in premiere and start editing) maybe that has something to do with it? Although I've never had this issue before? Most of the video was shot with a 5D Mark IV, along with a Mark III at 25 fps. I can provide any other information that might be necessary.

I've attached a photo comparison of what I'm seeing in the project and the export (in this case it's the 5GB Uncompressed one). It's a bit hard to tell in the still screen grabs, it's a lot more pronounced when in motion but you can still see what I mean

quality-disparity.jpg

This was mid intro fade that's why it's so dark. You can also notice it on the column above the left circle between the windows.

Views

991

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Explorer , Feb 23, 2017 Feb 23, 2017

Thank you shooternz (and everyone else whose responded)

I actually use a 27" 5k iMac with a second 1920x1080 Benq Monitor (not intended for anything visual, just media management & such). Would never use a TV - very bad for just about everything

I feel kind of stupid for having asked this question because I sort of realized what the problem was (or at least part of it). I hadn't realized that my sequence was a 1920x1080 and all the clips inside were 720 - so naturally the quality was going to be b

...

Votes

Translate

Translate
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

You should not rely on the built-in Program Monitor to judge the signal.  That is not it's purpose.  It's there for viewing content only.

To properly judge the look of the video, you need to get it off the computer and onto a calibrated TV from a hardware device.  You can do this using a proper I/O device from AJA or Blackmagic, or you can do this via exporting to BD or thumb drive and taking it to a TV.

Having said that, if I recall correctly there is a banding issue that even working in 10 bit won't solve.  Adobe is aware of the issue and working on a fix.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

To properly judge the look of the video, you need to get it off the computer and onto a calibrated TV from a hardware device.  You can do this using a proper I/O device from AJA or Blackmagic, or you can do this via exporting to BD or thumb drive and taking it to a TV.

"Properly" . Bad advice and totally incorrect. Not sure why you persist with this line Jim!

true tone echo.jpg

1. Do not use a TV set.

2. Do not export files for evaluation of Chroma, gamma, luma. (Grading, CC or looks) Why would one not use the media closer to the source and original  resolution and compression!

2. Get a decent Monitor(s)  and set them up to your computer.  Your computer and card have set ups and can be calibrated. TVs dont and cant.

3. An I/O device is optional if you have additional reference monitors or Broadcast quality Monitors and need the connections.  SDI for example.

Monitors...monitors...monitors...NO TVs!

steve_workstation.jpg

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

Don't know who the pic is of ... but using the Tangent Elements panel is sure an improvement in Lumetri ... let alone Resolve, as he's doing.

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

1. Not everyone has the wherewithal to set up the ideal scene for this.  If all you have is a TV set in the other room, you're better off calibrating that and viewing an exported file than you are trying to use the Program Monitor.

2.  Computer monitors that can be properly calibrated aren't inexpensive.  They're not always an option.  But even when they are, you still have the OS, GPU and software to worry about, so I don't consider this a viable option.

3.  I'd argue the I/O card is better thought of as required for a "proper" setup.  It guarantees the OS, GPU driver and even software will not interfere with the signal, thus 'coloring' the look of the video.  But again, this option not always feasible.

It's a matter of degree, here.  From best to worst I'd rank the options as:

1. I/O card connected to a broadcast monitor.

2. I/O card connected to an OLED or plasma TV.

3. I/O card connected to a computer monitor capable of proper calibration.

4. I/O card connected to an LED or LCD TV.

5. Calibrated TV viewing an exported file.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

If all you have is a TV set in the other room, you're better off calibrating that

How?

2.  Computer monitors that can be properly calibrated aren't inexpensive.

Not so.

They're not always an option.

Why?

It is the basic tool of a digital editor. Without a monitor..software can not be worked with.  Video is a visual medium after all.  Shot, edited, mastered, broadcast all with the same digital building blocks.

But even when they are, you still have the OS, GPU and software to worry about, so I don't consider this a viable option.

Well its viable to every other F.X, Compositing, editing and Color suite as well as Broadcast QC  I know of. 

Guess we are all wrong ...except..it a proven post production workflow in a video pipeline that has checks and balances.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 21, 2017 Feb 21, 2017

Copy link to clipboard

Copied

I use the Disney WOW disk for calibration.  Before that, I used the Spears and Muncil disk.  They both offer good test patterns as well as instructions on how to use them for TV calibration.

Good computer monitors that will properly calibrate and are decently accurate aren't generally the cheapest options out there.  Allowing for the likelyhood that a good many of the folks posting here in the forums aren't using such a monitor, I offer the advice not to use them for the task of quality control.

The guys you know really don't use any I/O devices for monitoring?  They all use the GPU only?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 21, 2017 Feb 21, 2017

Copy link to clipboard

Copied

The guys you know really don't use any I/O devices for monitoring?  They all use the GPU only?

Everyone I know use both.  None use TVs apart from maybe client space preview (playback.)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 21, 2017 Feb 21, 2017

Copy link to clipboard

Copied

Every one I know use both.

I'm confused by that.  If a system has an I/O card, why would one not use it?  Why disconnect the external monitor from the I/O card and reconnect it to the GPU?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 21, 2017 Feb 21, 2017

Copy link to clipboard

Copied

I'm confused by that.  If a system has an I/O card, why would one not use it?  Why disconnect the external monitor from the I/O card and reconnect it to the GPU?

In my case...I have 2 Monitors on GPU Card  and a BC Monitor on the BM Intensity I/O. 

I generally only run the BC Monitor for CC / Grade and QC because it has its own luma waveform monitor function.

All monitors  calibrated and matching. All in synch.

Very standard uncomplicated set up.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 22, 2017 Feb 22, 2017

Copy link to clipboard

Copied

I generally only run the BC Monitor for CC / Grade

And that's precisely what I'm suggesting here.  Soooo...where's the disagreement?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 22, 2017 Feb 22, 2017

Copy link to clipboard

Copied

I can confidently fully grade and QC a project without the BC Monitor if I wish to. Because all my monitors match.

The reason for the BC monitor is it can confirm luma levels because it has its own luma waveform internally as a function.

Soooo...where's the disagreement?

My disagreement with you is that you regularly advise "do not trust the computer monitors because they are not intended for quality checking". (words to that effect)

The extension of what you say is that it is not possible to use them for grading etc and users "should be exporting a file to check their work" on a TV.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 22, 2017 Feb 22, 2017

Copy link to clipboard

Copied

It's not perfect advice.  And for those like yourself who know what they're doing and have the right equipment, it's safely ignored.

But if you haven't noticed lately, the forums aren't filled with as many folks like you as it once was.

On the other hand, following my advice will not be detrimental in any way, and it likely will solve issues.

It's a bit like suggesting to use the Media Browser for all importing.  There are times when importing works just fine without Media browser, but there are also times when it won't.  Using the Media Browser, it will always work.

Similarly, if you remove the OS, GPU driver and software from the equation and watch on a calibrated monitor/TV, you will always see the signal as accurately as possible.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 23, 2017 Feb 23, 2017

Copy link to clipboard

Copied

Thank you shooternz (and everyone else whose responded)

I actually use a 27" 5k iMac with a second 1920x1080 Benq Monitor (not intended for anything visual, just media management & such). Would never use a TV - very bad for just about everything

I feel kind of stupid for having asked this question because I sort of realized what the problem was (or at least part of it). I hadn't realized that my sequence was a 1920x1080 and all the clips inside were 720 - so naturally the quality was going to be bad regardless. And the lumetri color effects on top of that were creating the banding because of it's low resolution.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 23, 2017 Feb 23, 2017

Copy link to clipboard

Copied

LATEST

oh ... yea, that sure explains the situation. Ouch.

Been there ...

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 20, 2017 Feb 20, 2017

Copy link to clipboard

Copied

Jim may well have answered this ...

As to Max Bit Depth/Max Render quality options ... normally, those are of no use.

Here's a bit on Max Bit Depth ...

You may benefit from this option in the following situations:

  • Your source media has a higher bit depth than the format you are outputting to
  • Your sequence contains heavy compositing or lots of layered effects (particularly 32-bit color effects)
  • Your sequence contains very high contrast or very low contrast images (for example subtle gradients)

And on Max Render Quality ...

This is a high-quality resize operation that should be used when outputting to a different frame size from your sequence. It can reduce aliasing (jagged edges) when resizing images but is of no use when outputting to the same frame size. This operation significantly increases render times so only use it when resizing.

These, and most other export choices, are explained in this page better than about anywhere else I've seen ...

Understanding render options in Adobe Premiere Pro - Blog - Digital Rebellion

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines