Copy link to clipboard
Copied
I have a GTX 980M GPU. I'm working with 10bit 4K footage and need to export a 10bit master of the completed edit.
A colleague tells me that this GPU can only display 8bit, therefore it can't export 10bit in Premiere using CUDA, and that I can only export 10bit by choosing 'Software Only'.
I told him that if the 980M is 8bit (I'm not disputing this), that would only affect the monitoring bit depth and would in no way limit the ability to export 10bit.
My understanding is that the GPU has no impact on image quality when exporting, it only adds more processing power by teaming up with the main processor to achieve a faster export.
So which of us is correct?
Cheers.
There is a BIG difference between the video output from a GPU card via a device connection and the processing the video card does along with the CPU ... and in PrPro, all color calculations are done in a 32-bit processing chain.
Period.
So ... you're good to go with that GPU.
Neil
Copy link to clipboard
Copied
There is a BIG difference between the video output from a GPU card via a device connection and the processing the video card does along with the CPU ... and in PrPro, all color calculations are done in a 32-bit processing chain.
Period.
So ... you're good to go with that GPU.
Neil
Copy link to clipboard
Copied
Thanks Neil.
Copy link to clipboard
Copied
the 980M is 8bit, that would only affect the monitoring bit depth and would in no way limit the ability to export 10bit.
That is correct.
My understanding is that the GPU has no impact on image quality when exporting,
That is...not quite correct. GPU processing uses the highest quality processing possible. That's not always true in Software mode.
Copy link to clipboard
Copied
Thanks Jim.
Could you elaborate on "GPU processing uses the highest quality processing possible." I'm not sure what "highest quality processing" actually means. Are you talking about encoding?
"That's not always true in Software mode."
So are you saying that Software Mode could potentially be superior or inferior to CUDA in reference to encode quality?
If you're actually referring to processing quality in terms of speed/efficiency, that would be a different conversation to the one about encode quality, no?
Forgive my confusion.
Cheers.
Copy link to clipboard
Copied
The GPU uses the best scaling and 32 bit floating point color at all times. Software results may or may not be as good, depending on settings.
CUDA, OpenCL, Mercury Playback Engine, and Adobe Premiere Pro | Creative Cloud blog by Adobe
Copy link to clipboard
Copied
Ahh, I get what you're saying now.
Thanks.