• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Graphics Card Selection for Premiere and AfterEffects (Direct X vs CUDA Cores)

Community Beginner ,
Sep 14, 2017 Sep 14, 2017

Copy link to clipboard

Copied

I'm in the process of buying a video card for my computer to aid in video editing and I wanted to know if PR and AE support Direct X - or - if the video cards have to be CUDA cards to be supported. Please advise.

Views

19.1K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

LEGEND , Sep 16, 2017 Sep 16, 2017

To support Windows you need DirectX.  Both nVidia and AMD support DirectX

To get GPU Acceleration in Adobe Premiere the card must have either CUDA or OpenCL.  nVidia cards with CUDA are better than AMD ATI cards with OpenCL.  I just received PPBM results from the new 16-core AMD Threadripper CPU system tested with one of the latest cards from each vendor.  The GTX 1080 Ti was far the best over the new AMD Radeon Vega Frontier Edition, and from the scoring on these tests I can tell you that you do

...

Votes

Translate

Translate
Community Expert ,
Sep 14, 2017 Sep 14, 2017

Copy link to clipboard

Copied

Moved to Hardware forum.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 14, 2017 Sep 14, 2017

Copy link to clipboard

Copied

Thanks!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Sep 16, 2017 Sep 16, 2017

Copy link to clipboard

Copied

To support Windows you need DirectX.  Both nVidia and AMD support DirectX

To get GPU Acceleration in Adobe Premiere the card must have either CUDA or OpenCL.  nVidia cards with CUDA are better than AMD ATI cards with OpenCL.  I just received PPBM results from the new 16-core AMD Threadripper CPU system tested with one of the latest cards from each vendor.  The GTX 1080 Ti was far the best over the new AMD Radeon Vega Frontier Edition, and from the scoring on these tests I can tell you that you do not need a GTX 1080 TI to equal the results of the Vega card, it could safely be done with a GTX 1070 or maybe even a GTX 1060 6GB card.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Nov 28, 2017 Nov 28, 2017

Copy link to clipboard

Copied

Hey Bill - so I frequently have to edit RED Epic-W 8K raw footage. I just updated to the 16-core Threadripper and am using a 1080ti. When I render I generally have GPU heavy effects like Magic Bullet Looks/Cosmo II which appear to saturate the 1080ti (100% GPU usage during render times) while my CPU is 50% or under. This indicates to me that my poor 1080ti can't keep up with the CPU and wanted to ask if perhaps upgrading to the Radeon Vega Frontier Edition (because it has 16GB of ram) would help in this case?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Nov 28, 2017 Nov 28, 2017

Copy link to clipboard

Copied

That Radeon Vega Frontier Edition won't cut it because of Adobe's historically poorer performance in OpenCL versus CUDA: The GPU utilization will still be pegged to 100%, while the CPU utilization will be no higher than what it currently is with the GTX 1080 Ti. Also, Adobe's CUDA renderer supports a few features that are not currently supported in the OpenCL renderer.

As a result, your situation is one of the very few that absolutely needs at least a second GTX 1080 Ti (though not necessarily SLI-linked).

If on the other hand the GPU memory (graphics RAM) usage is pegged to 100%, and then all of a sudden crashes down to its normal idle percentage in the middle of a GPU-accelerated rendering job (at which point the MPE renderer will automatically switch from GPU accelerated mode to software-only mode with absolutely no notification at all whatsoever), then a card with more VRAM might help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Feb 09, 2018 Feb 09, 2018

Copy link to clipboard

Copied

Does Premiere CUDA make use of multiple graphic cards?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 09, 2018 Feb 09, 2018

Copy link to clipboard

Copied

Yes, look at my Premiere Pro BenchMark (PPBM) web site at the GPU page for dual GPU card scoring.  Both should run on the same driver.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 29, 2018 Aug 29, 2018

Copy link to clipboard

Copied

LATEST

Bill, I read your benchmark webpage and like your conclusion part. It's been quite a few months since last update and the Bitcoin craziness seems to have cooled down. What would be your latest conclusions? I'm planning to go as inexpensive as possible. What would you say about:

AMD 1700 (maybe X if price is close enough)

1070 6GB

EVO 960 m.2 256GB boot, software

EVO 960 sata 1TB file

Do you see big difference between 16 and 32GB memory?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 09, 2018 Feb 09, 2018

Copy link to clipboard

Copied

Yes, if all of the GPUs used in the CUDA MPE acceleration are of the same GPU generation. Identical GPUs are preferred.

That's one thing. If you use multiple GPUs, you may run into another problem: The CPU. Depending on its performance level, it may not allow the GPU to become anywhere close to fully utilized.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 10, 2018 Feb 10, 2018

Copy link to clipboard

Copied

Since it seems for most users we have got to the point where the increase in GPU does not necessarily relate to rendering performance.   Does anyone know Of a comprehensive list of cards that shows in relation to each other the expected increase so one can make an educated choice on a card that will fit their budget.

The only list I have found so far is a CUDA score list but by the sounds of it, it might not be the best indicator 

Nvidia GPUs sorted by CUDA cores · GitHub

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 10, 2018 Feb 10, 2018

Copy link to clipboard

Copied

Here is a more comprehensive list.

Looks like most of the 780ti, 980ti etc have good bandwidth, speed and CUDA scores

https://www.studio1productions.com/Articles/NVidia-GPU-Chart.htm

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 10, 2018 Feb 10, 2018

Copy link to clipboard

Copied

CUDA cores are not the only thing, memory clock, memory bandwidth aree also significant.  as you can see from my GPU testing using Premiere Pro.

It does not help, but certainly adds a lot of confusion but different vendors and different programs have at least 3 different ways of specifying memory clock speed

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 15, 2018 Feb 15, 2018

Copy link to clipboard

Copied

After some research, I may be getting this wrong but it seems to me that sometimes a higher clock speed can negate the difference in bandwidth and CUDA Cores. I classic example is the 780ti and the 1060 SC OC. The 780ti has many more CUDA Cores and a higher bandwidth than the 1060, however, the 1060 SC OC seems to run at much higher clock speed.

I would love to see how the 1070 and 1070ti perform.

In regards to price to performance, it looks like the 1060 SC OC and 780 ti hit the sweet spot unless you have any other cards that you feel would be even better?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 15, 2018 Feb 15, 2018

Copy link to clipboard

Copied

I cannot afford any new cards.  As a matter of fact I am getting rid of some.

When you look at the small percentage improvement where (my test) CPU only is ~250 seconds and see that same timeline with GPU accelerations are in the 13 seconds to 24 seconds range, you are only talking a improvement of 236/250  94.8% versus 226/250 or 90.4%  So by spending 2-4 times as much you get a 5% performance improvement if you are using a lot of GPU accelerated effects and features

My PPBM testing covers a very wide range of nVidia architectures with many different memory types.  You cannot compare Memeory speeds between cards that specification was put in to to show the amount of overclocking that is available.  For instance you cannot CUDA overclock the GTX 900 series

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 15, 2018 Feb 15, 2018

Copy link to clipboard

Copied

I usually shoot video in 2k or 4k and downscale to 2k or 1080p with colour correction etc so my card is used all the time.
The CPU usages hovers around 60% - 70% with the GPU usage at 90% - 98%.

I am currently looking to upgrade from the 750ti and searching for the best path.

Also Bill thank you very much for all the work you do here and all the testing. I really appreciate it and I'm sure lots of others do to.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 15, 2018 Feb 15, 2018

Copy link to clipboard

Copied

Samtastico,

Your GPU can use some improvement, judging by the CPU vs GPU utilisation results. Unfortunately, the GPU that would have given a worthwhile enough improvement over your current GTX 750 Ti - the GTX 1060 6GB - now costs at least double what it should be due to the current cryptocurrency mining craze. I was fortunate enough to purchase one back in Spring of 2017 for the $250 USD plus tax, because most places now charge $500 to $600 USD for one. And performance-wise, any of the "affordable" current GPUs are a substantial downgrade from your GTX 750 Ti, while the GTX 1050 (the current GPU that's closest in performance to the GTX 750 Ti), in your situation, costs way too much money for only a marginal at best performance increase.

Hence, I am recommending that you stick with your current GTX 750 Ti until the prices come down to sanity levels.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines