• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
2

enable CUDA ?

Community Beginner ,
May 02, 2010 May 02, 2010

Copy link to clipboard

Copied

found this on cinema5d.com forum:

How to make Premiere CS5 work with GTX 295

Postby marvguitar on 01 May 2010 22:38

I figured out how to activate CUDA acceleration without a GTX 285 or Quadro... I'm pretty sure it should work with other 200 GPUs. Note that i'm using 2 monitors and there's a extra tweak to play with CUDA seamlessly with 2 monitors.

Here are the steps:

Step 1. Go to the Premiere CS5 installation folder.
Step 2. Find the file "GPUSniffer.exe" and run it in a command prompt (cmd.exe). You should see something like that:
-----------------------------------------------------------------------------------------------------------------------------------
Device: 00000000001D4208 has video RAM(MB): 896
Device: 00000000001D4208 has video RAM(MB): 896
Vendor string: NVIDIA Corporation
Renderer string: GeForce GTX 295/PCI/SSE2
Version string: 3.0.0

OpenGL version as determined by Extensionator...
OpenGL Version 2.0
Supports shaders!
Supports BGRA -> BGRA Shader
Supports VUYA Shader -> BGRA
Supports UYVY/YUYV ->BGRA Shader
Supports YUV 4:2:0 -> BGRA Shader
Testing for CUDA support...
Found 2 devices supporting CUDA.
CUDA Device # 0 properties -
CUDA device details:
Name: GeForce GTX 295 Compute capability: 1.3
Total Video Memory: 877MB
CUDA Device # 1 properties -
CUDA device details:
Name: GeForce GTX 295 Compute capability: 1.3
Total Video Memory: 877MB
CUDA Device # 0 not choosen because it did not match the named list of cards
Completed shader test!
Internal return value: 7
---------------------------------------------------------------------------------------------------------------------------------------

If you look at the last line it says the CUDA device is not chosen because it's not in the named list of card. That's fine. Let's add it.

Step 3. Find the file: "cuda_supported_cards.txt" and edit it and add your card (take the name from the line: CUDA device details: Name: GeForce GTX 295 Compute capability: 1.3

So in my case the name to add is: GeForce GTX 295

Step 4. Save that file and we're almost ready.

Step 5. Go to your Nvidia Drivercontrol panel (im using the latest 197.45) under "Manage 3D Settings", Click "Add" and browse to your Premiere CS5 install directory and select the executable file: "Adobe Premiere Pro.exe"

Step 6. In the field "multi-display/mixed-GPU acceleration" switch from "multiple display performance mode" to "compatibilty performance mode"

Step 7. That's it. Boot Premiere and go to your project setting / general and activate CUDA

Hope this helps

Views

146.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Community Beginner , May 03, 2010 May 03, 2010

Note that this will only work with cards that have 765MB or more of RAM.


Votes

Translate

Translate
Enthusiast ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

Well, that didn't take long.


Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

OK -- now when GTX 295 works -- can you make GTX285 do real time on more then 3 layers?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

Wow! Talking about Cracking the Code.

Many people will be arriving on your doorstep to hug & kiss you for  this. (what's your address?)

- Is there a special way to write the name of the Card when you do this? (ie: GTX-295 vs GTX295 vs GTX 295... my cards are Quadro FX1800)

- Have you noticed any performance boost in doing this with a non-supported card?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

sync2rhythm wrote:

Wow! Talking about Cracking the Code.

Many people will be arriving on your doorstep to hug & kiss you for  this. (what's your address?)

- Is there a special way to write the name of the Card when you do this? (ie: GTX-295 vs GTX295 vs GTX 295... my cards are Quadro FX1800)

- Have you noticed any performance boost in doing this with a non-supported card?

Whatever name that GPUSniffer.exe shows for the card, you must include that name exactly.

I have a GTX275 and when I monitor CPU and GPU use with an independent RT program (Everest), with CUDA enabled, the CPU usage goes down during previewing. Keep in mind also that the MPE does very little to nothing to improve decoding and encoding speeds, just rendering. The specific codecs that are accessed, control those critical factors.

On my quad core, RAID 0 setup, having CUDA enabled, usually makes a difference between stuttering and RT playback, but not for every timeline. Whether I want to risk doing so on an important project, though, is another issue.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

When someone tries this successfully on the Fermi cards please post here!  As the GTX 285 and the 295 are now obsolete I hate buying even a used card and I will not ever buy another Quadro card.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

I posted yesterday that GTX 470 seems to work.  Jury is still out on the stability and functionality though.  Take a look at my other posts for details.

UPDATE:  There are rendering errors with the Fermi / GTX 400 series from what I understand.  Sorry guys...Not ready for prime time yet.  Not production ready yet.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

cts51911 wrote:

I posted yesterday that GTX 470 seems to work.  Jury is still out on the stability and functionality though.  Take a look at my other posts for details.

UPDATE:  There are rendering errors with the Fermi / GTX 400 series from what I understand.  Sorry guys...Not ready for prime time yet.  Not production ready yet.

Even if so, keep it on for editing and turn it off for final rendering then, no? So it might still be a huge help even as is.

Maybe it is just a bug in the current nvidia fermi driver and will fixed the next driver release?

Yeah it does seem fairly disappointing that the HW still seems to be untouched for codec decoding, using video card decoding assistance makes a ridiculous impact (although it might make it harder to also use CUDA then, but I almost bet, for h.264 video, if most likely not for any other sort, that shifting that to decoded accel and doing the CUDA stuff back on CPU would be faster on average, i also wonder if it might not be possible to use both at once, at the very least with some buffering tricks, anyway i'm getting into wild speculation now). Straight h.264 still drives the CPUs like wild. At least the Mercurcy engine is now just barely fast enough in software to handle it on a reasonable fast machine though, so whatever the actual story it certainly is quite a step up from CS4 for sure and one a reasonably fast machine it will do pretty well unless you go crazy with it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

cts51911 wrote:

I posted yesterday that GTX 470 seems to work.  Jury is still out on the stability and functionality though.  Take a look at my other posts for details.

UPDATE:  There are rendering errors with the Fermi / GTX 400 series from what I understand.  Sorry guys...Not ready for prime time yet.  Not production ready yet.


Can you explain more about the rendering errors ?    Was it related to hangups/crash or image quality problems ?

Did you experience these yourself ?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

My experience, so far, with the GPU Hack + GTX 470.....

First, I have to say up front that nothing I say here is neither scientific or accurate.  I've had random moments throughout the last few days when I time to get on the computer and see if something works.

I've been doing random tests to see if I'm seeing any benefit at all...That has been my main focus.  In other words, does it even work?  As of right now, I'm sure I'm seeing GPU acceleration.  There is a distinct difference between MPE software only and gpu modes.

Regarding the errors, I have seen some strange pixelization and image shift in the preview window with basic 3d plugin.  I have not done enough to draw any conclusions.  I can't even speak to the final rendering quality since I have not done much.

Bottomline: Fermi requires updates to work properly (point releases).  It is a totally difference GPU than the 200 series.  Expect some strange stuff if you dare to use the hack.  I seriously doubt that I'm seeing the full potential of Fermi with the hack in terms of the computational power or the quality that is possible.  Still, what I am seeing so far is pretty amazing (accelerated effects, smooth preview playback)

Seeing the instablity helps me understand why Adobe chose just a few cards.  The last thing you would want to use is an uncertified card if you are doing production.  This is a strong argument for their selecting a few cards versus "all cuda cards".  Imagine it "kind of working"?  People would be pissed.  This is new territory for Adobe.  It was smart to error on the side of being caution.

I'll go out on a limb here and say that the GPUs that are "very" similar to the GTX 285 will work much better than Fermi (using the hack) because the GPUs are the same design.  My understanding, and I could be wrong, but the GTX 260/275 is the same GPU as the GTX 285 sans stream processors, ROPS and a narrower memory bus.  So, if the GTX 285 is limited to 3 streams, than what are these cards capable of?  Are you just going to bottleneck the system?  This leads to why they chose the GTX 285 as the mainstream option...It just has just enough power and models below lack the power to see a clear benefit.

3 more hard drives and another 6gb of memory is arriving today to relieve the i/o bottlneck I have.  I hope to have more details at a later date.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

So far so good with the Boss's new Sager Notebook.  He ordered it with CS5 and MPE in mind.  The notebook version of the Nvidia GTX 285 is the GTX 285M.  Just did the dance outlined above.  We'll see how it goes.  For our other two i7 920 editing systems, we just ordered a pair of Nvidia 285 GTX's.  We are currently working on a project that has a dimension of 2560 X 540.

In CS4, without MPE (waiting for the 2 cards from Newegg.com) playback from the timeline of the 2560 x 540 sequence was choppy.  The same sequence in CS5 plays back smooth at full resolution without rendering and without GPU acceleration!  Can't wait to install the GTX 285 MPE cards on our two best systems.

The Adobe CS5 team should be proud.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

Just noticed that the 3 line limit includes my GTX 470.

Someone here mentioned it is tied to the Geforce card drivers.

Funny thing is that the limit involves the timeline being Red and doesn't have to do with preview quality...The previews still seem to be smooth.

Sadly, I'm realizing that the last bottleneck I have is the i7 CPU 930 I have.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

Simply overclock!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

Hi Bill,

Looking forward to your PPBM5.

Yes...I see on your site you are at 4.2ghz

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

Yes, I have sucessfully pushed it to 4.2 GHz but I am not going to run normally at the clock rate.with my $1000 CPU.  Since it still is an experimental bench top system I am able to make very quick changes.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 26, 2010 Oct 26, 2010

Copy link to clipboard

Copied

Chuck A. McIntyre wrote:

................................ The notebook version of the Nvidia GTX 285 is the GTX 285M.  Just did the dance outlined above.  We'll see how it goes.

The Adobe CS5 team should be proud.

Hello Chuck, how did your setup 'dance', for the GTX285M work out?

I am picking up an MSI GT660R laptop which uses the Nvidia GTX285M, and I was wondering whether or not it would accelerate the Mercury Playback Engine. Has your fix worked effectively?

The Adobe CS5 team should be proud.

I am not certain that the CS5 team should be proud. Perhaps if they got off their backsides and qualified a few laptop Nvidia 'M' cards, I would be more appreciative of their efforts. After  3 months of begging and pleading with them to help, I finally bit the bullet with the MSI GT660R and now have to hope that the fix works.

After putting in my purchase order for the MSI GT660R, Dell finally came around with the XPS 17 which has a NVIDIA GeForce GT 445M 3GB graphics with  Optimus. Impressive stats on the system and they indicate that it is Adobe Certified? If so, then it is a first for laptop video-card certifications. Cannot find the certification on the Adobe site or the Nvidia site. I will have to ask them where it is stated that it is certified.

.................. Sigh!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 03, 2010 May 03, 2010

Copy link to clipboard

Copied

Note that this will only work with cards that have 765MB or more of RAM.


Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

Can anyone help me to enable CUDA support for Quadro 4600.  I run the GPUSniffer.exe the results are:

Device : 00000000002C4168 has video RAM(MB) : 768

Vendor string :     NVIDIA Corporation

Renderer string :  Quadro FX 4600/PCI/SSE2

Version string:     3.0.0

OpenGL version as determined by Extensionator...

OpenGL Version 3.0

Supports shaders!
Supports BGRA -> BGRA Shader
Supports VUYA Shader -> BGRA
Supports UYVY/YUYV ->BGRA Shader
Supports YUV 4:2:0 -> BGRA Shader
Testing for CUDA support...

     Found 1 devices supporting CUDA.

     CUDA Device # 0 properties -

     CUDA Device details:

          Name: Quadro FX 4600          Computer capability: 1.0

          Total Video Memory: 739MB

     CUDA driver version: 3000

CUDA Device # 0 not choosen because CUDA version 1.0 is not supported.

Completed shader test!

Internal return value: 7

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

It's saying that the GPU was not chosen because CUDA is not supported, not because it isnt on the list.

Do you have the latest drivers?

Does that card support CUDA?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

The lastest driver at Nvidia.com has Cuda 3.0 built in from what I understand

197.59 Driver is the lastest

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

I think your video MEMORY is too low  739mb available is not enough?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

may be...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

Yep - 768 megs of card memory is the minimum.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

one more thing if you can help me out

For Premiere is there any performace difference between Quadro and Geforce? because they both are identical in hardware just driver difference for professional applications which i found over net.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jan 08, 2014 Jan 08, 2014

Copy link to clipboard

Copied

I have a GTX 550 Ti with 729Mo found by GPUSniffer, and it's accepted for CUDA support by PPro CS5.5. It certainly accelerates playing back SD sequences.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines