Skip navigation
jahe08164
Currently Being Moderated

CS6 WMV export freezes with Cuda enabled - Z800 with FX3800 & GTX285

Jul 24, 2012 2:38 AM

Tags: #error #cuda #premiere #export #cs6 #wmv

Hi guys,

 

i have a huge problem with my HP Z800 exporting WMV from a timeline with clips from a Canon 550D (MP4) and RED Epic (r3d).

 

The FX3800 is the GUI card and the GTX285 is the Cuda only card. If i switch these cards, i have the same problem. If i disable cuda, it works well. With Cuda enabled, the export  hangs up at different %-stages.

 

But i need cuda to work with... how can i fix this problem?

 
Replies
  • Currently Being Moderated
    Jul 24, 2012 4:11 AM   in reply to jahe08164

    Try with one of the video cards removed. Do you still have the problems?

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 24, 2012 8:37 AM   in reply to jahe08164

    Premiere does not work well with 2 cards unless they use the exact same driver... do your 2 cards use the same nVidia driver?

     

    2 cards and 3 monitors http://forums.adobe.com/thread/875252

    -and http://forums.adobe.com/thread/876675

     

    Dual Card SLI http://forums.adobe.com/thread/872941

    -problem http://forums.adobe.com/thread/872103

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 24, 2012 9:22 AM   in reply to jahe08164
    The FX3800 is the GUI card and the GTX285 is the Cuda only card

     

    That's confusing.  It was my understanding that CUDA would only work on the primary GUI card, with the Tesla being the only exception.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 4:19 PM   in reply to jahe08164

    Just how do you set which card to use for CUDA?  I've only ever had one card installed.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 4:42 PM   in reply to Jim Simon

    Jim Simon wrote:

     

    The FX3800 is the GUI card and the GTX285 is the Cuda only card

     

    That's confusing.  It was my understanding that CUDA would only work on the primary GUI card, with the Tesla being the only exception.

     

    That was my understanding as well, but our main colour grading machine has a GUI card for display and a GTX285 in slot 2 for GPU (the ideal configuration for our grading software) and when we run Premiere on it CUDA works just fine.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 6:35 PM   in reply to SimonHy

    I am running a GTX670 4GB as primary card (ie with monitors connected) and my old Quadro FX3800 in the other PCIe slot.

     

    Premiere is fine with it but  I know of no way to tell PPRO that one is GUI and the other is for CUDA.

     

    Color Resolve uses the 2 cards . One as GUI and one as Processing.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 9:38 PM   in reply to shooternz

    Update:

     

    I now have installed a new GTX560 1GB as primary card (ie with monitors connected).. and the GTX670 4GB in the other PCIe slot.

     

    Works fine in Premiere and Da Vinci Resolve . Both playing nicely together.

     

    Installing two cards was no issue at all apart from a bit of slot and cable juggling to accomodate 2 x 2slot cards.

    ...and of course the cuda hack for both these.

     

    FWIW: The GTX560 was one of  Harms two recommendations The 560 was the best  choice and good advice.  (Cuda cores and power demand considered.)

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 10:13 PM   in reply to shooternz

    shooternz wrote:

     

    Update:

     

    I now have installed a new GTX560 1GB as primary card (ie with monitors connected).. and the GTX670 4GB in the other PCIe slot.

     

    Works fine in Premiere and Da Vinci Resolve . Both playing nicely together.

     

    Installing two cards was no issue at all apart from a bit of slot and cable juggling to accomodate 2 x 2slot cards.

    ...and of course the cuda hack for both these.

     

    FWIW: The GTX560 was one of  Harms two recommendations The 560 was the best  choice and good advice.  (Cuda cores and power demand considered.)

     

    Is Premiere using CUDA on both cards?

     

    I would have thought the better thing to do would be to only add GTX670 to the hack list, to force it to use the more powerful card.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 10:21 PM   in reply to SimonHy

    I would think that Premiere is only using the Primary Card.

     

    I dont think Premiere can take advantage of two cards  or pick and choose cards. (No SLI or anything AFAIK).

     

    The Primary card  (GTX 560) is Slot 2 and has the Monitors connected.

     

    The GTX 670 is slot 5 and has no monitors connected at all.  ( It can run 4)  Dont know what would happen if I connected one /some..  and have no need to try either.

     

    I also have the BM Decklink running SDi monitor.

     

    BTW: What Color App are you using Simon?

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 10:28 PM   in reply to jahe08164

    Coming back to the O.P

     

    I now believe that my new combination of 2x GTX cards is more responsive than when I had a GTX and a Quadro installed.

     

    It may be some hardware / driver  compatabilty issue. It maybe the slot order.

     

    I dont think you can set one up for GUI  and one for VIdeo ( as I said to Simon).

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 10:32 PM   in reply to SimonHy

    I would have thought the better thing to do would be to only add GTX670 to the hack list, to force it to use the more powerful card.

     

    I thought about this since and what would happen is that when you open a project...Premiere would advise you that you can only run in Cuda software mode.

     

    Dont ask me how I know this. 

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 10:35 PM   in reply to shooternz

    shooternz wrote:

     

    I dont think Premiere can take advantage of two cards  or pick and choose cards. (No SLI or anything AFAIK).

     

     

    Yeah, that's why I was surprised to see my system was successfully using my GTX285 when there's no monitors connected to it. My best guess (because I can't seem to find much in the way of detailed info about graphics cards configurations for Premiere) is that this works because the GTX285 is an approved card and my GUI card isn't, so it happily does what it shouldn't and uses a CUDA card that has no monitors attached for processing. If this is true, you could add your 670 to the hack list and not add the 560, and you'd be processing with the 4gig card.

     

    I use Davinci Resolve for colour grading.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 25, 2012 11:40 PM   in reply to SimonHy

    Interesting...I will try that and see what happens.  Its an easy test with no downside to trying it. ( ie. edit the cuda txt file )

     

    Be so cool if it worked out because currently ..most of my work is in Premiere..yet I am running two power hungry , heat generating cards without an obvious benefit except in Da Vinci Resolve.

     

    Will let you know what I find out. (tomorrow)

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 26, 2012 8:54 AM   in reply to jahe08164

    Take a look at the pictures. You can choose which card to use for Cuda.

     

    I don't read that language but, I'm not sure that will apply to PP.  Like Shooter said back in post 8, I don't think PP gives you the choice of which card to use for CUDA, meaning it will only use the Primary card.  (At least, that's the info we get from Adobe staff.)

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 26, 2012 9:01 AM   in reply to jahe08164

    Change the settings to 'Globale Einstellung' (Multiple display performance mode).

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 26, 2012 1:37 PM   in reply to Jim Simon

    @SimonHy et al..

     

    Well well well....here is something a bit different.  Cuda Card news.

     

    I have two cards installed

     

    GTX560 1GB as primary card (ie with monitors connected).. and the GTX670 4GB in the other PCIe slot.

     

    ...and I tested what happens when one "unhacks" the primary card. ie remove the primary  card name from the cuda txt file.

     

     

    Guess what...Premiere now uses the  second card for the Hardware GPU acceleration.

     

     

    This is perfect for my situation and I am now working "uncompromised" between Premiere and Da Vinci.

     

    ie. maximum cuda cores (1334) for Premiere and Da Vinci processing as well as maximum memory for Da Vinci...and the  CTX560 is effectively just running the GUIs

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 26, 2012 2:41 PM   in reply to shooternz

    I wonder if Adobe engineering is aware of this.

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 26, 2012 2:56 PM   in reply to Jim Simon

    I wonder if Adobe engineering is aware of this

     

    Dont tell them or let them know about it!

     

    They will mess with it and break it.

     

    It is pretty cool though .

     

     

    Not sure how it benefits anyone else  apart from those of us using  Da Vinci Resolve  as well as Premiere.

     

    I am curious to see what happens if I connect another monitor to the GTX670 but I have no need to ....so will avoid the temptation.

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points