Couple of bottlenecks in your setup as you currently describe it.
I'm assuming you have a 10 bit monitor for CC (something like a HP Dreamcolor), otherwise there is no use for any Quadro at all. Apart from the Quadro model - I'll come back to that later - you have three video cards plus the BM card. You did not tell on what platform they must be installed and it depends on the mobo used what will happen when all these cards are installed. One thing is for sure, there is no mobo in existence that has 52 PCIe lanes, so your video cards will not run at 16x speed, but only at 8x speed or slower. Running a video card at 8x instead of 16x results in a performance penalty of around 10-15%. But if you have this in mind for a 1155 platform, then the video cards will run at max at 4x speed and that wil be very noticeable. Whichever way you turn it, with this setup and the intention to do 4K CC work, you will not have the PCIe lanes for a dedicated raid controller and with that material you will want a dedicated raid controller.
Why two GTX 580's when not used for output or is that only for Resolve?
Assuming you have a platform with 40 PCIe lanes available, and despite budget restrictions, I would go for two Quadro 4000+ cards, forget about the 600 which is way too underpowered, forget about the GTX 580, since they lack 10 bit output, skip the Maximus/Tesla solution since it is too expensive. This solution assumes that Resolve can use both Quadro cards, since as you stated, PR can only use one.
Using this approach, you can run both Quadros at 16x and the BM at 4x and still have 4 PCIe lanes available for a raid controller at 4x. However, if you do not have a 10 bit monitor, then I would suggest to use two GTX 580's instead of two Quadro's.
Sorry for not elaborating. It's a PC. Was hoping someone had a quick way around adobe using the main card for acceleration.
My system plan is as follows:
Asus Z9PE-D8 WS (72 lanes of pci-e)
2x Intel Xeon E5-2620 (or higher)
Nvidia or ATI card to work with 10-bit color monitor
2x Nvidia CUDA cards to do heavy lifting in Resolve (using one for adobe)
4x RAID 0 500GB WD Caviar Black on inbuilt Marvel 6.0gb/s raid (understand the danger of not using dedicaded)
Blackmagic decklink 3D (to drive a large format SDI display in resolve, since speedgrade only uses nvidia sdi cards)
I don't think any bottlenecks exist in my system, if there are, please do tell me. I'm not a professional by any means, but I'm pretty confident.
I'm thinking of using an ATI card for my main card, seeing as they almost all have 10-bit color, even in their gaming cards, 4k monitor support, and no CUDA. Theory is, CS6 will see there is no cuda support on the main card and move focus to a card that does (my 580). OR, if worse comes to worst, trying to put a KVM on two of the graphics cards, switching inputs and restarting the computer.
Resolve will take as many CUDA gpus as I throw at it.
1 person found this helpful
I would definitely not recommend the ATI with Nvidia card setup. Adobe detects for hardware MPE acceleration off the Primary monitor. It does not move on down to the next card if it the current card does not support the hardware acceleration. If that was the case then Nvidia would not have had to develop Maximus. I have configured and shipped a system with similar requirements that you are looking at. That system had a Quadro card for display/output and a Geforce for the Hardware MPE. You had to select in the Nvidia control panel which card would handle the CUDA to get it to work and even then it was sketchy. The configuration was further complicated when the Quadro drivers no longer included the Geforce drivers as well. Installing both would often corrupt the Quadro drivers. However if you can get by the issues of a Quadro and Geforce drivers, you can get the configuration you are looking for. I would avoid the KVM at all costs. The last couple of years, I have seen way to many KVM's fail and take out very expensive video cards because there is zero resistance/protection on the RamDac side and KVM's have their own power source. If the KVM goes then chances are your video card is going to take the hit as well. I just cant recommend it at this point.
Thank you. Very helpful! I was under the impression the ATI card would work from this vid http://www.youtube.com/watch?v=koE5r0Kg1Rc . The secondary quadro card is recognized in premiere. The people in the video claim Premiere is smart enough. It's strange how little configuration/settings there are. You're right about the KVM being a bad idea.
If only speedgrade would accept output through a blackmagic decklink. Otherwise I wouldn't have to wory about a 30-bit compatible quadro and I could just put 3x gtx 580's and the world would be dandy. But I digress.
I've looked everywhere for evidence of being able to select cuda cores for MPE in the nvidia preferences, but I've heard so many mixed reviews. Could you take a screen shot of the window or describe the process? I've been told on multiple forums that MPE and acceleration would only be done on the main GPU for Adobe.
CS6 may be smart enough to sort the GPU's but CS5.5 was not and that video is with a Mac and Not a PC. The main problem you run into using both ATI and Nvidia cards in the same system is resource sharing with ram shadowing conflicts. Since the drivers are completely different then that cannot be managed well by the OS. I would highly recommend against it but that is ultimately up to you. You also risk running into Direct X or Open GL issues when one manufacturer changes versions versus another. This really creates possible failure points I would not recommend in a production environment.
Go into your Nvidia control panel and look for the Cuda GPU's listing. Then go in and uncheck the Quadro card and leave the Geforce card checked. That is how it's done. Sometimes the Nvidia control panel wont save that change and re-checks the Quadro card. Just go back in and do it again. Eventually I got that setting to hold.
Adobe detects the video card by Default from the Primary monitor. If you move your primary monitor from 1 card to another before you launch Adobe then it will use the new card. If you move your primary monitor back after you close it then obviously you can have a 10bit card. Adobe does not care what slot or what you consider the primary video card is all the time. It only cares about what Windows has listed as the video card that handles the Primary Monitor. That is another way to do it but then you have to have a monitor connected to your Geforce card.