5 Replies Latest reply on Feb 12, 2015 2:28 AM by RoninEdits

    Mixing Workstation and Consumer GPU's

    Michael Rochefort Level 1

      It was recommended that I move my previous thread here, so here I am!

      I need to design a couple of computers for a film class I've made at my school. One of the things I'm looking at is the camera I want to use with the class, and it can record up to 12-bit video. Now, I understand that to view this color requires a 10-bit OS, GPU, Application, and Monitor (yes, you can "see" it in 8-bit, but that's not the question here). Windows 7/8 supports it, Adobe CC supports it, and there are a plethora of 10-bit monitors on the market. Here's where my real question comes in:


      It's quite apparent that Adobe has a tight partnership with NVIDIA, utilizing the toolkit in their applications and recommending NVIDIA cards on their site. In order to see 10-bit though, a Quadro card is required, and for a card with decent memory and CUDA cores it can run up to and above $2K, which is money that I just don't have for a card. So, I was wondering if it is possible to use an AMD FirePro card (such as the W5100 or W7100) as the primary monitor output card to achieve the 10-bit chain, and then have a secondary consumer card, such as the GTX 780 to use with MPE and as the CUDA computation card in headless form, which I can get for about 800-900 bucks total? Is this kind of setup impossible due to the drivers/architectures of the cards, or is it a feasible pairing?


      Extra Info: The class is designed to work with Editing, Compositing, Color Grading, and 3D Work (Either Modo/Mari or Maya/Mudbox). I've also designed a build that's about $3K based off the idea above.


      Thanks in advance for the help!