My graphics card Quadro 4000 just "bit-the-dust" so I need to replace it. My pc has 24 GB RAM, all programs operate on a 256 GB solid state drive, my scratch disk is solid state, I have 12 TB of hard drive on board. I am an editor for an independent hollywood producer. I work in all HD media, I will be working with 4K media in the near using multiple Adobe programs; AE, SG, PrePro, Photoshop, Illustrator, Audition. I would like to stay with Quadro for my graphics card but do not know if the added expense of "maximus" will give me the best "bang-for-the-buck." Can I function at the high end with the Quadro 5000?
The much touted Maximus solution, at least by Adobe, is an utter waste of money, because it requires a very expensive Quadro card plus a Tesla C2075 card, that is even slower than a simple two generations old GTX 470, for the simple reason that it lacks CUDA cores and memory bandwidth to make it faster. We have several Maximus solutions in the current PPBM5 benchmark and despite prices up to € 6000 for the Quadro 6000 plus a Tesla C2075, these solutions are easily outperformed by many GTX 470/480/570/580/670 and 680 cards, that cost only a fraction.
BFTB wise, the 670/4GB is the much better option and is significantly faster than even the Quadro 6000 plus the Tesla C2075 for less than 10% of the price.
If you want a pro card, wait for the K20 to appear in Q4.
Thanks for responding...and I don't understand an aspect of what I think you are stating; I have been on the NVidia site and know from their tech info that the Quadro 4000, 5000, and 6000 have CUDA which I need for high end - real time processing, rendering and performance between Adobe CS6 programs. However, I am not working in 3D and I am not using any other software like CAD. Is it that you just don't like NVidia, or that you find GForce to function better with the new performance requirments of CS6. It sure looks like Adobe has teamed up with NVidia for some reason ...if not performance then what?
All the GTX cards I mentioned have CUDA cores and sufficient VRAM to allow the use of hardware accelerated MPE.
However the Tesla C2075 has only 448 CUDA cores and a memory bandwidth of around 150 GB/s. The GTX 580 has 512 CUDA cores and a memory bandwidth of 192 GB/s, so it is way faster, but does not cost € 2300, only € 350. Same argument applies to the latest Kepler range (670/680).
The problem is that nVidia keeps the prices of Quadro/Tesla artificially high and does not deliver any value over a simple Fermi/Kepler card for a fraction of the price.
Put it another way, when you have to commute 30 miles each day in the vicinity LA, with all the traffic jams, you can use a Maybach (€ 700,000) or a simple Lexus RX450H (€ 75,000) to get you from A to B, what do you think is more economical?
Ok, So within the Geforce options ....would the GTX 680 give me better/faster performance than the GTX580. And even though these cards are not listed as preferred on the Adobe site for use in CS6 programs (Premiere, SpeedGrade, After Effects) you are saying that the GTX580 and GTX680 will do the job very well, right?
We have some new information from a couple of users. Apparently, you can get CUDA to run on a secondary card, even if it's a "lowly" GTX and not a Tesla card.
So theoretically, you could pick up a new 4000 as the main GUI card (the only real advantage of which is the 10 bit output), and then use something like the 580 to handle the CUDA processing. The trick is to remove the 4000 from the supported cards list and add the GTX model you end up with.
Now I'm really confused......I am not a gamer.....I am an editor for feature length HD 1920 x 1080 movies and plan on moving into 4K media. I'm also planning to use Speedgrade along with AfterEffects and the editing process. The card I'm trying to replace is a Quadro 4000, are you saying the GTX690 does not have the 10 bit output which is necessary for using the applicatons for my editing purposes? And are you also saying if I purchase the GTX690 and once it is installed I will have trouble getting my Adobe software to recognize the card? Or are you saying I will need 2 video cards to do the job? Lastly is the GTX 690 an overkill for my purposes and if so what which card would be best for my purposes and my future 4K editing?
Keep in mind that to use 10 bit output, you also need 10 bit HP Dreamcolor (€ 2100) monitors or better, like Eizo Colorgraphic (€ 3500). A simple Dell Ultrasharp or Apple Cinema monitor will not do.
If you only use 10 bit monitors like the models above - you did not tell that - then I would suggest the Quadro 6000 plus the Tesla C2075 and two Colorgraphic monitors. It does not fall in the best BFTB category with a price of € 13,000 just for the video card and monitors. That amount is around the same as SIX to SEVEN complete and very decent PC's that are used by many here.
If you haven't spent huge amounts of money on very expensive 10 bit monitors, you are best off with a single GTX 670/4GB card for less than € 500.
If you are planning on using SpeedGrade ..you will need a card that accomodates its requirements SDI out put specific..
The rest of the CS6 suite will recognise and use the cards suggested by Harm and Jim.
High number of CUDA Cores is an advantage.
The GTX 670 - 4GB is an excellent choice with 1344 cuda cores.
All advice from myself and Iam sure Jim and HArm...assumes the rest of yoru system is up to a certain spec ( high)
4K - well ...you may need a Red Rocket Card as well or a workfow that "works" around the demand of this footage
I am getting amazing performance from my new setup with 4K files in a 1080P sequence. I dont plan on using speedgrade for color correction so I added the GTX 570 to the system. I veiw at 1/2 resolution and have gotten realtime playback on a 6 way compositied animated split screen. Using Source Setting I also can do realtime 1 light of the R3D files.
HP Z820 ( 2 - 2687w CPUs)
64 GB ram
Dulce ProDQg2 16 TB raid (5)
That might be a PrecisionColor model, not the simple Ultrasharp, and btw, Dell does not even mention 10 bit color depth on their site, maybe because it is not really 10 bit. IPS has nothing to do with color depth.
I am about to purchase a video card due to all the discussion here on the forum I am considering either the GTX670 of GTX680. I will be using the total suite including Speedgrade which card will function better (faster). I have satisfied the SDI output in a different set-up. Please advise