• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

geforce titan or quadro k4000?

New Here ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

Hello all -

I'm hoping to get a little help/clarification on which card would better suit my needs, and appreciate any/all help. I'm probably at an intermediate level of understanding when it comes to the technical aspects of computers, so some of the finer points escape me.

In short, i'm looking at two graphics cards to purchase as an upgrade: the geforce titan http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan/specifications or the quadro k4000 http://www.nvidia.com/object/quadro-desktop-gpus.html  .

Currently, my setup is as follows:

Intel Motherboard Dual Socket Xeon S5520SC

Intel Xeon CPU Fan Heatsink STS100C

2 x 2.26GHz Intel "Nehalem" Xeon Quad Core [8MB]

12GB 1333MHz DDR3 Triple Channel SDRAM (6 x 2GB)

1TB High Speed Hard Drive [64MB Cache, 7200RPM]

StormDrive Dual Layer CD/DVD Writer

850W Silent Power Supply

Windows 7 Pro [64-bit]

NVIDIA Quadro FX 3800 Workstation Graphics Accelerator [1GB]

PCI 3 Port FireWire [TI Chipset]

(additional) 1TB High Performance Drive [64MB Cache SATA 3 6GB]

I use my workstation for projects and not gaming. And i used CS5 and Toon Boon Harmony as my two mediums for editing/creation. With CS5, it's my understanding that the quadro cards are preferred as they utilize some useful functions (such as mercury playback), and that they handle after effects better for the 3d issues. Conversely, Toon Boom Harmony suggests a geforce card over a quadro, as it corresponds with their software more "inherently" than a quadro. Granted, both programs would work well with either type of card, but is there one that is better fit to run both? And given my setup listed above, would the Titan card be massive overkill/be bottlenecked by slower components? Additionally, can anyone testify to a head to head matchup between these two cards?

Thanks.

Views

15.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

cc_merchant --

Thanks for your response. i ran across that page earlier today in my searches. it led me to my tag on question of "And given my setup listed above, would the Titan card be massive overkill/be bottlenecked by slower components?"

the article helps me in the abstract but not the concrete/case specific with my setup, unfortunately. Basically, would jumping up in a graphics card be only partially beneficial due to other parts that may restrict it? For example, will i only get maybe 30% of an improvement from a great new GPU due to my processors, which may not be able to fully utilize the card?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

With the 'old' Xeons, only 12 GB memory @ 1333 MHz, with the limited disk setup a Titan would be serious overkill IMO. A GTX 760 would be more than enough, even when you start upgrading your system to a dual E5-2697 v2 with 64 GB. A K4000 is only sensible if you use 10 bit monitors.

You would probably profit more from upgrading memory from 6 x 2GB  to 6 x 4GB.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

This is where things get a little fuzzy for me: processors. I imagine my "nehalem" xeons are discontinued at this point, but they were still part of the i7 intel chips, right? Would dual e5-2697 be a downgrade from that (though still an upgrade because they're newer)?

When i'm using Toon Boom or CS5, the lagging I sometimes experience is a result of ram then? I was under the impression that with my 12gb of ram (regardless of 2 or 4 gb per stick), would be more than enough for editing and such...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 02, 2013 Dec 02, 2013

Copy link to clipboard

Copied

Oh. Also, the quadro k4000 wouldn't improve function whatsoever, or it would only be marginal?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Would dual e5-2697 be a downgrade from that?

If you go from two quad cores @ 2.26 GHz, DDR1333 and with 8 MB L3 cache to two 12 cores @ 2.7 GHz, DDR1866 with 30 MB L3 cache and call that a downgrade, you must be kidding.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

I'm admittedly not completely keen on the technical aspects of computer hardware such as processors, so I assure you I was  not kidding but rather searching for genuine answers.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Axeminister,

It would be a huge upgrade in processor power. In regards to GPU, the more expensive Quadro's only make sense if:

  • you need 10 bit color depth, and
  • have 10 bit color depth monitors to visible reproduce this level of output for you (only through display port output / cables afaik, not through HDMI, DVI or VGA signals).
  • Mercury Playback works on all NVidia cards that have more than 1 GByte of dedicated VRAM, so both on the Quadro and GTX platforms. Even if they are not officially supported by Adobe. There are quite some tutorials out there on the web how to direct the Adobe apps towards your NViadia graphics cards.

If you do not need 10 bit color depth, or do not have 10 bit monitors, then the extra money on the Quadro line is simply lost, and you can stay with the NVidia non-Quadro line, because then only the Cuda-cores and VRAM amount are important as far as I know, but I am also quite new to CC and computer hardware. But really, studying the articles on Harms PPBM website / Tweakerspage practically answers all these questions.

Greetz SebastiaanFP

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

thanks sebastiaanfp. i appreciate the help. if you had a suggestion (and a moment of time), which dual processors would you suggest to match up well with a geforce 770? quick searches of that gpu yield about a 400 price range, which would leave me looking at maybe 600-750 for a pair of processors.

thanks again.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

I would not suggest upgrading the CPU's unless you can also upgrade the board. Any of the current or even E5 V1 Xeons will require a new board. You would also want to add ram. If you want to upgrade the current system I would suggest the video card and more ram. You really dont have enough ram for GPU acceleration with that many threads in play.


Eric

ADK

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Ahhh. Good to know. The hardware/tech stuff on computers eludes me a bit. The other person in this string mentioned jumping up to 6 4gb sticks of ram. Would that sufficiently supply the new card? Additionally, would just adding addition 2gb sticks be helpful or would it still cause mini bottlenecks?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

And if i go with the 6 4gb sticks, would you have a specific suggestion on what kind? Do they still have to be 1333MHz DDR3 Triple Channel SDRAM in order to comply with my system?

Thanks so much.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

AXE,

I am not qualified to answer on the CPU part, but it seems to be adequately answered already by Eric and CC_merchant, if you want new CPU's, then you would start a build of a new workstation basically.

afaik about RAM/CPU-cores, if I remember correctly from a recent online course I followed you want at least 3GB of RAM for each CPU core to use it to its optimum, but again, better google this, shouldn't be to difficult I guess to track this information somewhere on the web, but there is some optimum in regards to processor cores / RAM & video editing performance. So I recon 2 quad core = 2 x 4 cores x 3 GB of ram means an upgrade to 24GB of RAM would improve your system, together with a more powerful Nvidia GPU with more Cuda cores and more VRAM. And this would be the maximum what you can do with your present system. Or if you want extra/new CPU's you will better go for a new workstation build or buy.

And if you are not using 10 bit color workflow, then you can save yourself a lot of money by leaving the Quadro line alone (because they are meant for 10 bit color workflow as only extra benefit afaik against the GTX line) and going GTX-line.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

I would highly suggest the 6x 4GB sticks. Dont get more 2GB. The board may not handle 1600 ram so I would check before you get DDR3 1600. Dont even bother looking at 1866 or higher with that system. 24GB would be far better than what you have with 16 threads currently and will be enough for either of the video cards I suggested.

Eric

ADK

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

I'll stick with the 6x4gb at 1333 then. Looking at the gtx770, i've noticed they have a 2gb model and a 4gb model. Would both be compatible with my system? And would you suggest evga or pny? sorry for the battery of questions, this is just getting into the deeper waters of computer knowledge and i'm trying to tread the water.

for example, would this one work?

http://www.amazon.com/EVGA-Classified-Dual-Link-Graphics-04G-P4-3778-KR/dp/B00DBPU8B2/ref=sr_1_2?s=p...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Both the 2GB or 4GB will work. It really just comes down to warranty and failure rate. I would highly suggest the EVGA and leave it at that.

Those Crucial are Dual Channel modules and not Tri Channel. I would suggest looking for a Kit of 6 or 3.

Eric

ADK

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Dec 03, 2013 Dec 03, 2013

Copy link to clipboard

Copied

Axe,

This is straight from Harms Tweakers page copy/pasted:

"As a rough guide, how much VRAM should be installed on the video card, think along these lines:

  • For SD material, 1 GB is enough
  • For HD material, go for 1 - 2 GB
  • For 3K or 4K, go for 2 - 4 GB
  • For 5K+, go for 4 - 6 GB"

http://ppbm7.com/index.php/tweakers-page/83-balanced-systems/94-balanced-systems

Just spent some time there and I think you can find many answers. I find Harm is also very willing answering any body asking serious questions. You can post comments on his site there. He answered my questions too, and I am really at beginners level.

If the price difference isn't to big, (less then a 100$ or so?) I would stay on the save side and buy the 4GB card. Just so you know it will definitely not be the bottleneck on your system. Or you can continue your research, but time is also money....

I find for myself that there is basically no way around it to give this some study time, or the other option is to just go to some professionals and pay the extra money for custom build video editing workstations. But that again is actually great, because understanding your PC-system is actually really interesting!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 04, 2014 May 04, 2014

Copy link to clipboard

Copied

Your getting alot of miss information regarding your initial question so heres the correct answer.

Quadro cards are not for gameing, they are for highlevel content creation. Especially 3D like Maya and 3Ds max, CAD softwares and 2D applications  like Autodesk flame (150k software)  and Foundrys Nuke software (8k). Quadro cards are much more stable from custom proprietary drivers and far more graphic programs support Quadro rather than gamer cards. Window refresh and especially interaction , 3d content creation / modeling control and much higher poly count is possible with Quadro. Quadro cards will run 24/7 rendering and geforce will not. Games require single floating point and Visual effects, scientific calculations and 3D rendering require double floating point percision for look and accurracy not to mention anti aliasing quality. Stereoscopic creation only possible with Quadro. Quadro has 10 bit color accuracy for professional level visual effects and color grading, Some Adobe software is designed for open GL will run faster and more stable with Quadro, Adobe light rays for example are about 400% quicker than any geforce card but these results are " special" case scenario.  Yes open GL is still far superior but open CL is catching up so by 2016 who knows. The new Quadro K series have 4 simultanious diplay outputs and do 4K at 10 bit.

If you do not require to run any of the above softwares and especially do not do any 3D then you do not need a Quadro. theres still other caviats like direct x is best for max etc but it would take much more writting to list all the points required. suffice to say, choose your software first and let that dictate your GPU.   There are simlply many advantages to a Quadro card and you can not see them from reading charts and reviews as most reviews cannot measure the Quadros advantages without first purchasing the highend 2D and 3D softwares to test with ( and thats not happening). Its that simple. Best of luck.

Oh yes, Nvidia purposly turns off hardware on there geforce cards via the PCB board and drivers for Quadros advantage. Thats capitolisim at work. Also d not judge a card on the amount of cuda cores. This used to be possible but no more.  For example the new Kepler architecture runs at half the frequency of Fermi so it will take basically twice the cuda cores to equal the older Fermi cards. Thats what they call creative accounting!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
May 05, 2014 May 05, 2014

Copy link to clipboard

Copied

Actually the Geforce cards have better cooling and run much cooler than the Quadro cards on average especially the higher end Geforce cards. The load put on the Geforce cards by games on average is higher than the load Adobe puts on the cards most of the time. The Geforce cards handle rendering long form fine with their temps averaging 40 to 65C including 24/7. Many of our clients have been doing that since CS5 released. Max like many other 3D applications are moving to Direct X so the Open GL acceleration on Quadro cards specific to their drivers is far less utilized. Geforce cards are better than Quadro cards for Direct X. Quadro cards do give you 10 bit color preview in Adobe but I/O such as Blackmagic and Aja are often better solutions for that coupled with a Geforce card. Open GL is an API and Open CL is GPU acceleration. They are completely different and pointless to compare. Cuda is what you compare to Open CL right now. GPU acceleration performance in general with regards to the video cards is tied to the raw specs. That means more cores and ram bandwidth decides the performance you will see from GPU acceleration. The higher end Geforce cards have far better specs than the Quadro's other than the K6000 which is extremely expensive. You could get 2 Titan Black edition cards for half the price which would perform far better. Many Red users use Titan cards and Blackmagic recommends them for Davinci because the performance is the best at lower cost. Geforce cards in general are a far better buy than the Quadro cards for GPU acceleration. We have also seen a lower failure rate with the higher end Geforce cards 760GTX and above than we have the Quadro K4000 series and above. The Quadro's just don't have correct fan profiles for the cooling which allows them to heat up too hot especially over long periods. Nvidia also has not artificially limited the Geforce cards due to the Quadro cards. The Geforce cards are vastly out performing the Quadro cards currently so if they had limited them then they failed badly. The GPU clock speed and memory clock speed likely were reduced because of the amount of cores added to the latest GPU's. That is the main benefit to parallel processing. More cores equals better performance even with far lower clock speed. X86 CPU processing is completely different which is why clock speed has far more impact. Right now the current CPU's are just not pushing the current higher end Geforce cards to their limit. It takes Dual Xeons to push a single Titan at this point due to how GPU acceleration works. The performance ceiling on the 700 series cards is far higher than the 500 series.

Eric

ADK

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 05, 2014 May 05, 2014

Copy link to clipboard

Copied

If anyone in the world has experience to back up their opinions it is Eric.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 05, 2014 Dec 05, 2014

Copy link to clipboard

Copied

your post is really interesting to me,

you mention a setup with geforce and blackmagic to obtain 10bit/c, but does it work also in photoshop or only via blackmagic software/preview?

Have you tested it? Id prefer to buy a blackmagic device instead of a quadro,

thanks a lot

best regards

sam

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Dec 05, 2014 Dec 05, 2014

Copy link to clipboard

Copied

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 06, 2014 Dec 06, 2014

Copy link to clipboard

Copied

hi Eric,

thanks for your reply; I've read all the thread, but I can't actually find an answer.

I know from bmd that: "The HDLink can accept 4:4:4 and 4:2:2 10 bit SDI video and the Ultrastudio Express can send out 10bit colour precision. If you were in a video grading workflow, then this would work fine with for example, our software DaVinci Resolve. However with something like Photoshop, there is not a live retouching feature that you can use, only a final render out that would be seen on your monitor."

but you stated that: "Quadro cards do give you 10 bit color preview in Adobe but I/O such as Blackmagic and Aja are often better solutions for that coupled with a Geforce card. "


Were you referring to softwares other than photoshop?

Since I would need a bdm device only for it [which one is not completely clear to me by now, but in case I'll deepen it later], I assumed I couldn't use it after the answer I got from bmd staff, because a 10bit live output isn't available inside photoshop [while with a quadro or a firepro it is].

thanks a lot for your help,

sam

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines