> I am looking to get a new rig, either a Mac Pro or a Skull Trail based Win machine, with my primary use being video editing, and dvd authoring. Both rigs have similar parts, DDR2 FBDIMM's, dual proc's, SLI boards. Both will accept the Quadro FX 5600. I can swing this super expensive card which is "fully supported" by Adobe, but is it worth the extra change?
> I figured After Effects is the most 3D intensive renders involved with the whole creative suite. I know that most video workstations are equipped with them, or their more lowly counterparts. But, I need to know from someone who knows, and who works with these programs, and who might have had some first hand knowledge of the comparison between these cards.
> My original idea was a 9800GX2 which is a dual-gpu card. This isn't available for the mac, but for the win rig. This fully supports hd playback, and with 2 gpu's has some better basic stats than the Quadro FX 5600.
> Nvidia also makes the Quadro FX 4600x2 which is based on the same stuff as the 9800gx2, only with more serious rendering apps in mind. I suppose I could use this card in the win rig, if I could find it.
> The Quadro FX 5600 is supposed to have really large frame buffers, and I could see how this would impact the performance. But is this the real difference between the two cards? Or is the architecture of the two cards so different, in ways that programs like After Effects can use? I really only care about video production, and I need to know this card makes a difference.
> Can anyone help me here?
Since after effects can use lots of extra ram, for rendering (including
ram previews) you might consider just loading up with tons of memory if
AE is the main reason you are thinking of the quadro card.
Quadro is WAY overkill for AE, IMHO. Even the Nvidia 9000 is overkill, since AE (and pretty much everything else except some games) doesn't support SLI.
Rather than spending the extra cash on a quadro card, I'd recommend getting faster quadcore cpus and more ram.
As the others mentioned - if you are not doing any super-heavy 3D work somewhere, the benefits of a more expensive card are not that great. Most accelerated effects and functions will still work even with lower fidelity models. However, as hardware acceleration becomes more widely adapted, you will find more and more effects that use it. This is already true for tools such as several Magic Bullet plugins as well as Boris FX and others. Also note that hardware acceleration is used for footage decoding and playback, both in Premiere and AE, so depending on your requirements it may make sense to get a slightly more powerful card.
Thanks everyone, this has helped.
But, now, as always, my questions have changed somewhat.
Actually, I swore that the 9800GX2 was fully supported, but it isn't. From this site these are the supported cards:
ATI Radeon X1950 PCI-Express Full Support
ATI Radeon X1900 PCI-Express Full Support
ATI x1900 PCI-Express Full Support Mac only
NVIDIA Quadro FX 4500 PCI-Express Full Support Mac only
NVIDIA GeForce 7300 PCI-Express Full Support Mac only
NVIDIA Quadro FX 55/5600 PCI-Express Full Support
NVIDIA Quadro FX 4500 PCI-Express Full Support
NVIDIA Quadro FX 3500 PCI-Express Full Support
NVIDIA Quadro FX 1500 PCI-Express Full Support
NVIDIA GeForce 8800 PCI-Express Full Support
NVIDIA GeForce 7800 PCI-Express Full Support
Really, the only cards I am interested from this list are the 8800 VS the Quadro 5600FX. This makes the choice easier, because the 8800 series does not accelerate HD playback (which makes me wonder what the full definition of fully supported means) at least according to their web info. The only other thing that is noticable about these cards, is that they have all been out for a while. Nothing brand new is listed.
So, is the Q5600FX worth it? The huge price of the Quadro line has to be in someway marketing to the big corps for workstations, since your average consumer wouldn't pay that much. But do the big corps get their money's worth? Or, should I get a non supported card and keep my fingers crossed for months that eventually it will be "fully supported" also? I figure that 'fully supported' means that the software 'recognizes' the card, and uses whichever features it can specifically from the card, which might otherwise be handled by the cpu. This is just a guess. But surely, based on this list, CS3 does not utilize each one of these cards equally, and a breakdown on exactly what features are enhanced by each card on each program in the suite would be eminently helpful. I doubt any user has this info!
The trick is this: I want to 'future proof' my system. I want to buy the 'big enchilada' right now, and save my future money for cs4, and plugins, and os upgrades (like the upcoming vista replacement), etc. And as Mylenium pointed out, certain plugins might be heavily counting on gpu hardware acceleration.
Like the others have suggested, I am going to get something like 16G of RAM for a Mac Pro, or 8G for a Win machine.
>the only cards I am interested from this list are the 8800 VS the
Mmh, well, I could point you to several threads on this forums that go on about the joys of 8800 cards causing issues with AE, so it would kill your enthusiasm entirely... As for the rest: Surely the price of Quadros seems excessive, but the matter is more complex. All Quadros I ever had, had higher memory clock and came with better voltage and temperature management as well as much, much quieter cooling units compared to the same GeForce models. Plus some models have memory configurations that you simply cannot get with normal GeForce cards.
Depending on where you buy your stuff, you also get extended warranty periods and additional support. I for instance can have my GeForce 5500 in my current workstation replaced overnight free of charge if it ever goes boom... The question really is, do you want that kind of "safety" and pay for it? If you don't then a normal GeForce will do just fine. It's not that they would explode 2 days after purchase or something. ;-)
Regarding the future proof aspect: You should not make any assumptions about this. The only thing that is sure that in half a year there will be better processors and better graphics cards and then even the most expensive, current top-of-the-line model will no longer be so top. ;-) Therefore base all your buying decisions on your current needs plus the timeframe you can predict in terms of what additional software you intend to buy in one year or so.
In the greater scheme of things there will not be that many changes in foreseeable time. All current hardware supports OpenGL 2.1. OpenGL 3 has been released as a final draft, but it will take time before hardware and software emerges that supports it. Because vendors want to sell software, they will compromise and settle for the smaller common denominator which even in two years will still be OpenGL 2.1. So no worry. You will be able to use all your current software in 2 or 3 years still as you will find new software that still is build around these more conservative specs. Remember: This is not the gaming industry that expect you to buy a new PC whenever they release a new game that may require the latest DirectX for all the eye candy. ;-)
What about 64 bit video rendering? Sure having loads of memory will not be a bad investment, because this should happen soon. Avid claims that it will be here at the end of June this year.
How do you feel about 64 bit, and what changes hardware wise do we need beside more memory?
I think 64bit is the most misunderstood concept out there and currently everyone throws it around without really recognizing what little practical benefits it has. Avid may claim whatever they want, but all they do is make their 64bit CoDecs native which will neither affect people's workflows nor how those CoDecs use system resources for the most part. A minor benefit may be better parallelization, but let's be honest: Does anyone with an average daily workload ever recognize a 20 to 30% performance gain that makes perhaps 10 seconds on a 5 minute clip on a system that already uses hardware accelerration? And the promised 1000% speedups for transcodes and whatnot have yet to be seen...
Ultimately all those 64bit evangelists have to ask themselves the hard question: Is there really any additional end user value? From where I sit: No. There is absolutely no advantage in re-compiling an existing app to 64 bit just to use it as a marketing ploy as long as it is not otherwise adapted. Doesn't mean anything if in theory you have infinite memory, but your particle system cannot do more than one million particles, regardless, due to other program-internal limits. 3D programs have failed to deliver on this promise for years and I don't see it happening for other programs, either. What good does it do me, if I could theoretically do half hour RAM previews, if it still takes forever to render them? Does it really have any benefit of Photoshop going 64bit when all you get out of it is being able to open two homungously large files instead of just one?
Programming for multicore 64bit systems requires a new way of thinking and IMHO this has not yet happened in the programmers minds - we still have mostly conventional apps that need to jump hoops or provide very little additional functionality. So far the only two real advantages are all based around the same matter: Because on a 64bit system you have access to more RAM, each application can have its own independent chunk without interference from other programs and thus it runs more stable. For the same reason, internal functions that operate on large memory blocks do work better because there is less fragmentation. That, and of course being able to move beyond 4GB if needed.
I appreciate your insightful and well thought out reply. You have really helped me a lot. I feel like you ought to be getting a fee :) Really, really, really thank you...
I've read that Windows PS 64bit will be arriving before Mac64bit..
And, sure it was a demo, but rendering 1,000 characters (1,000 different animations, 1,000 different textures) looked amazing on XSI64.
It would be nice if Apple would release a 64bit version of QT.
Everybody is saying that quadro are kind of useless for AE. BUT if you look at what the quadro cx was capable to, even if the concerned effects was limited, it was looking like worthing it.
Quadro series video presentation :
http://www.nvidia.co.uk/object/adobe_AftereffectsCS5_uk.html (the Camera Zoom & the3D Animation Rendering)
Well, if the quadro enhance the viewport as good as on the video shows, hell, for me, it just worth it.
The real point is as the author of the topic started with, if somebody who own a Quadro could say us if it improve as good on as the upper videos, that would be a lot of help.