Skip navigation
Khuramshakeel
Currently Being Moderated

Dilemma?? GTX 570, 580 or Quadro 4000

Feb 15, 2011 4:22 PM

hey guys apologies if im in the wrong place for this, its my first ever post online!  I have a slight dilemma, I have just recently ordered a new setup, however im confused as to which graphics card to go for. I will mainly use the system for HD Video editing (Sony HVR Z7) using premiere Pro CS5 and after effects. From what i have read the GTX range is more than capable of accelerating certain effects in premiere pro, but will the quadro be better?

 

My other main use of the pc is that I would like to hook it up to my Sim 2 Lumis host projector, via HDMI or DVI, now, since the Quadro has 10 bit video would this in any way re produce a much better image quality than the GTX range? Or is this only limited to the display port? Is the 10bit video sent through all ports even HDMI/DVI? I know that my projector has 10bit Video Processing. I would really appreciate some guidance on this, as im wanting to place an order for the card asap.

 

(Money is not an issue with regards to those cards)

 

Many Thanks guys.

 
Replies
  • Currently Being Moderated
    Feb 22, 2011 10:56 AM   in reply to Khuramshakeel

    The Quadro 4000 supports 10bit color so should be capable of supporting the Sim 2 Lumis host projector via either DP to HDMI or DVI to HDMI connection.  While everything should work in theory, I have to say that we haven't tested this specific combination as far as I'm aware.  If your looking to use the projector to playback 10-bit video output, you'll also need to verify that the playback application also supports 10-bit output formats (Premiere Pro offers 10-bit, but you'd have to check if you plan to use anything else for playback.)

     

    Sean

     

    Sean Kilbride

    Technical Marketing Manager, NVIDIA

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 22, 2011 11:25 AM   in reply to Khuramshakeel

    The Z7 only records 420 color space, so unless you want to pad the colorspace with zeroes, the end results will never be better than the input. Garbage in, garbage out. Converting to 422 does not add information, it only protects you somewhat from deterioration while editing.

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 22, 2011 3:11 PM   in reply to Khuramshakeel

    Khuramshakeel,

     

    The Quadro board isn't going to really offer any additional improvements in terms of speed, etc.  The Quadro 4000 is a single slot solution that only requires a single 6-pin power connector and at 142w draws considerably less power than either of the GTX cards.  If available slots or power is a concern then thats something to consider, as is the fact that neither the GTX 570 or GTX 580 are currently officially Adobe supported solutions.  (Although it seems that many users have been successfully using them since they became available.)

     

    The Quadro will allow you to output a 10-bit color format, but it doen't sound like you are working with any formats that are really going to take advantage of that capability.  If the source format doesn't already contain 30-36bits of color data than outputing a 30-bit image will not provide additional quality since the additional color data just doesn't exist.

     

    Sean

     

    Sean Kilbride

    Technical Marketing Manager, NVIDIA

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 22, 2011 4:18 PM   in reply to Khuramshakeel

    Sean explained it quite nicely. If your PSU can handle the extra power and you have the physical space for a 2 slot card, my preference would be the better specced 570/580. It gives you a better price/performance ratio than the Quadro 4000, at least for PR/AE. For 3d work, like Autodesk applications, the drivers of the Quadro may be beneficial, for for regular NLE work, there is no advantage to the more expensive Quadro over the more capable GTX 570/580.

     
    |
    Mark as:
  • Currently Being Moderated
    May 3, 2011 11:34 PM   in reply to RacerX2oo3

    Sean Good Evening,

     

    I see this thread is a bit older however, since you are a resident expert perhaps you can help me solve a dilemma.

     

    I've a need to buy a new Laptop, and wish to fully leverage the Nvida GPU along with the 17 Sandy Bridge.

     

    What I have tried to find out is which GPU allows full functionality of the  CS5 Premiere Pro 5.5 in a laptop configuration, so far with ambiguous avail.

     

    I travel with a portable solution and use two eSATA drives, one on and the other a spare  for cache and so on,  on assignmanet ,  onsite content rushes, edits for color etc.

     

    I went to Richard Harrington's seminar in March and saw him switch to a Dell for real time demos of the Mercury but he ran power point on a MAC.

     

    Given the fast pace of the industry and that Intel (finally birthed) the i7 second generation, my question is as follows and if you feel compelled to respond I'd appreciate it.

     

    Which series will be able to manage the CS5PP,  processes at 100% or run efficiently on a laptop, off the shelf or customized? I'm prepared for either based upon directive influence by all 'yall.

     

    Should I buy the 460M through 485M or the 520M through the 555M, or,  is there a requisite Quadro for laptops in the mix.

     

    I do not need gaming, I do need full functionality of the Mercury aspects within the CS5 PP suite.

     

    I'm using content/edits from Nikon's top two cameras, D3S and D7000,  in h264/MPEG4 or .mov combined with still NEF images using CNX2 or LR3.

     

    My DAW is using either Samplitude or Studio One, depending on the files from archive, and .wav is the audio format from the Zoom H4N on location.

     

    Thanks in advance for your attention to this query.

     

    Rob Manning

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 7:24 AM   in reply to Khuramshakeel

    There is a new 570GTX card that has a displayport output now which should output 10bit color in Adobe. I need to verify still if 5.5 updated the Adobe media Player to include the HDMI 1.3 or 1.4 standards still for 10 bit color via HDMI.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 9:24 AM   in reply to ECBowen

    Eric,

     

    There have been many Geforce boards that have offered DisplayPort connections as an option, and they have supported 10-bit color output to a full screen (true full screen not a maximized window) Direct X surface.  However, due to the way that most applications use traditional Windows API functions to create the application UI and viewport display, this is generally not a very useful option.  In order to get 10-bit color functionality over DisplayPort within Premiere you are still going to need a Quadro card. 

     

    Sean

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 9:33 AM   in reply to RacerX2oo3

    Why would that matter when the Color support is based on the connectivity standard supported, player software, and output device. Whether the UI supports 10bit is not really important if you want 10 color preview.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 9:41 AM   in reply to D3S user

    Rob,

     

    You've got a lot of variables in the mix there, and trying to decide on the best setup is particularly challenging on a laptop as they generally are not as configurable as desktop systems.  Additionally becuase laptops offer limited end user upgrades once configured, you generally want to make sure you are selecting the right components when you intitially configure it.  I can't give you the ideal one size fits all solution, however I can recommend what I would do if I was in your shoes.  The system that is currently sitting in my laptop bag is the HP 8740w with the Quadro 5000M GPU.  So far this system has been able to perform all the editing tasks I've thrown its way.  Now depending on what your timeframe to purcahse is, if it were me, I would serious take a look at a notebook system that used the new Quadro 5010M.  Are there other solutions out there that will perform similarly?  I would guess the answer is likely yes, however from personal experience this is what I would recommend.  I'm sure others can provide their expereinces with other system setups.

     

    Sean

     

    NVIDIA, Technical Marketing

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 9:50 AM   in reply to ECBowen

    Your correct that the UI is unimportant if you want 10-bit preview.  What I'm saying is that all the information I've recieved have indicated that the methods used within Premiere to draw video surfaces are not compatible with the need for Geforce 10-bit color to use a full screen Direct X surface and that even an application that did use the correct Direct X full screen output, would likely still be drawing a UI on one of the outputs (assuming you are previewing on one display and editing on the other), mixing 10-bit full screen DirectX output and 8-bit windowed output for the application UI is a non-trival coding task and I'm not aware of any application that performs this way. 

     

    Sean

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 10:04 AM   in reply to RacerX2oo3

    Is not the Direct Show output of the player not handling the full screen output? How can displays function independently if the driver and video card cannot segregate the outputs by channel? How do I/O cards output 10bit via HDMI while in Adobe if the UI screen is still in 8bit mode if the player is not independant of the UI?

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 10:25 AM   in reply to ECBowen

    Eric,

     

    Can you provide a workflow description of what you trying to accomplish so that I can understand if we are talking about the same thing?  My understanding is that what you are suggesting, 10-bit previews over DisplayPort on a GTX 570 are not possible (from within Premiere).  However, if I have a better understanding of exaclty what it is you are proposing, I'll get the definitive answer from the Engineering team.  You can either post your desired workflow process here, or send me a PM with the info and I'll get the answer straight from the guys who design this stuff. 

     

    Sean, NVIDIA Technical Marketing

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 10:35 AM   in reply to RacerX2oo3

    The information is more for clients and recommended configurations based on a standard 10bit color codec workflow. In otherwords if they are using Cineform's codec in Adobe  then can they not get 10 bit color preview out full screen from the displayport of any video card since it's the displayport standard that is supported for 10bit color preview in Adobe's Media Player right now provided they have a 10bit color display.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 1:08 PM   in reply to ECBowen

    BTW incase you need a reference this is from Nvidia.com

     

    http://www.nvidia.com/docs/IO/102043/GTX-570-Web-Datasheet-Final.pdf

     

    Page 3

     

    Advanced Display Functionality

    • Two pipelines for dual independent display

    • Two dual-link DVI outputs for digital flat panel display resolutions up to 2560×1600

    • Dual integrated 400 MHz RAMDACs for analog display resolutions up to and including 2048×1536 at 85 Hz

    HDMI 1.4a support including GPU accelerated Blu-ray 3D support, x.v.Color, HDMI Deep Color, and 7.1 digital surround sound. See www.nvidia.com/3dtv for more details.

    • Displayport 1.1a support

    • HDCP support up to 2560×1600 resolution on all digital outputs

    • 10-bit internal display processing, including support for 10-bit scanout

    • Underscan/overscan compensation and hardware scaling

     

    Incase you need reference for what Deep Color is in the HDMI standard BTW:

     

    http://www.hdmi.org/learningcenter/faq.aspx

     

     

    HDMI 1.3:

    • Higher speed: HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds.
    • Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
    • Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.
    • New mini connector: With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option.
    • Lip Sync: Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates automatic audio synching capabilities that allows devices to perform this synchronization automatically with total accuracy.
    • New HD lossless audio formats: In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and all currently-available compressed formats (such as Dolby® Digital and DTS®), HDMI 1.3 adds additional support for new lossless compressed digital audio formats Dolby TrueHD and DTS-HD Master Audio™.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 1:12 PM   in reply to ECBowen

    Eric,

     

    I was hoping that providing a workflow example would make sure that I'm answering the correct question for you.  I'll make a couple of assumptions based on the information you provided.  Let me know if this differs from your expectations.

     

    It sounds like your question is whether an end user using 10-bit content can expect to preview that content using the full bit depth using the DisplayPort connector on the graphics card. 

     

    When you say preview I'm assuming that:

    • You mean going out to a full screen playback on monitor via Playback Settings >> External Device >>  "OutputMonitor"
    • That the output monitor is a 30-bit capable display such as an HP DreamColor monitor connected via DisplayPort.
    • Input source is 10-bit Cineform (4:2:2) (or similar 10-bit source)
    • Premiere Pro CS 5/5.5 is active on the primary output.
    • The preview shown on the 30-bit output monitor in the senario above mirrors the output to the Program Monitor in Premiere.

     

    In the workflow here, 10-bit output would not be possible using the GTX 570 or any other Geforce card because of the limitations I mentioned above. 

     

    Premiere Pro uses a 10-bit OpenGL pixel format to handle 10-bpp output.  The application queries the devices for 10-bit support, if that query is positive the application then creates a 10-bit OGL pixel format and draws out to it.  When requested for a 10-bit OGL pixel format any Geforce board will fail the query and Premiere will provide a standard 8bpp pixel format for the video drawing surface. 

     

    While 10-bit color is part of the DisplayPort specification, this does not mean that implementing a DP connection allows 10-bit functionality.  Many different requirements must be met.  The application must specifically request a 30-bit color pixel format, and implement its use within the application, the output device must be capable of recieving 30-bit data over the DP connection and correctly implement an appropriate color space.  The graphics hardware must be capable of supporting the 30-bit pixel format requests on both a hardware and driver level. 

     

    Hopefully this adresses the use cases your customers are interested in.  The Geforce cards definitely support 10-bit color processing and output (and have for some time now), it's simply that the methods that they use do not apply based on the way that 30-bit color support is implemented in professional applications like Premiere Pro.  If your interested in the details including some of the challenges of implementing 30-bit color support, you might find this useful: http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf

     

    Sean

     

    NVIDIA, Technical Marketing

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 1:44 PM   in reply to RacerX2oo3

    That basically was the workflow I was asking about. So since I listed above how the Displayport on the Geforce 570 clearly supports 10 bit color, How is it then that the Geforce board fails the query?

     

    Number 2: If one closes the preview window on the UI screen and just uses the full screen output for preview, will that change the results of the query from Adobe?

     

    Number3: How is it then that I/O devices like the Matrox, AJA, and Blackmagic are outputting 10bit via HDMI in that same process yet the video card cannot when it supports the 10 bit color output and can output full screen to an independent display?

     

    BTW Page 19 on that link you listed lists this which explains the independant display and why it SHOULD work.

     

    Multi-Display Configurations

    In the case when a single GPU drives multiple displays, as long as there is one 30-bit

    compatible display active, 30-bit color is automatically enabled on all displays. At scan out,

    the driver automatically downgrades the 30-bit output to 24-bit for regular displays.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 2:22 PM   in reply to ECBowen

    That basically was the workflow I was asking about. So since I listed above how the Displayport on the Geforce 570 clearly supports 10 bit color, How is it then that the Geforce board fails the query?

     

    As I mentioned in my original post 10-bit output on the Geforce boards requires a 10-bit Full screen DirectX surface.  Premiere is asking for a 10-bpp OGL pixel format which is completely different from what the Geforce boards are capable of supporting.

     

     

    Number 2: If one closes the preview window on the UI screen and just uses the full screen output for preview, will that change the results of the query from Adobe?

    Would'nt make any difference because of the root issue that mentioned above.  I merely used that example to be sure that we were using the same meaning of "Preview"

     

    Number3: How is it then that I/O device like the Matrox, AJA, and Blackmagic are outputting 10bit via HDMI in that same process yet the video card cannot when it supports the 10 bit color output and can output full screen to an independent display?

     

    Unfortunately,I'm not familar with the specifics of how I/O cards process video for output.

     

    Sean

     

    Message was edited by: RacerX2oo3 to correct color formatting

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 2:53 PM   in reply to RacerX2oo3

    Well that gives me the info to ask the I/O device manufacturers whether they use Open GL or Direct X. Thanks for the information.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 3:19 PM   in reply to ECBowen

    Eric,

     

    Thanks for the detailed questions. Although I do not game per se, the technical aspects which you've pointed out has been interesting to follow.

     

    In my case, I'm certain that at some point, the protocols will mesh for games and edits but so far, it appears that the industry supports GPU's which lean in one direction or another and, still have no common format which would be similar to the MIDI standard from way back in the stone age where manufactures actually conscribed and agreed.

     

    I did make notes.

     

    Thanks again,

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 3:38 PM   in reply to D3S user

    Well Open GL and Quadro's go way back and I don't expect that to change regardless what Open GL versions the Geforce cards support. The API being the limitation though was not what I expected but atleast now that is official. That is definitely something nice to know since it did not make sense before.

     

    Eric

    ADK

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 3:42 PM   in reply to RacerX2oo3

    Sean,

     

    I spent all morning rooting around the web, reading about specs and so on and following with some interest the exchanges between you and deep-gamer Eric.

     

    Do you know or have an ETA on the 2000M, and 5010M to folks like Puget?

     

    They have responded indicating possibly having the Q 2000M soon however, at this juncture, perhaps my vague understanding of the capabilities for Mercury using ACS5PP5/5.5, is exaggerated by misunderstood readings within various web posted specs and comments. Many are too fractious to quote here but paraphrased as the 460M is a dumbed down this or that and more for prosumer gamers etc., lacks full CUDA access, versus older designs and so on.

     

    So, back to your expertise, would it be easier to assume that the 460M or better still, the  485M, with an i7 Sandy Bridge mo bettah chip is enough firepower for building layered HDMI content and broadcast quality products that  render into the various formats from media phones, DVD, web,  to HDTV television signal?

     

    Or, would it a better bet to wait until GPU's are actually shipping such as the Q 2000M and or the Q 5010M and would that buy me much beyond some Nano speed increase that I'd not be able to effectively notice and be an overkill, considering what the 485M can already do?

     

    Sorry to seem confused still and yet but I am gathering to make a buying decision, so thanks for the help to all!

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 4:18 PM   in reply to D3S user

    My questions were not based on gaming but for our clients interested in 10 bit color workflows which included preview. Because 10 bit color preview requires so many factors to implement including source, player, driver, transmission protocol, and preview device, this is something many of my clients have no idea about. If I am to advise them correctly in what they need then that info is critical and especially for support since many editors are trying to make the move to 10 bit as well especially with R3D material.

     

    Your question is dependant on the codec/codec's of material you will be dealing with and how much work you do in AE or other compositing programs. If you use your standard codecs like AVCHD, H264 including DSLR, XDCam, and other 8bit color codecs and your AE workload is not significant then the Quadro Mobile cards will not increase your performance. Regardless of what you read the Geforce cards are not limited with CUDA. Your delivery of material toward this would only come into play if you need to deliver 10 bit material. If your AE workload or PS workload is significant then the Quadro M would be the way to go since Premiere will not show any performance increase with those cards.

     

    Eric

    ADK

     

     

     

     

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 4:45 PM   in reply to D3S user

    Rob,

     

    I have to be honest that questions like this are tricky to answer.  It's difficult to just overall performance of a system based around one or two individual components.  Additionally, the entire issue of certified graphics solutions verses non-certified comes into play as well.  While I understand that there are many people using non-supported GPU's who are are perfectly happy with the results, recommending a non-supported graphics product is not something that I'm comfortable doing.  On the desktop side that isn't as big a deal since with the arrival of 5.5 there is now a broad range of supported solutions to choose from, however it means that on the notebook side of things those choices are more limited.  Given the available supported solutions, I can say that I've personally tested solutions using the 5000M and been very happy with the results (HP 8740w), the 2000M will be a very usable solution for less demanding timelines, but for effects heavy timeline using multiple MPE effects (note this includes scaling, blending etc.), I would recommend a system based around the 4000M at a minimum. 

     

    Some of the questions about which GPU to choose are questions that only you can personally answer.  Choosing one of the Quadro mobile cards will likely mean that your going to be purchasing a system from DELL, HP or Lenovo since I'm not aware of any custom laptop shops that offer Quadro Mobile Graphics as an option.  If you decide to go with a Geforce mobile GPU, I'd recommend buying from a company with a good reputation for support in the area you plan to use your notebook.  ADK has a good reputation around these forums becuase they provide exactly that type of support.  This is evidenced by the interest representatives like Eric take to understand the complexities of the various aspects like 10-bit color support.  Keep that dedication in mind if you choose to purchase a custom notebook solution.

     

    Sean

     

    NVIDIA, Technical Marketing

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 4:58 PM   in reply to ECBowen

    Eric,

     

    I saw a deep gaming forum, with ADK moniker and an Eric associated with such. Forgive the misstep, NO intention to disparage.

     

    I very much appreciate your advice however on the GPU's.

     

    Being versed/trained in creative, writing, composing etc., Pro Recording DAR/DAW but video...at 8-bit, well you can see that even with technical chops, I certainly can always learn more.

     

    The CS5PP software is now my learning curve of choice, so thanks to all for the occasional newbie questions, being tossed out for all 'yall to engage (or not).

     

    Regards,

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 6:09 PM   in reply to RacerX2oo3

    Sean,

     

    I will only be buying an Nvidia GPU, no compromise for gear.

     

    A lifetime's worth of guitars, amps, recording platforms, mics and effects has led more recently into a significant investment with heavy weight Nikon tools, for the simple fact admittedly that the best tools allow for expected results, even when technology flies by at warp speed..

     

    So the ONLY reason I tagged you folks last night was because I needed to get an informed, contemporaneous, in touch decision by hands on folks like you and Eric.

     

    Sooner or later, a buying decision needs to be made, so that the new gear manifests in the studio/locker/kit as a real-time acquisition, ready for tasking.

     

    Sure, in six months the new bell and whistle will be here however, as we folks over in Nikon land http://www.nikonians.org  like to call it NAS (Nikon Acquisition Syndrome), can cause one to wait  and...?

     

    The six months' worth of shooting that we can be doing with what is available, simply slipped away.

     

    You indicate that using existing processer(s), managing throughput, functioning at levels deep , and performing as expected, has so far been the best illustration in answer to my initial post, so thanks!

     

    I will research further now and start from scratch to better mirror your most recent advice, and to greater degree, implement a version of the systems you have suggested.

     

    Many thanks,

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 4, 2011 6:17 PM   in reply to RacerX2oo3

    Sean,

     

    I will only be buying an Nvidia GPU, no compromise for gear.

     

    A lifetime's worth of guitars, amps, recording platforms, mics and effects has led more recently into a significant investment with heavy weight Nikon tools, for the simple fact admittedly that the best tools allow for expected results, even when technology flies by at warp speed..

     

    So the ONLY reason I tagged you folks last night was because I needed to get an informed, contemporaneous, in touch decision by hands on folks like you and Eric.

     

    Sooner or later, a buying decision needs to be made, so that the new gear manifests in the studio/locker/kit as a real-time acquisition, ready for tasking.

     

    Sure, in six months the new bell and whistle will be here however, as we folks over in Nikon land http://www.nikonians.org  like to call it NAS (Nikon Acquisition Syndrome), can cause one to wait  and...?

     

    The six months' worth of shooting that we can be doing with what is available, simply slipped away.

     

    You indicate that using existing processer(s), managing throughput, functioning at levels deep , and performing as expected, has so far been the best illustration in answer to my initial post, so thanks!

     

    I will research further now and start from scratch to better mirror your most recent advice, and to greater degree, implement a version of the systems you have suggested.

     

    Many thanks,

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 12, 2011 10:37 PM   in reply to RacerX2oo3

    Sean,

     

     

     

    I've been plowing through options, many, including the warp speed HP laptop that you suggested.

     

     

     

    "...that there are many people using non-supported GPU's..(NOT) comfortable doing.  On the desktop side that isn't as big a deal since with the arrival of 5.5..."

     

     

     

    The issue now, budget of course, what else in new so my nephew and I are considering standalone solutions, which we could build ourselves, he's IT Certified, I've built DAW's etc. or buy a bare bones from Tiger Direct and so on.

     

     

     

    The issue for me  remains whether I can adequately construct and render, h.264 content, using After Effects and the other suite driven sand boxes in the CS5.5 PP playground?

     

     

     

    We attended Rich Harrington's set here in Los Angeles in March, my nephew just informed me  that the Dell he used, was indeed a custom build. Rich  e-mailed me and said to stick with the approved GPU's listed on Adobe's site when I asked him about all this, oh, and have fun as well.

     

     

     

    My insistence on a laptop, perhaps is ill conceived but sometimes, out for a week in a place like Death Valley, and loading content at the end of the day, well…you get the picture I suppose. Keepers or reshoot and so on.

     

     

     

    What seems abundant, in commercially available brands, for the most part are the GTX 460's off the shelf or as high as the GTX485's on build to suit,  paired with i7's at some level however, the high-end laptops for graphics and not gaming are inclusive of older and newer cards (inventory on hand) some in 1000's-M series.

     

     

     

    Given this site's evaluations: http://www.notebookcheck.net/Comparison-of-Laptop-Graphics-Cards.130.0 .html the order of scrutiny had the Class One category, with the GTX 400 series at the top and the Quadro 5510 and 4000 M's a bit further down the list.

     

     

     

    Question then, for actual use with CS5.5, is there an advantage in using the GTX M's, or are the 4000 and 5010 M's more suitable as a platform element?

     

     

     

    I did contact Puget and ADK about possibilities regarding the solutions BTW, and so far have great advice.

     

     

     

    Pending a response to the question above, for now my needs warrant a look at the QFX4000M to prevent bottlenecks as content evolves, rather than hasten buyer's remorse because the assemblage outstripped the GPU/CPU/RAM combination, which is where I am now.

     

     

     

    Thanks in advance for your answer,

     

     

     

    Rob

     
    |
    Mark as:
  • Currently Being Moderated
    May 12, 2011 10:42 PM   in reply to RacerX2oo3

    That would be the Quadro 4000M, not QFX which does not exist.

     

    RM

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points