11 Replies Latest reply on Sep 15, 2010 9:25 AM by mikeklar

    Harm Millaard You're Correct

    mikeklar Level 1

      The nVIDIA Quadro FX 3800 is slower 

      When compared to cheaper gaming cards it dropped the Windows performance for Graphics and Gaming Graphics from a score of 7.4 to 6.9.

      It's a good thing I don't play games, but it is a bit of a shock to pay at least four times more for a card to see that much lower score.  However, when scrubbing it's a delight.

      Having said that, I do have a couple of questions which hopefully you or someone here can clear them up for me:

       

      - When loading the latest card driver it allocated the monitors in reverse from the basic driver which installed itself right after installing the card, i.e. the monitor plugged into the DVI-I port (identified as #1) it is shown as #2 monitor and the one plugged into the DisplayPort (identified as #2) now is the #1 monitor.  The DisplayPort cable's other end is a DVI-D connector and the DVI-I port is connected with a DVI-I cable (also this monitor is DVI-I compatible).  Have you encountered or heard of this?  It is not a serious issue and of course solvable by configuring the display one wants to use, just a concern since I've never incountered this before.

       

      - This brings me to my second question; is it better to use both DisplayPorts, rather than what I've done?

       

      Thanks and Cheers,

      Michael

        • 1. Re: Harm Millaard You're Correct
          shooternz Level 6

          When compared to cheaper gaming cards it dropped the Windows performance for Graphics and Gaming Graphics from a score of 7.4 to 6.9.

          It's a good thing I don't play games, but it is a bit of a shock to pay at least four times more for a card to see that much lower score

           

           

          What were the "cheaper gaming cards" you are comparing to? ie are they MPE / Cuda  capable?

           

          Does the Windows Performance Index measure speed?  If so ...how much does ".5" mean as a time quantity in an MPE enabled  NLE?

           

          What will you do with the time you save?

          • 2. Re: Harm Millaard You're Correct
            ECBowen Most Valuable Participant

            We get this question all of the time. The Video cards bios identifies and id's the primary monitor at the post. The id number has no bearing on functionality. Windows 7 just looks at the display as virtual space and does not care what number monitor is primary and what each subsequent monitor is labeled. The user configuration alone decides how the monitors are displayed and what order. The only advantage with going to the displayport for both is the 10 bit color option if you have a display and codec that supports it.

             

            Eric

            ADK

            1 person found this helpful
            • 3. Re: Harm Millaard You're Correct
              mikeklar Level 1

              Thank you Eric

              I'll connect both monitors to the DisplayPorts.

              One monitor is capable of 10 bit resolution, it's an Eizo CG211.  I use it mostly for color correcting artworks for printing.

              However, there already seems to be a visible improvement in the graduation of colors with the FX 3800 using the DVI-I connector on the card.

              I'll let  you know if this improves when it is connected via the DisplayPort later today.

              Cheers,

              Michael

              • 4. Re: Harm Millaard You're Correct
                mikeklar Level 1

                I have a number of cards, the one that was replaced is the ATI Radeon HD 5770 and it is not Cuda capable.

                Regarding the Windows performance measure, I'm assuming it measures speed and overall performance.  The scale goes from 1 through 7.9, so a .5 lowering in performance is about 16%.

                What will I do with the time I lose?  Have less coffee

                Cheers

                Michel

                • 5. Re: Harm Millaard You're Correct
                  mikeklar Level 1

                  Sometimes it pays to buy more than one needs

                  In this case it's an adapter cable from DisplaPort to DVI-I.  These are not always readily available in my area and I ended up with two because I was concerned about delivery from one supplier.  Of course having done so they both arrived at the same time...

                   

                  As indicated earlier, I now connected the main monitor to the DisplayPort #2 and the secondary monitor to #3 port on the card, both of which are DisplayPorts.

                   

                  Is there a visible difference - YES.  Better than I expected!  Color graduation from light to dark shows no banding, which was not the case with the previous graphic cards.

                   

                  Although not important, the monitor sequence is now #1 and #2 for the main and secondary monitor.

                  One more point, when I had the DVI-I cable connected to port #1 the signal received by the monitor was in analog format, now it is in Digital format, but without HDCP... Interesting...

                  Cheers,

                  Michael

                  • 6. Re: Harm Millaard You're Correct
                    ECBowen Most Valuable Participant

                    That is because the Displayport is digital output. DVI-I can function in digital or analog. Nice news on the quality and interesting observation. I have heard that with 10 bit color displays but it sounds like you have an 8 and a 10 bit display.

                     

                    Eric

                    ADK

                    • 7. Re: Harm Millaard You're Correct
                      shooternz Level 6

                      have a number of cards, the one that was replaced is the ATI Radeon HD 5770 and it is not Cuda capable.

                      Regarding the Windows performance measure, I'm assuming it measures speed and overall performance.  The scale goes from 1 through 7.9, so a .5 lowering in performance is about 16%.

                      What will I do with the time I lose?  Have less coffee

                       

                       

                      Cuda & MPE realtime advantage far outweighs 16% in assumed hardware performance loss  in terms of actual time saving on the job.

                       

                      Its even good for your health.  No time to drink coffee/ beer etc...while rendering

                      1 person found this helpful
                      • 8. Re: Harm Millaard You're Correct
                        mikeklar Level 1

                        Yes, the 10 bit (with 12 bit hardware acceleration) monitor is an Eizo CG211 acquired about three and onehalf years ago.  The 8 bit monitor is a little more than five years old.  Both have the same resolution, 1600x1200.  Considering what I paid for the Eizo you can buy a top end off the shelf computer for the same price.  The Eizo was acquired primarily for pre-press work and as I'm deciding on building a newer workstation (this one is alsmost four years old) it may be time to consider another monitor more suited for video work.

                        Thanks again for your input regarding the DisplayPort being digital output only.  Frankly, I was of the impression that the DVI-I was capable of both analog and digital, by that I mean that the graphics card would ouput either or both.  Anyway, that may explain why all of the Quadro FX cards seem to come with a DVI-I to VGA adapter.

                        Cheers,

                        Michael

                        • 9. Re: Harm Millaard You're Correct
                          mikeklar Level 1

                          Good for my health Ok,  but no Beer, now that's enough reason not to buy that card

                           

                          I appreciate you comment regarding the Cuda and MPE realtime advantage and tend to be in agreement.

                          Cheers,

                          Michael

                          • 10. Re: Harm Millaard You're Correct
                            ECBowen Most Valuable Participant

                            DVI-I will handle either digital or analog but not both at the same time and it's based on the source. In this scenario the source is digital so DVI-I will be digital. If you were coming from a different source other than displayport then it might be analog.

                             

                            Eric

                            ADK

                            • 11. Re: Harm Millaard You're Correct
                              mikeklar Level 1

                              My aplogy I wasn't very clear

                              My reference was to the DVI-I port on the graphics card.  It appears that it is only capable to providing an analog output and for digital one must use the DisplayPorts.

                              Your comments have been very helpful and are appreciated.

                              Cheers

                              Michael