Well, it finally happened, my primary monitor bit the dust this morning. :( I've really become spoiled by the larger size (21") monitor and would like to replace it with a similar size. Problem is, as always for me, I need to keep the pricing as low as possible, while still getting something reasonable.
So, to that end, do you guys have any suggestions on low priced, decent monitors? Most are wide screen these days, aren't they?
I'm also going to have to get used to using a LCD screen, aren't I? The one that just died was a CRT.
what is the difference between the regular Dell monitors and the ultrasharp? I was looking at their specs and from what I can see, some of the other ones have "faster" specs ... for whatever that is worth.
In my browsing, I did see a LG for a smilar price point at the ultrashapr, for an equivalent sized monitor.
LG monitor info here
In reading up on some of this, the DVI cable (if purchased) would allow for a cleaner picture, but one can still use the supplied VGA cable, correct? Therefore it's not *required* to have the DVI cable, right?
I'm looking more this morning and I see these two monitors, but I am not sure what the difference between them is. One has wide color gamut (do I want/need this?) and the other does not. I am wondering if one is the newer model of the other?
The cable is important, as the bandwidth requirements for proper resolution are much greater than VGA. With cheap cables you will see, for instance, in an Excel spreadsheet, ghosting of the narrow lines and smearing of the edges of the type. This because of timing errors between the color information and the luminance signal causes skewing.
The quality of the electronics to which the cable connects is vitally important as well.
The upshot of all this is you may see artifacts and resolution problems that are not due to the image, but due to the connectivity. Therefore, trying to sharpen with little improvement may in fact be a systemic problem. (Based on the 1920/1080 screen resolution)
Don't go beyond 10'in length. The signal losses mount quickly. 10' is the ideal length.
This is compliments of a friend who is chief engineer for a company that makes the display setups for the likes of Circuit City etc who have to have clean, properly setup connections to each and every monitor or TV. When they go beyond 10' per set, differences are introduced which compromise the comparison test. The cables they use are first class.
So, if you are going to spend hundreds of dollars for the monitor, go ahead and spend the extra bucks for the high end Belkin.
While cable length
can cause losses and signal distortion, I hardly think that
that sort of restriction makes sense in the real world.
Personally I think cable companies make far too much of the need for their expensive products. The proof of the pudding, or in this case, the picture, is in the user's experience not what somebody measured in a lab.
I would say, try the monitor with the cable it came with and
only if you are not satisfied with what you see, get a more expensive cable.
I always have to laugh a those HiFi enthusiasts with their $10 000 valve amplifiers and half-inch diameter speaker cables. As if ...
Okay, so if I am going to use the DVDi cable, what kind of card do I need in my system to take advantage of it?
The D-Sub cable is just another way of saying VGA, btw.
I picked up the LG yesterday and even bought the DVDi cable, but since I don't have the proper port on my system, I can't make use of it right now.
Also, since this is a widescreen monitor, I notice that things are more "stretched" than a CRT would be. Is there any way that I can keep the more accurate aspect ratio on the images. What I am seeing right now is that 100x100 pixels does not display the same way on the wide screen as it does on the standard screen. I don't know what to do about this, if anything can be done?
Unfortunately, the audio debacle spilled over to the digital world.
JJ, here's the story technically.
At the frequencies necessary for proper digital communication, we resort to transmission line theory, which, for audio made no sense at all given the cable lengths.
Transmission lines require termination in a load which matches the characteristic impedance of the line, typically in the order of 50 ohms unbalanced and up to 100 ohms balanced. If you think that 10 feet is inconsequential, consider that the Intel Core Two Duos have what is called a Front Side Bus, which is on the order of a few inches, and is rigorously controlled, as to line impedance and termination. It is directly involved in the marvelous performance of those devices, matching and exceeding that with which AMD competes, and AMD doesn't have a front side bus! If the FSB was not controlled we would all be either running AMD or running behind.
Think of it this way. The pulse which corresponds to, say a "one" is like a golf ball, a steady stream of golf balls heading for a target. Now, if the target is somewhat soft and thin, the ball gets through and lands on the space waiting for the signal "One" or "Zero". Now, consider a target is hard. The ball hits the target, bounces back and Uh oh!, it's heading back to the source. But wait! There is more! Another ball is heading to the target, it gets hit by the returning ball, and in this case, is a zero ball. The two cancel, and a pixel location gets the wrong info.
I have worked with transmission lines for years, both RF and digital. At Intel, I was responsible for measurements involving FSB performance on CPU to MCH (Northbridge) connections.
Since I have no recent experience with the practical notions of DVI cabling, I went to my source, who detailed it for me, far more than I have written here.
Anyway, take it for what it's worth. You can do as JJ suggests and try before you buy. And buy from a dealer that allows returns, should you decide for you, it makes no difference.
One more thing: The electronics has to terminate the cable correctly. If it doesn't, no amount of cable upgrade will improve performance!