• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Vga, dvi, hdmi...

^+1

That is essentially it relative to connectors. Keep in mind that other technology in the graphics card will also affect the quality of video output. For more information about quality of output and on upgrading a video card for home theater or non-gaming purposes, you can see:

[removed]
 
Last edited by a moderator:
Wrong, just wrong .
VGA was perfect all the time it was used, at resolutions including 2560*1600 @ 200 hz, and with cable lengths way past the spec (I've seen conference rooms projectors with 25+m cables).

now we're in 2011, everybody said vga sucked, and thus it is officially dead and you will have to go the DVI / HDMI way.

As this is in fact more a question of future-proofing, I would suggest HDMI, as it's the newest (and shittiest) of all.

Yes, HDMI is bad because low quality HDMI cables don't handle anything, when low quality VGA cables never were a problem.

Also, HDMI is a standard on all TV's and their preferred input thingy, so that's probably the best choice in terms of compatibility.
 
Wrong, just wrong .
VGA was perfect all the time it was used, at resolutions including 2560*1600 @ 200 hz, and with cable lengths way past the spec (I've seen conference rooms projectors with 25+m cables).

now we're in 2011, everybody said vga sucked, and thus it is officially dead and you will have to go the DVI / HDMI way.

As this is in fact more a question of future-proofing, I would suggest HDMI, as it's the newest (and shittiest) of all.

Yes, HDMI is bad because low quality HDMI cables don't handle anything, when low quality VGA cables never were a problem.

Also, HDMI is a standard on all TV's and their preferred input thingy, so that's probably the best choice in terms of compatibility.

so vga and dvi offer similar performance?
 
VGA is for analog CRT monitors. The image in the GPU RAM is digital. Video cards had a RAMDAC chip to convert the signal to analog for the CRT.

LCD monitors need a digital signal. DVI is better because no conversion is necessary, resulting in a clearer signal.
 
VGA is for analog CRT monitors. The image in the GPU RAM is digital. Video cards had a RAMDAC chip to convert the signal to analog for the CRT.

LCD monitors need a digital signal. DVI is better because no conversion is necessary, resulting in a clearer signal.

That is just theory. It should be true but it's just not.

And yes, it is retarded to transform digital to analog to retransform it to digital afterwards, but hey it works and it's cheap.

That still does not change the fact that in reality, HDMI is much more subject to noise than VGA is, and VGA has always been more than enough even for specs far beyond those supported by current HDMI specs.

All in all, let us forget this and focus on the facts : nobody has vga anymore, nobody builds for vga anymore, we're all doomed and we will have to use that ugly failspec called HDMI. All hail our new connector overlords !
 
VGA is not used very often anymore, as evidenced by all modern video cards not being shipped with VGA ports. DVI, and HDMI are the same thing except that HDMI supplies sound as well as video. I have never had a problem with noise using DVI or HDMI.
 
^^ I have never seen noise from a DVI or HDMI connected display from a PC. Has anyone else?

I read a fuckton of reviews, and I have read of people having issues with DVI (very rarely).

Also, I have heard many times, both from reviews and friends, that their HDMI cable had noise and they had to buy a better one.

Besides, I have never seen noise on VGA, neither have I heard about it.

Which means that both can have noise, but still HDMI's weakness is bigger, because even though it's recent, there already are many more reports of "faulty HDMI cable" and such.
 
I asked about the quality in VGA and DVI because I bought a monitor from Dell (ST2320L) that never came with a DVI cable. Wondering if I should order 1 from Dealextreme or not.

Does it increase quality? And also does does DVI quality vary from different quality cables?

Thank You.
 
I read a fuckton of reviews, and I have read of people having issues with DVI (very rarely).

Also, I have heard many times, both from reviews and friends, that their HDMI cable had noise and they had to buy a better one.

Besides, I have never seen noise on VGA, neither have I heard about it.

Which means that both can have noise, but still HDMI's weakness is bigger, because even though it's recent, there already are many more reports of "faulty HDMI cable" and such.

Noise on a digital spec like HDMI/DVI/DisplayPort kind of makes no sense. In virtually all cases, it should either work or not work. This is why people laugh at the sales of 'Monster' HDMI cables, as the zeros and ones being transmitted will get no benefit from being put over a more expensive cable, something completely logical on an analog setup like VGA.

I, like many here, am in the somewhat dreary IT field. I deal with thousands of desktops and servers annually, and during the past few years, none of these thousands of units has seen a display problem related to noise over a digital cable. I have seen a handful of cables that either weren't up to spec (single-link DVI not capable of a native resolution of a large display), or were defective. Previous to digital displays becoming the standard, I could reasonably estimate that 10-15% of the VGA displays I saw were poorly implemented, either by low-quality cables, low-quality analog circuitry on the video card itself, or just a poor quality CRT. Usually a combination of the first two. This isn't really a failing of the VGA standard, but cheap implementation of it. Anyone who had an old cheap TNT or GeForce card back in the day from some of the lower-quality manufacturers can attest that 2d image quality at 1280x1024 and above could be pretty spotty. It's one of the reasons Matrox video cards were prized by people who worked with digital imaging and desktop publishing, not only for the 2d performance, but for the image clarity. Some of the higher-end CRT displays even had the large breakout multi-coax connectors in addition to a standard VGA connection, to enable the best possible signal for resolutions like 1600x1200 and beyond (Sony FW900 FTW!).

In short, I saw many, MANY more issues with VGA to VGA than I've ever seen with digital to digital LCD setups.

That's not to say that I don't miss the easy ability that CRTs had of easily bouncing between dozens of resolutions and refresh rates, but the VGA analog standard itself was pushed about as far as it could go logically once 1:1 pixel digital tech became widespread. Analog to a digital display is never quite perfect, as the DA:AD process logic has to kind of approximate the resulting data for each pixel.
 
I asked about the quality in VGA and DVI because I bought a monitor from Dell (ST2320L) that never came with a DVI cable. Wondering if I should order 1 from Dealextreme or not.

Does it increase quality? And also does does DVI quality vary from different quality cables?

Thank You.

It won't necessarily increase quality, but you will get the maximum possible quality from your particular setup should you use DVI from your video card to the DVI input on your display. I find that most of the time for lower resolutions, using an analog VGA cable to an LCD display isn't all that terrible.

For the cheap guarantee of best quality, I'd recommend it.
 
Besides, I have never seen noise on VGA, neither have I heard about it.
Except, I'm certain that you have seen noise on VGA, but just haven't noticed it, because it is much more subtle. But it's there.
Which means that both can have noise, but still HDMI's weakness is bigger, because even though it's recent, there already are many more reports of "faulty HDMI cable" and such.
What you are trying to say is, cheap cable == bad picture? Well, duh. That's true in the VGA world as well.
 
DVI or HDMI are both digital so the quality will be identical (except one has sound, hdmi). Generally you'll find HDMI used for HDTV's (because they need sound) and DVI for computer monitors (I really don't understand the point of HDMI ports on a computer monitor).

The cable doesn't matter either, it's digital.
 
Last edited:
DVI or HDMI are both digital so the quality will be identical (except one has sound, hdmi). Generally you'll find HDMI used for HDTV's (because they need sound) and DVI for computer monitors (I really don't understand the point of HDMI ports on a computer monitor).

The cable doesn't matter either, it's digital.

It's useful for people who like to connect their game consoles to their computer monitors.
 
Analog signals degrade more or less linearly; digital signals work perfectly until they suddenly don't work at all. For a typical PC cable length, analog is not going to come out ahead. In large boardrooms with projectors, analog may still make some sense due to its greater cable length, but digital extensions/repeaters solve that problem as well.
 
Anecdotally, CRTs look best with VGA and LCDs look best with DVI/HDMI. I've never used a CRT with a digital input, but each of the few times I've tried an analog input on an LCD it's looked like garbage in comparison to digital (everything is fuzzier, and the interference is visible).
 
Anecdotally, CRTs look best with VGA and LCDs look best with DVI/HDMI. I've never used a CRT with a digital input, but each of the few times I've tried an analog input on an LCD it's looked like garbage in comparison to digital (everything is fuzzier, and the interference is visible).

Thanks!
 
I'd be screwed if the video card companies didn't include a VGA adapter with the new cards...My aging CRT just won't connect to the other connectors, no matter how hard I try to force the fit!! 😛
 
Back
Top