Difference between DVI and VGA?

moogle077

Member
Mar 18, 2003
87
0
0
Could someone explain what the major differences between VGA and DVI are? I know that DVI is digital and people say its better; but i don't know why. I don't know if I would need it or not (don't play much comp games anymore, maybe a bit of war3 or something). I usually use the computer for web editing, image manipulation, and I'll begin with flash creation soon. <~~ of course there's the usual text doc stuff which really doesn't matter at all.
 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
Computers are digital, they produce a digital image. To output to a VGA monitor the digital data has to be turned into an analog signal. During conversion from digital to analog, there's a loss of clarity. When it gets to the monitor that loss of clarity can turn into things like false colors, ghosting and so forth depending on the quality of the DAC (digital to analog converter), the quality of the cable, etc. When using the DVI port, the data is digital all the way. What gets displayed on the screen is exactly what the video card produced.
 

moogle077

Member
Mar 18, 2003
87
0
0
So, by having a DVI output/input ... I would be eliminating the ghosting effect that people complain about so much?
 

moogle077

Member
Mar 18, 2003
87
0
0
Wow ...

dang, that just eliminated the samsung 170MP I was looking at. I was gonna buy that today or tomorrow too o_O
 

moogle077

Member
Mar 18, 2003
87
0
0
geez, 171 MP has

Response time: 30 ms

Input signals: Analog RGB, CVBS, S-Video, Component/HDTV, TV (antenna/cable); Video Level: Analog; Analog: 0.7VP-P; Sync Type: Separate H/V, Composite H/V, SOG
Input connector/cable: 15-pin D-sub, S-Video, RCA connector, Component x 2, HDTV, TV antenna/cable


All that and no DVI? :( It also costs like 100 more than the 170MP
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Originally posted by: BoberFett
Computers are digital, they produce a digital image. To output to a VGA monitor the digital data has to be turned into an analog signal. During conversion from digital to analog, there's a loss of clarity. When it gets to the monitor that loss of clarity can turn into things like false colors, ghosting and so forth depending on the quality of the DAC (digital to analog converter), the quality of the cable, etc. When using the DVI port, the data is digital all the way. What gets displayed on the screen is exactly what the video card produced.

The chief advantages of DVI are improved image stability. The DVI interface provides a pixel clock, where the analog interface the clock used to sample the analog video has to be derived from the horizontal sync.

Even though it is possible to have pixel jitter in the analog interface, It is extremely rare so if you're seeing noise, odds are something's is wrong.

The only real advantage of a digital interface may be of some reduction in "ghosts" usually caused by impedance problems with the connectors.
This does not mean that digital connections are immune from impedance mismatches. Generally speaking they are more sensitive due to the fact that they to run at very high frequencies.

The "you shouldn't be converting to analog only to convert back to digital" argument doesn't really make a whole lot of sense - it ignores the fact that there's yet another conversion back to analog that goes on within the LCD panel!

Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).

Analog and digital are simply two different ways of encoding information onto an electrical signal - neither is inherently "better" or "worse" than the other.

Will you see a difference will depend on many factors. For example some video cards have filter circuits on the analog video output. This will degrade the image to some extent. All else being equal, it is similar to the Coke / Pepsi challenge. Some people will say the image looks better on one over the other, however I attribute some of this to the placebo affect. Many find it difficult to see in a blind test where they do not know what interface is being used. The problem is what may be an appreciable difference to me may not be for you.

The analog interface is taking a bad rap due to poorly designed video cards IMHO. In my studies it is virtually impossible to tell the difference between a properly designed analog interface and DVI.

I?m not arguing with those who notice a big difference however what they don?t realize is not all-analog video interfaces are the same. Many IMHO poorly designed video cards use RF filters on the analog video lines to reduce RF emissions. There are many ways to reduce RF emissions without these RF circuits hanging on the video lines. These circuits add capacitance to the video signals increasing the rise and fall time of the signal. These slower rise and fall times create soft edges especially on text or sharp black to white and white to black transitions. Thus the DVI connection will look better.

It is not the fact that you are using the analog interface it is the fact that the design of the analog interface that you have may be a poor one compared to others with much better designs. To make general statements that DVI is better is very short sided.
 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
Gosharkss

An analog signal can be produced that as clean as a digital one. But from a cost perspective, I doubt most home users want to spend the extra it would cost for the quality analog when they could just use digital.
 

moogle077

Member
Mar 18, 2003
87
0
0
So ...

in theory, if I used a high end GF4 or a radeon 9700 pro ...

An analog signal would not show too much quality loss compared to a DVI signal?
 

lowinor

Junior Member
Mar 20, 2003
21
0
0
Well, I have a Sony SDM-X72 that I use as a second monitor on my main workstation.

One of the main draws for me is that it has multiple inputs; I have it hooked up to my main workstation via DVI, and to a secondary system via VGA. (I like this setup, as I've also got a KVM switch -- always have my main work on my 21" CRT, and can switch video on the flat panel independently of keyboard/mouse control on the KVM).

When used on a DVI link compared to a VGA link, the image is crisper. There is some static noticable if you look really close when VGA is used, but none on the DVI link. The VGA link also has pretty noticable color bleed when displaying bright magenta on a dark background, but I don't know how much of that is just a poor analog filter on the monitor. Either way, the color bleed is *zero* on the DVI link. No ghosting, etc. Very, very solid image.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Originally posted by: BoberFett
Gosharkss

An analog signal can be produced that as clean as a digital one. But from a cost perspective, I doubt most home users want to spend the extra it would cost for the quality analog when they could just use digital.

Its good you brought up cost. The majority of computer users out there still do not have a VC with DVI outputs. Yes I know most of you gamers update your VC every six months, however you are the minority compared to the millions of computers used in corporate environments. In this case they would be required to update the VC and in some cases operating systems in order to take advantage of DVI. Thus DVI may be a more expensive route for the majority of the computer systems out there today.