Originally posted by: BoberFett
Computers are digital, they produce a digital image. To output to a VGA monitor the digital data has to be turned into an analog signal. During conversion from digital to analog, there's a loss of clarity. When it gets to the monitor that loss of clarity can turn into things like false colors, ghosting and so forth depending on the quality of the DAC (digital to analog converter), the quality of the cable, etc. When using the DVI port, the data is digital all the way. What gets displayed on the screen is exactly what the video card produced.
The chief advantages of DVI are improved image stability. The DVI interface provides a pixel clock, where the analog interface the clock used to sample the analog video has to be derived from the horizontal sync.
Even though it is possible to have pixel jitter in the analog interface, It is extremely rare so if you're seeing noise, odds are something's is wrong.
The only real advantage of a digital interface may be of some reduction in "ghosts" usually caused by impedance problems with the connectors.
This does not mean that digital connections are immune from impedance mismatches. Generally speaking they are more sensitive due to the fact that they to run at very high frequencies.
The "you shouldn't be converting to analog only to convert back to digital" argument doesn't really make a whole lot of sense - it ignores the fact that there's yet another conversion back to analog that goes on within the LCD panel!
Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).
Analog and digital are simply two different ways of encoding information onto an electrical signal - neither is inherently "better" or "worse" than the other.
Will you see a difference will depend on many factors. For example some video cards have filter circuits on the analog video output. This will degrade the image to some extent. All else being equal, it is similar to the Coke / Pepsi challenge. Some people will say the image looks better on one over the other, however I attribute some of this to the placebo affect. Many find it difficult to see in a blind test where they do not know what interface is being used. The problem is what may be an appreciable difference to me may not be for you.
The analog interface is taking a bad rap due to poorly designed video cards IMHO. In my studies it is virtually impossible to tell the difference between a properly designed analog interface and DVI.
I?m not arguing with those who notice a big difference however what they don?t realize is not all-analog video interfaces are the same. Many IMHO poorly designed video cards use RF filters on the analog video lines to reduce RF emissions. There are many ways to reduce RF emissions without these RF circuits hanging on the video lines. These circuits add capacitance to the video signals increasing the rise and fall time of the signal. These slower rise and fall times create soft edges especially on text or sharp black to white and white to black transitions. Thus the DVI connection will look better.
It is not the fact that you are using the analog interface it is the fact that the design of the analog interface that you have may be a poor one compared to others with much better designs. To make general statements that DVI is better is very short sided.