The DVI interface uses a digital protocol in which the desired illumination of pixels is transmitted as binary data. When the display is driven at its native resolution, it will read each number and apply that brightness to the appropriate pixel. In this way, each pixel in the output buffer of the source device corresponds directly to one pixel in the display device, whereas with an analog signal the appearance of each pixel may be affected by its adjacent pixels as well as by electrical noise and other forms of analog distortion.
Previous standards such as the analog VGA were designed for CRT-based devices and thus did not use discrete time display addressing. As the analog source transmits each horizontal line of the image, it varies its output voltage to represent the desired brightness. In a CRT device, this is used to vary the intensity of the scanning beam as it moves across the screen.
However, when using digital displays (such as LCD) with analog signals (such as VGA), there is an array of discrete pixels and a single brightness value must be chosen for each. The decoder does this by sampling the voltage of the input signal at regular intervals. When the source is also a digital device (such as a computer), this can lead to distortion if the samples are not taken at the center of each pixel, and there are also problems with crosstalk.