<< analog will only spit out approximations. With DVI theres no blurry edges, and every pixel is perfectly distinguishable from the other. >>
Not certain what BigDee means by this, however the interface used has no affect on quality of the edges or distinguishing pixels. The panel itself and the resolution you run has a bigger affect on these than the interface used.
Digital (DVI) like (BNC connectors on a CRT) in theory will provide a better picture. However if you have a good video card in practice or in a blind test I find it very difficult to tell the difference.
The myth that DVI does not need to be converted is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.
In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, so you can argue that DVI actually increases the number of times the signal is processed.
Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).
Most of today?s version of DVI is rather limited. DVI driver chips have a maximum 1600 x 1200 at 60Hz resolution and refresh rate capability. Keep this in mind if you plan on upgrading. LCD?s do not suffer from flicker so the 60Hz is not such a big deal unless you are playing games and want higher FPS. Faster DVI transmitter and receiver chips are being developed if you want to upgrade, however that means replacing both your video card and monitor. With an analog connection you can upgrade either one without the need to replace the other.
Also if you use DVI-I or DVI-D I do not recommend hot swapping (unplugging) the monitor. Turn the computer and monitor off prior to unplugging the monitor.