If you have an LCD flat panel that has a DVI input, then you should definitely go that route.
Otherwise, the RAMDAC (digital to analog converter) in your video card is taking a DIGITAL frame at your desktop's resolution (say, 1024 by 768), converting this digital frame to an analog scan signal, sending this over the D-sub VGA cable, and then your LCD panel has to covert this analog signal BACK to a digital signal to display it on the LCD panel.
In the process of converting the signal from digital -> analog -> digital, some signal fidelity is lost.
If you use a DVI connector, the RAMDAC process does not occur. The EXACT same digital representation is seen by the LCD panel as is seen by your video card. So it is technically superior in this respect.
Also, from my experience using D-sub input on flat panel displays leads to "signal jitter" where the resolution is sharp, but the entire picture on the flat panel "jumps" around a bit. It's not really noticeable unless you look at the edges of the display. Most high-quality LCD panels have signal processing A->D converters that can correct for this jitter, but I have seen it nonetheless. DVI connectors do not have this problem.
The ONLY disadvantage to using DVI cables is usually their cost. Anyone disagree?