Ignore Maximum BS, but anyway...
CRTs are analog, so the traditional way of getting a signal to them was to have digital to analog converters on the video card, which take the digital information in the video card's memory and convert it into an analog signal that refreshes the contents of the screen 60+ times per second.
LCDs are digital, so if your video card only has analog outputs (or you LCD is a cheap and crappy model that only has an analog input), then you end up having unnecessary D/A and A/D conversions, which can and will degrade the image quality noticeably (unless you have really high end equipment with excellent 2D quality - like the Matrox Parhelia). A DVI interface keeps the signal in digital form through the entire chain from computer to monitor, providing optimal quality.
Dual DVI (two DVI ports on one video card) allows the use of two LCD monitors with digital connections, which looks great on the desk and is a great productivity booster for certain kinds of work, but is unfortunately not so easy on the wallet.