from my experience, the reason is video bandwidth.
you have something like 165mhz of bandwidth or something on an analog connection. when you have analog interference it cuts out the edges of this, and you well have less effectively. analog cables even have special magnet rings that need to be installed at the ends of the cable to help with this (i forgot what the rings are called but they are built into most cables, though you can even buy them)
17 and 19" lcds are only 1280x1024 so thats a 1.3 megapixel image x some number of hz per second. something like that. a 1600x1200 display lik ea 20" 4:3 lcd is 2 megapixels, a 1680x1050 is like 1.8 megapixels (20" wide).
you start pushing into the limits of that analog video cable bandwidth and any interference will disturb that. that is what i've read anyway.
the big difference with dvi is with displays that need more video bandwidth on the display connector. a lot of older analog video cards had really bad ramdacs and bad capacitance issues as well that basically made this same problem happen anywya on smaller displays (geforce 2 cap desoldering if anyone remembers) but its not very common anymore except on really cheap video cards.
try an analog lcd on a really really cheap i810 era integrated video and it will look awful. same with some of the cheaper i915/i945g based boards now. then try it with dvi (i actually have a i915g/i945g dvi expansion card and have tried this) and ti will look perfect because digital connections are not prone to interferance.
another thing to try is running a monitor through a cheap kvm, which is a bandwidth robbing "connector/extension" and it will look worse. its just like attaching more cable tvs to a shared analog cable line.