I read a fuckton of reviews, and I have read of people having issues with DVI (very rarely).
Also, I have heard many times, both from reviews and friends, that their HDMI cable had noise and they had to buy a better one.
Besides, I have never seen noise on VGA, neither have I heard about it.
Which means that both can have noise, but still HDMI's weakness is bigger, because even though it's recent, there already are many more reports of "faulty HDMI cable" and such.
Noise on a digital spec like HDMI/DVI/DisplayPort kind of makes no sense. In virtually all cases, it should either work or not work. This is why people laugh at the sales of 'Monster' HDMI cables, as the zeros and ones being transmitted will get no benefit from being put over a more expensive cable, something completely logical on an analog setup like VGA.
I, like many here, am in the somewhat dreary IT field. I deal with thousands of desktops and servers annually, and during the past few years, none of these thousands of units has seen a display problem related to noise over a digital cable. I have seen a handful of cables that either weren't up to spec (single-link DVI not capable of a native resolution of a large display), or were defective. Previous to digital displays becoming the standard, I could reasonably estimate that 10-15% of the VGA displays I saw were poorly implemented, either by low-quality cables, low-quality analog circuitry on the video card itself, or just a poor quality CRT. Usually a combination of the first two. This isn't really a failing of the VGA standard, but cheap implementation of it. Anyone who had an old cheap TNT or GeForce card back in the day from some of the lower-quality manufacturers can attest that 2d image quality at 1280x1024 and above could be pretty spotty. It's one of the reasons Matrox video cards were prized by people who worked with digital imaging and desktop publishing, not only for the 2d performance, but for the image clarity. Some of the higher-end CRT displays even had the large breakout multi-coax connectors in addition to a standard VGA connection, to enable the best possible signal for resolutions like 1600x1200 and beyond (Sony FW900 FTW!).
In short, I saw many,
MANY more issues with VGA to VGA than I've ever seen with digital to digital LCD setups.
That's not to say that I don't miss the easy ability that CRTs had of easily bouncing between dozens of resolutions and refresh rates, but the VGA analog standard itself was pushed about as far as it could go logically once 1:1 pixel digital tech became widespread. Analog to a digital display is never quite perfect, as the DA:AD process logic has to kind of approximate the resulting data for each pixel.