
Mingon, I've handled a LOT more than 8 nVidia cards in my time and I pay close attention to a lot of online communities and review sites too.

It is important to remember I'm talking image quality not 3D performance. Most people call it 2D quality but because it is the quality of the signal (not to mention DAC, CAD and all manner of other things too) being sent to the monitor it effects anthing the card displays from BIOS screens, through Windows to games, which is why I call it image quality. Obviously many members of the general public are unlikely to tell much, if any, difference between cards regarding better image quality but that doesn't mean it isn't there, esp as many people use 15" monitors! And it does depend A LOT on the monitor or other final output device(s). It is what you personally find acceptible and can notice, just as an example many people find 70Hz refresh rates more than fine while others find 100Hz is needed to prevent headaches.

GF2 cards were notorious for poor image quality, that isn't anything to do with 3D ability, build quality, ports, AA, filtering methods etc but simply to do with clarity, stability and legibility of the final output which the monitor displays, esp when considering higher resolutions and refresh rates. There's a lot more to a clear sharp image than simple xxHz in the same way that there's more to a monitor than simple dot pitch and more to a CPU than simple mhz.

I find your verdict of
"Rubbish" a little childish and offensive, but I'll route around and see what supporting links I can find on this subject ASAP.