Although there are of course many manus of nVidia cards a general trend has always been apparent from both users and reviewers:
GF2 cards have poor image quality, GF3 cards have slightly better image quality but are still far behind the like of Matrox and ATI. GF4 cards have excellent image quality as good, if not better than ATI and Matrox cards (perhaps excluding the overpriced underpeforming P512). However, the majority of users use 15" monitors and even with larger better quality monitors would prob not notice the difference anyway, but that doesn't mean the differences aren't there. Obviously in DarkFudge2000's case his nice Sony 19" monitor should certainly not be the limiting factor, so it is a bit of a head scratcher. You are being sure to use the highest possible refresh rates and have checked that your monitor is correctly set in Windows? Also WinXP ususally resorts to default (60Hz) for gaming, but many fixes are available.
Other than that are you sure all other factors are the same since changing from a GF2 to GF4? No new hw, OS, drivers (Det v29 are best) or items such as speakers near your monitor? GF4 should certainly be a big leap forward in terms of image quality as well as the obvious perf benefits.
It is important to remember I'm talking image quality not 3D performance or quality (AA Aniso etc). Most people call it '2D quality' but because it is the quality of the signal (not to mention RAMDACs filters and all manner of other things too) being sent to the monitor it effects anthing the card displays from BIOS screens, to Windows and games, which is why I call it 'image quality'. It is what you personally find acceptible and can notice, just as an example many people find 70Hz refresh rates more than fine while others find 100Hz is needed to prevent headaches.
GF2 cards were notorious for poor image quality, that isn't anything to do with 3D ability, build quality, ports, AA, filtering methods etc but simply to do with clarity, stability and legibility of the final output which the monitor displays, esp when considering higher resolutions and refresh rates. There's a lot more to a clear sharp image than simple xxHz in the same way that there's more to a monitor than simple dot pitch and more to a CPU than simple mhz.