<<
A wise man does some research before he takes the plunge >>
No plunge now. Probably when I get my 3 GHz Hammer or something. It was just a general comment about the state of video card reviews in general.
<<
Considering NVIDIA's marketshare and dominance, either most people don't give a crap about 2D image quality, or it's not at as bad as people make it out to be. >>
Heh, well a bazillion people are on AOL so what's your point?

Actually nVidia GPUs are great, but the OEM manufacturers sometimes suck as you very well know. I think the reason is that people just don't know any better, for a few reasons.
1) Many people only run 1024x768 anyway, so it's irrelevant. Lots of cards that are terrible at 1600x1200x85 run pretty well at 1024x784, or even (the dreaded 5:4 AR) 1280x1024.
2) Half the people I've talked to in my dept. can't even tell the difference between 60 Hz and 75 Hz, until I point it out to them.
I did a minor test in my dept. the other day. I have a high end video camera hooked up to a microscope and to two TV monitors via 1) single cable BNC (equalent to coax or composite, not the 3 to 5 cable BNC used for high end video) and 2) S-video.
It was clear to me that:
1) The BNC image flickered, but the S-video was solid.
2) The BNC image had slightly (artificially) more contrast and brightness, with enhanced edges, etc. However, the S-video image had more appropriate brightness and contrast, and of course much greater detail (looking at cell nuclei etc).
Out of 5 people (who make their living looking down a microscope all day), 4 preferred the crappier single-cable BNC image, because it had artificially high brightness and contrast. It's no surprise that TV and monitors come from the factory set with their contrast and brightness way too high. Similarly it's no surprise that many people can't tell what good 2D on a video card is. (Not saying that's you though. You probably have had good nVidia cards in the past, and now it's moot since you use DVI.)