a few brief points:
1) Those UT scores where the Radeon beat the 5500 in UT were D3d only, and were with the old drivers. Using glide, the 5500 takes down both the Radeon AND the Ultra in UT. Also, the 1.03 drivers fix the 1600x1200 D3d driver bug
2) The 5500's drivers allow for quite a bit of tweeking to both image quality and speed. There are A TON of options. Check out the tweeking guide at
www.3dspotlight.com to see all the stuff you can do in there. The washed out look of the 5500 has more to do with gamma and lodbias settings, which are different on their default than other cards.
Also, BFG, why do you 'weep' when you see UT benchmarks? Hate to see your nvidia boys get their asses handed to them?
Ben, I understand your points. However, it's all about the games, for me. I'm not about to debate "professional graphics workstations" with you. Not my cup o' java. I play games, cuz I'm dumb and I like mindless entertainment. <g>
as far as the Radeon defaulting to the 16-bit, just disable it. You gotta go into the drivers of both the GTS and RAdeon to enable anisotropic, so while you're there, make the switch.
"the 2d of the GTS is noticeably inferior to both the 5500 and the Radeon."
Only on Trinitron tubed monitors."
uh.....so I should change out my excellent Trinitron monitors because the GTS doesn't like them? Trinitrons are pretty darn good monitors, and they're pretty popular. Herein lies my ONLY true complaint about the GTS and GeForce based cards...you gotta turn your damn computer world all around to get the snatches to work properly. You gotta futz with different driver revisions, different motherboards, different monitors, WTF???? I like the 5500 cuz I plug it in, and it works great. I play a game, it works great. I change out hardware, it still works great. I get a new driver, I AUTOMATICALLY upgrade to it, because it will offer bug fixes for the minor bugs that were present, and improves speed (in the case of 1.03, significantly improves speed).
Look at it this way, if you are a COMPUTER TECHNICIAN, and a client of yours buys a new GTS card, and gives you a call and says "hey, which driver should I use?", I cringe, especially if he has an Athlon system. This person's $$$ is what keeps yhou in business. He gives you his $$$ because he trusts you and your knowledge. I'd LOVE to say "upgrade to the latest driver, and you can play your favorite games". Unfortunately, I can't do that. Especially if he has an Athlon system.
Brotherman: "FSAA really isnt reality unless you play older games and with so many new games to play i dont have time to play the games that have been played 1000x times"
MDK2, 1024x768x22 w/2xFSAA enabled - just under 75 fps using 1.03. Does that meet with your approval? How about 1024x768 w/4xFSAA enabled in NFS-PU, SMOOTH AS SILK? Is that okay?
BW, glad you like to watch demos. Some of us buy these cards to play games. To each his/her own.
Doomguy: "NVidia's cards are less suseptable to pixel popping because of thier far superior opengl drivers. "
I'll agree with you on their superior OGL drivers, no doubt, but your remark about being less susceptible to pixel popping is just borderline ridiculous. <shakes head>
han888: "shanehi,the 64MB gts beat the voodoo5 on 32 bit at 1280 X 1024 on the link u give us, just look funny, voodoo5 is the king of glide, but the geforce card still have a chance to beat the voodoo5"
well, there are a few points here:
1) Those are with the old 1.01 drivers. The 1.03's give a nice performance increase
2) NOBODY with a 5500 plays UT in D3d. It's stupid. They play in glide, which gives a good 10-15 fps performance increase
3) D3d turns off volumetric lighting and detail textures. Glide enables them by default, and the 5500, with those defaults left ON in glide, owns the GTS with the defaults left OFF.