hehehe....well now, lookie at what I started....
a few points:
1) Disregard all benchmarks you know about the 5500. the 1.03 driver are FUGGIN' INCREDIBLE. Firingsquad FINALLY posted something that made sense. Please check their update. 32-bit and FSAA got a HUGE boost now.
2) running UT in 32-bit is funny. It's 16-bit source artwork guys.

3) You MUST run UT/D3d in 32-bit because 16-bit renders improperly, not because it "looks better in 32-bit". Again, 16-bit source artwork. Why not just run Quake1 or Quake2 in 32-bit, eh? <laffs>
Now here's where I LOOOOOOVE to hear you peeps trumpet your anti-glide sentiments in UT when you have no clue, heh....
4) D3d defaults to
normal texture quality. Glide defaults to
HIGH texture quality. Rerun those D3d benchmarks with texture quality set to high, or rerun those glide benchmarks with texture quality set to normal, and you'll see that glide spanks D3d in 16-bit alone, and obliterates it in 32-bit.
5) Despite the speed increase that the hi-res OGL patch gives to UT, it's still SIGNIFICANTLY (i.e. ~20%) slower than glide. I'm making my copmarison bewteen my 5500/P3@933 and my bro's GTS-64MB @ 220/400 + T-Bird 900. Now, can the CPU's make a difference? Sure, maybe 5 fps. But he would STILL be 10 fps behind me, and that is with (again) his *NORMAL* quality textures (which look funny next to the hi-res ones), and glide with its HIGH quality textures.
me : 70 fps cityintro (glide with high-quality textures)
him : 55 fps cityintro (OGL w/patch applied and normal quality textures)
oh yeah, I saw Sharky show his anti-3dfx bias again today. Oy vey....first off, he tests a PCI 5500 and "forgets" to mention that. Then he CONVENIENTLY digs up an ass-old game that NOBODY uses as a benchmark anymore, Revolt. DX6 all the way, CPU-limited up to 1600x1200x32, hell yeah baby!!
So, why did he use this, you may ask? I mean, why not use one of the new racing games that support T&L??? Why not use a racing game taht supports DX7???
Was it because he wanted to test a racing game? No. NFS

U, Grand Prix, F1-2000, MCM2, NASCAR Heat all are far newer, DX7 racing games. We couldn't POSSIBLY use those as examples tho (Because the 5500 spanks ass in them), we have to go find a game that he can't run on the 5500 using the new driver set....so he is *forced* to use the 1.01 driver set, and since that game is CPU-limited up to 1600x1200, he only uses 1600x1200 as the resolution. How convenient, since the old driver set had a 1600x1200 bug in DX. Yes indeedie, let's not use NEW games, lets' use oldass DX6 games that almost no one plays anymore, just so we can show the 5500 in a bad light. Brilliant.
At least he could've balanced the equation and used games that are acutally still PLAYED, like Deux Ex or The Vampire Chronicles, but NOOOOOOOOOO....the 5500 might actually look good, and we can't have that, hahaha....he's fuggin' worse than Dr. Tom these days....
Hey, WTF, why not dig up Tomb Raider 2, eh?
Realistically, why not use Evolva? It has hardware T&L support. It is a new, DX7 game with great graphics. Why don't you see too many peeps using that?
Prolly cuz the 5500 kicks ass in it. Can't have that now, can we?
<rolls eyes>
(Evolva - 1024x768x32 w/1.03 drivers = 91 fps)
Oh, here's what I have found to be the best settings for the games I've been playing a lot lately, with corresponding framerates, on the 5500 using the new drivers. 5500 @ 183 MHz:
Quake3 - 1024x768x32 SHQ (Geometry/textures maxed) turn off blood, gibs, marks, brass and viewable weapon = 103.5 fps
MDK2 - 1600x1024x22 w/textures maxed and mipmap enabled - 75.5 fps
OR
MDK2 - 1024x768x32 w/textures maxed and mipmap enabled - 72.2 fps
OR
MDK2 - 1024x768x22 w/textures maxed and mipmap enabled - 74.1 fps
(still trying to decide which is best - I'm leaning toward the 2xFSAA, wow!!!!!!)
OH YEAH, this brings up another one - I just love the $hitheads who enabled T&L in MDK2 for the 5500. Duh. That drops the 5500's fps by close to 10. Morons. Let's see how strongly we can stack the deck against the 3dfx cards, eh?
Now, getting back to UT and the "glide vs. d3d" thing.
Are we comparing how well the cards run the game, or are we comparing the cards in D3d performance? "How can you compare two different APIs in a benchmark?"
well, hey, I thought UT wasn't a valid benchmark? The only reason we use it is to see how the cards run one of the most popular games out in the last year, right?
I mean, really. WHO THE HELL would use a 5500 on UT and NOT use glide? WTF?????? Brilliant.
If you and I were to engage in a benchpressing contest, and I was stronger with a close grip, you were stronger with a wide grip, how fair would it be to FORCE both of us to use the same grip?
NOBODY plays UT in D3d with a 5500.
and by the way, glide is a FEATURE, just as much as T&L. It's a far more useable feature, too, judging by the number of games that run SIGNIFICANTLY better in glide, even recent games. T&L makes almost NO DIFFERENCE in ANY games out presently. Evolva, SoF and MDK2 are the only games where T&L makes ANY difference at all, and that difference is almost completely negligible.
Also, EVERY game on the planet can use both of the 5500's cores. There is a very small, non-representative handful of games that actually make use of the GTS's T&L unit, so that is not a valid comparison in the least.
Again, let's compare Evolva scores to see what a difference T&L makes in D3d, not 3dMark2000(Dot3 sure does make a difference tho)
Straight up facts now boys:
with the new 1.03 drivers, the 5500 has closed the performance gap SIGNIFICANTLY in the areas that it lagged (OGL performance). It's FSAA received a VERY large framerate boost, and now FSAA is becoming quite useable in games it previously wasn't even an option (UT and MDK2 are 2 that spring to mind now)
and of course, there is the old standby, NFS

U @ 1024x768 w/4xFSAA enabled - WHOOO-HOOOOOO!!!!!!