these are very very interesting answers. Naturally, this was a completely loaded question with a devious purpose behind it, heh....
Now then, going along with these responses, can someone please explain how, if these are our meager needs, why do we live and die on benchmark scores? If all we care about is "60 fps with good visuals", then why is 115 fps better than 100, when no mention is made of visual quality?
I'll tell you why *I* think super high framerate is necessary, at least in Q3 (I have yet to find another game that really required such high FPS)
My "need for speed" seems far greater than anyone else's here, with the exception of shk, natch

In Q3, the physics necessitates a MASSIVE framerate. So here's how *I* test to see if a video card is fast enough:
I do the following for Q3:
/seta com_maxfps 125
/map q3dm13
then I do the megahealth jump. backwards, forwards, strafing left, strafing right, strafing backwards left, strafing backwards right, strafing forwards left, strafing forwards right. If I have ANY problems with ANY of the jumps, I turn down visual details. If I can do them several consecutive times without a problem, I up some visual details (i.e. lodbias, r_picmip, resolution, and various other items) until I can't hit the jump 100% of the time.
This is a *tangible* need, i.e. if I'm racing down into that room in a 1v1 match, and I miss the jump even once, that could be the difference of 2 frags (I get fragged = I'm down by 1, or I get the MH before my opponent and nab a frag = I'm up by 1) in a duel match. That, to me, is important.
Now, the 125 fps was chosen because it has been my observation that some of the weapons simply act "differently" with high framerates, especially the shaft and the rail (and to a certain extent the MG as well). Why? I dunno, but several others have noticed it also.
So again, I have a VERY SPECIFIC and VERY HIGH expectation for framerates. If the card can't do what I want (see above), then I have no use for it.
I recently posted a screenshot to the newsgroup alt.binaries.games.quake. It was a pic overlooking the rocket launcher on q3dm9. It was a pic using my "graphics.cfg" that I had developed. It looked, to be honest, outstanding. and yes, it totally passed my q3dm13 test with flying colors.
EVERY ONE of the peeps thought it was a screenie of my GTS in 32-bit color. i had been bragging about how fast the GTS was (I was one of the first in the alt.games.quake3 newsgroup to pick one up), and had been harassing my buddies about their "slowass" DDr's. D'oh! Everyone knew of my requirements for framerates (I'm a framerate hog, hence the system you see in my sig), so everyone was AMAZED at how good the "GTS" could look while still maintaining my "need for speed"
It was actually a pic using the 5500 I "borrowed" from Electronics Boutique. Interestingly, when I posted the next pic I took (after turning on the framerate counter), the "slowass, dog, POS voodoo" read "140 fps". This was with the highest quality settings in the drivers turned on to make the 22-bit post filter look damn near like 32-bit.
When I switched it to 32-bit color, it read "100 fps". Not bad, right?
That seems pretty fast to me. And to think Quake3 was one of the games that the 5500 "sucks at", and the GTS "blows it away" in.
That's why I question so many of you when you say "the 5500 is slow, it sucks, it's lame, blah blah blah"
My question is always HAVE YOU TRIED IT YOURSELF?
Now, admittedly, this was with an overclocked card. Of course, I'm pretty sure that *most* serious gaming peeps will overclock their CPU's and their video cards every chance they get, assuming 100% stability.
I was getting 130 fps in demo001 and ~115ish in Quaver, so I upped visual quality (using the lodbias slider), and I ended up with ~126 fps and 110ish (honestly can't remember exactly) respectively, and the GTS simply couldn't touch it at that resolution. I had to up the resolution to 1152 to even come close while maintaining framerate (see my "com_maxfps 125" test), but it still didn't look as good. If I jumped up to 32-bit color, the framerate sank too low. I could turn on the TC with the GTS, and it would allow me to use 1280x1024 and maintain my framerate, but the TC turned on proved to be too annoying.
So understand just *why* I bristle at those who read the bull$hit reviews of the video cards and declare themselves experts, when they haven't even tried the card yet. A very reasonable 20 MHz overclock was all it took to get the incredibly high framerates I got with the 5500.
3Dfx fooked up an outstanding card (the 5500) in 4 ways:
1) They released it too late (dumbasses)
2) They released it without a lodbias slider initially - and visual quality looked crappy
3) They released it at a very low default speed (I haven't found too many peeps who couldnt' hit 183 easily)
4) They released it with some stupid default settings in their drivers (lodbias, alpha blending and mip map dithering)
Since EVERY reviewer out there does ALL their tests with basic default configurations, the 5500 ended up looking pretty weak by comparison. Very few reviewers (check
www.voodooextreme.com/reverend and
www.3dspotlight.com) actually bothered to sift through the drivers and tweek the card. When they did, they were amazed at just how good that card performed.
I got the 5500 as a joke, mainly. I had just sold my 32MB GTS and ordered me a 64MB GTS from DecoY here on Anand's boards. Unfortunately, I was "late" on the list, so it was going to be awhile before I would receive the 64MB GTS beast, so I went down to Electronics Boutique with the plans of exploiting them and their "10-day return, no questions asked" policy. I had my eyes on a Herc Prophet II 64MB'er, when on a whim, I decided to pick up the 5500, just for the halibut. I figured it would tide me over until I got my "real card" in, then I'd take the 5500 back to EB (which I, begrudginly, did)
Funny how well that 5500 performed, and this was in DIRECT COMPARISON to the 32MB GTS, and now, the 64MB GTS.
Anyway, thanks for the responses, and thanks for confirming what I already figured.
