• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

3dfx just unleashed their mega secret weapon HSR, will NVIDIA respond?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Sudheer Anne, lol...that was funny.

now back to the subject, i agree with legion88's idea of categorization. however, i do think that Glide should also be a category. of course we all know Glide does not extend beyonf the realm of 3dfx, so the comparison in the Glide category would be a V5 5500 against say a V3 3000, or something along those lines. but hey, if the consumer wants to know about a 3dfx card, then this is the category to look in, provided it exists.

but i also agree with airman6 (sorry about the mispelling) in a way. if the categories are video cards and not APIs, then each API should be tested on each card. some people may say that Glide is an unfair comparison b/c it cant be used on an nVidia, ATI, or Matrox. but that isnt the only unfair comparison. there may be certain APIs optimized for a specific card, but they are also optimized for certain games too. for instance, even though 3dfx cards arent too well known for their OGL performance, MDK2 looks great when running it with a 3dfx card. why? b/c MDK2 was built on a superb OGL engine. UT was built on a superb Glide engine, and on top of that, 3dfx cards were made to run best when using Glide, provided the game being played was built on Glide. so naturally, the combination of the two is going to kick some arse.

its just that every card should be tested with every API and let the results be known to the consumer so he or she can decide on what will be best for his or her needs. the reason i went with the V5 5500 is b/c i play alot of UT using Glide, and i play alot of FlightSim98 using D3D. I also tried a Hercules 3D Prophet II GTS 64MB, and found that its performance in the games and applications i needed it for to be inferior to the performance of the V5 5500. dont get me wrong, if i played MDK2, Q3, and other OGL based games, i would have gotten the GF2 GTS card. i was very impressed with the T&L and many other featurres, but the V5 5500 was best for me and my needs, regarless if the GF2 GTS is a faster, more powerful chip. this obviously proves that bigger, better, faster isnt always going to come out on top...at least not for my needs.

its all about the consumer and what they want or need.:)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Legion88:

This is why I laughed at the numerous attempts by 3dfx supporters including reviewers insisting that they should compare glide in UT when using 3dfx cards to D3D (or OGL) when using other cards.

My stance on the issue is exactly the same as yours. I'll give you warning now: you are wasting your time arguing with the Glide supporters and trying to convince them they are wrong.

There was huge argument about it a while back and I was firmly against using different APIs when comparing the different cards. If nothing else it makes an totally invalid benchmark because you are comparing Apples to Oranges, and the results are totally meaningless.
 

Ahriman6

Member
Oct 24, 2000
78
0
0
"There was huge argument about it a while back and I was firmly against using different APIs when comparing the different cards. If nothing else it makes an totally invalid benchmark because you are comparing Apples to Oranges, and the results are totally meaningless."

But of course a software T&L vs. hardware T&L comparison isn't Apples to Oranges and is perfectly valid? If a game is used as a benchmark in a CPU comparison and that game supports SSE and not 3DNow!, what do you do? Is it fair if that game, hand-picked by the reviewer to be the testing standard, supports one set of instructions and not the other? Are all games' support of instructions and APIs equal and fair? Are all pieces of hardware created equal, with identical implementations? No, no, and no.

I'm not a fanatical Glide supporter because, quite frankly, Glide isn't an issue these days. I'm not a Glide supporter but rather a supporter of fair, impartial testing.

How many gamers want to know which video card runs a game best regardless of why or how many gamers want to know which card runs a game best in a particular application programming interface?

I am apparently wasting my time trying to use simple logic. Please don't respond unless you answer the above question concerning software vs. hardware T&L.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
my stance on the API thing:

1) Quake3 and MDK2 make great OGL gaming benchmarks

2) Evolva is the only good D3d gaming benchmark for video cards

3) UT uses an engine used by 3 VERY popular games right now, so it is fair to test each card on this using their "best" API. Everyone knows UT itself sucks as a benchmark, but gameplay in UT is very important. It just seems silly to hamstring a card that supports glide when you'll always enable glide on this game if you can. Kinda like enabling T&L in MDK2

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
But of course a software T&L vs. hardware T&L comparison isn't Apples to Oranges and is perfectly valid?

:confused:

In that case we'd better disable the Voodoo 5's second CPU. And disable their T-Buffer as well so that their FSAA scores aren't higher than the competition's.

Right?
 

Ahriman6

Member
Oct 24, 2000
78
0
0
"In that case we'd better disable the Voodoo 5's second CPU. And disable their T-Buffer as well so that their FSAA scores aren't higher than the competition's."

Absolutely not. Like I wrote above, I've never owned an S3 card but I'm entirely against S3TC not being used. Also like I wrote above, and like Robotech wrote, show each card at its best, whether that's an API, a driver setting, a hardware feature, etc. Start with a baseline of commonality and then move from there when comparing these cards: Glide and maybe FSAA for a V5, hardware T&L and Dot 3 bump mapping for a GTS, and maybe anisotropic filtering for a Radeon (not sure about this last one since the GF cards also support it in OpenGL, but I have heard that the Radeon's implementation looks better, but again I'm not sure).

Remember, I believe reviewers should show these things and how they affect performance, good or ill, not hide them because one card doesn't have a possibly positive feature and the other doesn't. That goes for hardware T&L for a GF card. . .I would never suggest disabling it on a GF card in low-rez Quake 3 testing because that would not be reflective of how gamers will use that card in real-world situations, which is supposed to be why games and not synthetics are used for testing in the first place! To replicate real-world gaming situations, the very reason why we fork out $100s for these cards!!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Absolutely not.

So don't complain about T&L then.

Also like I wrote above, and like Robotech wrote, show each card at its best, whether that's an API,

How exactly do you think Glide is the "best" for the Voodoo? If Glide is doing completely different things to Direct 3D how can you say it's the "best"? The comparison is meaningless and worthless.

I want to find out how well the card does, not how well Sweeney programs under Glide compared to Direct 3D.
 

Ahriman6

Member
Oct 24, 2000
78
0
0
"So don't complain about T&L then."

I never complained about T&L. . .only used it as an obvious answer. Oh, and you completely dodged the question I asked you to answer. :(

"I want to find out how well the card does, not how well Sweeney programs under Glide compared to Direct 3D."

I agree with the first part, but don't you want to know how well Carmack optimizes his engines for T&L? Hmmm, methinks you are engaged in a full-blown double-standard here. Seriously, if you guys want to throw in the whole "programmers' skills" issue, which in Roscoe example of UT and MDK2 I think is entirely fallacious, then why not just throw out all benchmarking. A popular game used as a benchmark that runs better under one API than another should be shown as just that. . .A popular game used as a benchmark that runs better under one API than another!!! Give the people the damn information instead of hiding it under the questionable auspices of anti-3dfx biases.

The only people I've ever seen against Glide testing are those who don't have it. Well, as of yet I don't have a hardware T&L video card and yet I'm all for such testing if a game supports it. Hmmmm. . . .
 

AirGibson

Member
Nov 30, 2000
60
0
0
Ahriman6 and Legion88 both have good points, IMO. Don't ya just hate people who "straddle the fence"? :) Legion88 feels that all the cards should be on "equal footing" for the benchmarks (yeah, I know that's a loose translation) while Ahriman6 is for letting the cards demonstrate all of their special features.

Neither stance is silly. The only way I could see the arguments ended would be to benchmark everything. All the combinations using all the different APIs available to the game. But then, that can often take alot of time. Hell, I just spent 5 hours over the last two days compiling my HSR benchmarks :) 55 in all, and that was only High Quality! So now we have the problem of having too many combinations to worry about testing.

It's truly a double-edged sword. Supposing Glide and Open GL APIs are available on a certain game, you have two outcomes:

1) Only Open GL benchmarks are used. The reader sees that card B runs Open GL better than card A, and thus thinks that card B is faster.

2) Glide is used for card A, Open GL for card B. The reader sees that card A has a higher final frame rate in the end, and thus thinks card A is faster.

Both of these cases are wrong in their own respects. Truly, this is a catch 22. Since there would be too many combinations to fully test EVERYTHING (API, FSAA, T&L, HSR, resolution, etc...) that option is out. So what can we do to settle this problem?

And, just because I hate when people don't take sides, I'll say I agree with Ahriman6, and yes, I own a Voodoo 5500 :) I would simply find it disturbing for someone to say "Card A gets 35 fps in this game" when it actually gets 60 if you use the Glide API. Simillarly, I think it would be unfair to take away T&L or Trilenear Filtering from an nVidia card.

Legion88 made the comment that Glide shouldn't matter since *only* Voodoo cards use it. Nobody else does. I certainly understand the initial logic behind this. However, refusing to employ a good feature that is native to a single card "just because nobody else has it other than Voodoo" would not give you totally *accurate* results of that cards capabilities in the said game. Personally, I wouldn't care if my Voodoo 5500 ran Tribes at 3 frames per second in Open GL. It's irrelevent. The Voodoo is the best card for Tribes bar none due to it's great Glide support. However, if you benchmarked based on its Open GL results in that particular game, it would look pathetic. Therefore, there cannot be an all encompassing single API to test for some games.

With all of the different cards, features, APIs, drivers, and games that are out there it is obvious that there is not a single standard. Since there is no standard, you will always run into stumbling blocks such as what you guys are debating.

Edit: If you're interested in seeing my HSR results in a simple table format, check out this link. They're pretty extensive covering 4 resolutions, all FSAA settings, and all HSR settings:
HSR benchmark results
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Oh, and you completely dodged the question I asked you to answer

Of course I dodged it. I'm not getting involved in this pointless debate again. Once was enough.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Air Gibson: "The only way I could see the arguments ended would be to benchmark everything"

FU(K YEAH MAN!!!!!

BENCHMARK 'EM ALL, LET GOD SORT 'EM OUT

BWWAAAHAHAHAHAHAAAAAAAA!!!!!!!!!!!!!!!!!

:D:);):p:D:);):p
:):D:p;):D:):D:p
:D:);):p:p:);):D
:p;):):p:D:):):p
:D:);):p:D:);):p

Honestly, here's the BIGGEST problems with benchmarking (in a nutshell, brief statements):

1) We ASSUME that benchmarking one game in one API is representative of ALL games of that SAME API. Not true. Roscoe hit on it somewhat when he said (basically) "not everyone programs in each API as competently as the next"

2) Using a game, such as UT, as a benchmark is all well and good, but if it supports 3 API's, then benchmark it in all API's, or benchmark it in "best case scenario" API under "as equal conditions as possible", or don't benchmark it all.

3) "not all settings are created equal". anyoen who has seen the 5500's "trilinear filtering" in q3 knows this. anyone who has seen the Radeon's anisotropic vs. the GTS's anisotropic knows this. Anyone who has seen the 5500's 16-bit vs. the Radeon's 16-bit knows this. Anyone who has seen 3dfx's or the Radeon's 2d vs. the GTS's 2d knows this. Anyone who has seen the 5500's 2xFSAA vs. the other's "~2xFSAA" knows this to be true.

So we can't just say, anymore, "card A is better than card B with these features enabled because card A gets faster framerates with all the same settings".

our pal Roscoe might say (right before he powerbombs me on some cold, hard surface) "the data can be reproduced over and over and over again....The data .... is reliable if it can be reproduced by a third party following the same procedures."

Unfortunately, "not all settings are created equal" these days. I wonder if Dx8 will "solve" that, or at least relieve the current situation?



 

AirGibson

Member
Nov 30, 2000
60
0
0
Robo: "So we can't just say, anymore, 'card A is better than card B with these features enabled because card A gets faster framerates with all the same settings.'"

I agree. In fact, I think all of us here agree with that. The problem doesn't lie so much with us. You, myself, Legion 88, Sunny, we all *know* about those subtle differences that can sway benchmarks or about those features that might not be running. However, average Joe Gamer probably does not. They just want "a really good gaming card" and go read a few benchmarks to find out which is the best. Or, if enough benchmarks are run improperly, even the tech savvy might be fooled. Considering some of these great companies live and die by their products, I don't like the idea of them getting an "unfair" shake on some of these benchmarks. I don't think any of us like that. But I guess "that's life", as they say. Who decides what's fair?
 

DominoBoy

Member
Nov 3, 2000
122
0
0
Poor AirGibson. He's so desperate to have anyone agree with him. RoboTECH is a good guy, but AirGibson is nothing but a 3dfx Zealot, and um........ a crybaby. BooHoo little crybaby.

Waaaaahh, My V5 is not judged fairly. :( Waaaaahh Life is unfair. :( Waaaahh, my penis is really bigger than it looks. :(
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
just when i thought we were all starting to get along, or at least tolerate each other's opinions...:(

i cant honestly say i dont agree with any one thing that everyone has had to say. everybody has made good points, and yet many of the same people who say something i agree with have also said things i dont agree with.

benchmarking is indeed extremely difficult due to the many different factors that must be included in order to give the consumer the information they need to be aware of b4 buying a card. all cards should be benched to show their best abilities. that is why Glide shouldnt be disregarded just b/c it only works for 3dfx cards.

once all cards are benched to show their best and worst performance, then the consumer can make a decision based on the games they would be playing. If a person is looking for a card to play UT, Deus Ex, and other games with Glide based engines, then 3dfx is the way to go. if the person likes to play MDK2, Q3, and other games based on OGL engines, then nVidia is probably the way to go. if a person likes to play a variety of games with engines all based on different APIs, then he or she will have to make a decision and sacrifice the performance of their card of choice in one game for another. unless you have the time, money, and patience to switch a video card, drivers, or API support for every game you want to play, you'll have to make a choice and go with it. it really isnt that bad having to switch APIs b4 playing a game if you need to. and i think alot of people will be surprised anyways with the performance of their new card in games it really wasnt hyped to be good for. after all, MDK2 looks great with my v5 5500.

there is too much to test, and it would take too much time, but if reviewers anf benchmarkers can find the time to test everything and find the peak and worst performances of certain cards in certain games using certain APIs, it would be best for the consumer. that way he or she can make a choice as to which card will be best for his or her needs.
 

DominoBoy

Member
Nov 3, 2000
122
0
0
I'm sorry Sunny, you are a good guy and I don't mean to bum you out. It's just that AirGibson is a crybaby. I'm just giving my opinions too.

BTW AirGibson, since Legion88 layed the smack down on you Long and Hard, I wouldn't really say he agrees with you. :)
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
No prob Domino...everyone is entitled to their opinion. i guess its just coming across a bit more harsh than some of us would like. While i dont agree with way AirGibson and others are throwing remarks at each other, it has been quite some time since i've seen him say anything insulting or derogatory. his last few posts simply express the facts and his opinion on benchmarking. i have to respect him for his original post with all those benchmarks he tested and posted for us. remeber those benchmarks werent put up to knock down those who had bad things to say about the beta drivers, but rather to show us what it did for his system and what good it could potentially do for the rest of us who were interested.

this is quite the thread.:) lots of controversy and masses of information. i think i'll write a sci-fi novel based on this thread.;)
 

DominoBoy

Member
Nov 3, 2000
122
0
0
Haha Sunny. Sorry about all the mess, my mouth was really watering.

BTW, how you like my new signature? ;)
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
lol...funny signature

for all of us who remeber this thread will understand the inside joke:)
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
damn, what's with all the spit and stuff

gross! take baths, all of you, now!!!!
 

AirGibson

Member
Nov 30, 2000
60
0
0
**rubs temples**

Thanks for contributing yet again, Domino Boy. A quote from Byron feels appropriate here.

"He who ascends to mountain-tops shall find
The loftiest peaks most wrapt in clouds and snow;
He who surpasses or subdues mankind
Must look down on the hate of those below."

-Lord Byron, Childe Harold's Pilgrimage


Futile attacks on me notwithstanding, do you have anything to contribute to the discussion? Penises, crybabies, kisses to other forum members, etc... are all nice and everything, but c'mon... Let it go, bro! Let's talk video cards and clear out the garbage.
 

AirGibson

Member
Nov 30, 2000
60
0
0
I checked Quake 2, Tribes, Homeworld, and Unreal Tournament for HSR performance increases. Nuttin' doin'. Only the games based on the Q3 engine are experiencing any benefit. Now that's a shame :(

On the good side, at least we know what they're working on shows promise. On the bad side, I hate Q3 as a game, so while the performance increase there is nice, it's a bit moot to me. Because we all know how much better Tribes is than Quake 3 :) **ducks**

Not bad for an undocumented, unannounced feature in a beta driver, though.

Getting back to the subject of this whole thread, what will nVidia respond with (if at all) should 3DFX produce something great with HSR? I guess the nv 20 would be as good an answer as any. Even HSR wouldn't save the 5500 from the speed which that card may produce.