A quick note on benchmarking hypocrisy

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
I see that many sites have been using Unreal Tournament as a benchmark, but all of them refuse to use "glide"

WTF? Why?

"because glide is a blah blah not everyone plays glide blah blah only 3dfx has glide blah blah" etc. etc.

Bull$hit. Straight up biased reporting right there. First off, glide is now open source, so ANYONE can make a driver for it. However, glide will always run best on a 3dfx card. Glide is a feature of the 5500, why not use it?

These same biased peeps are using HW T&L when running 3dMark2000. These same biased peeps set HW T&L to "enabled" in MDK2 (for BOTH the damn cards!! Brilliant!) I wonder why?

Why? "Because it's a feature of the GTS card". Well, WTF? Glide is a "feature" of the 5500, why not use it? Am I to believe that we can use a feature that DOESN'T belong to 3dfx, but we can't use a feature that *DOES*??? I wonder why?

Everyone LOOOOOOOVES to throw up overclocked benchmarks of the GTS cards. Great.....I don't see too many overclocked benchmarks of the 5500. I get an extra half-dozen FPS in MDK2 in fillrate limited situations by overclocking from 166 to 183. You don't see too much about that, do you? I wonder why?

 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
I agree with you 100% Robotech, its a double standard plain and simple

When you consider the nvidia fans who say "fps means everything" and then say the glide does not count because the geforce doesn't have it is stupid. They make themselves look like dumbasses and just hate to see their mighty vid card get beat.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
Bull$hit. Straight up biased reporting right there.

Why is that biased? How can you compare two different APIs in a benchmark? The whole point of running benchmarks is to keep the two items as close to each other as possible.

How is that different from running one card in Direct 3D and the other in OpenGL? How can that be a valid benchmark?

These same biased peeps are using HW T&L when running 3dMark2000.

That's a hardware feature. Not testing this is like taking a new card and disabling all of its new features to make it like the old card. That's hardly an indicative benchmark is it?

By your reasoning shouldn't we be disabling the Voodoo's second CPU? After all the competing boards only have one CPU. Using the second CPU is "biased" isn't it?
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91
One quick question: if you cared about glide, why are you using a GF 2?

I do agree with you about websites benchmarking UT without glide, its stupid and unfair.
 

han888

Golden Member
Apr 7, 2000
1,586
0
0
hmm may be because the high demand of geforce card in the market (marketing ) :p! and then i have a question here! why the geforce card has a higher demand than voodoo5 in the market??? it's has a lot of possible! why people want to buy geforce card?? why people dont buy voodoo5?? i think that's the answer for your question robo :p
 

glen

Lifer
Apr 28, 2000
15,995
1
81
I agree with you RoboTECH.
I would seem reasonable to me to do some bench marking where both cards are run in their ideal Modes, even if one is in OpenGL and the other is in Glide. That is how the end consumer will run what ever card he gets.

 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Comparing cards via API is fine, but failure to mention (or benchmark) that it can run a game under glide (at a faster speed) is inexcusable.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I guess if your going to benchmark two cards with the same API, why not use Glide?
V5500: pass
GF2: fail
That doesn't mean the GF2 is a peice of crap. They should show what a video card will do with the capabilities it has I agree. I think that some sites show what a V5500 can do with glide supported games. They just never show these benchmarks on a "Card shootout".
 

glen

Lifer
Apr 28, 2000
15,995
1
81
If you get a 3dfx card, you will run UT in Glide.
If you get an Nvidia card, you will run UT in OpenGL or D3D.
Why not compare best to best?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It is explained here. At the highest resolutions D3D is faster then Glide, which is what most people look at. The only time Glide is "faster" at the highest resolutions is when the color is set to 32bits, which Glide doesn't support but allows you to select in UT.

Would you rather sites post numbers that showed the V5 slower? Perhaps adding Glide on top of D3D would be a good idea, but if the V5 is slower running Glide in the resolutions people play, and again it is only 16bit support which is fairly useless for most people, why would you want the numbers?

Lower resoltuion certainly is better, as Glide is less CPU intensive then D3D(which we all knew), so perhaps showing the best scores for each resolution and noting which API was utilized would be best? Don't know, but sites using D3D instead of Glide was explained dozens of times when benching the V5. No 32bit color support and slower at higher resolutions, the two things that most people look for in a ~$300 video card, be it from 3dfx, ATi or nV.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
The only time Glide is "faster" at the highest resolutions is when the color is set to 32bits, which Glide doesn't support but allows you to select in UT.

That's exactly right. Not only that but Glide only supports small textures which makes its comparison to other APIs even more useless. Do you buy a $300 board so you can run small 16 bit textures ? I don't think so.

It's a totally invalid comparison to compare Glide to another API, given that such a benchmark would result in the Voodoo and the other video card running the game in two very different ways.
 

tr0dd_

Junior Member
Jan 1, 2000
21
0
0
BFG, the comparison is COMPLETELY valid when comparing cards and allowing 3dfx cards to use Glide with UT. To be honest, when benching the 3dfx cards, the testers should pick the API that would provide the best results for that particular resolution. No one is going to buy a Voodoo5500 and say, "I wanna play UT - since competing cards use D3D, I guess I should to" - they will pick the best API for the resolution they will be playing at.
 

HigherGround

Golden Member
Jan 9, 2000
1,827
0
0
no it's not, the whole point of benchmarking is to provide the same testing environment across the sample spectrum. the benchmarks are not suppose to show, which card performs better in game X or Y. they are here to evaluate the test sample's hardware as well as software (driver) capabilities, that's why the review sites keep the setting the same during the tests.
 

Eug

Lifer
Mar 11, 2000
24,026
1,644
126


<< Would you rather sites post numbers that showed the V5 slower? Perhaps adding Glide on top of D3D would be a good idea, but if the V5 is slower running Glide in the resolutions people play, and again it is only 16bit support which is fairly useless for most people, why would you want the numbers? >>


Hmmm... After getting into both Quake 3 and UT, I often wonder how much the the benchmark reviewers actually play these games. In Q3, all the hardcore gamers I know play at 16-bit, because speed is key. 1280x1024x32 is not an option on almost any card because it simply too slow. Indeed, because I use fairly lowish settings, I get about 67 fps out of Q3 demo001 with my Voodoo 3. However, if I were to get a faster card, first and foremost I'd want more speed... 32-bit and higher resolutions would be secondary. I'd go even lower resolution, but I must admit anything below 800x600 is getting too blocky for me. But these guys will play 640x480 with Geforces. In UT, I would love to see cards benchmarked in Glide, at least in 16-bit, if 32-bit is unfair. As far as I'm concerned, there is probably no video card in existence that actually plays UT adequately at 1280x1024x32, but for lower resolutions it would be nice to know how much of a bonus Glide provides.
 

glen

Lifer
Apr 28, 2000
15,995
1
81
Eug,
Run the X-Isle demo at 640x480 and tell me if you think it is blocky.
 

Weyoun

Senior member
Aug 7, 2000
700
0
0
I'd have to agree with both Ben and RoboTECH here, although that may sound kinda wierd :p
The fact that glide now runs slower UT compared to D3D now is what I'd call an exception. IF glide was faster than D3D in the desired situation, then I can see no reason why it shouldnt be used.

What is the actual point of creating a benchmark? To show how well a system/card/foo performs under a given situation. Sure, a duplicate API comparison is useful to A CERTAIN DEGREE, but how many people do you know that would prefer to run Deus Ex in D3D and not glide? Benchmarking 2 cards with the same API shows how well that card performs under that API, not overall performance. This gives an indicator, again to a certain, probably outlying indicator, as to how well the card would perform if it were forced to use the same API under the same engine. Comparing 2 cards cross API would give people a more 'real world' benchmark, say if glide was preferable...

IMHO, when benchmarking a card, reviewers should provide both situations and then determine which is the desirable API to use, considering image quality, speed, etc, etc....

And to BFG, why should he turn off that second VSA-100 on the V5? When i buy a video card, i buy EVERYTHING that's on the PCB, including T&amp;L capability, extra chips, DDR memory, or proprietry API's! WHATEVER I PAID MY DODGEY AUSTRALIAN MONEY FOR!!!

Everything should be considered and more benchmarks given. Time to get working reviewers....
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
IF glide was faster than D3D in the desired situation, then I can see no reason why it shouldnt be used.

Well it may be &quot;faster&quot; but only because it sacrifices image quality and features. It is totally invalid to compare two different APIs in a benchmark. You are testing apples against oranges and the results are meaningless.

And to BFG, why should he turn off that second VSA-100 on the V5? When i buy a video card, i buy EVERYTHING that's on the PCB, including T&amp;L capability, extra chips, DDR memory, or proprietry API's!

I agree with you 100% Weyoun. I don't think you should turn anything off on any video card. I was merely pointing out the flaw in RoboTECH's reasoning when he said that using T&amp;L was &quot;biased&quot; and that it shouldn't be used.
 

Weyoun

Senior member
Aug 7, 2000
700
0
0
[rant] The results are only meaningless if you dont intend to use them in real life. Again, you have to consider what a benchmark actually sets out to do. It shows how well a card stacks up against the competition in a given situation, whether it includes glide, T&amp;L, bump mapping, or whatever the hell the user feels like testing. What i would like to see is having cards benchmarked at their real world implementation, an optimised configutation of Glide or D3D or OpenGL (whichever is preferable, due to speed, image quality, whatever reason - another section i would like to see, the pros and cons of selecting various API's (although this is game specific, especially in the case of UT)) for the V5, and again an optimised configuration of OpenGL or D3D for the GF2 etc... The only case I can think of where a same API benchmark, where an API isnt optimal for either or both cards, is useful, is determining, yet again, possible future results under the same engine, where only that API was available. I am asking every one of you to tell me a reason why a same API benchmark would be useful, excluding the one previously mentioned, if any one of the cards wasn't going to use the API it was assigned.

And this comparing apples to oranges business is complete BS!! I know this is a pretty big statement, but I dont care if Joe Schmoe in Britain and Schmoe Joe in America has an apple and an orangle respectively, it honestly doesnt matter! When i buy a card, yet again, i buy everything on the PCB, whether it be 'a generation ahead', has T&amp;L, has DOT3, etc... Whether it be a 'Voodoo5 compared to GF2 is unfair' (I am referring to all those schmucks on the 3DFX gamers BBS) argument, just because the former was targetted at the previous generation, again means nothing. Why dont we consider price as an apples vs oranges case? If the Voodoo5 was intended to compete against the GF1, then why 1) doesnt it cost the same as a GF1 and 2) wasnt it released in the same generation? If i can buy a higher generation card (GF2) for less than a 'lower generation card' (V5), should i really care what was intended for what?

What we need here to destroy the whole apples vs oranges argument is a new benchmark that considers things other than speed, including maybe a mark for the card's image quality or artifact level. Although this is something beyond the scope of this argument, it's something I'd like to be thrown around the net rather than just 'x card runs faster than y card' Apples and oranges arguments should only apply when two cards have different purposes, something that doesnt apply to the Video Card industry as a whole.

Call me insane if you will, but everyone putting major points here but Ben and a few others have blood on their hands. RoboTECH, i have to agree with 99.9% of your statement, but this disabling T&amp;L thing is just silly. BFG, you're in the same camp as RoboTECH, except you're rooting for the other team. Enable everything and let it be. Ben deals with the selection of the API part of this post and I am 100% with his post. Trodd and Glen, you're in the same boat as Ben. If i am wrong in ANY part of my statement, then let me know...
[/rant]

well there goes my stress relief for the day :) And Ben, when is b3d expecting to be back up again?

ack!! i feel the need for a rant again....

[rant]
I am so goddam sick of this corporate run, money driven goddam world!! GODDAM!! Although this might be considered OT, it can be directly applied to the Video Card industry and benchmarking techniques. Get off your a$$es, make the best product you can and you WILL have our money!! Ignorant, biassed people are what is keeping this world back, including about 50% of the people on the 3DFX gamers BBS (i wont call them the only ones, as I havent seen the nVidia and ATI dedicated BBS')

I am really starting to feel like cautery... :( :( :( :( :( :(
[/rant]

Hopefully i wont have to do that again for a long time...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
I am asking every one of you to tell me a reason why a same API benchmark would be useful, excluding the one previously mentioned, if any one of the cards wasn't going to use the API it was assigned.

If video card reviewers slapped on an extra set of benchmarks of games running in Glide mode how would that help you in any way? You would see a graph of Glide benchmarks and then what? The results would be totally meaningless. Glide completely different API with a completely different way of doing things.

The whole point of benchmarks is to keep everything the same but to modify the the things which you are testing for. In this case we get an identical system with a clean install of Windows and the game to be tested. We also make sure each game is configured in the same way.

Then we add the modifications which let us test the differences between each board. Running low resolution/low detail mode tests the driver efficiencies and running high resolution/high detail mode tests the video card fillrate.

Modifying the API between the two tests is totally invalid. It's like running a race between two cars and using a high octane fuel in one car and a low octane fuel in another. The results can't be compared in any way.

Supposing we ran the benchmark in Glide on the Voodoo and in OpenGL on the GF, and we got some results. What do these results mean? Assuming the game was Unreal and Glide happened to score higher. This is because:

(1) Glide is optimised better than the other APIs on the Voodoo.
(2) Glide is faster than the other APIs, and in order to do this it must have worse image quality/less features.
(3) The game engine itself is programmed in such a way that Glide runs better.

================================================================================
Looking at our two objectives at what we wanted to test we see that:

Running low resolution/low detail
We wanted to ascertain how optimised the two drivers are between the boards. The results we have received are meaningless. Is Glide more optimal than OpenGL? What do we even mean by &quot;optimal&quot;? Is nVidia's OpenGL ICD less optimised than 3dfx's Glide? What do we even mean by &quot;less optimised&quot;?

Running high resultion/high detail
We wanted to compare the raw pixel pushing power of the two video cards. Once again we have received meaningless results. Is Glide modifying how/what pixels are actually getting pushed? Are the same pixels even being pushed as they are in OpenGL? Does the Voodoo have more fillrate than a GTS because it gets higher results? What does &quot;more fillrate&quot; even mean in a test like this?

These questions are like asking whether a jacket is better than a pair of trousers. It just doesn't mean anything.
================================================================================

In the case of Glide the situation is made even worse because it started out as a proprietry API. This alone should ring warning bells because proprietry APIs are avoided in benchmarks at all costs.

When i buy a card, yet again, i buy everything on the PCB, whether it be 'a generation ahead', has T&amp;L, has DOT3, etc

In that case a Glide benchmark for you would be valid because you specifically know what you are looking for in the results. But for a general purpose benchmark it would fail.

It's like having a round of general CPU benchmarks and then running a special compression benchmark on one CPU because we know that somebody is interested how it will turn out because they have a specific interest in the results. While the CPU may blow at the general benchmarks it may do well in the specific benchmark.

If you are only interested in this benchmark then obviously you will get this CPU even though it is clearly worse than the other CPU. In this case you have a specific interest in mind and you don't care about the rest of the results.

The same goes for a Voodoo. If all you want is to run Glide games and nothing else (and perhaps maybe some FSAA games as well), the Voodoo is the best card for you.

If i am wrong in ANY part of my statement, then let me know

You are wrong on my position. I say the benchmarks being done on websites are fine. Disabling hardware features (other than FSAA for obvious reasons) is bad and shouldn't be done. Using different APIs across the benchmarks is obviously extremely flawed and yields nothing of use.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Let's make a hypothetical where the Geforce2 would run Unreal Tournament in OpenGL mode at 50% higher speed than in D3D (with approximate visual quality). Knowing that anyone playing UT on a Geforce2 would play in this mode I believe it would be pretty important to compare it against other cards in a different API as well as itself in a different API.

Benchmarks are a method of quantifing real world performance. If people are gonna play UT they are gonna use the best API for doing so.

In this case, it seems that the V5 running UT is a mixed bag. The latest drivers are showing that neither D3D or glide is the clear winner. Neverless, it's a feature and is valid game for comparisons. After all, you aren't benchmarking a UT D3D wrapper, you are benchmarking Unreal Tournamemt.
 

Weyoun

Senior member
Aug 7, 2000
700
0
0
BFG:

Well, yes and no about your statements, what are website benchmarkers actaully trying to determine? If it is a general benchmark, then I see your point and it should be same API and same detail/graphical/etc settings. But on the other hand, what if benchmarkers are trying to determine what performance is going to be like for the average user, who tries to get the game working in it's optimal stage? This should be a review taken on at the game review level, and IMHO, would be very valuable.

Another hypothetical here, what if the GF2 ran better than the V5, in Q3 for example, but the V5 trounced it when optimised. Rather than throwing general benchmarks around the net and comparing two 'general' results, i would rather see the two show the optimised benchmark if it were for comparing individual games. And I dont see why these results would be meaningless results. If Glide had a different way of doing things, then so what!! I couldn't give a flying fudge if someone took the stairs to the top of the building and his mate takes the elevator, as long as the guy who used the elevator still exercises his tubby butt, it really doesn't matter. And on the comparison to a car race, yes, this applies to the hardware specific benchmark, but nothing else.

For a hardware specific though, general benchmarks would be the best solution.

Conclusions:
Both benchmarks are very useful and optimised banchmarks should be implemented by game reviewers. Benchmarkers should also state the intentions of the test and configure it accordingly

BTW, thanks for the honest post, I really appreciate it :)
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
I think at one time Glide was a valuable benchmark. It had a significant performance advantage over other API's. It of course required use of 3DFX hardware. What many people seem to forget (this may be a shocker for some) is that some people actually buy games to play them, not to just run benchmarks! If a game you really liked ran much better in Glide than any other API, (remember Unreal?) and you wanted the best gaming performance, then a Glide benchmarks was very important. You wanted to know which API, and hardware gave you the best gaming performance. You didn't care if it was supported in everyone's card.

These days with D3D and OpenGL having the performance they do, I don't think a Glide benchmark is needed any longer. It no longer has any performance advantage. It's actually the opposite. And when I say performance, I'm not just talking about FPS. Performance includes visual quality as well.
 

Eug

Lifer
Mar 11, 2000
24,026
1,644
126
Looks definitely matter.

However, like I've said before, for games like Q3 and UT looks are definitely secondary. Raw speed is what counts.

My favourite game is UT and knowing that Glide is usually the fastest platform at the lower resolutions in UT, Glide performance of a card is of extreme importance to me at this point in time. Other than Q3, I play no other games, ever.

In fact, my card choices are strongly influenced by how well they do in a SPECIFIC game. When I played NO games, I bought an ATI Rage Pro. When I started playing UT, I bought a Voodoo 3. When the new generation games come out, I'll probably buy another card. (This may sound expensive, but in fact it isn't because I generally don't buy top-of-the-line cards.)