3dfx just unleashed their mega secret weapon HSR, will NVIDIA respond?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AirGibson

Member
Nov 30, 2000
60
0
0
First off, I need to correct one of my scores. It turns out that 4x FSAA *is not* playable at 1024 x 768. I must have screwed that up somehow. So, I went back and re-checked my numbers on the other benchmarks and they are correct.

Next, Domino Boy, this isn't a personal attack, but I'm not sure why I should respond to some of your comments when you didn't respond to a single one of mine :) In any case, here are me responses to your posting:

"#1- I am not complaining about anything. I was simply saying it's ironic that the same game 3DFX Zealots have trashed for being "A bad benchmark", is the same game they are now praising, that's all. It's just funny. "

Here is a cut and paste from my initial post, and I *dare* ya to try to refute any of them :)
A) It is a simple game to benchmark.
B) It is currently the most popular game to benchmark.
C) It is the game that the GF2 has always beaten the 5500 in (without FSAA) and the game that many nVidia fans use to "rub it in" when arguing with 3DFX fans about how slow their card is.

IMO, it wasn't / isn't a fair bench for the Voodoo, but even in the face of what I think is unfair, it performs excellent now. So my question is why suddenly should Q3 *not* be looked at? Nearly every single review site lately does Q3 benchmarks, yet this has been out two days and you're griping about "only seening Q3 benchmarks" ? C'mon, man...Personally, I honestly don't have any other games installed on my system that use Open GL except Baldur's Gate 2 and Tribes (runs like ass in Open GL on a Voodoo). Believe me, I'm all for seeing more games benchmarked, but I also realise that I'm not going to see 30 different games benchmarked the day after a new driver set is released. If it makes ya feel better, people are reporting huge increases in Unreal: Tournament. I personally haven't benched that one, though.

"#2- Are you suggesting that we are to believe what one person in a chat forum says over the many people out there that are saying that high resolution is extremely screwed up with these drivers? That's would not be very wise."

Grrr...I get disgruntled when people reply to me without even reading my post. And I quote once again from my last post "Domino Boy, I know there's alot of garbage out there regarding these drivers. Don't believe anything. Heck, don't believe me! Go test it, or listen to someone you trust who tests it."
Perhaps you see why your "point #2" is invalid. I agree and never suggested otherwise. I'm simply posting my results an observations. In the end, I suggest you find out for yourself rather than listening to me or to people (I.E. RivaStation.com) who haven't the slightest idea of how to properly use HSR before they post their inaccurate information. Oddly enough, you seem to embrace their inaccurate results as a method of refuting what I posted.


"#3- I see results for only ONE game, and I see mostly problems at high resolutions from what the majority of people say. That is less than acceptable proof for anything meaningfull at all."

Once again, I challenge you to respond to this: How is *doubling* your FPS in 1280 x 1024 which is a high resolution "not meaningful"? 42.4 to 80.4 (yeah, I know it's actually 1.896 times rather than double...humor me) And please, PLEASE don't tell me that you're going to claim 1280 x 1024 isn't one of the "higher resolutions".


"#4- Direct3D (majority of games) seem to not benefit whatsoever."

Yup.

"#5- Show me ONE SINGLE reputable site or reviewer that has used these drivers and shown any solid evidence of anything."

Man, I honestly couldn't care less about giving "Domino Boy" on Anand's forums "solid evidence". That's not a personal stab at you, but rather at anyone who acts like this. If you're concerned about it, go find your own evidence. My evidence is my results. Ignore it if ya want to and tell me that its worthless because I don't have a popular reputation. I'm not here to hold your hand, and I'm sure as hell not going to go web surfing for you :)

"#6- Don't give me the "They just came out" story, it doesn't work. Afterall, since you are so quick to proclaim them 'miracle' drivers with no ruputable proof, don't be suprised when people are quick to point out that they are worthless and mean nothing without complete and thorough testing."

Miracle drivers? If you were referring to myself or Sunny (since it is directed towards us), I'm wondering where we posted such claims of them being "miracles". Feel free to quote us. Forgive me for getting a preemptive chuckle from knowing that you can't and it is simply more unfounded speculative conjecture on your end. If you're going to address us, please address what we say rather than inventing things.

"There seems to be increases in ONE game, and that means squat."

Well, there's no "seems" to it. Q3 has gained HUGE increases in multiple resolutions. I agree that it would be nice to see several other games benchmarked before anyone can say these drivers are "miracles".

"But for now, it's just smoke."

For you, yes. For me, well, I've seen the results first hand, so there's no smoke to it.
 

SleepyTim

Member
Oct 27, 2000
106
0
0
Hello AirGibson. Looking at the last several posts it's obvious that Sunny and Domino don't have a problem anymore but you certainly appear to.
I understand completely what Domino was saying. If the FPS increase at the expense of graphic anomalies and bugs, then it's a poor trade off and not worth it. After looking around the web it appears that most people are saying the drivers have significant problems above 1024. Again, this is exactly what Domino was saying, and exactly what RoboTech was saying.

You are just looking for trouble with your "I challenge you" and "I dare you" smartass remarks. That's obvious. While you may not like the way a person states the facts, the facts are the same regardless. I noticed that you could not give any credible sources of extensive testing. So since you can't answer his challenge, I don't think you are in a position to be making any of your own.

I actually hope these drivers are not as bad as they seem because I am thinking about getting a PCI V5 myself to add to either my RADEON or ULTRA rig. But so far it does look like the drivers are more more hype & fluff than substance. It's great that you are not having the problems, but the vast majority of people are experiencing significant graphic anomalies and bugs at high resolution. But since they are just Beta, maybe they can fix all the problems...... or maybe not. We will have to wait and see.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
IMO, it wasn't / isn't a fair bench for the Voodoo,

Quake3 is quite nearly the perfect OGL benchmark for "today".

It is repeatable. It is consistent. EVERYTHING you do affects it.

Set your memory from Cas3 to CAS2? You'll see improvement. Up your FSB from 100 to 124? You'll see improvement. Drop AGP from 2x to 1x, you'll see a slight drop. Turn off AGP entirely? You'll see a noticeable drop. O/C your CPU, you'll see a gain. O/C your video card, you'll see a gain.

In short, Carmack managed to make an engine that has intense, immense graphical complexity, and uses EVERY single item in your system. No benchmark out there (Aside from 3dmark2000) makes such excellent use of your hardware.

As far as being "fair", what ISN'T fair is that at HQ settings, the GTS does trilinear, the 5500 only does bilinear. :-/ But you can make up for that almost entirely with a more aggressive negative lodbias setting.

"#4- Direct3D (majority of games) seem to not benefit whatsoever."

Yup.


guys, think about what you do to enable HSR. IT IS AN ADDED GL EXTENSION. It is undocumented/unsupported, and it's not even in the tools. That's why it has no effect on d3d. the GL extension is, presently, just a "test" for HSR.

Now then, I will repeat myself

1) Benchmarks are benchmarks. In-game performance is everything. Benchmarks are nothing if they aren't indicative of true, in-game performance

2) Visual anomalies are not acceptable in any way, shape or form. that's why Q3 benches with "faster" depth precision are, in my eyes, bogus. \

3) Stability is key. Some people are reporting staiblity issues. Not sure if they're just dense or did something wrong (heh...registry hacks are cool...), but until we can verify that:

  • the benchmark scores correspond with playability in game
  • the visual anomalies are dealt with ENTIRELY
  • the drivers get WHQL-certification and maintain 3dfx's reputation for excellent stability

then EVERYONE should reserve judgement. We can't say that these drivers are "good" or "bad" yet, no matter how badly we want them to be (and HOOOOO-BOY, do I want them to be!!!)



 

AirGibson

Member
Nov 30, 2000
60
0
0
This will be my last post on this thread since it?s almost child?s play to slaughter this stuff.

?Hello AirGibson. Looking at the last several posts it's obvious that Sunny and Domino don't have a problem anymore but you certainly appear to.?

Forgive me for being disgruntled that Domino was implying I said things I didn?t say and for pointing out multiple points I have observed first hand that refuted his arguments. I guess that equates to ?a problem? in the books of some.

?I understand completely what Domino was saying. If the FPS increase at the expense of graphic anomalies and bugs, then it's a poor trade off and not worth it.?

Yup. I agree totally, but I am saying that in many of the cases there is no trade off. Instead there is only an increase in FPS. In those cases, these drivers are excellent to have. In the cases where they are glitchy, they are worthless. Once again, have I implied otherwise anywhere ? Nope. So why are we pretending that I have or that I ?don?t understand? this?

?After looking around the web it appears that most people are saying the drivers have significant problems above 1024. Again, this is exactly what Domino was saying, and exactly what RoboTech was saying. ?

After testing this for myself, I can say that at certain points they didn?t have their Voodoo 5500 set properly in some of these ?tests? or that their setup is certainly not indicative of what others are seeing.

?You are just looking for trouble with your "I challenge you" and "I dare you" smartass remarks. That's obvious.?

Actually, the remark was ?I dare ya to try to refute one of them :) ? I had hoped the smiley face would convey a friendly tone to most readers. Take it however you choose, though. Semantic warfare is a straw man to most and of no consequence to me.

?While you may not like the way a person states the facts, the facts are the same regardless.?

Eh, forgive me for saying that some of the ?facts? are ?fictions?. Anyone with a Voodoo 5500 set up properly can prove it. Go on. Try it. Get a V 5500 and run the tests or find someone, ANYONE, to run them for ya. See first hand with your own eyes the results. Garbage such as ?The HSR driver does nothing for any high resolutions? for example. You think that?s fact? Resolve this cognitive dissonance at your own peril, or pretend that I (and many others) didn?t get my results. It matters not to me.

?I noticed that you could not give any credible sources of extensive testing. So since you can't answer his challenge, I don't think you are in a position to be making any of your own.?

Ah, I see. My challenge is that of a ?smart ass? while his is legit. Goooood concept. For hopefully the last time, I will say that I?ve seen it with my own eyes and on my own computer. I?m not here to ?convince? you of anything, nor have I searched for any HSR benchmarks from popular sites. All I can do at this point in time is tell you what I have seen and what other 5500 owners have seen in hopes to perk your interests. At present, I?ve yet to see any sites with *any* thorough testing of them, so what exactly am I supposed to say? You?re 100% correct. I cannot offer you a website to go to today that has done thorough testing and drawn any conclusions. If that makes someone wrong in your mind, que sera sera. I?ll deal, I guess.


?But so far it does look like the drivers are more more hype & fluff than substance.?

I would (and always have) take the same stance until I saw convincing proof otherwise, though I can?t say I would blast someone for offering up tons of benchmarks they took the time to run, post, and explain.


? It's great that you are not having the problems, but the vast majority of people are experiencing significant graphic anomalies and bugs at high resolution.?

God, this humorous. Once again, please quote *anywhere* I?ve said that there aren?t any problems in high resolutions with HSR. 1280 x 1024 in Q3, no FSAA, HSR 3 runs fine on my system when it is set up properly. Period. That does not translate to ?Gibson says he has no problems in all of the high resolution settings.? And you wonder why I got frustrated?

Feel free to correct yourself if ya like.
 

BW

Banned
Nov 28, 1999
254
0
0
Hehe that was ok Domino.Sorry for saying what i said. I work nights and i usually have a few brews when i get home and thats how that came out. I will ignore your post for now on as we seem to clash.Later.
 

legion88

Member
Nov 27, 1999
34
0
0
Check out Crappy Hardware on a short editorial on benchmarks. It actually covered a topic related to the V5 not rendering the frames as originally intended--eg. trilinear turned on but not used by the card. But as you can see, it also relates to this current HSR topic--frames are not rendered as originally intended with some objects like floors and walls, which makes up a large chunk of the screen, not even rendered at all.

 

DominoBoy

Member
Nov 3, 2000
122
0
0
I think we have all learned a lot from this thread.

# 1- We should all try to not make rude comments. I'm sorry for my first one which was kinda sassy. :)

# 2- The 3dfx drivers do show some promise, but still need lots of help before they work correctly. But if they ever get the bugs taken care of, and they can work for many games instead of just one, they could be very nice and give the V5 an impressive boost.

# 3- BW, it's ok man. I apologize for my own harsh words too.

# 4- AirGibson apparently has OCD (Obsessive Compulsive Disorder), and despite being proved wrong and plain silly repeatedly, he continues to make huge posts trying to break down each and every word of people in a desperate attempt to hide his own mistakes and ridiculous claims. He seems to be obsessed with this whole thing, and while everyone else is is getting along now, and even apologizing, he is spending all his waking moments thinking of what he can say next, and going on & on & on & on & on & on. Everyone else is getting along now except you. I have even apologized to the other people that thought my comments were rude. I can't apologize to you though because you keep going on & on & on & on, and I have no respect for you. Look at the length of your posts man. They are just rambling and trying to start trouble. Get over it dude.

I have just edited #4 (my commets about AirGibson). They were much longer and very blunt. But they would have driven him crazy and caused more trouble, and I am trying to keep the peace. :)


 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0

I can certainly agree with the point you were trying to make. Take a read of this thread and you'll see that.

But GEEEZ Roscoe....that went overboard a bit.

oh, and do me a favor. Can we get a WARM floor, please?

You're always powerbombing me onto the "cold, cement floor"

Let's warm it up a bit. Some carpeting...perhaps turn the heat on in the building...SOMETHING. sheesh..... :D
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Seriously, i've never come across a thread that is so informative, yet entertaining at the same time! i must say i do appreciate everyone who has chosen to share their knowledge, benchmarks, and other tidbits here. we've learned that these beta drivers did some pretty amazing things for someone out there. yet we also know this doesnt extend beyond Q3, OpenGL, and moderate resolutions yet.

cant we all just get along;)
 

legion88

Member
Nov 27, 1999
34
0
0
No, my editorial here didn't go overboard. It was written at the time when people, on behalf of the Voodoo5, were actively downplaying the significance of faulty benchmark scores being presented. Do you catch my drift?

I don't mind comparisons where one or more cards were given an advantage or disadvantage. What matters is that this handicap or advantage was made known and reviewers often do not say a thing. It is also important to understand what that handicap or advantage means to the scores and some people seem to not understand that at all.

At least with trilinear (in multi-texturing), we know that this is a limitation of the hardware so no driver magic will solve that. With the current HSR drivers, it is up in the air for now. We don't know yet if all this is just an illusion or there will be real improvements in the future.

In any event, comparing scores and/or making conclusions based on invalid data often end up with invalid comparisons and/or invalid conclusions.

As long as those 'anomalies' exists, the scores are not valid for true HSR performance. They are valid for scores where anomalies are present such as the floor not being rendered (and that obviously saves fill rate and bandwidth). So people should be comparing the scores to other video cards that exhibit similar visual anomalies.

And one more thing, the example I used in my editorial was of course made-up. It was based on another example provided by a professor of mine. He told the class that he had "scientific proof" that men are smarter than women. According to my professor, some "scientists" in the late 1800s thought that brain sizes were indeed an indication of intelligence. They dug up graves just like I mentioned in my editorial. This led to the conclusion that men were smarter than women. He was trying to explain to people that invalid data can often lead to bad conclusions, regardless of how reproducible that data is. I simply took his example and modified it.
 

legion88

Member
Nov 27, 1999
34
0
0
Three posts in a row...Anyway,

Why is it that there are no MDK2 scores, Quake II scores, or scores from any other OGL application including other Quake III engine games?

With all these Voodoo5 users here, one would think that one of you would test it out in some application other than Quake III.


 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
legion88, since you mentioned MDK2, allow me to add another question to the thread...

i recently downloaded the MDK2 demo and played it using my current system setup (P2 350 and a V5 5500). now because MDK2 was built on an OpenGL engine, does that automatically mean that my V5 5500 will default to OpenGL support in order to play the game? the reason i ask is b/c i know 3dfx is known for having less than good performance using OpenGL support, and my MDK2 demo runs very smooth and the graphics are beautiful. now running Unreal Tournament using OGL is horrendous. the graphics are very dark and ugly. but MDK2 is very pretty and smooth. can anyone explain this?
 

legion88

Member
Nov 27, 1999
34
0
0

Sunny129, yes, when running MDK2, the OpenGL drivers of your Voodoo5 5500 will be in use.

Regarding smoothness and ugliness, you are comparing MDK2's development team's "mad programming skills in OGL" to UT's development team's "mad programming skills in OGL". OGL==OpenGL. The difference you are seeing is the difference in their programming skills, which apparently can be significant. MDK2 is better.

This is why I laughed at the numerous attempts by 3dfx supporters including reviewers insisting that they should compare glide in UT when using 3dfx cards to D3D (or OGL) when using other cards. By playing these benchmarking games with the selection of the "right" API, these "non-biased" people are deliberately taking into account the game developers' differing skills in working with different API.

To give you an example, Tim Sweeney of UT can program using glide a lot better than in OGL or D3D. Because Sweeney has an easier time with Glide (since he knows it better), it is only natural that the game will run better in glide.

But there is another problem when playing these API games. Again, let us take UT as an example. I do not remember the site anymore but someone compared the performance in UT using a GeForce2 GTS in OGL and D3D. At the low resolutions, the OGL scores and D3D scores were roughly the same. I don't remember which was faster but they were roughly the same. Up the resolution to 800x600, then we see the D3D scores start dropping by over 10 FPS while the OGL scores dropped about 3 or 4 FPS or so. As the resolution increases, it became very apparent that the D3D scores were dropping faster than the OGL scores. This means that UT when using D3D is consuming more fill rate than OGL. This indicates that when using D3D, UT is showing more graphical effects, eye-candy, than OGL or that there is some very significant programming flaws in D3D. If such a difference is present in the same game by simply switching APIs, then it makes benchmark comparisons invalid when different APIs are using for different cards.

Read my editorial here. It discusses what happens when invalid data is used to make conclusions.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
I believe the HSR is a wonderful advancement when it comes to video cards as it takes the burden off memory bandwidth. If 3dfx can work out the bugs with these new drivers(which i believe they can do)it will make the 5500 from a now good buy to an incredible buy since it will add a year at least to the 5500's lifespan. For a videocard that is an incredible feat!!

Being a Radeon owner i'm on the outside looking in thinking "Damn!! Can Ati bring out more power in hyperZ? I certainly hope so.
 

Ahriman6

Member
Oct 24, 2000
78
0
0
Legion88:

This is why I laughed at the numerous attempts by 3dfx supporters including reviewers insisting that they should compare glide in UT when using 3dfx cards to D3D (or OGL) when using other cards. By playing these benchmarking games with the selection of the "right" API, these "non-biased" people are deliberately taking into account the game developers' differing skills in working with different API.

So hardware reviewers should somehow psychically know what API a game's particular development team is better with? I mean, if you owned a GTS and a game that you love supported both D3D and OpenGL and plays, let's say, 10% better in one API over the other, as an owner of the GTS and that game which API would you use? And if hardware reviewers are using games to test *actual* game performance, they should use both, or all, supported APIs, IMO. They shouldn't automatically exclude one API even if it is proprietary, because gamers will want to see the actual speed of that game on their piece of hardware.

And when you say that you "laughed at the numerous attempts by 3dfx supporters. . .", you really sound like a hardcore Nvidia supporter yourself. It's usually those guys who screamed bloody murder about Glide being used in a review. Hell, Sharky just recently admitted on his own forums that he won't test with Glide because he doesn't want the avalanche of hate mail to fill his inbox. WTF?? Why would anyone care if a game that was chosen to be used as a benchmark is tested with all its supported APIs, especially if one API gives better performance for whatever reason? Gamers want to know how games will perform on their cards, and while people such as yourself are against Glide, ostensibly under the banner of review fairness, such practices are really unfair to a certain segment of gamers who own 3dfx cards. I fully understand testing with only Glide on a 3dfx card as being unfair; no, a thorough reviewer should test with all APIs to show the varying performance levels, and Glide isn't always guarantee of better performance (though a few years ago it generally was because D3D was so dog-slow back then).

So let's break this down and I'll skip your commentary about developer skills with particular APIs because that's utterly ludicrous sophistry (and really makes you look quite biased). Are you for more thorough, fair testing or just simply against Glide being used at all, because it's "unfair"? Me? I'm all for full disclosure of all possible available information, so that gamers can be better equipped to make better choices. I don't need reviewers to tell me what's best for my personal gaming needs, I need them to give me more information so I can make a more educated purchasing decision.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
It seems to me that 3dfx forced Nvida's hand with FSAA, so Nvida releases drivers supporting it.
Now they have themselves released (beta) drivers supporting HSR, will this force Nvida to release their own drivers supporting this feature? And can they do it? Or is this not supported by the GeForce chip, and/or protected by patents, ect.?
 

legion88

Member
Nov 27, 1999
34
0
0

Ahriman6 wrote:


<< So hardware reviewers should somehow psychically know what API a game's particular development team is better with? >>

It is irrelevant if they knew or they used a crystal ball. The point, which obviously you missed, is that adding a new 'element' into the benchmarks invalidates the comparison.



<< I mean, if you owned a GTS and a game that you love supported both D3D and OpenGL and plays, let's say, 10% better in one API over the other, as an owner of the GTS and that game which API would you use? >>

I've seen this 'excuse' used many times before and it never ceases to amaze me that you people continue to spread this myth as if it is important. It isn't important. But I recognize that if enough people make enough noise, then it becomes &quot;important&quot; not because it is the right thing but because too many people want it that way.

So let me go over the points which I'm sure you have read many times before. These people are reviewing a graphics card. It is a hardware review, not a software review. Since it is impossible for a hardware reviewer to benchmark the video cards using every single piece of software available, then the reviewer has to categorize the available software. What do I mean by 'categorize'? Look the word up. I will explain it by example.

The reviewer would group the available software into categories: D3D and OGL. Gee, do these groups look familiar?

In the D3D group, the reviewer will use D3D as the API of choice and benchmark all the cards using the same API. There will be no &quot;comparisons&quot; using glide since this group is for D3D. Notice I did not mention the game to be benchmarked here. The application is irrelevant, it is the API that matters.

In the OGL group, the reviewer will use OGL as the API of choice and benchmark all the cards using the same API. There will be no &quot;comparisons&quot; using glide since this group is for OGL. Notice I did not mention the game to be benchmarked here. The application is irrelevant, it is the API that matters.

It never ceases to amaze me of the number of attempts that you people are doing to invalidate these comparisons. Sticking a glide score into a group of D3D scores is non-sense. Always was nonsense and always will be nonsense.

Now, I know, you will argue that all APIs should be used. Funny. Glide is available on 3dfx cards. I would hope that someday you people would understand that but every day it seems less likely. I will repeat it one more time for effect:glide is available on 3dfx cards. If glide is a factor in a gamer's choice for video cards, then a 3dfx card is the only option the gamer has. There is no logical reason to include glide in the comparison if the gamers, the readers of the reviews, has no interest in using glide.

The EXCEPTION to the rule is when the reviewer will be comparing the performance of various 3dfx cards so the reader can see which 3dfx cards provides the faster performance in glide. Obviously, this would only be useful to those considering to purchase a card capable of using glide. And to make it clear for you, this comparison would only include glide scores. Since most reviewers do not even bother to use the Voodoo3 or Voodoo4 in their benchmarks, they have no reason to include a glide comparison.



<< ...you really sound like a hardcore Nvidia supporter yourself. >>

And it is a typical response from people who spent so much effort in defending their favorite game, favorite API, or favorite video card/company. Let me educate you in the error of your ways. Read this editorial that I wrote back in February of this year, Four Flaws of 3DMark2000. In that editorial, I criticized Derek Perez's comments regarding 3DMark2000. If you do not know who Derek Perez is, then let me inform you that Mr. Perez is the head of public relations for NVIDIA.

Unlike so many people, I have no feelings of love and joy for a video card. Therefore, I have no interest in making sure that a video card from a particular card company comes out on top in the benchmarks. Hell, I do not even get paid to write any editorials I've made. No money from graphics companies, no money from banner advertisements. Additionally, I have no interest in getting a job at 3dfx, NVIDIA, ATi, Natrox, or any one of those companies. In contrast, we have reviewers actually kissing up to these companies and later be hired by those same companies. My only interest is that accuracy and truth prevail, not propaganda.

This comment from you is quite telling:

<< ...if one API gives better performance for whatever reason? >>

You apparently do not care if the comparisons are valid. Yet, you are the one calling me &quot;bias&quot;. Why don't we all compare the 16-bit performance of the GeForce2 GTS to the 32-bit performance of the Voodoo5 5500? After all, we are looking at performance here. Who cares why the numbers ended up that way? Just blindly compare the numbers like zombies and proclaim a winner.
 

Ahriman6

Member
Oct 24, 2000
78
0
0
Legion88 (Roscoe Sincero, learned this after reading the 3DGPU article) wrote:

&quot;It is irrelevant if they knew or they used a crystal ball. The point, which obviously you missed, is that adding a new 'element' into the benchmarks invalidates the comparison.&quot;

I don't agree with this. You seem quite concerned with levelling the playing field as much as possible when a reviewer compares two cards. Well, isn't it the job of the reviewer when doing a 3D card comparison to show the strengths and weaknesses, advantages and disadvantages of each card throughout the comparison? Seems quite logical, and fair, to me.

Look, there should be a solid baseline of testing that reviewers should use. Run the software (games) with the same settings and make the comparisons. But then start tweaking each card based on any particular advantages it might or might not have and show how that affects the game being used as a benchmark! For instance, did the Savage 2K cards run UT faster with the compressed textures on that 2nd CD than the regular, uncompressed textures? Since, say, a GF card wasn't able to use the compressed textures until recently, a comparison article written earlier this year should've, IMO, tested both cards with the uncompressed textures and then gone on to show whether or not the Savage 2K's possible advantage of using those compressed textures would make a performance and/or image quality difference, for better or for worse. That's what I mean when I talk about giving the readers more information by which they can make a better purchasing decision. The same would go for a 3dfx card. When compared to a competitor's card with a particular game that supports Glide (let's stick with UT), a fair reviewer would go down through the settings using D3D, and then throw in some Glide benchmarks to see if that bonus that 3dfx cards only have might make a difference so that readers can better choose. After all, if a game is popular enough to be widely used as a benchmarking tool, gamers want to know which card under which conditions best runs that particular game. By using these features present on different cards, the reviewer might also educate existing card owners as to how to better tweak their boards for better performance in the game.

&quot;It never ceases to amaze me of the number of attempts that you people are doing to invalidate these comparisons. Sticking a glide score into a group of D3D scores is non-sense. Always was nonsense and always will be nonsense.&quot;

The real bottom line is which card plays the game best in real-world, readily-available conditions. We're talking actual and not hypothetical performance, and that's the difference in our arguments. Hell, by not using an advantage such as Glide, S3TC texture compression, or hardware T&amp;L, you're essentially reducing a comparison of these different cards (that possess different features) to a flat, baseline. You've essentially stripped certain key differences. What's the point of comparing different products if you're going to remove the things that differentiate them from each other?

&quot;And it is a typical response from people who spent so much effort in defending their favorite game, favorite API, or favorite video card/company. Let me educate you in the error of your ways. Read this editorial that I wrote back in February of this year, Four Flaws of 3DMark2000.&quot;

Hmm, just read that editorial. I agree with some of the things you wrote, just as I agree with some of the things you're writing in this thread. Interesting that you go after an unreleased 3dfx product (the V5 6000) while writing about a PR statement from Nvidia. Hmmmm. . .didn't exactly disprove a bias against 3dfx, IMO.

&quot;Unlike so many people, I have no feelings of love and joy for a video card. Therefore, I have no interest in making sure that a video card from a particular card company comes out on top in the benchmarks. Hell, I do not even get paid to write any editorials I've made. No money from graphics companies, no money from banner advertisements. Additionally, I have no interest in getting a job at 3dfx, NVIDIA, ATi, Natrox, or any one of those companies. In contrast, we have reviewers actually kissing up to these companies and later be hired by those same companies. My only interest is that accuracy and truth prevail, not propaganda.&quot;

Same here. I just want fair testing, I want to see reviewers showing me how these different cards play these different games. I don't have the money or time to buy them all myself, which is why I rely on hardware reviewers to show me, and if they're not going to give me all the information then I cannot make an informed decision. &quot;In contrast, we have reviewers actually kissing up to these companies and later be hired by those same companies.&quot; Who's been hired by Nvidia, ATI, or 3dfx that used to review 3d cards? I'm sure 3dfx wants any games that still support Glide tested with that benchmark and I'm sure Nvidia and/or ATI are against it. My only point is that I want and need as much information as possible, and by example let me say that while I've never owned an S3 card in my life I would think it only fair that if a reviewer chooses to test a game that actually did support S3TC (even before MS licensed it into DXTC) and then compare that S3 card against a competitor's card by using that particular game to measure relevent performances, said reviewer should damn well have used S3TC sometime in that comparison.

&quot;You apparently do not care if the comparisons are valid. Yet, you are the one calling me &quot;bias&quot;. Why don't we all compare the 16-bit performance of the GeForce2 GTS to the 32-bit performance of the Voodoo5 5500? After all, we are looking at performance here. Who cares why the numbers ended up that way? Just blindly compare the numbers like zombies and proclaim a winner.&quot;

More ludicrous sophistry from you. I never suggested such a thing and I very much care that the comparisons are valid. Apparently, your idea of 'comparing' two products is two remove anything that might differentiate one from the other (though it's curious to notice that you're not supporting disabling hardware T&amp;L on a GTS board when it's compared to a V5). I'm all for fair and impartial testing. I clearly wrote that ALL APIs that a tested game supports should be used throughout the benchmarking tests. . .tests that of course are using the same hardware and settings. I think the difference here is that while I'm advocating for as much information as possible, which would lead us to 3d card comparisons that show benchmark score tables trickling down beneath the different APIs a test game supports so that the reader may browse through those for himself, you're apparently advocating never allowing a feature that any card may uniquely support, even when compared to other cards. Preposterous, horribly broken logic. Quake 3 supports hardware T but you'd never even think of a reviewer disabling this feature or advantage to make a comparison with a V5 or Matrox &quot;valid&quot;. And I would never suggest such a thing, because readers need to see the impact this advantage has on game scores that support it. Again, give the readers more information.

I'm sorry for how this sounds, but you have a horrible reputation for attacking 3dfx all the time. And I do mean ALL THE TIME. I remember a friend pointing out some old article you wrote attacking the texturing on a V2 or V3 (can't remember which). So excuse me for questioning why you &quot;laugh at 3dfx supporters&quot; or feel inclined to criticize a never-to-be-released 3dfx (the V5 6000) product's hypothetical (hmm, speaking of hypothetical, you're against actual real world performance being shown in favor of hypothetical situations yet you won't hesitate to make hypothetical comments about a 3dfx product months and months before much is known about it. . .in fact, you wrote that 3DGPU article months before even the 5500s were out, so no one even knew then just how CPU-dependent that line of cards were going to or not going to be. . .hmmmmmm, that article really, really fails to disprove you being biased, quite the opposite in fact) CPU-dependency in an editorial that was supposed to have been talking about Nvidia PR tactics. WTF kind of logic is that? But I really don't care about this. . .the issue is what constitutes fair benchmarking, especially when comparing different video cards. Our own words will show who's really biased, now won't they? :D
 

han888

Golden Member
Apr 7, 2000
1,586
0
0
people dont know about Voodoo5 5500 card will say geforce2 is the best, like me for the example! before i own this voodoo5 , i am use asus v7700 in my system! and it's true geforce2 card is a damn faster card! and i always said geforce2 is more better than voodoo5, however after i swap my v7700 with voodoo5, so my conclusion is voodoo5 more better, who need 122 fps (my asus v77000 on p3-800@1000 mark) for play the quake3 at resolution 1024 X 768 X 16 bit with the bad graphics? i am better choose 100 fps with good graphics!

just want to let geforce2 card owner know! this voodoo5 is rocks!!! and before i am own voodoo5 , i ever owned asus V6600, asus v7700 and i am still have asus V7100 on my second system :)so it's mean i am not just judge the voodoo5 card is good because i own it! but i see the different with my eyes!!!