FutureMark & Nvidia joint statement on 3DMark03; FutureMark tucks its tail between its legs.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,099
16,014
136
Originally posted by: Gstanfor
Sideswipe001 said:
Gstanfor (besides doing a lot of techincal arguments about the superiority of nVidia's designs) claims this was a set up; That ATi and FM set up nVidia with the newest version of 3dMark to make them look bad. That they are "Too smart" to do something like that.

I don't see it.

You obviously have not followed all of my posts. Do you remember back to when the ATi QUAK scandal broke? Guess who was responsible for figuring out what was going in in ATi's drivers back then. I'll give you a hint: It wasn't anybody who actually reported on the story - it was nVidia. Go have a read through Tom's editorials.

Now we have allegations - from ATi of all people - that changing the executable name of 3dmark2003 causes nVidia driver cheats to become apparent. Sorry guys, but that was ATi's little fiasco and nVidia was the whistle-blower. Do you honestly think they would leave themselves open to the exact same allegation?

Nvidia cheated, period ! I am sick of reading your crap supporting their lying. When you rename a program and it changes the benchmark, then obviously they coded just to get a better score, and that is cheating. I don't normally get involved in these flame threads, but this is rediculous, and I am sick of seeing it defended. Shut the h$ll up !!

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
No matey, I won't shut the hell up. Certainly not for you.

Where is your proof that nVidia did cheat by the way?
 

Chobits

Senior member
May 12, 2003
230
0
0
Gstanfor...you are kidding me. Go back and read the other posts then!

Even I am amazed by your biasness. I've owned 2 S3 Cards (Virigs and Savag4), 2 Nvidia Cards (Geforce2gts-b, geforce4mx420), and my brother has a 9000pro. I went S3, S3, NV, ATI(bro's pc), NV. I'm currently using the geforce4mx and its absolutely beautiful to me and I don't plan on replacing it for a long time (I mainly use it for TVOUT). I also have a nforce2 and I just built my friend a nforce2 with a G4ti4200 (there wasn't any real competitors for ATI in the ~100-120 range) so if anything I guess I'm biased to Nvidia though I see myself as a consumer buying whatever I think is a good deal. I decide this via bechmarks of all sorts and I do take 3dmark into account [but I don't make it my #1 choice. I look at least 8-ish benchies and as many reviews as possible]

If a company goes and tries tries to alter the benchmarks in a way that creates a pseudo increase in preformance in order to deceive me I think that is wrong. That is why during the Quak fiasco I didn't buy anything ATI (luckily though I realized before I plunked cash on a Geforce3 that I really didn't need it). ATI since that debacle has improved tremendously, probably the most "improved" tech company within the past few years.

Now Nvidia because it cannot stay ahead of the Radeon by a large margin (actually the margins are really small IN MOST situations that these days I guess I tend to read more reviews that talk about image quality) they have to cheat. They get caught and instead of saying sorry and moving on they say it is NOT their fault and tries to blame the competitor! And on top of that they try to cheat again immediately! Good god!

How do we know that the drivers don't also Target Serious Sam 2, Doom 3, Commache4 and other games that we look at? That is the serious issue. If we let them get away with a synthetic benchmark cheat (Which many consumers put their trust into) what prevents them from cheating elsewhere!

In the end the consumer is at a loss and you are trying to justify that.

It is like Media when it comes to the government....rather than argue the issues Gstanfor you argue the party politics of it
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
If you think nVidia (or any other company for that matter) is cheating in a particular game, then you should rename the .exe and record a new time-demo, significantly different to what is commonly in use for that game.

Nobody has presented one single shred of solid evidence that nVidia is cheating anywhere but in 3DMark2003. They announced their intention to "optimize" that very loudly a while back and have comprehensively followed through on that intention.

Personally if this whole saga forces review sites to abandon time-demos forever as a benchamarking tool and instead force the reviewers to actually play games on the card and give a subjective opinion based on that actual experience it will be a very good thing for the industry indeed.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
Dont bother arguing with this guy, its obvious he cant understand anything except for "blah blah Nvidia good! nvidia cheat, who cares!? Its ok for nvidia to cheat! everyone else bad, even if they have products superior to Nvidias!
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Something to ponder......If a company is willing to compromise one benchmark to "show" everyone that its the better performer, why would you think they wouldn't do it to all tools that they can (including game benchmarks)?

The same can be said about ATI..they cheated in Quack3.exe,AND they admitted to cheating in 3dMark2k3 (although with lesser performance gain) I see it this way, ATI cheated on a GAME, NV didnt, which is the lesser evil? whether it be 3 years ago, or yesterday, they still did it.

And noone has STILL answered my question...if this is a driver issue and not hardware, why is NO ONE getting these optimizations with other NV cards? I see NO degradation in my 44.03 drivers, with my GF3 Ti200. Performance is up a little bit in my games (this is normal for NV drivers to gain a LITTLE bit), and I wont run 3dMark AT ALL, coz I dont find it viable and real world. You cant play it, so why bother?

I am not gonna blame either company for what has transpired, I blame Futuremark. For putting out such a buggy program that either and/or both companies are optimizing it beyond it's intention. Who do you blame for putting SSE extensions in Photoshop, and not 3Dnow! extensions? do you blame Intel or AMD? nope, you blame Adobe. This is my opinoin though, take it with a grain of salt, in the end, your gonna buy what your gonna buy. And right now, NV has my money, unless they are caught cheating in games. I cant buy from either company right now, I have to pay off my OFNA 9.5 R/C car ;)
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: Markfw900
Nvidia cheated, period ! I am sick of reading your crap supporting their lying. When you rename a program and it changes the benchmark, then obviously they coded just to get a better score, and that is cheating. I don't normally get involved in these flame threads, but this is rediculous, and I am sick of seeing it defended. Shut the h$ll up !!
Wow, when did you become the communist dictator of the AnandTech forums? Holy crap... Someone had better let Anand know that his forums have been usurped by Markfw900 ASAP!!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Please head over here:
http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=12996

download the images on offer and comment please.

(this is a test using NOLF.exe renamed to 3DMark03.exe for the lazy - NOLF is a DirectX title).

Could someone please provide hosting for 3 high quality jpeg images at approx 370kb apiece please? (images are low quality because of forum space restrictions).

Oh, and Compddd and all the other detractors; are you just going to run your mouths or are you going to back your claims up, like me?
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
The reason you see the Rage3d gangups is simple. They detect Nvidia trying to undermine the consumer and some webmasters turning a blind eye to it. I frequent NVnews and alot of us over there notice what is going on ourselves. We all want a better playing field where the best product wins and earns our hard earned cash. With Nvidia lowering IQ in an effort to beat out ATI's best offerings is not doing their loyal customers any justice and its not "Winning".

I happen to have a gf3ti200 and a radeon256vivo. Both cards still serve their purpose but the radeon is not my primary gaming machine right now as its slower. I'm not a huge gamer but I go get my time in for a few hours a week. I plan on upgrading both cards when I get some bills paid off but I can guarantee you with the stuff that I am seeing and reading online right now Nvidia will not be earning my hard earned cash.

The only reason Nvidia is discrediting 3dmark is because its limitations get shown when running it. When pure DX9 shaders get thrown into the mix the FX series tanks! It was a bad design from the beginning and even though Nivdia admits now that the 5800 was a failure, throwing on Mag tires and pinstipes will not make the 5900 a success as its still a bad design underneath. I hope that Nvidia will make the NV40 a totally different design and less dependant on developers following their own proprietary extensions. If they followed the proper DX and OGL specs they would not be sitting in the pile of sh** they are now.

 

dpm

Golden Member
Apr 24, 2002
1,513
0
0
Originally posted by: Gstanfor
Nobody has presented one single shred of solid evidence that nVidia is cheating anywhere but in 3DMark2003. They announced their intention to "optimize" that very loudly a while back and have comprehensively followed through on that intention. Personally if this whole saga forces review sites to abandon time-demos forever as a benchamarking tool and instead force the reviewers to actually play games on the card and give a subjective opinion based on that actual experience it will be a very good thing for the industry indeed.

In other news, East Germany has announced that the Olympics are a "synthetic" benchmark of human activity, and less representative of human activity, and therefore have announced that they will "optimize" their atheletes for it...

Do you really think that reviewers using "subjective opinion" to measure new graphics cards is a good thing? How on earth is that supposed to help anyone decide what new graphics card is the one for them to buy? How can you measure the new Nvidia FX against the new Radeon, when Anandtech says the Nvidia "looks pretty" and Beyond3d says the Radeon has "nice colours"?

Actually, what am I saying - if I read your post more carefully you are right. Such a move would be "a very good thing for the industry" - they would no longer have to concentrate on being better than the competition so much. Its just that it would not be such a good thing for the consumer - me.
NB seeing as we all seem to have to "prove" we don't have any anti-Nvidia bias here, I've run 1 Cirrus, 2 ATI, and 3 Nvidia graphics cards. Currently using a ti4400.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: dpm

In other news, East Germany has announced that the Olympics are a "synthetic" benchmark of human activity, and less representative of human activity, and therefore have announced that they will "optimize" their atheletes for it...

.

Now thats funny.

I decided my physics exam was biased and not indicative of real world knowledge so I decided to optimize
for it (cheat sheet). Thank you Nvidia for showing us the way.

I am also considering hacking into my bank and "optimizing" my bank account because I think I should have more money than I actually do.

 
Nov 23, 2002
26
0
0
Hi bstanford, it's me; I'm back. :)

I wish I would have checked rage3D this morning before nVnews, then I would have seen the thread entitled "Never seen someone this oblivious" and spared meself the trouble of arguing with you over at nVnews.
rolleye.gif


At least ya didn't post up a couple of megs of pictures here to prove absolutely nothing, that much was decent of ya...but that's about all the props I can give you.

On the downside...

You just ain't getting it...not at all. nVidia cheated, nVidia lied, and nVidia is shooting themselves in the foot at every opportunity because they're product is currently inferior to ATi's and they can't handle second place.

Their new line-up wouldn't really be so bad if'n they just sold it on it's merits, but all the BS makes anything good about 'em questionable right now....if you could buy one at all!

Please, stop. If you got a real point make one, but if you're just trolling for flames you're going to find that you might just get burned.

I didn't bring you up at Rage3D, but the person who posted and said that Rage3D people seem to spring out of the woodwork to dispel the lies of nVidia ain't all that far from the truth. Rage3D has put up with a LOT of sh!t thru the years from nVidiots, (hell, I was one of the people giving 'em grief for a while! ;) ), and has done an AWFUL lot to help ATi enthusiast thru the bad ATi years to get their stuff to work right. (I did a lot of that too, I'm weird that way)

Now you're surprised that we're all committed to defending a company that has finally stepped up and is doing everything we've been asking them to do for YEARS?

Sorry, no. ATi has been busting their balls to help out the ATi community over the last year or so and defending their product/drivers/support is our way of saying "Thank you!". We tell no lies, we don't distort, we just try and bring the truth to light and to the attention of people.

Heck, we don't even organize for this! Someone just has to put up one post saying where and what and a couple of us just naturally swarm in. :)

Oh, and I don't think there is anywhere that's safe either. One great thing about all the horrible years of ATi is that Rage3D's community has built up so tremendously big & faithful that I dare say it is one of the most active hardware forums for one company out there. :)

So go right ahead and say whatever ya want bs, we'll be there. ;)

 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
digitalwanderer. Thanks for the link. That is TRULY a funny read:

So, this is the name the Iraqi Information Minister uses on the Internet.....

BUAHAHAHA:p
 

Seabook

Junior Member
Jun 7, 2003
2
0
0
Originally posted by: Gstanfor
Ever since microsoft changed the DX9 spec at the last minute, nVidia has had a huge performance problem on their hands with NV3x.

I saw this so i decided to make a forum account just to reply to this statement , lets not forget how this all happend , it was a Press release by Nvidia and how they said that they would be using FP32 and so Microsoft set the DX9 Minimum req to FP24 , and now they are having trouble with it Since DX9 Standards is FP24 , and that FP32 on Nvidia is VERY SLOW , They cannot fully use FP16 or NO whql certification.


(Don't forget John Carmack stated that FP16 would be sufficient for Doom3 - FP32 is nice, but not necessary).

Also Don't forget that ARB2 is the highest Quality path you can play doom3 in. R350/R300 is 2 times faster when it comes to that.

who is to say futuremark didn't lay a trap for nVidia to fall into?

Who is to say Nvidia didnt lay a trap for us to fall into by releasing there cheats (smoke and mirrors) so they can hide its problems?
 
Nov 23, 2002
26
0
0
See how much more fun it is when you bounce around all the forums rather than just blindly trust one of them? Me likes the whole cross-referencing factors to sift the chafe from the wheat....so to speak. ;)
 

rachaelsdad

Member
Aug 26, 2001
130
0
0
Originally posted by: Gstanfor
I
They can't make a good case against what I say, so they resort to attacking me.
Do you honestly believe that the likes of microsoft (which has never had an original thought in its entire existance) and ATi can design a better API/language than Univerisities such as Stanford can? Cg isn't just nVidia's invention you know...
.

The biggest problem NVidia has is that their hardware is not up to the task. So just like 3DFX they will try to push CG (as in Glide) instead of using what is already on the table. Much easier for the coders and developers to use one set of rules and it means fewer driver conflicts for the users. NVidia of course has to push its own API since it has not been able to create the hardware neccessary to run effectively under GL or DX9.

If the hardware is not up to the task then of course NVidia very sensibly takes control of the users card and turns off such things as AA when the card slows. It is not NVidias hardware that cannot effectively utilize the bandwidth of their memory bus; it is the fault of the user for wanting to run his card with AA on. When a user asks the card to use 32FP ;NVidia built the card and knows of course that 16FP is more than adequate and wisely chooses this option or even 12FP if the driver recognizes this will be enough if texturing becomes a problem. Clip planes; who cares, Right?

Perhaps NVidia should just stick with making XBox chips; at least all in that setting all the drivers could be matched to make the fullest use of their hardware limitations.

I believe that most people who use their pc's to play games would rather have more control over their graphics card than allowed on the XBox. ATI allows the user their own personal choices; NVidia doesn't.



 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: rachaelsdad
Originally posted by: Gstanfor
I
They can't make a good case against what I say, so they resort to attacking me.
Do you honestly believe that the likes of microsoft (which has never had an original thought in its entire existance) and ATi can design a better API/language than Univerisities such as Stanford can? Cg isn't just nVidia's invention you know...
.

The biggest problem NVidia has is that their hardware is not up to the task. So just like 3DFX they will try to push CG (as in Glide) instead of using what is already on the table. Much easier for the coders and developers to use one set of rules and it means fewer driver conflicts for the users. NVidia of course has to push its own API since it has not been able to create the hardware neccessary to run effectively under GL or DX9.

If the hardware is not up to the task then of course NVidia very sensibly takes control of the users card and turns off such things as AA when the card slows. It is not NVidias hardware that cannot effectively utilize the bandwidth of their memory bus; it is the fault of the user for wanting to run his card with AA on. When a user asks the card to use 32FP ;NVidia built the card and knows of course that 16FP is more than adequate and wisely chooses this option or even 12FP if the driver recognizes this will be enough if texturing becomes a problem. Clip planes; who cares, Right?

Perhaps NVidia should just stick with making XBox chips; at least all in that setting all the drivers could be matched to make the fullest use of their hardware limitations.

I believe that most people who use their pc's to play games would rather have more control over their graphics card than allowed on the XBox. ATI allows the user their own personal choices; NVidia doesn't.

nm, fishtankx explained it :D
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
if I remember correctly, wasn't Cg just a compiler for DirectX and OGL?

That doesn't make it an API at all.

Correct me if i'm wrong.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Well from what I can gather Cg will just make it easier to program shaders and crap onto the Nvidia platform. Just like coding in C is easier than coding in assembler. I don't see anything inherently wrong with it.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: FishTankX
Well from what I can gather Cg will just make it easier to program shaders and crap onto the Nvidia platform. Just like coding in C is easier than coding in assembler. I don't see anything inherently wrong with it.

I'm not saying that it is, but why would a developer bother to use both Cg and Rendermonkey to code for both platforms instead of just using generic calls?
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
It's simple. Render monkey and CG both have a simpler set of instructions to work with. 5 lines of CG code could do the equivalent of a hundred lines of regular code in some situations.

That being said, CG (OR was it render monkey?) can also automatically create optomized paths for ATi and Nvidia cards.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: FishTankX
It's simple. Render monkey and CG both have a simpler set of instructions to work with. 5 lines of CG code could do the equivalent of a hundred lines of regular code in some situations.

That being said, CG (OR was it render monkey?) can also automatically create optomized paths for ATi and Nvidia cards.

Interesting.