• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

FutureMark & Nvidia joint statement on 3DMark03; FutureMark tucks its tail between its legs.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Go here:
http://briefcase.yahoo.com/gregrstanford
(you will require a yahoo account if you don't already have one).

Go to the Anisotropic folder and download "Container.zip"

It will have 2 ace files and two jpg files in it.

The two jpg's show my driver settings, used for the test.

Aniso Tester.ace is Xmas's anisotropic filtering testapp V1.2 as used by this website.

Aniso1.ace contains 2 tiff images.

The first of these is the anisotester run normally with the settings indicated in the picture.

The second is the anisotester renamed as 3DMark03.exe (which is the exact filename 3dmark 2003 installs as on my system).

Note I only have a GF3 Ti200 currently.

I'd like to see results from GF-FX owners please.

Remember the allegation is that nVidia's drivers detect the 3dmark executable name and then modify their anisotropic filtering settings based on the name detected. If this in fact true, this test will pick it up.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Megatomic
Originally posted by: NFS4
Looks at Gstanfor's sig...

**Off to bed for me**
Looks at NFS4's rig...

Yep, the pot calling the kettle black.
rolleye.gif

What does my sig have to do with NVIDIA and FutureMark?
rolleye.gif
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Megatomic
Originally posted by: NFS4
Looks at Gstanfor's sig...

**Off to bed for me**
You implied that Gstanfor was biased based on the contents of his sig while you have an ATI product in the system in your RIG link in your SIG implying your bias. That's what I was getting at.
(1)
You looked at my comment and simply assumed that I was attacking Gstanfor when I said nothing to him actually. It's just that his sig made me laugh [see #3]

(2)
Just b/c I have an ATI card doesn't make me biased. I could look at your rig and say that since you have a 4800 card that YOU are biased towards NVIDIA. I even went to so far as to say that 90% of my graphics cards have been NVIDIA. Does that fact that my notebook has an S3 based graphics chip make me S3 biased? And the system sitting right beside me has an AMD Athlon processor in it. That must make me AMD biased...but wait that can't be b/c I have a Pentium 4 in my main rig. WHERE DOES IT END!!!!???? Come on, you've gotta try harder than that;)

(3)
The reason I pointed to Gstanfor's sig was for this reason:
nVidia's proven to be the most reliable shop when it comes to supporting their products and making them perform at the highest level they can. So your choice is this: go for a chip proven to be under-performing and unreliable (VIA), or go for a new product from a company known for its performance, reliability and end-user support (nVidia).
I couldn't bring myself to say that with a straight face right now given the situation that NVIDIA is in. Maybe for their nForce2, but definitely not for the FX series with these driver "issues."
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Originally posted by: NFS4
I couldn't bring myself to say that with a straight face right now given the situation that NVIDIA is in. Maybe for their nForce2, but definitely not for the FX series with these driver "issues."

i dunno about the nforce either... they don't seem to be able to write a working IDE driver...
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: NFS4
You looked at my comment and simply assumed that I was attacking Gstanfor when I said nothing to him actually. It's just that his sig made me laugh
Ok, sorry then. With all the antics lately I saw this as another of them. My bad. I can edit my post out if you'd like.
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: ElFenix
Originally posted by: NFS4
I couldn't bring myself to say that with a straight face right now given the situation that NVIDIA is in. Maybe for their nForce2, but definitely not for the FX series with these driver "issues."

i dunno about the nforce either... they don't seem to be able to write a working IDE driver...
We should see something soon in regards to the nForce UDAs. I'm confident of this.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Megatomic
Originally posted by: NFS4
You looked at my comment and simply assumed that I was attacking Gstanfor when I said nothing to him actually. It's just that his sig made me laugh
Ok, sorry then. With all the antics lately I saw this as another of them. My bad. I can edit my post out if you'd like.

Seriously though, until this whole 3DMark "fiasco," I've only had two things to bitch about concerning NVIDIA:

(1) 5800 Ultra was late, meager performing and had a dustbuster fan on it.
(2) The fact that the 5800 Ultra and 5900 Ultra take up two slots

Other than those two issues, I had no beef with NVIDIA. In fact, if NVIDIA had a comparable card out at the time I bought my Radeon 9700 Pro, I would have gotten it (I had some prior issues with ATI products)...but they didn't. The fastest thing they had at the time was the Ti4600.

Now this whole FM thing is a whole new can of worms...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Even with the current state on nVidia's drivers we still enjoy better overall compatability and stability than ATi users.

Most of the "issues" in nVidia's drivers directly relate to 3DMark2003. nVidia publically stated it would demonstrate how easy it is to optimize 3DMark2003's performance. They have done exactly that.

No-one has yet been able to demonstrate that these optimizations affect anything outside of 3dmark2003.

Lastly, it is still extremely early in the life of NV3x, which is a huge change away from nVidia's last architecture. The drivers are still being perfected. Not having optimally performing silicon has not helped matters either.

As for the nForce2 drivers, if the SW IDE drivers cause problems don't use them. They are an optional part of the install and don't really add much performance above and beyond the normal IDE drivers. It's not as though other chipset vendors such as VIA have not had their problems with custom IDE drivers in the past either. I think the problem has more to do with windows itself than actual driver quality.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,051
32,570
146
I'd just like to see Nvidia stop wasting effort on this whole 3D thing and put optimizations like the 3Dmurk one into my favorite games so I can get a performance boost without a drop in IQ I can dtect with the naked eye :light::evil:
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
The 3dmark stuff does not surprise me. Like somebody else said in another forum, I want to see how Therius Tham and Spinters Bell run on that 5900 ultra.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
alternative link to my file for anyone having trouble (thanks Indio from Beyond3d!)
alternate link

Ever since microsoft changed the DX9 spec at the last minute, nVidia has had a huge performance problem on their hands with NV3x.

This is because NV3x depends upon multi-precision render targets to work well. nVidia realised that just like 32 bit color in the old days, it would take a long time for 32 bit floating point to be universal used. nVidia also realised that 60% or more of the time 32 bit FP precision is simply unnecessary - most rendering can be done at FP16 or less with no loss whatsoever (Don't forget John Carmack stated that FP16 would be sufficient for Doom3 - FP32 is nice, but not necessary).

Unlike what ATi would have you believe, mutli-precision rendering is a step forwards, not backwards - it allows you to have your cake and eat it too. Fancy effects when needed and fast, effecient rendering for the rest of the scene at a data size ideally suited to what is being rendered - no wastage of resources/precision where it is unnecessary.

Why render a pixel at a higher precision than necessary if you don't have to? It makes no sense whatsoever. The extra cycles saved per pixel can be put to use on even more eyecandy than before!

ATi are also being obstructive by refusing to support Cg. (Both multi precison rendering and Cg would help ATi just as much as it does nVidia if they stopped to think about it). Cg is what supports multi-precision rendering. it is an open standard co developed with Stanford University. It is to everybodys best interest to support it.

ATi's brute force FP24 bit approach will eventually hit a performance brick wall, and there will be no easy way around it, since it is the only supported FP render target. NV3x can make better use of its performance budget by intelligent use of precision - wasting less resources.

Unfortunately for reasons known only to itself microsoft has decided that brute force rendering (the ATI way) is better than nVidia's intelligent approach. As I said way back in the thread - I believe this was brought on by MS losing the x-box arbitration.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Gstanfor
alternative link to my file for anyone having trouble (thanks Indio from Beyond3d!)
alternate link

Ever since microsoft changed the DX9 spec at the last minute, nVidia has had a huge performance problem on their hands with NV3x.

This is because NV3x depends upon multi-precision render targets to work well. nVidia realised that just like 32 bit color in the old days, it would take a long time for 32 bit floating point to be universal used. nVidia also realised that 60% or more of the time 32 bit FP precision is simply unnecessary - most rendering can be done at FP16 or less with no loss whatsoever (Don't forget John Carmack stated that FP16 would be sufficient for Doom3 - FP32 is nice, but not necessary).

Unlike what ATi would have you believe, mutli-precision rendering is a step forwards, not backwards - it allows you to have your cake and eat it too. Fancy effects when needed and fast, effecient rendering for the rest of the scene at a data size ideally suited to what is being rendered - no wastage of resources/precision where it is unnecessary.

Why render a pixel at a higher precision than necessary if you don't have to? It makes no sense whatsoever. The extra cycles saved per pixel can be put to use on even more eyecandy than before!

ATi are also being obstructive by refusing to support Cg. (Both multi precison rendering and Cg would help ATi just as much as it does nVidia if they stopped to think about it). Cg is what supports multi-precision rendering. it is an open standard co developed with Stanford University. It is to everybodys best interest to support it.

ATi's brute force FP24 bit approach will eventually hit a performance brick wall, and there will be no easy way around it, since it is the only supported FP render target. NV3x can make better use of its performance budget by intelligent use of precision - wasting less resources.

Unfortunately for reasons known only to itself microsoft has decided that brute force rendering (the ATI way) is better than nVidia's intelligent approach. As I said way back in the thread - I believe this was brought on by MS losing the x-box arbitration.

No offense, but is that taken from an NVIDIA inter-office memo?;)
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Ever since microsoft changed the DX9 spec at the last minute, nVidia has had a huge performance problem on their hands with NV3x.
So since they couldn't win fairly in DX9 benchmarks they needed to cheat?

Like I said elsewhere, they should hire Sammy Sosa as their spokesman. They might have to exempt him from the employee drug testing policy though.
 

WarCon

Diamond Member
Feb 27, 2001
3,920
0
0
Gstanfor - You make alot of good points (some are simply defending Nvidia's choices and are simply one side of an arguement/discussion, but are valid because they represent a viewpoint). Only problem with your arguements is that they do not pertain to the topic. The topic is did Nvidia knowingly try to violate the publics trust by circumnavigating the benchmark process by optimizing for it specifically.

Let me take the perspective of Joe Consumer (specifically head of purchasing/sales/marketing for a company like DELL that bases their multi-million dollar contracts for parts on a known and widely used benchmark).

Joe Consumer wants to know which card is going to make his machine perform the best and make consumers continue to buy his products for the least amount of money. So he picks a known benchmark and supports it (primary funding for said benchmark). Large video card manufacturer seems to be losing the lime-light that it once shared with no-one and developed a huge appetite for money. Said company runs tests based on upcoming benchmark on their upcoming flagship and discovers that it doesn't perform well enough to guarantee solo spot in lime-light still. Said company pleads with benchmarking company to rewrite benchmark so that it reflects better on their product, rather than being written strictly to standards set by Microsoft for DX9. Benchmarking company says it won't compromise itself because it wants to stay unbiased and fair to all manufacterers even though it realizes large company has major market share. Large video manufacturer withdraws from benchmarking program and starts on plan to discredit benchmarking company, while making itself look better than they really are. Large company writes drivers to specifically detect portions of the test and because the test is run in a static direction it is easily manipulated to give readings other than what the benchmarking software was intended for.

The question is..............Is this obvious manipulation and misuse of benchmarking program a legit and straight forward procedure that will be what Joe Consumer is going to have to watch for in making his decisions or is it just some sleazy marketing trap to break Joe Consumers faith in benchmarks so the one with more advertising funds will be able to sell based upon the typical half-truths and outright lies we find in marketing everywhere? No static benchmark will be safe from this tripe (this includes the much vaunted game benchmarks that everyone says shows true performance), and its just possible that several have already fallen prey to this mischief. Any form of this mischief has to be suspect, including this latest find (obviously out at the same time as the other, just recently found) with Nvidia's driver specifically altering procedure (any) when performed on a specific .exe. It doesn't matter what you think looks best as that is opinion, the fact is something is different just for this program. If it is truly an optimization then it would implemented for general use and not just for a single program.

Anyway....................
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
I think gtstanfor is a paid Nvidia board guy or something, no normal self thinking person could spew this much blatant bull**** and believe it.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Compddd
I think gtstanfor is a paid Nvidia board guy or something, no normal self thinking person could spew this much blatant bull**** and believe it.

Ouch man. A little harsh, but to the point.
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: NFS4
Originally posted by: Compddd
I think gtstanfor is a paid Nvidia board guy or something, no normal self thinking person could spew this much blatant bull**** and believe it.

Ouch man. A little harsh, but to the point.

Normally I wouldn't care to comment on an immature statement like this, but I just got a pm from a fellow forum member that I thought was pretty funny. Here it is:

Isn't it pathetic to see all the Rage3D fanboys come over to AT and bash anyone who is positive about NVIDIA or negative about ATI? When in doubt about an ATI fanboy search their username at Rage3D, for example here where Compddd reveals himself in all his unbiased glory.
rolleye.gif


LOL you can apply this to the retard NVIDIOTS as well, just go to NVnews. Though even NVnews has been partially taken over by fanATIcs.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Evan Lieb
Originally posted by: NFS4
Originally posted by: Compddd
I think gtstanfor is a paid Nvidia board guy or something, no normal self thinking person could spew this much blatant bull**** and believe it.

Ouch man. A little harsh, but to the point.

Normally I wouldn't care to comment on an immature statement like this, but I just got a pm from a fellow forum member that I thought was pretty funny. Here it is:

Isn't it pathetic to see all the Rage3D fanboys come over to AT and bash anyone who is positive about NVIDIA or negative about ATI? When in doubt about an ATI fanboy search their username at Rage3D, for example here where Compddd reveals himself in all his unbiased glory.
rolleye.gif


LOL you can apply this to the retard NVIDIOTS as well, just go to NVnews. Though even NVnews has been partially taken over by fanATIcs.

YGPM :D
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
This was my first pro ATI post on AT :) but gstanfor's posts were so blatantly "blah blah Nvidia didn't cheat, they cant do anything wrong" that I had to comment, and atleast I dont make a bunch of posts trying to make a company that blatantly cheated look good. I've owned 1 3Dfx graphics card, 2 Nvidia cards and 1 ATI card so far. After all this stuff they've pulled, I doubt I will ever buy Nvidia again.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I knew the fanATIcs would come out wielding their baseball bats and shouting taunts and insults.

They can't make a good case against what I say, so they resort to attacking me.

The ATi brute force method simply will not last guys. It looks good now for sure. So did the GF2 series when first introduced. They were brute force renderers too, and they quickly ran out of steam. ATi will suffer the same fate with R3xx.

Show me what is in the pipeline that has more efficiency than multi precision rendering?

Do you honestly believe that the likes of microsoft (which has never had an original thought in its entire existance) and ATi can design a better API/language than Univerisities such as Stanford can? Cg isn't just nVidia's invention you know...

As for being a nVidia spokesperson, that I most certainly am not. In fact I believe nVidia has done a remarkably poor job of selling NV3x's strengths and Cg's strengths so far, and what selling they did do came off as arrogant whithout explaining the benefits (the way it's meant to be played campaign). Why this is so, i don't know. It normally is not normally a nVidia weakness. My advice to nVidia at the moment is: stop focussing on what ATi is doing and being so reactive towards them, and focus on your own strengths. The ATi lead will not last. Provide real, solid information about your products in well thought out campaigns that from history we know you are capable of.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Outside of 3dMark, I can care less what NV did or didnt do. Ther is a difference between 3dMark and Quack3. Quack 3 was a GAME that ALOT of people play/played, if NV starts using this tactic outside of 3dMark, then I might have something to complain about.

to me, 3dMark is nothing. It's the games is where it's at. Show me proof of the gaming cheats/optimizations, and then I will start complaining. and dont give me those screenshots of UT2k3 that has ATI at 90fps and NV at 30..I can do the SAME identical thing, BOTH on my lowly GF3 Ti200. You just take a screenshot at 1600x1200 (the 30fps) and take another screenshot at 1024x768 (the 90fps) then resize that 1600x1200 screenshot to 1024x768 in say like...Paint Shop Pro? Then both screenshots "appear" to be taken at 1024x768, giving the other screenshot the "appearance" of having an FPS advantage. As of now, I still stand by NVidia...........for now.


To Evan,

I ask and implore you to remain unbiased in every benchmark or hardware review AT does, no matter who it hurts, I like and prefer the truth. THis is the only reason I come to AT, for it's unbiased reporting. Other tech sites "appear" bought off, and I dont trust them. The reason I like AT is, even though they might have an advertisement for say..Dell...AT will still report on a Dell, whether bad or good :)

Thanks,
Shamrock
 

WarCon

Diamond Member
Feb 27, 2001
3,920
0
0
Something to ponder......If a company is willing to compromise one benchmark to "show" everyone that its the better performer, why would you think they wouldn't do it to all tools that they can (including game benchmarks)?

I personally carry no company loyalties, but am instead interested in what is the best product overall. Which product will give me the most satisfaction for the dollar. With Futuremark being discredited and manipulated and several game benchmarks have mystery gains that seem unreasonable (probably similar "optimizations" being applied), where do I turn for advice? It seems like most people that post about video cards are basing their knowledge on these benchmarks and propaganda from the manufacturers (alot are just outright loyalists that support a product mindlessly), not from intimate experience with both of the price range cards I am looking at, at that time.

Anyway............. :| :disgust: :frown:
 

Sideswipe001

Golden Member
May 23, 2003
1,116
0
0
Ok, I've been reading the arguments and, I'm sorry- some of this just doesn't seem logical to me.

If I'm getting this wrong, let me know.

Gstanfor (besides doing a lot of techincal arguments about the superiority of nVidia's designs) claims this was a set up; That ATi and FM set up nVidia with the newest version of 3dMark to make them look bad. That they are "Too smart" to do something like that.

I don't see it.

Renaming an executeable should not change how that .exe file runs, whatsoever. Why would FM program 3dmark to run less efficiently on nVidia cards when it's RENAMED? If I rename 'Winword' to 'Free Money' I don't get free money by double clicking on it. I open up Microsoft Word. It doesn't do it faster, or slower, it just does it.

It seems bogus to me to claim that FM would write something like that in. I don't even know if it's POSSIBLE to have the code run differently if the name is different - I'm no programmer. I'd like to see someone try this on an older version of 3dMark personally. If nVidia doctored their drivers, it should still do it no matter what version of 3dMark 2003 is running. If they didn't, it should be made apparent that way too.

And yes it matters - because they would do it for one and only one reason - PR. It makes them look better to have a faster card. It doesn't matter if it's a game or not. People base buying decisions off of benchmarks like this, and if they are doctoring their score, they are doing it to try to sell more cards by decieving people.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Sideswipe001 said:
Gstanfor (besides doing a lot of techincal arguments about the superiority of nVidia's designs) claims this was a set up; That ATi and FM set up nVidia with the newest version of 3dMark to make them look bad. That they are "Too smart" to do something like that.

I don't see it.

You obviously have not followed all of my posts. Do you remember back to when the ATi QUAK scandal broke? Guess who was responsible for figuring out what was going in in ATi's drivers back then. I'll give you a hint: It wasn't anybody who actually reported on the story - it was nVidia. Go have a read through Tom's editorials.

Now we have allegations - from ATi of all people - that changing the executable name of 3dmark2003 causes nVidia driver cheats to become apparent. Sorry guys, but that was ATi's little fiasco and nVidia was the whistle-blower. Do you honestly think they would leave themselves open to the exact same allegation?