Forceware 53.03 gives FM the finger

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Super ;):beer: .. a real KO blow from Nvidia ... now lets see Futuremark Slap back !! ;)
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
This is too much. When a hell is this 3dMark issue going to finish????!!!!!!
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
What I find funny is that scores in other benchmarks don't change. Here's a way to settle this debate over which 3DMark2003 score is correct... nVidia's way, or FM's way...

Performance difference in % between 9800XT and FX5950 according to AnandTech.

Aquamark3 - nVidia 1.5% behind ATI
C&C Generals - nVidia 26% behind ATI
F1 Challange '99-'02 - nVidia 7.5% behind ATI
FFXI - nVidia 5.3% behind ATI
Halo - nVidia 10.7% behind ATI
Homeworld 2 - nVidia 22.8% ahead of ATI
Jedia Night 3 - nVidia 15% ahead of ATI
Flight Sim 2004 - nVidia 40% behind ATI
Neverwinter Nights - nVidia 13.5% ahead ATI
Simcity 4 - nVidia 28.6% behind ATI
Splinter Cell - nVidia = ATI
UT2k3 - nVidia 3.9% behind ATI
X2 - nVidia 4.2% ahead ATI
Warcraft 3 - nVidia 10.4% behind ATI
Wolfenstein ET - nVidia 1.5% ahead ATI

nVidia is about 5.5% slower than ATI on average

Lets look at 3DMark Scores... I found a user using the stock 9800 pro speeds and a 3000 Mhz Pentium 4... he score 6500... so, if 3DMark is accurate, 5900 Ultra speeds with the same speed processor should be 5.5% lower than 6500, which woudl be about 6150... so lets see... I see a P4 3000 Mhz with a FX5900 Ultra at 450/850 scoring 5300... which is well shy of the 6150... lets look some more... here's another that's below 5300... have a look for yourself if you want... use "project search and compare" and sort based on the relevant criteria. That was with the 340 patch, lets see what kinda of numbers we get from build 330... I see a few in the 6300's, and even more in the 6200's.

BTW... I won't tell you which projects I look at so you all can look for projects with the video cards at stock clock speeds, with similar processors at similar clock speeds and see the results for yourself.

So it looks as though Futuremark is handicapping nVidia in such a way that it DOES NOT reflect real world performance. It may reflect performance in a few select games, but look at Homeworld 2... nVidia is WAY head of ATI there... so you could say 3DMark 2003 isn't accurate because nVidia should be scoring at least 20% higher than ATI because it does in Homeworld 2.

I don't see how even the ATI fanboys can argue against this...
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Jeff, 3dMark is a DX9 benchmark where Nvidia is very weak. DX8 they perform similar but DX9 is ATI's terrain.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: McArra
Jeff, 3dMark is a DX9 benchmark where Nvidia is very weak. DX8 they perform similar but DX9 is ATI's terrain.

I don't see your point... elaborate.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I mean: Game 4, which is DX9 has a large percentage of the marks. As we know ATI has much stronger PS 2.0 and the score in Game 4 without cheat/optimizations is quite higher. This is a very high ammount of marks from the overall puntuation and that's why Nvidia without optimizing/cheating falls behind by a large margin.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: McArra
I mean: Game 4, which is DX9 has a large percentage of the marks. As we know ATI has much stronger PS 2.0 and the score in Game 4 without cheat/optimizations is quite higher. This is a very high ammount of marks from the overall puntuation and that's why Nvidia without optimizing/cheating falls behind by a large margin.

That's the thing - how is it cheating when IQ remains the same?
 

vshah

Lifer
Sep 20, 2003
19,003
24
81
have there been any image quality comparisons done on geforce FXs with 330 and 340? is it actually a cheat that reduces IQ? or was it simply the compiler doing its job?

i was under the impression that the community had reached some consensus that if an optimization doesn't affect IQ, it is legal and welcome.


-Vivan
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Here's an IQ comparison from Elite-Bastards investigation of the 53.03 drivers. It's animated, so give it a second to load.

IQ Comparison Pic

Anyone who says this new drivers looks any better or worse than the old is full of sh!t. I can't tell which driver looks better. The difference is maybe 20 pixels, and all it is is certain areas becoming lighter or darker. For all we know, it could be an image quality FIX.

The whole article can be found at:

http://www.elitebastards.com/page.php?pageid=2647&head=1&comments=1


Frankly, back when this whole thing began, I thought Nvidia and Futuremark were both being childish little weenies about this whole thing, but now I'm just starting to see Futuremark as ridiculous. Nvidia obviously buried the hatchet and FM is still throwing tantrums like a two-year old....I was never that interested in the 3dMark benchmarks before this whole bit, but now I have no intention of using them at all.






 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Even if nVidia's drivers are optimized for 3DMark, what difference does it make? It will be optimized for future games, so shouldn't the benchmark be an indication of future games? nVidia wants to rewrite the driver every other month to optimize for new games... I have no problem with that. It's necessary because of design differences in the GPU's... we'll see what NV40 brings since now nVidia knows what DX9 is all about.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
nVidia wants to rewrite the driver every other month to optimize for new games... I have no problem with that. It's necessary because of design differences in the GPU's... we'll see what NV40 brings since now nVidia knows what DX9 is all about

You mean they've backed away from a single driver release per year now?
rolleye.gif
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it. Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.
 

vshah

Lifer
Sep 20, 2003
19,003
24
81
the problem with this 3dmark situation is, even if Nvidia believes that it is not representative of game performance, it still is a (flawed) industry standard benchmark. When companies like Dell or Gateway decide which vcard to stick in their next high end pc, they don't want to waste time with benchmarks when they can run one program that the enthusiast community has used for a while.

and Nvidia & Ati are obviously concerned about keeping their market share.


-Vivan
 

Naffer

Member
Oct 21, 2003
28
0
0
Optimizations are just fine if they aren't done at the e(xpense of image quality)

(edit. How did I manage to post an incomplete thought... I must be going nuts)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I thought Nvidia was doing very well with market share? If so why are they still acting like they have something to hide? Personally I don't really care, as I will not be buying a new card for some time and am quite happy with my 9700pro, but I do think the pr is very bad for nvidia here. If it is bad for FM, so what, still does not affect ATI. Having the second fasted card must have a negative impact on sales of mainstream cards, which seems like dumb consumers that need educating.:beer:
 

SilverLock

Member
Nov 18, 2003
112
0
0
Originally posted by: McArra
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it. Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.

Err....

Would you get rid of 3Dnow? SSE? SSE2?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: SilverLock
Originally posted by: McArra
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it. Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.

Err....

Would you get rid of 3Dnow? SSE? SSE2?


Exactly what people don't seem to realize about this whole situation. Thanks for pointing that out.

 

Is

Member
Sep 16, 2003
64
0
0
Originally posted by: SilverLock

Err....

Would you get rid of 3Dnow? SSE? SSE2?


I would get rid of that stuff, if it came at the expense of standard FPU performance, which is analogous to what's going on.