Nvidia does not endorse 3Dmark2003

andreasl

Senior member
Aug 25, 2000
419
0
0
http://gamespot.com/gamespot/stories/news/0,10870,2910707,00.html

On a related note, Nvidia has contacted us to say that it doesn't support the use of 3DMark 2003 as a primary benchmark in the evaluation of graphics cards, as the company believes the benchmark doesn't represent how current games are being designed. Specifically, Nvidia contends that the first test is an unrealistically simple scene that's primarily single-textured, that the stencil shadows in the second and third tests are rendered using an inefficient method that's extremely bottlenecked at the vertex engine, and that many of the pixel shaders use specific elements of DX8 that are promoted by ATI but aren't common in current games.

In response to Nvidia's statements, Futuremark's Aki Jarvilehto said, "We've been working for the last 18 months to make 3DMark 2003 an accurate and objective benchmark. Nvidia was one of our beta members until December, and it's not our place to comment on why they've decided to withdraw from the program. After working on the project for almost two years with all the leading manufacturers, we do strongly believe that 3DMark 2003 is an accurate representation of game performance for modern 3D accelerators."
 

andreasl

Senior member
Aug 25, 2000
419
0
0
Seems like Nvidia jumped ship in December last year after being a beta-tester for years. Seems strange though, didn't the FX perform pretty good with the latest drivers?
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
How convenient. Your "brand new" FX Ultra gets slapped around by a 6-month old card, it sounds like a jet engine, it may not even be released...

Yeah, I can see why they don't endorse it :D
 

Dacalo

Diamond Member
Mar 31, 2000
8,778
3
76
Originally posted by: GTaudiophile
They'll say that until they beat ATi in every benchmark again...then rapidly promote it.

ditto, now that radeon 9700 pro is destroying their flagship, they need some lame excuese

 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
3DMark isn't a perfect benchmark anyway. I'd rather see NV's driver programmers work to improve performance in general rather than optimize for a benchmark anyway (which I'm sure both companies do).
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Sour grapes. They know that the competition already makes them look silly on 3DMark01. With 03, Nvidia's whole product line will be dropping off the charts. I'm no fan of 3DM, per se, but they are certainly peeing into the wind if they think their opinion will prevent 03 from quickly being adopted as the standard gaming benchmark.
 

Vic

Elite Member
Jun 12, 2001
50,422
14,337
136
Originally posted by: Auric
Sour grapes. They know that the competition already makes them look silly on 3DMark01. With 03, Nvidia's whole product line will be dropping off the charts. I'm no fan of 3DM, per se, but they are certainly peeing into the wind if they think their opinion will prevent 03 from quickly being adopted as the standard gaming benchmark.
Ahem. Actually the GFFX beat the 9700 in 3dMark03 using the most recent drivers. I don't think this has anything to do with sour grapes. Many reviewers are criticizing 3dMark03 as being a VERY unrealistic benchmark, not based on any current or planned-for-the-future gaming technology.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Can anyone say "damage control"? Yup, it's in full effect over there at nVidia. Shouldn't they... I don't know... GET A FREAKING NEW VIDEO CARD OUT SOMETIME THIS CENTURY?!?!?!!?

Stop working on the media and spin doctoring 3dMark nVidia, and like, work on something that's good and spare our intelligence with this BS brainwashing nonsense. Like we can't see through this anyways. How insulting.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Beyond3D *ENDORSED* 3dMark 2003 as a valuable tool. I know what you're talking about, Kyle over at H-OCP. Great, a web developer. Yes Kyle... web developer, please school us on the intimacy of software and how it interacts with various hardware, especially video cards... you know Kyle, the "technical stuff" you picked up from developing the H-OCP website for the GeForce 4's.

C'mon fellas, this isn't that hard. HardOCP was using 2001se authoritatively for years, which works on all the same principles they state is the reason for NOT using 2003. The difference between now and then is nVidia's card suck at it now.

To make matters worse, look at HardOCP's diatribe on the matter, then look at nVidia's (which came just a few hours after HardOCP's) diatribe, looks awfully similar to me. You don't find this coincidence strange? I'd almost bet nVidia wrote them both.

Put 2 and 2 together and stop being naive.

I think it's safe to say Beyond3D knows a HELLUVA lot more about this than Kyle at HardOCP, who's not even a programmer (so someone's filling his head with this crap - read: nVidia).

I know Dave from Beyond3D from WAAAAAAAAAAAAAY back, and he's actually WORKED on video cards and knows hit stuff, intimately.

Conclusion: Kyle is a freaking idiot on a subject that he is speaking authoritatively on. Hmm... On the take?
 
Jan 9, 2002
5,232
0
0
Originally posted by: Adul
Originally posted by: majewski9
I dont know if Nvidia will ever be on top again!!!

never underestimate a wounded prey ;)

...or the market of people like me who could care less about benchmarks or gaming and just want a great video card with top notch driver support for the business environment. I doubt I'll never need the pixel crunching power of a GF4 Ti4600. If I do, that's what Quaddros and Wildcats are for.
 

DillonStorm

Junior Member
Jan 30, 2003
10
0
0
Exuse me, but the GFFX *Barely* beats the 9700, and actually does it by winning 2 of the 4 scoring tests by 5 fps or less...

Can we please remember that the GFFX is an overclocked overheating monstrocity with a 175mhz clock advatage??? Go look at the Fururemark leaderboard. 9700's are already crossing the 6000 mark, 1000 points faster than the GFFX. And the funny thing is those 9700's are still slower than the FX ultra by 100mhz.

 

oldiex

Member
Oct 10, 1999
78
0
0
Nvidia will not endorse it until they have a card on the market that will run all the tests.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: DillonStorm
Exuse me, but the GFFX *Barely* beats the 9700, and actually does it by winning 2 of the 4 scoring tests by 5 fps or less...

Can we please remember that the GFFX is an overclocked overheating monstrocity with a 175mhz clock advatage??? Go look at the Fururemark leaderboard. 9700's are already crossing the 6000 mark, 1000 points faster than the GFFX. And the funny thing is those 9700's are still slower than the FX ultra by 100mhz.

God, go back to Game Test 3. Do you ONLY show up when the FX challenges your precious 9700pro?
rolleye.gif


Chiz
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
I don't blame em for not endorsing it.

You could probably get the SAME looking benchmarks with a better engine to run smoothly on current hardware.

They're just stuffing their benchmark with so much extra crap that no programmer would ever consider it.

In fact I kept looking for what was making it slow down. There is a lot to be desired if it's bringing video cards down to 1-3 fps.


Look at Doom3. Looks better imho and will run fine with current hardware.


 

bigshooter

Platinum Member
Oct 12, 1999
2,157
0
71
I still think Geforce FX is like the P4. When it first came out, older technology (Athlon) beat it in almost every test. It took future revisions of the processor to really take the lead, although AMD is keeping up, they are not the performance king. Nvidia is not dumb, they have plenty of engineers working on driver updates, and probably even more working on revisions for the NV31 and 34 low cost and updated version. I was under the impression that this is the hardware they are going to be basing their next couple gens of cards, so they have time to make it work right and add features/performance improvements. Look how many gens the geforce technology took nvidia, pretty much all the way through the Geforce 4. I wouldn't discount them yet.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
This isn't suprising. The GFFX aside, in a lot of the tests, the rendering pipeline makes heavy use of PS 1.4 capibilities over PS 1.1(The Radeons are 1.4, the GeForce 3,4 are 1.1). For the tests 3DMark uses, the 1.4 pipeline is much faster than the 1.1 pipeline, and that doesn't leave Nvidia sitting happy. There's been some complaints about unrealistic coding on the part of the 1.1 stencil buffer, along with general arguments about really how similar 1.4 and 1.1 are(Compared to 2.0, 1.1 and 1.4 might as well be the same. Futuremark seems to be exploiting 1.4 for all it's worth). I'm sure Nvidia will "hop on the bandwagon" again later, but for now, they're in defensive mode, which is what any smart company should be doing.
 

Operandi

Diamond Member
Oct 9, 1999
5,508
0
0
Graphics core?s last on average roughly 6-10 months, so comparing the FX to the P4 doesn?t make much sense. The NV30 chip doesn?t stand much of a chance without the FX Ultra. If Nvidia is going to end up back on the top of the performance charts it?s not going to be with the NV30.
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
I'm gonig to have to agree with Nvidia providing they don't jump right on that bandwagon when NV35 comes along.

We need to get off benchmarking stuff unrealistically and they need to work on day to day preformance for drivers (something that ATI is getting better at)

so if that means saying "see ya" to 3dmark, I'm all for it
 

codehack2

Golden Member
Oct 11, 1999
1,325
0
76
Originally posted by: Chad
Beyond3D *ENDORSED* 3dMark 2003 as a valuable tool. I know what you're talking about, Kyle over at H-OCP. Great, a web developer. Yes Kyle... web developer, please school us on the intimacy of software and how it interacts with various hardware, especially video cards... you know Kyle, the "technical stuff" you picked up from developing the H-OCP website for the GeForce 4's.

C'mon fellas, this isn't that hard. HardOCP was using 2001se authoritatively for years, which works on all the same principles they state is the reason for NOT using 2003. The difference between now and then is nVidia's card suck at it now.

To make matters worse, look at HardOCP's diatribe on the matter, then look at nVidia's (which came just a few hours after HardOCP's) diatribe, looks awfully similar to me. You don't find this coincidence strange? I'd almost bet nVidia wrote them both.

Put 2 and 2 together and stop being naive.

I think it's safe to say Beyond3D knows a HELLUVA lot more about this than Kyle at HardOCP, who's not even a programmer (so someone's filling his head with this crap - read: nVidia).

I know Dave from Beyond3D from WAAAAAAAAAAAAAY back, and he's actually WORKED on video cards and knows hit stuff, intimately.

Conclusion: Kyle is a freaking idiot on a subject that he is speaking authoritatively on. Hmm... On the take?

Errr... you need to go back and read that article. It clearly states in the header that it was written by Brent Justice. And even goes as far as refering to "Kyle" in the 3rd person. I'm not validating or condoning anything that Mr. Justice says, but insulting Kyle on this matter isn't going to get you very far.

CH2