• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Futuremark Responds to Nvidia 3DMark03 Criticism

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: kazeakuma
For all those saying how 3dmark03 test future tech etc. Can anyone explain why it only has one DX9 test and not a fully DX9 test at that?

Simply because it uses DX9 where appropriate. In "Mother Nature" it uses DX9 (read: PS2.0) where it should be used, but there are palces where it uses PS1.4 instead, since PS2.0 would be horrible overkill. It uses PS1.4 mostly because DX9-hardware also support PS1.4.

3dmark has never been more than eyecandy IMO, pretty to look at but means nothing.

I guess that's why even HardOCP uses 3DMark2001 in their benchmarks.

As people have said, 3dmark isn't representative of a real game (future or current).

It is representative of games that take advantage of advanced shader (like Doom3)

If you were a programmer would you use PS1.1 as a fallback for PS1.4/PS2.0 when PS1.3 is easier and more representative of the market out there?

If you used PS1.4, you would use PS1.1 as a fallback, since PS1.3 and 1.2 don't bring any real advantages when compared to 1.1. 1.4 is alot better than 1.1. 1.2 and 1.3 are only minor improvements over 1.1 with nonexistant performance-benefits.

(GF4 Ti4200s out there by the droves) No you wouldn't, so why did Futuremark? Either they got lazy or Nvidia pissed them off (dropping their subscription) and this is mostly political.

Again, 03 is meant for future tech (read: DX9 and 8.1), GF4 is older tech. Should FM cripple 3DMark just because one vendor has decided not to support advanced features? Of course not!


The one thing that annoys me here is that many people who whine about 3DMark just parrot the party-line spoon-fed to them by NV!

Of course, now that Nvidia has discredited 3dmark (for whatever reasons), anyone who says the same is spouting the party line despite the fact that alot of us never put any stock in 3dmark in the first place.
rolleye.gif

HardOCP used 3DMark01 extensively. Now, all of a sudden, when allmighty NV flames 03, they too decide that it sucks and must not be used. Yes, alot of the people who flame 03 do so only because NV did it too. They are nothing but sheeps who do everything NV does.
 

MadRat

Lifer
Oct 14, 1999
11,999
307
126
It ought to be more concerned with testing the same scenes using multiple techniques to sort out how compliant the individual card is and then give 3D marks by DirectX generation rather than one specific score. It makes no sense whatsoever to give one summed up score but then have some ambigious rating system to divide the scores so that faster DX7 cards score less than slower DX9 cards. The whole idea of 3DMark200x just stinks.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: MadRat
It ought to be more concerned with testing the same scenes using multiple techniques to sort out how compliant the individual card is and then give 3D marks by DirectX generation rather than one specific score. It makes no sense whatsoever to give one summed up score but then have some ambigious rating system to divide the scores so that faster DX7 cards score less than slower DX9 cards. The whole idea of 3DMark200x just stinks.

makes sense to me, 3dmark is about speed and feature sets, if a card lacks one of those it scores badly, geforce4 lacks the feature sets needed for dx9
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
HardOCP used 3DMark01 extensively. Now, all of a sudden, when allmighty NV flames 03, they too decide that it sucks and must not be used. Yes, alot of the people who flame 03 do so only because NV did it too. They are nothing but sheeps who do everything NV does.
Many people have been saying this for some time now! Way before Nvidia's statement and before HardOCP little article. I'm sure you know that.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: NOX
HardOCP used 3DMark01 extensively. Now, all of a sudden, when allmighty NV flames 03, they too decide that it sucks and must not be used. Yes, alot of the people who flame 03 do so only because NV did it too. They are nothing but sheeps who do everything NV does.
Many people have been saying this for some time now! Way before Nvidia's statement and before HardOCP little article. I'm sure you know that.

Yes, peohave said that 3DMarks don't matter, real games do. But still, HardOCP (one of the sites flaming o3) happily usd 01 in their benchmarks. Even NV used 01! Now that 03 support features that NV has been too lazy to implement, it's suddenly sucky benchmark that should not be used. If 3DMark sucks and you should use games instead (as NV says), why did NV use 01? I guess it's OK to use certain benchmarks but only if the favor NV-hardware.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
3dmark is as valid as any other benchamrks, you dont judge a card by just one benchmark but how it performance in all of them

what if doom3 suddenly made the nv30 look like crap because the game reqires features that the nv30 doesnt have, should it be dismissed as a benchmark?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Czar
3dmark is as valid as any other benchamrks, you dont judge a card by just one benchmark but how it performance in all of them

what if doom3 suddenly made the nv30 look like crap because the game reqires features that the nv30 doesnt have, should it be dismissed as a benchmark?

Well Doom3 doesn't make the NV30 "look like crap". In fact "old" cards like a GF4 still run the game better than a card that is supposedly more "feature rich" than a PS1.4 compliant part (R200), which is why much of the criticism has surfaced in regards to 3dMiss2K3.

For those saying its a benchmark designed to test future compatibility of games, I'm sure you know that the actual DX 9 capabilities are limited to 2 of 10 tests in GT 4, yet that test carries the heaviest weight. The fact that 3dMarketing2k3 almost completely dismisses CPU processing power is particularly laughable. If you can find a single game (now or in the future) that benefits 3x more from a 25mhz GPU core overclock than a 300mhz increase in CPU clock speed let me know :)

Chiz
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: chizow
Originally posted by: Czar
3dmark is as valid as any other benchamrks, you dont judge a card by just one benchmark but how it performance in all of them

what if doom3 suddenly made the nv30 look like crap because the game reqires features that the nv30 doesnt have, should it be dismissed as a benchmark?

Well Doom3 doesn't make the NV30 "look like crap". In fact "old" cards like a GF4 still run the game better than a card that is supposedly more "feature rich" than a PS1.4 compliant part (R200), which is why much of the criticism has surfaced in regards to 3dMiss2K3.

Wyes and no. Doom3 uses PS1.4 when available. That means that Ati-cards will need less passes to do their work than GF4-class cards need. That is a fact. Of course that doesn't automatically mean that Ati-card will outperform GF4, since there are other things at play as well (drivers etc.)

For those saying its a benchmark designed to test future compatibility of games, I'm sure you know that the actual DX 9 capabilities are limited to 2 of 10 tests in GT 4, yet that test carries the heaviest weight.

Like I said, it uses PS2.0 (DX9) where it makes sense. It doesn't use PS2.0 just for the sake of using them (PS2.0 is horrible overkill in some cases). And no, GT4 doesn't carry most weight if I remember correctly. The Game-tests are pretty evenly balanced if I remember correctly.

The fact that 3dMarketing2k3 almost completely dismisses CPU processing power is particularly laughable.

What I find laughable, is that 3DMark01 (a vid-card test) was in many cases limited by the CPU. And people complained that it's more a CPU-test than a vid-card test. So they made sure that 03 isn't limited by the CPU. And now people whine that it doesn't use the CPU! Sheesh, some people are hard to please!

If you can find a single game (now or in the future) that benefits 3x more from a 25mhz GPU core overclock than a 300mhz increase in CPU clock speed let me know :)

Games that are limited by vertex and pixel-shaders for starters. Just about all games that are limited by the vid-card.

Seriously, this whining is getting ridiculous! Maybe those NV-fanboys should build their own NV-Mark and shut the hell up! I find it rather strange that they whine because NV-hardware doesn't take advantage of features offered by DX-standard, and they whine when NV-hardware does poorly in a benchmark that tests those very same features! If you want to whine to someone, whine to NV, not Futuremark. NV is the one who decided to keep on pushing inferior tech to the market.

What is the core-thingy of this whining? It's that 03 takes advantage of PS1.4-shaders and NV doesn't have PS1.4 in their mainstream-cards. Why does 03 use PS1.4? Simple: It's a DX9-benchmark, and ALL DX9 vidcards support PS1.4! The problem that NV doesn't have PS1.4-capable mainstream vidcard (their mainstream vidcards are still stuck at DX7!) is NV's problems, and no-one elses.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Nemesis77
Wyes and no. Doom3 uses PS1.4 when available. That means that Ati-cards will need less passes to do their work than GF4-class cards need. That is a fact. Of course that doesn't automatically mean that Ati-card will outperform GF4, since there are other things at play as well (drivers etc.)

That's a fact, but it still doesn't translate into pure performance. Yes, a GF3/4 will require multiple passes to render the same scene, but if those multiple passes are just as fast as a single pass, there isn't any real-world benefit. Efficiency for efficiency's sake means nothing if it doesn't translate into performance. Its no secret that the GF series cards use fragment programs to accomplish the same work in multiple passes that a R200/R300 would in a single pass, but the end result is negligible when it comes to image quality, and actual performance/"efficiency" is self-evident in REAL benchmarks. The same can NOT be said for 3dmark2k3.

Like I said, it uses PS2.0 (DX9) where it makes sense. It doesn't use PS2.0 just for the sake of using them (PS2.0 is horrible overkill in some cases). And no, GT4 doesn't carry most weight if I remember correctly. The Game-tests are pretty evenly balanced if I remember correctly.

PS 1.4 is a DX 8.1 compliant feature, so please stop confusing it with DX 9. Yes, DX 9 falls back to 1.4 natively, but it also falls back to 1.1 if 1.4 isn't supported. I think that's the rub here. All the supporters of 3dmark2k3 keep saying its a true test of DX9 hardware, but in reality 95% of the tests are composed of DX 8.1 features. That's the issue here, FutureMarketing and ATI obviously collaborated on the project to make it a test of DX 8.1 or better cards (they say as much in their responses to nVidia's remarks). What shifted from a focus on a DX9 benchmark has shifted to a DX 8.1 or better benchmark. The irony is that a Radeon 8500 gets higher |3SMarks than a Ti4600, yet there isn't a single game that an 8500 (even with the more "efficient" PS 1.4 rendering methods) would outperform a Ti4600. Doom 3 included. In fact Carmack has said numerous times that a GF3 requiring multiple passes still outperforms a R200 that only requires a single pass. I still feel that 3DMark2k3 is either optimized to render using PS 1.4, or it arbitrarily assigns a penalty to non-PS 1.4 parts.

What I find laughable, is that 3DMark01 (a vid-card test) was in many cases limited by the CPU. And people complained that it's more a CPU-test than a vid-card test. So they made sure that 03 isn't limited by the CPU. And now people whine that it doesn't use the CPU! Sheesh, some people are hard to please!

Another common defense for 3DMark2k3. If its just a "vid-card test" or a "3D capabilities test", why do they claim to be "The Gamer's Benchmark". Why are references to Doom 3 all over their white papers? They claim that GT 2 and 3 were specifically designed to "mimic" the rendering engine of Doom 3, yet both fail miserably. Without taking into consideration the actual PLATFORM and PROCESSOR that the video card is running on, it would be a helluva lot easier to just arbitrarily assign values to a family of GPUs.

GF3 = 1000
GF4 = 2000
R200 = 2500
R300(9500/pro) = 3000/3500
R300(9700/TX/GOLD/PRO/OC'd) = 4000/4200/4500/4500+
FX = 4500

Oh wait, I guess that's what they do anyways LoL!
rolleye.gif
The most glaring shortcoming is still the fact that CPU/Platform differences have almost no effect on the total score. Good luck explaining to some poor fellow that gets 4500 in 3DMark2k3 with his Celeron 1.1 and 9700pro why he can't run Doom 3 smoothly (as his score would indicate). :)

Games that are limited by vertex and pixel-shaders for starters. Just about all games that are limited by the vid-card.

You said it yourself, there are no games that are limited by either. PS 2.0 is only used in certain instances "where its not overkill". I don't think a game will extensively use either of these features until DX10 at the earliest, and by then this benchmark will be even more useless. Not to mention any card that currently has full DX 9 support will be obsolete or a value part.

Seriously, this whining is getting ridiculous! Maybe those NV-fanboys should build their own NV-Mark and shut the hell up! I find it rather strange that they whine because NV-hardware doesn't take advantage of features offered by DX-standard, and they whine when NV-hardware does poorly in a benchmark that tests those very same features! If you want to whine to someone, whine to NV, not Futuremark. NV is the one who decided to keep on pushing inferior tech to the market.

What is the core-thingy of this whining? It's that 03 takes advantage of PS1.4-shaders and NV doesn't have PS1.4 in their mainstream-cards. Why does 03 use PS1.4? Simple: It's a DX9-benchmark, and ALL DX9 vidcards support PS1.4! The problem that NV doesn't have PS1.4-capable mainstream vidcard (their mainstream vidcards are still stuck at DX7!) is NV's problems, and no-one elses.

Again, confusing a DX 8.1 feature with DX 9. Just as all DX 9 parts support 1.4, DX 9 games will support PS 1.1 (with obviously better results than those portrayed by 3DMark2k1). I find it funny when people use the "inferior tech" argument; I'm sure you're one of those people that rushes out and buys a card b/c of some great feature, only to realize there isn't a single game within 6 months to a year that will implement it. When it finally shows up in a game, the card is obsolete and the next generation of cards is already upon us. Personally, (and most reviews echo the same sentiment) I'd rather see implementations of current features over emphasis on future capabilities. The best example is the R200, a completely overambitious part that suffered miserably b/c it banked on "future tech". I guess there is some respite for the 8500, it does REALLY well in 3DMark2k3 now that PS 1.4 is finally here, a good year and a half after its release. :)

As for 3DMark2K3, its still a broken benchmark. They might as well slap a "Built By ATI" sticker on it and call it "The ATI Video Card Benchmark". Its certainly not "The Gamer's Benchmark", and does nothing to mirror real-world gaming performance, in future or current games.

Chiz
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: Nemesis77
Originally posted by: NOX
HardOCP used 3DMark01 extensively. Now, all of a sudden, when allmighty NV flames 03, they too decide that it sucks and must not be used. Yes, alot of the people who flame 03 do so only because NV did it too. They are nothing but sheeps who do everything NV does.
Many people have been saying this for some time now! Way before Nvidia's statement and before HardOCP little article. I'm sure you know that.

Yes, peohave said that 3DMarks don't matter, real games do. But still, HardOCP (one of the sites flaming o3) happily usd 01 in their benchmarks. Even NV used 01! Now that 03 support features that NV has been too lazy to implement, it's suddenly sucky benchmark that should not be used. If 3DMark sucks and you should use games instead (as NV says), why did NV use 01? I guess it's OK to use certain benchmarks but only if the favor NV-hardware.
Yeah, but just because two entities all of a sudden want to take the stance that a lot of people have been taking for years now, don't change a thing.

That?s why in another thread I stated that I question the timing of Nvidia?s statement. However again it doesn?t change the fact that knowledgeable people have been stating this for a while now, from 01.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: chizow
That's a fact, but it still doesn't translate into pure performance.

Of course shaders are not end-all thing in performance. Like I said, there are other things besides it.

Yes, a GF3/4 will require multiple passes to render the same scene, but if those multiple passes are just as fast as a single pass, there isn't any real-world benefit. Efficiency for efficiency's sake means nothing if it doesn't translate into performance. Its no secret that the GF series cards use fragment programs to accomplish the same work in multiple passes that a R200/R300 would in a single pass, but the end result is negligible when it comes to image quality, and actual performance/"efficiency" is self-evident in REAL benchmarks. The same can NOT be said for 3dmark2k3.

LOL! people whine that 03 is not a "real" benchmark, yet they happily used 01 to benchmark their vidcard, NV included! WHy is 01 a "real" benchmark (after all, why would NV use it?), whereas 03 is not? Because it supports standard-features that NV is too lazy to implement?

FWIW: I have seen 03-benchmarks with R9700Pro using PS2.0, 1.4 and 1.1, and the difference in performance is quite striking.

PS 1.4 is a DX 8.1 compliant feature, so please stop confusing it with DX 9.

I'm well aware that PS1.4 is a DX8.1 feature, thank you very much

Yes, DX 9 falls back to 1.4 natively, but it also falls back to 1.1 if 1.4 isn't supported. I think that's the rub here. All the supporters of 3dmark2k3 keep saying its a true test of DX9 hardware, but in reality 95% of the tests are composed of DX 8.1 features.

Like I said, PS2.0 is used where it's smart thing to do. And that's how games are made. There propably wont be any games that are 100% PS2.0.

That's the issue here, FutureMarketing and ATI obviously collaborated on the project to make it a test of DX 8.1 or better cards (they say as much in their responses to nVidia's remarks). What shifted from a focus on a DX9 benchmark has shifted to a DX 8.1 or better benchmark.

And the problem is.... What? Like I said, all DX9 (PS2.0) cards support DX8.1 (PS1.4), so using PS1.4 is valid thing to do.

The irony is that a Radeon 8500 gets higher |3SMarks than a Ti4600, yet there isn't a single game that an 8500 (even with the more "efficient" PS 1.4 rendering methods) would outperform a Ti4600.

In games that take advantage of PS1.4, they would be differences (assuming 8500 wouldn't be otherwise bottlenecked).

Doom 3 included. In fact Carmack has said numerous times that a GF3 requiring multiple passes still outperforms a R200 that only requires a single pass. I still feel that 3DMark2k3 is either optimized to render using PS 1.4, or it arbitrarily assigns a penalty to non-PS 1.4 parts.

No, it doesn't arbitarily penalize 1.1 cars, other than requiring them to do more passes to achieve same results.

Another common defense for 3DMark2k3. If its just a "vid-card test" or a "3D capabilities test", why do they claim to be "The Gamer's Benchmark".

Now you whine about their slogans
rolleye.gif
? Why don't you whine NV "the way it's meant to be played" (with DX7!)-slogan?

Why are references to Doom 3 all over their white papers?

D3 uses PS1.4, just like 03 does.

They claim that GT 2 and 3 were specifically designed to "mimic" the rendering engine of Doom 3, yet both fail miserably.

Yes, it mimicks D3 in that it uses PS1.4. But why do you say it "fails"? Because NV loses? oh no, we can't have that, so the benchmark is obviously flawed! Yes, that must be it.

Oh wait, I guess that's what they do anyways LoL!
rolleye.gif
The most glaring shortcoming is still the fact that CPU/Platform differences have almost no effect on the total score. Good luck explaining to some poor fellow that gets 4500 in 3DMark2k3 with his Celeron 1.1 and 9700pro why he can't run Doom 3 smoothly (as his score would indicate). :)

How do you know that D3 will be CPU-limited and not vid-card limited?

You said it yourself, there are no games that are limited by either.

Not yet. 3DMark is a forward-looking benchmark.

PS 2.0 is only used in certain instances "where its not overkill". I don't think a game will extensively use either of these features until DX10 at the earliest, and by then this benchmark will be even more useless. Not to mention any card that currently has full DX 9 support will be obsolete or a value part.

Like I said, 3DMark is forward-looking benchmark. And there are already games coming up that take advantage of advanced shaders.

Again, confusing a DX 8.1 feature with DX 9.

No I'm not, I'm well aware that PS1.4 is DX8.1 feature. It's a feature that will see wide use even with DX9, it's officially part of DX-standard (but of course it shouldn't be used in benchmarks 'cause NV is too lazy to implement it
rolleye.gif
) and it's supported by all vid-cards that support DX9. So I REALLY fail to see the problem here.

Just as all DX 9 parts support 1.4, DX 9 games will support PS 1.1 (with obviously better results than those portrayed by 3DMark2k1).

How do you know? PS1.4 is ALOT better than 1.1 is!

I find it funny when people use the "inferior tech" argument; I'm sure you're one of those people that rushes out and buys a card b/c of some great feature, only to realize there isn't a single game within 6 months to a year that will implement it.

You are sure eh? You are wrong! I currently have GF2 GTS.

And yes, NV pushes inferior tech. Carmack has said that MX doesn't cut it with D3, Ati's entire lineup is DX9 (with exception to 9100). NV has fallen behind, that is a fact.

When it finally shows up in a game, the card is obsolete and the next generation of cards is already upon us.

And they can still use 3DMark03 to benchmark those cards.

Personally, (and most reviews echo the same sentiment) I'd rather see implementations of current features over emphasis on future capabilities.

3DMark has always been future-looking. And ALOT of reviewers use it, so there doesn't seem to be a problem

The best example is the R200, a completely overambitious part that suffered miserably b/c it banked on "future tech".

Really? I think it's a pretty good product. First it competed succesfully against GF3, then it competed against GF4.

As for 3DMark2K3, its still a broken benchmark.

Yeah, because NVIDIA says so
rolleye.gif


They might as well slap a "Built By ATI" sticker on it and call it "The ATI Video Card Benchmark".

What do you suggest they do? Cripple the benchmark so it fitst better with NV's crippled feature-set
rolleye.gif
?

Its certainly not "The Gamer's Benchmark", and does nothing to mirror real-world gaming performance, in future or current games.

You have a crystal-ball that tells you all of this? You know more about game-developers and their intentions that Futuremark does?

Head over to beyond3d-forums. There are ALOT of smart people there (many are in one way or another involved in 3D-business). They don't see a problem with 3DMark03, neither did NVIDIA, untill Futuremark decided to support advanced tech that they were too lazy to implement. And, again, that really is NV's problem.

What you are basically suggesting that every benchmark should be designed around NV-hardware, otherwise the benchmark is inaccurate.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: NOX
Yeah, but just because two entities all of a sudden want to take the stance that a lot of people have been taking for years now, don't change a thing.

Taking a stand... So I guess HardOCP and NV used 01 against their wills then
rolleye.gif
? I find it too big of a coincidence that NV happily supported 01, but now that 03 support features that they themselves have chosen not to support, it's suddenly sucky benchmark. NV is just annoyed because 03 clearly shows that Ati's lineup is alot more future-proof than NV's lineup is. That Ati has DX9, whereas NV is still busy pushing DX7!

I applaud the fact that Futuremark designed a benchmark that takes advantage of latest tech, instead of designing their benchmark around NV-hardware.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
I don't have the time or the energy to point out all the inaccuracies in your replies, I'll just leave you with this thought:

Wait until Doom 3 is released (bout a month or so), since it seems to be the most common (and only) argument for the validity of 3DMark2K3 as a valid benchmarking tool. I think you'll find that the ATI family cards (R300 and above, since the older R200 and R250 parts just suck no matter which way you look at it) will not see a 3x to 4x advantage over a "crippled" part that lack DX 9 or DX 8.1 features used in 3DMark2K3.

Chiz

Btw, I don't need a crystal ball to realize the majority of currently released games are as CPU dependent as GPU dependent.
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Originally posted by: chizow
I don't have the time or the energy to point out all the inaccuracies in your replies, I'll just leave you with this thought:

Wait until Doom 3 is released (bout a month or so), since it seems to be the most common (and only) argument for the validity of 3DMark2K3 as a valid benchmarking tool. I think you'll find that the ATI family cards (R300 and above, since the older R200 and R250 parts just suck no matter which way you look at it) will not see a 3x to 4x advantage over a "crippled" part that lack DX 9 or DX 8.1 features used in 3DMark2K3.

Chiz

Btw, I don't need a crystal ball to realize the majority of currently released games are as CPU dependent as GPU dependent.


1. Doom 3 isn't the only game that's going to be released in the next 2 years.

2. A crystal ball is used to predict the future, I certainly hope you don't need one to see the present. ;)

3. Saying a post is filled with inaccuracies and not even being able to point them out and correct them is lame.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Spicedaddy
1. Doom 3 isn't the only game that's going to be released in the next 2 years.

No, it won't. But considering its engine will be the basis for many of the games that take advantage of the so-called "advanced features" emphasized in 3DMark2K3 and the fact that GT 2 and 3 are specifically designed to mimic its engine, its the only relevant example. Not to mention there'll in all likelihood some new iteration of 3DMark that does an even worse job of forecasting future gaming requirements.

2. A crystal ball is used to predict the future, I certainly hope you don't need one to see the present. ;)

The original comment suggested that there was no indication that future games would necessarily be CPU dependent. I was just pointing out the fact that a crystal ball isn't necessary, as current games are CPU dependent and there is nothing to suggest that requirements will change in the future.

3. Saying a post is filled with inaccuracies and not even being able to point them out and correct them is lame.

Being able to and not caring enough to type out responses to 20 sentence fragments and incomplete thoughts are not something I care to do on a rare day off (thanks to the snow :)). You should've seen that from the post in the video discussion where you just started spamming quotes from various releases. Info without insight = worthless.

Chiz
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: Nemesis77
Originally posted by: NOX
Yeah, but just because two entities all of a sudden want to take the stance that a lot of people have been taking for years now, don't change a thing.

Taking a stand... So I guess HardOCP and NV used 01 against their wills then
rolleye.gif
? I find it too big of a coincidence that NV happily supported 01, but now that 03 support features that they themselves have chosen not to support, it's suddenly sucky benchmark. NV is just annoyed because 03 clearly shows that Ati's lineup is alot more future-proof than NV's lineup is. That Ati has DX9, whereas NV is still busy pushing DX7!

I applaud the fact that Futuremark designed a benchmark that takes advantage of latest tech, instead of designing their benchmark around NV-hardware.
You totally missed my point.

I?m not just talking about Nvidia or HardOCP. My point is this has been discussed over and over and over again on these forums. People, knowledgeable people for that matter, have stated over and over again, that 3dmark0x is not a suitable benchmark to measure REAL WORLD PERFORMANCE.

Just take a look at Anands last video card review, notice he did not use 3dmark, and I believe he didn't use 3dmark on the review before that, and the one before that too.
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Being able to and not caring enough to type out responses to 20 sentence fragments and incomplete thoughts are not something I care to do on a rare day off (thanks to the snow ). You should've seen that from the post in the video discussion where you just started spamming quotes from various releases. Info without insight = worthless.


haha, you just resumed your problem, too much insight with nothing based on real info. :p
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Spicedaddy
Being able to and not caring enough to type out responses to 20 sentence fragments and incomplete thoughts are not something I care to do on a rare day off (thanks to the snow ). You should've seen that from the post in the video discussion where you just started spamming quotes from various releases. Info without insight = worthless.


haha, you just resumed your problem, too much insight with nothing based on real info. :p

Real info? I'm basing my insights on REAL GAMING performance, both current and the glimpses Carmack has provided to us. What's your backup again? Oh yah, the future benefits of 3DMark2K3. Go figure.
rolleye.gif


Chiz
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I have always viewed 3DMark as eye candy only. It has some useful applications. If someone tells me their system is running slow, I tell them to run 3DMark2001 and compare their scores against others with similar configuration. I don't put much value though in the score 3DMark2001 gives. ATI and NVIDIA both cheat on this benchmark because there are so many people who put so much value in this benchmark. I wish 3DMark2003 would consists of benchmarks from popular games and not their own demos. And make it so that you can not cheat.
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Originally posted by: chizow
Originally posted by: Spicedaddy
Being able to and not caring enough to type out responses to 20 sentence fragments and incomplete thoughts are not something I care to do on a rare day off (thanks to the snow ). You should've seen that from the post in the video discussion where you just started spamming quotes from various releases. Info without insight = worthless.


haha, you just resumed your problem, too much insight with nothing based on real info. :p

Real info? I'm basing my insights on REAL GAMING performance, both current and the glimpses Carmack has provided to us. What's your backup again? Oh yah, the future benefits of 3DMark2K3. Go figure.
rolleye.gif


Chiz

A few quotes:

For those saying its a benchmark designed to test future compatibility of games, I'm sure you know that the actual DX 9 capabilities are limited to 2 of 10 tests in GT 4, yet that test carries the heaviest weight.

Proof? (hint: it doesn't...) --> 3DMark score = (GT1fps * 7.3) + (GT2fps * 37) + (GT3fps * 47.1) + (GT4fps * 38.7)


That's the issue here, FutureMarketing and ATI obviously collaborated on the project to make it a test of DX 8.1 or better cards (they say as much in their responses to nVidia's remarks).

ATI isn't the only BETA partner. Here's a list: AMD, ATI, Intel, Microsoft, Creative, Matrox, S3Graphics, SiS, ALi, Dell, CNET, Gateway, Imagination Technologies/PowerVR, InnoVISION Multimedia Trident Microsystems. So basically, all these companies grouped and decided to give nVidia the shaft?


In fact Carmack has said numerous times that a GF3 requiring multiple passes still outperforms a R200 that only requires a single pass. I still feel that 3DMark2k3 is either optimized to render using PS 1.4, or it arbitrarily assigns a penalty to non-PS 1.4 parts.

Proof? (hint: he was talking about GF4 (not GF3) being faster even though it had to do multiple passes... The GF3 bit is about higher precision rendering.)


I find it funny when people use the "inferior tech" argument; I'm sure you're one of those people that rushes out and buys a card b/c of some great feature, only to realize there isn't a single game within 6 months to a year that will implement it. When it finally shows up in a game, the card is obsolete and the next generation of cards is already upon us.

Don't you own a 9700 Pro? ;)


They might as well slap a "Built By ATI" sticker on it and call it "The ATI Video Card Benchmark". Its certainly not "The Gamer's Benchmark", and does nothing to mirror real-world gaming performance, in future or current games.

Seriously, how much nVidia stock do you own? :p
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: Spicedaddy
Seriously, how much nVidia stock do you own? :p
You see that's the problem, people will always turn it into a product X vs. product XX. Why can?t some people understand that?s not what this is all about? This is something that has been stated over and over! 3dmark0x does not, represent REAL WORLD PERFORMANCE, not by any stretch of the human imagination!
Don't you own a 9700 Pro?
Then why would you ask if he owns nvidia stock? Clearly he is not blinded by product loyalty.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
For those saying its a benchmark designed to test future compatibility of games, I'm sure you know that the actual DX 9 capabilities are limited to 2 of 10 tests in GT 4, yet that test carries the heaviest weight.

Proof? (hint: it doesn't...) --> 3DMark score = (GT1fps * 7.3) + (GT2fps * 37) + (GT3fps * 47.1) + (GT4fps * 38.7)

Judging by the fact you highlighted 2 random numbers, it doesn't appear you know why those multipliers were chosen. The multipliers attempt to normalize the framerates so each test constitutes about 25% of the total score. The third test is not being given higher priority than test 4 because it was given a higher multiplier. Test 4 is not weighted any more than the other 3, but when the card gets a fat "0" for the score because it doesn't support one feature, it does tend to put it in a hole it isn't going to climb out of in the other tests.

For some reason the 9500/9700 series ATi cards kill the competition in tests 2 and 3 as well which makes one wonder how that can be, when DirectX 8 games are not future tech, they are out now with completely different results. 3DMark2001SE is a DirectX 8.1a benchmark, why doesn't it give the same odd results that 3DMark 2003 does in games 2 and 3? Which benchmark is the better example of real game preformance?

Aquanox

"Aquanox makes heavy use of pixel and vertex shaders in the DirectX 8 standard."

Again, similar situation, much different results. You want to claim that 2003 is for future games, but even the DirectX 8 benchmarks give bogus results when compared to current games that use the same features.

They might as well slap a "Built By ATI" sticker on it and call it "The ATI Video Card Benchmark". Its certainly not "The Gamer's Benchmark", and does nothing to mirror real-world gaming performance, in future or current games.

Seriously, how much nVidia stock do you own?

This has nothing to do with video card manufacturers, I don't know why you keep dragging the discussion back to that.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Spicedaddy
Proof? (hint: it doesn't...) --> 3DMark score = (GT1fps * 7.3) + (GT2fps * 37) + (GT3fps * 47.1) + (GT4fps * 38.7)

Considering 5% of the test requirements equate to 30% of the score, yes, its still given the highest weight. 0fps * 38.7 = 0

ATI isn't the only BETA partner. Here's a list: AMD, ATI, Intel, Microsoft, Creative, Matrox, S3Graphics, SiS, ALi, Dell, CNET, Gateway, Imagination Technologies/PowerVR, InnoVISION Multimedia Trident Microsystems. So basically, all these companies grouped and decided to give nVidia the shaft?

That list is meaningless, you might as well look at the advertising decals on a race car. There's a ton of companies represented, but there's only 1 force driving/financing the vehicle. Its the one that paid the most money to have their logo emblazoned on the hood of the car.

Proof? (hint: he was talking about GF4 (not GF3) being faster even though it had to do multiple passes... The GF3 bit is about higher precision rendering.)

That's proof of nothing other than your ability to point out a quote I originally brought to your attention. Now that you've "corrected" me, why don't you comment on how a GF4 requiring mutliple passes outperforms a R200? Since you seem to think PS 1.4's efficiency over PS 1.1 gives validity to 3DMark2K3's claim that lower scores are a result of having to render using PS 1.1.

Don't you own a 9700 Pro? ;)

Yes, I do. Does that mean I can't comment on how a benchmark, or any product in general is FUBAR? I guess its black or white, chocolate or vanilla for some people.
rolleye.gif

Even before nVidia's comments were released, I noticed that CPU and system specs had almost no influence on the the total scoring, as did many others who could see beyond the typical nVidia vs. ATI rhetoric.

Seriously, how much nVidia stock do you own?

What does my portfolio have to do with the comments I make here? I don't make financial decisions based on personal preferences or the opinions I read here; If anything I'd invest against the popular opinion here. AT is a cross-section of the enthusiast market and has little effect on the financial health of any company in my portfolio. By the same tolken I don't make purchasing decisions based on brand loyalty or preference either. Maybe that's your problem, as it seems you find it difficult to objectively evaluate one product simply because it casts another one of your products in a favorable light.

Chiz
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Judging by the fact you highlighted 2 random numbers, it doesn't appear you know why those multipliers were chosen. The multipliers attempt to normalize the framerates so each test constitutes about 25% of the total score. The third test is not being given higher priority than test 4 because it was given a higher multiplier. Test 4 is not weighted any more than the other 3, but when the card gets a fat "0" for the score because it doesn't support one feature, it does tend to put it in a hole it isn't going to climb out of in the other tests.

Exactly my point... If it's there to normalize each test so that they approx. account for 25% each, then the DX9 test (GT4) isn't more important than the other 3 test, which Chizow implied. (exact words: I'm sure you know that the actual DX 9 capabilities are limited to 2 of 10 tests in GT 4, yet that test carries the heaviest weight.) If the card gets a fat 0, then it wasn't meant for 3DMark03.


That's proof of nothing other than your ability to point out a quote I originally brought to your attention. Now that you've "corrected" me, why don't you comment on how a GF4 requiring mutliple passes outperforms a R200? Since you seem to think PS 1.4's efficiency over PS 1.1 gives validity to 3DMark2K3's claim that lower scores are a result of having to render using PS 1.1.

For the same reason a 9700 forced to run PS 1.1 is still faster than an 8500... PS speed doesn't translate to game speed directly, and I never said it did.


That list is meaningless, you might as well look at the advertising decals on a race car. There's a ton of companies represented, but there's only 1 force driving/financing the vehicle. Its the one that paid the most money to have their logo emblazoned on the hood of the car.
and
What does my portfolio have to do with the comments I make here? I don't make financial decisions based on personal preferences or the opinions I read here; If anything I'd invest against the popular opinion here. AT is a cross-section of the enthusiast market (blablablabla...)

I just thought it was funny that you'd attack someone about buying the latest tech. when that's what you seem to do. :) And I really couldn't care less about your portfolio, it's just weird that you try to blame this on a huge conspiracy against nVidia, and ATI somehow being the only one financing FutureMark, etc. Where's your proof of that "insight"?