Forceware 53.03 gives FM the finger

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Jeff7181
Set everyone straight then, oh great one.
Try reading one of the dozen or so other threads on the subject of 3DM03 and the FX series. Those of us who spend the time to understand and explain the issues shouldn't have to repeat ourselves every single time someone ressurects an old issue in a new thread. Being a regular, you should've read the other threads by now. If you had, you'd understand why I said what I did in reference to a lot of this thread up to my post.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Pete
Originally posted by: Jeff7181
Set everyone straight then, oh great one.
Try reading one of the dozen or so other threads on the subject of 3DM03 and the FX series. Those of us who spend the time to understand and explain the issues shouldn't have to repeat ourselves every single time someone ressurects an old issue in a new thread. Being a regular, you should've read the other threads by now. If you had, you'd understand why I said what I did in reference to a lot of this thread up to my post.

Your comment fits better to the discussion going on after you posted... when I started the thread, it was to inform everyone that nVidia has once again bypassed FM's "anti-cheat" patch.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Jeff, the thread starter is safe enough, but then you veer into questionable territory before my first post:

So it looks as though Futuremark is handicapping nVidia in such a way that it DOES NOT reflect real world performance. It may reflect performance in a few select games, but look at Homeworld 2... nVidia is WAY head of ATI there... so you could say 3DMark 2003 isn't accurate because nVidia should be scoring at least 20% higher than ATI because it does in Homeworld 2.

I don't see how even the ATI fanboys can argue against this...

Even if nVidia's drivers are optimized for 3DMark, what difference does it make? It will be optimized for future games, so shouldn't the benchmark be an indication of future games?

1. It really does reflect realworld performance. In unoptimized DX9 games, nVidia is often twice as slow as ATi (AM3, Halo, HL2).
2. The point is, it's neither a wise nor feasible strategy to hand-optimize every game after release. It doesn't help consumers who buy the game at release, either.
3. Most of the games you listed aren't DX8 or DX9, so what relevance do they have to a DX8/9 benchmark? I don't think Homeworld 2 will stress a video card with DX8 shaders. Heck, I doubt the game even has DX8 shaders.
4. Here's the deal: 3DM03 is a straight-up DX8/9 stress test. It's not an indication of how games will play *now*, but rather a rough guide to how your card will perform with future games of an equivalent D3D level. It's an indication of how your card will perform compared to other cards with games coded to the D3D API.
5. Agreed, the FX architecture seems handicapped compared to ATi in terms of shader-heavy games. OTOH, it excels at older games. Each company placed priorities (and had the engineering talent/time to focus) on slightly different directions. Don't blame 3DM for exposing a hardware's strengths and weaknesses. If you don't want to test for unoptimized D3D performance, don't use 3DM.
6. Again with the fanboy nonsense. Can we stop slinging mud, especially preemptively?
7. Says who? What guarantees you that nV will spend time optimizing for every game you play? And what happens if your favorite game isn't a high-profile title like Doom 3 or Half-Life 2 or Halo (which are guaranteed time with nV b/c of their guaranteed high sales)?
8. 3DM is useful as a peek into unoptimised D3D performance, nothing else. It won't tell you how fast your card will run Game X in absolute terms, but it can offer a rough predictor of performance compared to other cards.


You veer further into ignorance (not meant as an ad-hom, but as an observance of fact bearing no malicious intent):

Yes... it SHOULD... but FM has a bug in their ass for some reason. I got better performance in every game I own by switching from 45.xx to 52.xx... but 3DMark2003 scores don't reflect that when FM creates a patch to disable nVidia's "application specific optimizations."

Solution to the problem... Industry Standard Benchmark out... game benchmarks in.
If FM wants to stay in business, they should get a crapload of demos of the popular games, and measure performance with those instead of running "games" that nobody will ever play.

1. 3DM03 is available far ahead of any other DX8/9 game.
2. FM isn't the one with the "bug in their ass." They're simply following their own rules in enforcing their vision of 3DM's purpose. (Besides, nV didn't complain about 3DM01, did they?)
3. Testing only games is a great idea, as it removes having to think about what 3DM represents. But it's a short-sighted idea. We had DX9 cards for months before we had a single way of testing the performance of their big new feature, and almost literally a year before we had games that exploited fancy shaders. So what do you do a year previous? Hope your card will do well at the bleeding edge in a year, or attempt to use the tools available to make an educated guess as to how it'll perform?

I'll stop here, as I don't feel like going into whether nV or MS are to blame for nV's poor initial DX9 performance. We've covered this so many times in so many threads, this'll be my last half-hearted attempt at communicating 3DM's value and the total lack of value to gamers in nV's actions. AT has a working search function, and you can use it to mull over previous threads.

If you don't think I know what I'm talking about and think Gabe Newell is a paid shill, maybe you'd give John Carmack's opinions more weight? His .plan file from November 2001 on optimizations is eerily prescient. Here's a taste, but you really should read it all:

Attempt to guess the application from app names, window strings, etc. Drivers
are sometimes forced to do this to work around bugs in established software,
and occasionally they will try to use this as a cue for certain optimizations.

My positions:

Making any automatic optimization based on a benchmark name is wrong. It
subverts the purpose of benchmarking, which is to gauge how a similar class of
applications will perform on a tested configuration, not just how the single
application chosen as representative performs.

It is never acceptable to have the driver automatically make a conformance
tradeoff, even if they are positive that it won't make any difference. The
reason is that applications evolve, and there is no guarantee that a future
release won't have different assumptions, causing the upgrade to misbehave.
We have seen this in practice with Quake3 and derivatives, where vendors
assumed something about what may or may not be enabled during a compiled
vertex array call. Most of these are just mistakes, or, occasionally,
laziness.

Sounds a lot like what Gabe Newell said at Shader Days, doesn't it? So are they both wrong, or does nVidia know best?

Insomniak, you should get ta readin'. :p I didn't arrive at my understanding of the situation by talking or listening solely to forum posters, but by reading as much about the issue as I could. If you're not so inclined, or want a shortcut or summation of the main issues surrounding 3DM03, I suggest starting with Beyond3D's great coverage of the issue (both in its articles and its forums).
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
1. It really does reflect realworld performance. In unoptimized DX9 games, nVidia is often twice as slow as ATi (AM3, Halo, HL2).
In all the tests I've seen, Valve has forced the reviewer to use 45.23 drivers. So in reality, you don't know how nVidia cards will perform in a game that is still at least 4 months away from store shelves. (the same applies to Doom 3 and ATI)
Where do you see a Halo benchmark where the GFFX is twice as slow as an ATI card?
Where do you see an Aquamark 3 benchmark where the GFFX is twice as slow as an ATI card?
(the reviews on AnandTech's site look like the gap is never greater than 10% in those two games)
2. The point is, it's neither a wise nor feasible strategy to hand-optimize every game after release. It doesn't help consumers who buy the game at release, either.
By the time HL2 and Doom3 are out, which are the two major titles that will make use of DX9 shaders, we won't be using the video cards we have right now... at least, hardcore gamers won't be.
Most of the games you listed aren't DX8 or DX9, so what relevance do they have to a DX8/9 benchmark? I don't think Homeworld 2 will stress a video card with DX8 shaders. Heck, I doubt the game even has DX8 shaders.
What relevance do games that are incomplete, unoptimized, not even ready to beta test have?
4. Here's the deal: 3DM03 is a straight-up DX8/9 stress test. It's not an indication of how games will play *now*, but rather a rough guide to how your card will perform with future games of an equivalent D3D level. It's an indication of how your card will perform compared to other cards with games coded to the D3D API.
Agreed. But it's also an industry standard benchmark that OEM's and less informed consumers use to make hardware purchases... and for someone buying hardware right now to play games they can get their hands on right now, 3DMark2003 is NOT indicative of how they will perform... the FX5900 Ultra is not 20% slower in games currently on the market than the 9800 Pro.
5. Agreed, the FX architecture seems handicapped compared to ATi in terms of shader-heavy games. OTOH, it excels at older games. Each company placed priorities (and had the engineering talent/time to focus) on slightly different directions. Don't blame 3DM for exposing a hardware's strengths and weaknesses. If you don't want to test for unoptimized D3D performance, don't use 3DM.
The FX GPU is handicapped compared to ATI's in terms of DX9 shaders. Why would you test for unoptimized performance when in the real world, those optimizations WILL make a performance difference? That's like disabling HT on P4's for benchmarks because it's an optimization that the opposition doesn't make use of or doesn't need to. The fact is that it's there, it works, it should be used.
6. Again with the fanboy nonsense. Can we stop slinging mud, especially preemptively?
Ok
7. Says who? What guarantees you that nV will spend time optimizing for every game you play? And what happens if your favorite game isn't a high-profile title like Doom 3 or Half-Life 2 or Halo (which are guaranteed time with nV b/c of their guaranteed high sales)?
I didn't say they have to optimized for every game... the 52.xx detonators are proof of that... they increased performance in EVERYTHING... not just DX9 tests, and not just 3DMark.
8. 3DM is useful as a peek into unoptimised D3D performance, nothing else. It won't tell you how fast your card will run Game X in absolute terms, but it can offer a rough predictor of performance compared to other cards.
Unfortunately the industry uses it to tell how fast cards will run Game X, Y, and Z... that being so... one would think FM would allow optimizations to be used.

As I pointed out before, nVidia and ATI cards score relatively similar in Aquamark3 using the 52.xx drivers, being only 1.5% difference between the two. Does that mean nVidia has an application specific optimization for Aquamark3? There seems to be no controversy over that benchmark.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
And the replies illustrate my point. nVidia got beat badly this round, and instead of taking one on the chin and going back to the drawing board with a vengance, they chose to act like babies. By lying, and cheating. They've lost a large consumer base, one that they wouldnt have. Even if they produce a better card next round, I wont be bying one because of how they handled themselves. Its pathetic, and they should be embarrassed.

But go ahead and cheer for cheating. Maybe you'll get another great driver configuration that doesnt allow you to use true trilinear filtering. Because hey, speed is all that matters.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Ackmed
And the replies illustrate my point. nVidia got beat badly this round, and instead of taking one on the chin and going back to the drawing board with a vengance, they chose to act like babies. By lying, and cheating. They've lost a large consumer base, one that they wouldnt have. Even if they produce a better card next round, I wont be bying one because of how they handled themselves. Its pathetic, and they should be embarrassed.

But go ahead and cheer for cheating. Maybe you'll get another great driver configuration that doesnt allow you to use true trilinear filtering. Because hey, speed is all that matters.

IIRC you couldn't force tri-linear ONLY if you set AF to application control... but I could be wrong.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Ackmed: Nvidia had 56% of the standalone market in the 3rd qtr. While it was down from 65% in the 2nd qtr, they still hold the majority of the standalone market. This is the market where people purchase cards from the retail shelves.

I am interested in seeing the 4th qtr results. I want to see if Nvidia regained marketshare due to the cheapness of the 5900NU.

1. It really does reflect realworld performance. In unoptimized DX9 games, nVidia is often twice as slow as ATi (AM3, Halo, HL2).

HL2 is not released and wont be for awhile, was using old drivers with an ols engine build. Halo half as fast on the FX????? And AM3 is half as fast and is now considered a game?

3DM is useful as a peek into unoptimised D3D performance, nothing else. It won't tell you how fast your card will run Game X in absolute terms, but it can offer a rough predictor of performance compared to other cards.

While this may be true, how many game companies do you know of that dont optimize their code for a specific GPU? On top of that from my understanding 3dmark03 is not only unoptimized, it is of a poor design to boot. So you have 2 problems with using this to gauge anything in the real world. Not a good sign.................

On thing to note from Carmacks .plan. I think he is referring to cutting IQ corners for increased performance. From what it looks like Nvidia made the shader pretty much mathmatically identical to 3dmark03's shader but optimized it for the FX series of cards. Now if game companies are going back and completely rewriting shaders after the game is released then i guess the issue is valid. But I think what john is saying is when vendors cut IQ corners, the game dev changes things around, and all of a sudden you are rendering blue instead of red due to how you changed the output of the shader.

But maybe I am wrong on this. It wouldnt be the first time :)
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Insomniak, you should get ta readin'. :p I didn't arrive at my understanding of the situation by talking or listening solely to forum posters, but by reading as much about the issue as I could. If you're not so inclined, or want a shortcut or summation of the main issues surrounding 3DM03, I suggest starting with Beyond3D's great coverage of the issue (both in its articles and its forums).

Oh, nah, I'm well aware of what's going on with this situation - it's been beaten to death everywhere. I just like to stir sh!t up and troll.

Thanks for typing that though. I feel satisfied now :)

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
First of all, I'm mistaken. nV was never half as fast as ATi in Halo. They were slower initially, but not close to half as fast. Sorry about that.


Jeff:

In all the tests I've seen, Valve has forced the reviewer to use 45.23 drivers. So in reality, you don't know how nVidia cards will perform in a game that is still at least 4 months away from store shelves. (the same applies to Doom 3 and ATI)
Where do you see a Halo benchmark where the GFFX is twice as slow as an ATI card?
Where do you see an Aquamark 3 benchmark where the GFFX is twice as slow as an ATI card?
(the reviews on AnandTech's site look like the gap is never greater than 10% in those two games)

I specifically mentioned unoptimized drivers, implying that it takes nV time (usually after a game's release) to work on their drivers to improve performance to competitive levels.
As I said, I was mistaken about Halo.
I believe it was THG that showed nV doubled performance in AM3 going from Det 44 to Det 52.

By the time HL2 and Doom3 are out, which are the two major titles that will make use of DX9 shaders, we won't be using the video cards we have right now... at least, hardcore gamers won't be.
And this helps gamers buying cards to last them at least a year how? (Again, D3 is not DX9--at least, not to the extent HL2 is.)

What relevance do games that are incomplete, unoptimized, not even ready to beta test have?
Greater relevance to 3DM03 and DX8/9 performance than many of the games you listed? I'm not sure of your point. 3DM is meant to test future performance, not current performance. Maybe I'm being too understanding in terms of 3DM's intended purpose. It's possible 3DM is advertised as something other than what it is, and I've ignored that. I mainly focused on how reviewers present it to their readers, though. I'm sure 3DM wouldn't be as popular as it is without reviewers having backed it en masse.

Agreed. But it's also an industry standard benchmark that OEM's and less informed consumers use to make hardware purchases... and for someone buying hardware right now to play games they can get their hands on right now, 3DMark2003 is NOT indicative of how they will perform... the FX5900 Ultra is not 20% slower in games currently on the market than the 9800 Pro.
Again, 3DM03 tests future performance. I didn't see any DX8/9-heavy games out at the beginning of the year. I also don't see the 5800 anywhere, either, so 3DM03 did have some (positive) effect on nVidia (also reflected in the 5600->5700 transition).


Genx87:

HL2 is not released and wont be for awhile, was using old drivers with an ols engine build. Halo half as fast on the FX????? And AM3 is half as fast and is now considered a game?
Again, my mistaken memory WRT Halo.
AM3 = Aquanox 2. It was half as fast, as I mentioned to Jeff.

While this may be true, how many game companies do you know of that dont optimize their code for a specific GPU? On top of that from my understanding 3dmark03 is not only unoptimized, it is of a poor design to boot. So you have 2 problems with using this to gauge anything in the real world. Not a good sign.................
1. How many companies do you know of that have id's or Valve's resources? Not everyone has the time and money to optimize for everything. I believe game devs would benefit from being able to code to one API without worrying about unique hardware speedups.
2. Yes, nV said a lot of disparaging things about 3DM03. (If you recall, PS1.4 use was labelled as unrealistic WRT the hardware market. Now nV encourages using it over PS2.) 3DM isn't about making things run as fast as possible, but about putting as much stress on your 3D hardware as possible.

Yes, Carmack did talk about vendors changing results on you, but he also mentioned the fragile nature of app-specific optimizations. That's what I was focusing on.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
What I find funny is that scores in other benchmarks don't change.
Of course they don't change and why should they? None of the other benchmarks are employing anti-cheat measures like Futuremark is.