Jeff, the thread starter is safe enough, but then you veer into questionable territory before my first post:
So it looks as though Futuremark is handicapping nVidia in such a way that it DOES NOT reflect real world performance. It may reflect performance in a few select games, but look at Homeworld 2... nVidia is WAY head of ATI there... so you could say 3DMark 2003 isn't accurate because nVidia should be scoring at least 20% higher than ATI because it does in Homeworld 2.
I don't see how even the ATI fanboys can argue against this...
Even if nVidia's drivers are optimized for 3DMark, what difference does it make? It will be optimized for future games, so shouldn't the benchmark be an indication of future games?
1. It really does reflect realworld performance. In unoptimized DX9 games, nVidia is often twice as slow as ATi (AM3, Halo, HL2).
2. The point is, it's neither a wise nor feasible strategy to hand-optimize every game after release. It doesn't help consumers who buy the game at release, either.
3. Most of the games you listed aren't DX8 or DX9, so what relevance do they have to a DX8/9 benchmark? I don't think Homeworld 2 will stress a video card with DX8 shaders. Heck, I doubt the game even has DX8 shaders.
4. Here's the deal: 3DM03 is a straight-up DX8/9 stress test. It's not an indication of how games will play *now*, but rather a rough guide to how your card will perform with future games of an equivalent D3D level. It's an indication of how your card will perform compared to other cards with games coded to the D3D API.
5. Agreed, the FX architecture seems handicapped compared to ATi in terms of shader-heavy games. OTOH, it excels at older games. Each company placed priorities (and had the engineering talent/time to focus) on slightly different directions. Don't blame 3DM for exposing a hardware's strengths and weaknesses. If you don't want to test for unoptimized D3D performance, don't use 3DM.
6. Again with the fanboy nonsense. Can we stop slinging mud, especially preemptively?
7. Says who? What guarantees you that nV will spend time optimizing for every game you play? And what happens if your favorite game isn't a high-profile title like Doom 3 or Half-Life 2 or Halo (which are guaranteed time with nV b/c of their guaranteed high sales)?
8. 3DM is useful as a peek into unoptimised D3D performance, nothing else. It won't tell you how fast your card will run Game X in absolute terms, but it can offer a rough predictor of performance compared to other cards.
You veer further into ignorance (not meant as an ad-hom, but as an observance of fact bearing no malicious intent):
Yes... it SHOULD... but FM has a bug in their ass for some reason. I got better performance in every game I own by switching from 45.xx to 52.xx... but 3DMark2003 scores don't reflect that when FM creates a patch to disable nVidia's "application specific optimizations."
Solution to the problem... Industry Standard Benchmark out... game benchmarks in.
If FM wants to stay in business, they should get a crapload of demos of the popular games, and measure performance with those instead of running "games" that nobody will ever play.
1. 3DM03 is available far ahead of any other DX8/9 game.
2. FM isn't the one with the "bug in their ass." They're simply following their own rules in enforcing their vision of 3DM's purpose. (Besides, nV didn't complain about 3DM01, did they?)
3. Testing only games is a great idea, as it removes having to think about what 3DM represents. But it's a short-sighted idea. We had DX9 cards for months before we had a single way of testing the performance of their big new feature, and almost literally a year before we had games that exploited fancy shaders. So what do you do a year previous? Hope your card will do well at the bleeding edge in a year, or attempt to use the tools available to make an educated guess as to how it'll perform?
I'll stop here, as I don't feel like going into whether nV or MS are to blame for nV's poor initial DX9 performance. We've covered this so many times in so many threads, this'll be my last half-hearted attempt at communicating 3DM's value and the total lack of value to gamers in nV's actions. AT has a working search function, and you can use it to mull over previous threads.
If you don't think I know what I'm talking about and think Gabe Newell is a paid shill, maybe you'd give John Carmack's opinions more weight?
His .plan file from November 2001 on optimizations is eerily prescient. Here's a taste, but you really should read it all:
Attempt to guess the application from app names, window strings, etc. Drivers
are sometimes forced to do this to work around bugs in established software,
and occasionally they will try to use this as a cue for certain optimizations.
My positions:
Making any automatic optimization based on a benchmark name is wrong. It
subverts the purpose of benchmarking, which is to gauge how a similar class of
applications will perform on a tested configuration, not just how the single
application chosen as representative performs.
It is never acceptable to have the driver automatically make a conformance
tradeoff, even if they are positive that it won't make any difference. The
reason is that applications evolve, and there is no guarantee that a future
release won't have different assumptions, causing the upgrade to misbehave.
We have seen this in practice with Quake3 and derivatives, where vendors
assumed something about what may or may not be enabled during a compiled
vertex array call. Most of these are just mistakes, or, occasionally,
laziness.
Sounds a lot like what Gabe Newell said at Shader Days, doesn't it? So are they both wrong, or does nVidia know best?
Insomniak, you should get ta readin'.

I didn't arrive at my understanding of the situation by talking or listening solely to forum posters, but by reading as much about the issue as I could. If you're not so inclined, or want a shortcut or summation of the main issues surrounding 3DM03, I suggest starting with Beyond3D's great coverage of the issue (both in its articles and its forums).