Originally posted by: Matthias99
The point is that Futuremark has asked the GPU manufacturers not to use application detection in 3DMark03.
This is their call, and NVIDIA should not be trying to sidestep their restrictions. NVIDIA is refusing to go along because they believe this is unrealistic (although ATI seems to have no problem with this!) Whether it is a good idea or not is an entirely separate issue from whether or not NVIDIA should follow the rules that have been established.
First NVIDIA claimed that 3DMark03 was a bad benchmark because it didn't use realistic enough code (and then got caught "cheating" at it after disparaging it as inaccurate and meaningless). Now their problem is that FM won't let them hand-optimize for it (even in ways that could be used for real programs) -- although since a few real DX9.0 games have come out, I notice they've stopped trashing its numbers as unrealistic.
ATI put out a pretty amusing statement a few weeks back about the percentage of games that actually get hand-optimized shaders and the like written for them by the driver team. Basically, only the most popular games and benchmarks will get worked on by NVIDIA and ATI. FM wants 3DMark03 to be representative of the raw hardware/baseline driver performance of the cards, not of how much time the driver team spent working on custom shaders for it. If you don't think that's a good idea, or that it's representative of "real" performance from the cards, then just disregard all 3DMark03 results.
But it does... ask anybody with a GeForce FX... the 52.16 drivers provide better performance in every game than the 45.xx drivers.
Then shouldn't they provide better results in 3DMark03, even without app-specific optimizations? If not, this means all NVIDIA has been doing for the last 6 months is application-specific enhancements rather than making their drivers faster in general.