Pidge:
ATI is well known to include special catches in their drivers which searches for benchmarks and sets up special tweaks for that particular benchmark.
That's hardly the same thing as having defective DXT1 hardware, forcing 32 bit textures into 16 bits when compressing them and using a forced 16 bit Z-buffer.
Also does ATi's/3dfx's image quality blow like nVidia's when those tweaks are applied? I think not. And if you don't like the tweaks with ATi/3dfx you can turn them off.
ATI's drivers also force a 16-bit Z buffer which greatly helps with benchmarks.
I doubt that *very* much, especially since there are settings in their drivers in
both OpenGL and Direct 3D change the Z-buffer settings. And this
link clearly shows the default settings are not forcing a 16 bit Z-buffer.
You can see that ATi even have a setting to force 32 bit textures into 16 bit textures, which just happens to be what nVidia are already doing with their DXT1 compression scheme.
How do you disable nVidia's forced 16 bit Z-buffer? How do you disable nVidia's forced 16 bit textures in DXT1? Registry hacks probably because in these
Direct3D and
OpenGL screen shots I don't see anything to change any of those those tweaks. In fact nVidia don't even tell you that they're tweaking something.
DaveB3D was right to tweak the V5 in his reviews. nVidia are quite clearly doing so already and not even telling anyone, plus you have no choice in turning them off if you wish (unless you know how to hack the registry). Plus Dave's tweaks have no effect on the image quality, unlike nVidia's tweaks.
Don't give me this suprised look as if NVIDIA is the only one who tweaks their drivers.
This is "tweaking" to the point where you have unacceptable image quality, and you don't even have a choice not to use the tweaks. All nVidia care about is getting the highest towers in the framerate bar graphs. And bizarrely not one of the reviewers seem to care about the sub-par image quality. IMO nVidia's boards should all be benchmarked with S3TC turned
off and compared to everyone else with S3TC turned
on.
A good quote from the article
Thanks to his pioneering investigative work, we finally know the real reason behind this problem that nVidia and id were silent about.
That means no more pointing the finger at Carmack. It's a hardware fault with
nVidia.