BFG10K
Lifer
- Aug 14, 2000
- 22,709
- 3,003
- 126
And how much did nVidia pay you to ignore everything that's been going on?If we pile all the evidence of this together we have nothing, nothing except comments from a man we know was paid millions of dollars from ATi.
ROFL. Where have you been for the last six months?You have ranted on about cheats in numerous other benches, show any evidence of it at all.
Let's start with something simple - if nVidia's cheats are really bugs then explain to me nVidia would take the time to implement measures so that anti-cheat programs stop working? If the "bugs" have been fixed then how could these programs continue any effect on them? Why go to so much trouble to hide something that isn't there?
Right, so the compression has been removed and you've tried the anti-cheat programs and verified that there is no difference to either performance or image quality when they're employed. If not unless you have access to nVidia's driver source code then you simply have no evidence to make that claim. And calling Gabe a liar isn't evidence either and doesn't disprove his comments.The things that supported your 'cheat' accusation against every bench nV ran are gone.
Right, so instead you'll stick to an inferior NV25 board and continue to shout that ATi's AF is unusable.Actually I've mentioned in multiple threads the reason I won't buy a FX is because of driver bugs. Funny that.
Catalyst 2.5 was the last DX 8.x driverset, Catalyst 3.0 was the first DX 9 driverset.Which version was it directly before 3DMark2K3 came out?
Prove it. Show me the report from FutureMark that illustrates their findings just like the cheats they found in nVidia's boards.Then they released their driver sets with replaced code and the performance went up considerably.
You won't find it of course. The only thing you'll find is one shader that was optimised but unfortunately relied on application detection to work. For good measure ATi promised to remove it and they did.
OTOH nVidia's cheats were numerous and upon discovery, anti-cheat programs like RivaTuner discovered that performance across the board dropped to similar levels as previous drivers that hadn't been "optimised" yet. nVidia's response? In addition to badgering FutureMark they released drivers that stopped anti-cheat programs from working.
Rather bizzarre actions from a company whom you claim have genuine driver bugs that have now been fixed, wouldn't you say?
If they changed the previous code, which I'm sure they did. Hell I could give you a 1 MB driver, compress it then give you a 2 MB driver that compresses even lower than the 1 MB one. More code but it's smaller.If they already had a bunch of shader replacement code in there, and they added a whole bunch more, how does their file size keep dropping?
However the real issue here is the compression to begin with: if these are truly driver bugs that have now been fixed then what does nVidia have to hide? Why are they preventing the public from testing them out for themselves?
You didn't appear to answer the question at all.You have to work under the assumption that nVidia changed their driver structure for the purpose of looking better on a few benches instead of it being planned already. If they did it in response to the anti detect, they have some incredibly fast thinking people on their staff to come up with a countermeasure and release it as quickly as they did.
Answer the question: if nVidia is not cheating and have fixed what you call genuine driver bugs then why implement counter-measures against programs that illustrate cheating?
When they drop the compression mechanism, the anti-cheat programs prove the drivers work just fine, all of the flaky "does not use AA even when requested" problems are fixed and the likes of Gabe and Futuremark release detailed reports outlining that the issues have been fixed, then I'll be glad to admit it.Now they get their drivers sorted out and performance is up and you still can't acknowledge the fact that maybe they just plain fvcked up. It has to be a giant conspiracy.
Until then all I've got is your comments which I'm afraid don't hold a lot of weight.
The reason I asked was because you brought up the Quack issue. If you claim ATi was cheating then you must also claim that nVidia was cheating at the hardware level since all early Quake III benchmarks were done with texture compression and that meant that the nV boards were all running with 16 bit textures. Also this was cheating that could not be fixed and continued through several generations of boards. Even the NV25 is still loading 16 bit textures but they use a dithering technique to mask the problem; not sure about the NV3x series though.The answer to that is no if you look at S3's implementations. They followed the creator, they didn't one up them.
Thus nVidia have been cheating for every single benchmark using DXT1/S3TC that was run on on NV25 boards and earlier.
So nVidia was cheating then? Don't hide behind the spec, just answer yes and no.
I'm not the only one in this thread who feels the same way about your attitude.Do not mistake your own zealotry at the time for mine.
