JB-
Sorry but I don't agree. They had Trilinear before. They remove it to inflate FPS scores. That in my book is a cheat.
So ATi's hardware engineers don't know how to do true anisotropic? Why? Is it too complicated for them? What about PVR's box filter? Are you now switching your former position and stating that Kristof and crew are also in to cheating? Yes, I'm trying to put you on the spot here since you either now have to say that PowerVR is also cheating and has been for years or acknowledge a difference between them. I did not slam PVR back then for cheating. My position has remained constant, yours is the one that had to change.
A hardware limitation is not even in the same ballpark as this "hack" as you call it.
What about PVR?
You said he is more or less lying and have no proof to back it up.
And he ran off at the mouth for some time about how they bent over backwards for the NV3X architecture while they didn't even compile it for optimal performance on their default code path. Do I think he is in any way, shape or form impartial, not even close.
Do you know that for a fact?
Carmack's .plan updates are public record,
everyone knew what he was doing with Doom3.
Why did JohnC felt he had to add a custom code path for the NV30?
Because it mapped to what he wanted to do in Doom3. Carmack's requests for exactly what he wanted in terms of features were things that were public knowledge back in the early NV2X days. nVidia gave him custom extensions that allowed the specific features he wanted to perform well on their boards. When ATi releases some of their own extensions that improve performance and Carmack doesn't use them then you can certainly make the argument he is as biased as Newell.
Why not just use the comman ARB path that ATI uses?
ATi is running in reduced quality versus nVidia if you want to take that discussion to the logical conclusion. Carmack went on record years ago stating he needed 64bit color for his next engine, ATi didn't offer it, nVidia did. He has gone on record stating there is no discernable quality difference going beyond FP16, so why would you? If you state that it is better to run in higher precission no matter what(which is a valid line to be sure) then you must acknowledge that nVidia is running in higher precission then ATi using ARB2. And in comparison to HL2, I'm not suggesting that Valve use anything nV specific, but MS's provided product.
You mean those same beta drivers that showed image corruptions on 3DGPUs AQ3 test? Sorry but DriverHaven and other sites looked at those drivers and found many issues.
No, I meant the 52.14s.
This is no different than what NV did with those Doom3 test. Except that HL2 was closer to being done.
If id had forced reviewers to use the most recent released drivers then the 5200Ultra and 9800Pro would have been quite close in the bench.
You from your many post here have gone out of your way to again make an excuse NV when they have clearly been caught time and time again lower IQ to gain speed for only BENCHMARK GAMES.
If you will state that PowerVR has been cheating for years and ATi is cheating in TRAoD then I would say you have a valid argument. Otherwise, your shifting your standards to accomadate how you want to see things.
BTW- Thanks for the Halo info
BFG-
So let me ask you again, do you claim shader subsitution - along with the rest of Futuremark's findings - are bugs or cheats?
They were cheating in 3DMark2K3, I've said in numerous times.
Well they did it in 3DMark so there's a high chance they're doing it in most games, especially since they went to great lengths to hide it.
Why do you make that leap?
Also we're seeing all manner of cheats all over the place, in a wide range of popular games that just happen to be benchmarked.
List some. List the exact title, exactly what the cheat is and what it causes. I'll run through the bugs in the Cats and see how many comparable issues I can come up with for you to put against your list so you know in advance. You list something that is without a doubt a cheat and I'll gladly say they cheated. So far I've seen a bunch of jump to conclusion BS that has left sites backpedaling or ignoring when the issues are fixed and performance is up
even more.
nVidia never admitted anything
That would be a very d@mning point if it were true. They did admit it, and they followed that up by posting their new guidelines for drivers(I'm not saying they follow these, but it was posted when they admitted to the cheating).
Right, and the "protection" just happened to come straight after the 3DMark cheats were exposed and Unwinder's publicily available RivaTuner defeated nVidia's cheating and you continue to insist that this action was nVidia's plan all along?
You honestly think they could come up with a scheme that the community couldn't hack through inside of a few days? If they could, the entertainment industry would gladly pay nVidia's driver team millions and millions of dollars to sort out their piracy issue.
However you are missing the big picture which is simply the fact that nVidia needed to use DXT3 in order to achieve comparable image quality to ATi's DXT1.
Not quite. nVidia's DXTC3 was superior to ATi's DXTC1 in terms of image quality. Don't take my word for it, Oldfart already mentioned it in this thread.
However I read your interpolation explanation but I still fail to see what possible gain a vendor has from storing 32 bits' worth of data and then, after using up the required resources to do so, suddenly chooses to do a crappy interpolation which AFAICT gains them absolutely nothing except horrendous image quality. Why do something like that? And why do it for three generations' worth of hardware?
That was what S3 did, and they created the standard. You said it yourself, it was only a slight performance decrease using '3'
despite using 32bit interpolation
and having lower levels of compression. They followed the creator of the standard. They didn't exceed it. Why keep it the same? How many games does it impact? Would they have been better served reducing the amount of time they spent on another aspect of their hardware to solve an issue that effects a very limited amount of games, one of them never in an official capacity? They knew they had an issue, they implanted a switch in the registry to force the use of S3TC3 for those that wanted to(unfortunately that would not work for UT as the textures were all precompressed which was not the case with the other titles that compressed at run time).
nVidia's DXT1 was so bad that it was unusable but the benchmark graphs with it enabled sure didn't tell anyone that.
Same with S3's, why aren't you bashing them about it?
No, as long as clear image quality screenshots were posted. But most sites had the "run-and-gun" approach and all people ever saw was the framerate graphs.
That sure as hell wasn't the case at the Basement.
However nVidia's DXT1 issue was the most blatent one because it wasn't even in the same ballpark of image quality offered by the competitors.
Again, what about S3?
Not supporting a feature by design is a totally different problem to supporting a common feature in a faulty fashion for three generations of hardware.
No, no, no. S3 created the feature, nVidia followed their implementation to the letter including all formats. Your issue is that they did not exceed the specification. Using that same logic, the R9700Pro has faulty PS2.0 support. They can't support 1,000 instructions in a single PS, nor can they run FP32. If your line is that following a spec exactly is wrong, then ATi is wrong with their PS 2.0 support.
So explain how it worked on your Ti4600 then. Also I guess the Ti4600 must be "cheating" for not supporting PS 2.0, right? And the NV1x is "cheating" for not supporting any version of PS, right?
If I applied your logic then absolutely. I don't apply your logic however.
The issue in question requires both application detection and prior knowledge in the driver for nVidia to be able to pull it off and for that reason, it's a cheat.
The brilinear? They aren't using app detection for that, they do it for
all D3D games. As far as using app detection for optimizations, PowerVR does this an incredible amount. Are they cheating in d@mn near every game you have heard of?