• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ATI's Radeon X800 texture filtering game at Tech Report

nemesismk2

Diamond Member
Please read this article and then explain to me why some of you keep defending ATI?

"Whatever the merits of ATI's adaptive trilinear filtering algorithm, ATI appears to have intentionally deceived members of the press, and by extension, the public, by claiming to use "full" trilinear filtering "all of the time" and recommending the use of colored mip map tools in order to verify this claim.

Encouraging reviewers to make comparisons to NVIDIA products with NVIDIA's similar trilinear optimizations turned off compounded the offense. Any points ATI has scored on NVIDIA over the past couple of years as NVIDIA has been caught in driver "optimizations" and the like are, in my book, wiped out."

ATI's Radeon X800 texture filtering game at Tech Report
 
That's weak. How about wait for someone to extensively examine what's going on in the Driver and Hardware levels. Simple play on words, comments, or Press Releases proves nothing.
 
I think that sums it up pretty well. It isn't like it is only his opinion on the matter. He is showing what they said through quoting the online chat as well as providing pictures and slides (provided by ATi) of what they wanted, and then stated what they were really doing (goes back to the chat as well as comp.de's findings).

Granted it isn't a retest, but it is all to go on at the moment. I agree that more has to be done before a concrete conclusion can be made in either direction.
 
Anyone that is sidetracked at this point by saying "there's nothing wrong with optimizations" needs to take some ritalin and focus. Optimizations to improve performance but keep all other factors in tact are GOOD. Now stay with me here a moment... Mip map tests are like the 3DMark of IQ. The driver detects colored mip map tests and then enables full trilinear. Otherwise it uses a lesser algorithm that yields better performance and lesser IQ. Are people still arguing that there's nothing wrong with this? 😕
 
It is ironic that when something we really like and trust is accused of deceitful tactics, with overwhelming proof, that people will defend the accused to the end. It sounds familiar. Like fans of Michael Jackson trying to justify his actions. LOFL!! It is ok because of who or what they are. Get real! Fanboys to the end.
 
Originally posted by: Transistor
Anyone that is sidetracked at this point by saying "there's nothing wrong with optimizations" needs to take some ritalin and focus. Optimizations to improve performance but keep all other factors in tact are GOOD. Now stay with me here a moment... Mip map tests are like the 3DMark of IQ. The driver detects colored mip map tests and then enables full trilinear. Otherwise it uses a lesser algorithm that yields better performance and lesser IQ. Are people still arguing that there's nothing wrong with this? 😕

They are not detecting colored mip maps specifically... they are analyzing the mips to determine what is the best filtering mode they can do without a loss in IQ. For colored mip maps this analysis results in full trilinear being the best filtering mode that can be done. This is because the mips are not box filtered equivalents of each other (like real world apps would actually use). If an app were to send down mips that looked nothing like each other (one a smiley face, one a texture of the sky for example) then this analysis would come to the same conclusion and enable full trilinear. This is not an application specific cheat, it doesn't just target benchmarks like so many of nVidia's cheats. EVERY GPU company does something like this.
 
Originally posted by: 413xram
It is ironic that when something we really like and trust is accused of deceitful tactics, with overwhelming proof, that people will defend the accused to the end. It sounds familiar. Like fans of Michael Jackson trying to justify his actions. LOFL!! It is ok because of who or what they are. Get real! Fanboys to the end.

What "overwhelming proof"? Last years Nvidia cheats were thoroughly examined and understood. So far 1 website has noted some phenomena, but not offered explicit detail on what's going on.
 
Originally posted by: sandorski
Originally posted by: 413xram
It is ironic that when something we really like and trust is accused of deceitful tactics, with overwhelming proof, that people will defend the accused to the end. It sounds familiar. Like fans of Michael Jackson trying to justify his actions. LOFL!! It is ok because of who or what they are. Get real! Fanboys to the end.

What "overwhelming proof"? Last years Nvidia cheats were thoroughly examined and understood. So far 1 website has noted some phenomena, but not offered explicit detail on what's going on.

They didn't have to... ATi themslves told us in the live chat that they were running an optimization. The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.
 
They didn't have to... ATi themslves told us in the live chat that they were running an optimization. The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.
Very unfair, each card should have each optimization enabled to see once and for all who is truly the man at fps at least.
 
Originally posted by: VIAN
They didn't have to... ATi themslves told us in the live chat that they were running an optimization. The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.
Very unfair, each card should have each optimization enabled to see once and for all who is truly the man at fps at least.
Actually, they should be tested both ways With and without the optimizations. That will tell us whio has the most raw, brute power as well as who has the best optimization in terms of IQ and performance.
 
Please read this article and then explain to me why some of you keep defending ATI?

Oh yes, I just let everybody else make up my mind for me :roll:

Encouraging reviewers to make comparisons to NVIDIA products with NVIDIA's similar trilinear optimizations turned off compounded the offense

Seems fair considering the competitor actually has quality differences between modes why ati does not...
 
Originally posted by: nitromullet
Originally posted by: sandorski
Originally posted by: 413xram
It is ironic that when something we really like and trust is accused of deceitful tactics, with overwhelming proof, that people will defend the accused to the end. It sounds familiar. Like fans of Michael Jackson trying to justify his actions. LOFL!! It is ok because of who or what they are. Get real! Fanboys to the end.

What "overwhelming proof"? Last years Nvidia cheats were thoroughly examined and understood. So far 1 website has noted some phenomena, but not offered explicit detail on what's going on.

They didn't have to... ATi themslves told us in the live chat that they were running an optimization. The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.

When did they say that and were NVidia's optomations affecting quality?
 
long and short what ati did was dirty....no sense in arguing against it....it just shows that they're "hardware" isn't as impressive as first though....it has convinced me to try nvidia again, as I'm sure many others....
 
Well, OMG ATI outsmarted Nvidea for two freaking years, LOL!!! ATI should be hired by Canada as a Canadian secret police they bushwacked Nvidea so well. Amazing isnt it. I was buying Nvidea 6800 whatever and now I feel even better about it. I think Nvidea should sue ATI for fraudulent marketing in an attempt to put Nvidea out of business. ATI accuses Nvidea of the very thing ATI is doing but ATI manages to hide it far better and do it better to boot. Tragic and funny at the same time. FREAKING AMAZING!!!
 
I'm starting to get some doubts. These incidents need answers.

Well, OMG ATI outsmarted Nvidea for two freaking years, LOL!!! ATI should be hired by Canada as a Canadian secret police they bushwacked Nvidea so well. Amazing isnt it. I was buying Nvidea 6800 whatever and now I feel even better about it. I think Nvidea should sue ATI for fraudulent marketing in an attempt to put Nvidea out of business. ATI accuses Nvidea of the very thing ATI is doing but ATI manages to hide it far better and do it better to boot. Tragic and funny at the same time. FREAKING AMAZING!!!
But they were only secretive for two years, not to mention ATI being behind almost ever major leak.
 
Okay, maybe I don't know what I'm talking about - and that's entirely possible - but in my opinion ATI isn't really cheating. The way I understand it depending on what is being rendered they are sometimes using a workaround and the IQ result is the same with a higher frame rate (at least that's the way I understand it). If that's really the case, who cares?

And as far as benchmarks being unfair, maybe and maybe not. If ATI's "workaround rendering" results in equal or better IQ than Nvidia's full rendering then it's not unfair. I look at it as a feature rather than a cheat and just because Nvidia's cards can't do it doesn't mean ATI should have to disable it or turn it off in benchmarks.

I've read a lot of posts that don't agree with that but I don't understand the reasoning behind it. I want to see proof that ATI's "cheating" results in lower IQ than Nvidia's rendering. If it's true I'm sure the Nvidia camp would've exposed this by now and AFAIK they haven't. Until then I think ATI's only mistake was not being up front with exactly what's going on.
 
It seems to me that people want to argue about the optimization being a cheat. Looks like a great technique to me. The problem is, both cards should have been benched either with both using optimizations or both using full trilinear filtering. This is not a hard concept to understand or agree upon. People want to get a clear picture of a cards true performance, apples to apples. How about some new
benchies 😎
 
Originally posted by: kingmike
It seems to me that people want to argue about the optimization being a cheat. Looks like a great technique to me. The problem is, both cards should have been benched either with both using optimizations or both using full trilinear filtering. This is not a hard concept to understand or agree upon. People want to get a clear picture of a cards true performance, apples to apples. How about some new
benchies 😎
This is where I disagree. If ATI has a feature that Nvida does not why should they be forced to turn it off just to "level the playing field"? It would be no different to ask Nvidia to turn off their PS3 feature in tests because ATI doesn't offer it. If ATI can produce equal IQ with and without the feature enabled then they shouldn't be forced to disable it in tests. Just my opinion though...
 
No one said you need to turn it off as the the only benchmark to be used. But to make comparisons using both settings from both cards, then compare.
 
whether its a cheat or not, i couldn't care less. what i do care about is the false advertising. they're little booklet says "full trilinear...all the time" and what little effect the optimization may have on IQ, that is, as ATI itself admits, NOT the case.

-Vivan
 
I am still waiting for a video showing some (any) IQ difference in actual game play. I understand that Nvidia owners feel hard done by with all the bad press about game by game optimizations and misery likes company. Unfortunately ATI looks like they will ride this one out. Maybe if Nvidia had an open chat with top people to answer questions .......
 
I agree with Damage that ATi's marketing/PR dept should take a credibility hit with this, probably to nV's current level.

The jury is out WRT to IQ, though, and we'll have to wait a week or two for ppl to get some comprehensive screengrabbing and comparing done before we can conclude ATi's engineers deserve some scorn (similar to that served up for their angle-dependent AF), too. If they are indeed able to reduce "trilinear" memory access, though, they should consider adding an all-angle AF mode.
 
Back
Top