ATI cheating in AF?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
Mattias: No I am glad they said something, however what they said was very neutral, leaving the possibility wide open for more cheats to be uncovered. But that's a normal practice in business, and I understand this.

Their answer satisfied me. I don't know what more they could say about it other than explaining what was causing the performance delta that was being seen.

As far as what i said about 20% loss the R420 would get hit with when removing the optimizations and performing on par with a 9800xt clock for clock, it is all available on comp.de or the beyond 3d forums.

...in German. :p

I tried to read their article through a translator, and I'm still not sure I got all the details, but yes, I believe that it is clock-for-clock with a 9800XT. Considering that the R420 is built on top of the R360, this is not surprising. However, the R360 was no slouch at AF itself.

And I doubt the R420 will be still be ahead without the opts, because in 99.9% of the bencmarks it was never 20% ahead of the 6800U in anything, but that depends on the benchmarks and sites you use for your own personal analysis.

It's *very* hard to say without a more through suite of tests being run. comp.de only showed numbers for 4xAA/16xAF -- not even AF alone!

I based my conclusion on looking at performance drops from non-AA/AF to 16xAF modes for the 6800 compared with the R420. For instance, here, the 6800 takes a good 35% performance hit at 16xAF, whereas the X800XT loses less than 5%, and the X800Pro is losing only around 10%. Beyond3D showed a similar hit in their testing -- the 6800 dropped in excess of 30% in some tests with AF enabled, while the X800s only lost 5-10% (R420 NV40). It's far from conclusive, though -- what we need are graphs like this for multiple games with colored mipmaps enabled and disabled. That would show the performance hit/benefit (from your perspective) that ATI's optimizations are responsible for.
 

ChkSix

Member
May 5, 2004
192
0
0
I was utterly shocked by his response. I thought he would say a little more than what just pertained to a PR release, but that wasn't the case at all. Hence my verbal attack in the beginning (which was the most recent email). Since then, he has been put on my Mailwasher "blacklist".

But yet, the minute an optimization is uncovered for Nvidia, it would make Driverheaven and HardOCP headline news.
 

ChkSix

Member
May 5, 2004
192
0
0
Originally posted by: Matthias99
Originally posted by: ChkSix
Mattias: No I am glad they said something, however what they said was very neutral, leaving the possibility wide open for more cheats to be uncovered. But that's a normal practice in business, and I understand this.

Their answer satisfied me. I don't know what more they could say about it other than explaining what was causing the performance delta that was being seen.

As far as what i said about 20% loss the R420 would get hit with when removing the optimizations and performing on par with a 9800xt clock for clock, it is all available on comp.de or the beyond 3d forums.

...in German. :p

I tried to read their article through a translator, and I'm still not sure I got all the details, but yes, I believe that it is clock-for-clock with a 9800XT. Considering that the R420 is built on top of the R360, this is not surprising. However, the R360 was no slouch at AF itself.

And I doubt the R420 will be still be ahead without the opts, because in 99.9% of the bencmarks it was never 20% ahead of the 6800U in anything, but that depends on the benchmarks and sites you use for your own personal analysis.

It's *very* hard to say without a more through suite of tests being run. comp.de only showed numbers for 4xAA/16xAF -- not even AF alone!

I based my conclusion on looking at performance drops from non-AA/AF to 16xAF modes for the 6800 compared with the R420. For instance, here, the 6800 takes a good 35% performance hit at 16xAF, whereas the X800XT loses less than 5%, and the X800Pro is losing only around 10%. Beyond3D showed a similar hit in their testing -- the 6800 dropped in excess of 30% in some tests with AF enabled, while the X800s only lost 5-10% (R420 NV40). It's far from conclusive, though -- what we need are graphs like this for multiple games with colored mipmaps enabled and disabled. That would show the performance hit/benefit (from your perspective) that ATI's optimizations are responsible for.


I agree with you Matt. Sorry if I jumped to any conslusions. Let me see if I can find the original claim from comp.de in English for you (Beyond 3D had it I think). Here it is:

Here
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
Wow you guys are not only extremely biased... ...only fanboys of ATi are the only people that take your site or benchmarking seriously.

How much does ATi fund you, or better yet, how much profit do you gain by trying to discount what six or so other tech forums found to be the case?

I was a member on HardOCP forums, and those primitive fools would tell you the sky is black if you stuck an ATi label on it...

...I'll stay with others who are a little less slanted.

And you got a response to that? I'm impressed. I would've trashed your email in a heartbeat. It's clear you've already made up your mind about them...

Edit:

I was utterly shocked by his response. I thought he would say a little more than what just pertained to a PR release, but that wasn't the case at all. Hence my verbal attack in the beginning (which was the most recent email). Since then, he has been put on my Mailwasher "blacklist".

But yet, the minute an optimization is uncovered for Nvidia, it would make Driverheaven and HardOCP headline news.

Ah, that *was* your response. You got a terse reply to your first email -- but think about how many emails they're probably getting over this issue. "Don't be a slanted or biased website"? Gimme a break. You didn't say "Hey, I saw this article, and I think you should post it", you said "Hey, post this article or you're biased idiots!" I'm not surprised you got a 1-sentence reply.
 

ChkSix

Member
May 5, 2004
192
0
0
No I didn't get a response from the top email in my thread, but the one all the way at the bottom up to the last one I wrote. Yes my mind is pretty much made up regarding them, and it was something that took a long time to come to. You can formulate your own opinion, but watch them, their stories, and their reviews closely...and you will clearly see a pattern emerge that trashes Nvidia at any chance they get while on the other hand glorifying ATi even when something as controversial as this comes to light. Driverheaven in my eyes is no different.

Bottom line is that they are biased in my opinion. Their reviews show it (which oddly differs drastically from other sites doing the same exact benchmarks using both similiar cards) as well as their choice of news (ie: Nvidia optimizations exposed, nothing for ATi, even with this current scandal).

Again, you don't have to agree with me. In fact we can agree to disagree. I have written to him before which can explain by biased comments in my early emails a little better, however I don't have those anymore.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
ATI and Nvidia both cheat when it comes to trilinear. Everyone now wants ATI to admit they are cheating when it comes to it. Are Nvidia and ATI supposed to announce new "cheats"?

Neither company will be willing to sacrifice 15%-20% FPS to have an almost undistinguishable amount of higher quality. A lot of people in here are now saying that ATI will get their @ss handed to them if they remove that optimization, and I guess they probably would. Nobody is here telling Nvidia to remove theirs though or implying the consequences(FPS penalty) or how ATI would crush them if they did.
 

ChkSix

Member
May 5, 2004
192
0
0
The only 'supposed" optmization that came out of the beta drivers used to test the 6800U with Nvidia dealt with Far Cry. As far as 'crushing', again I simply doubt this would be the case since with optimizations, it doesn't crush it at all but is more neck and neck with it.

Anandtech's initial test on both cards is more than proof of this.

No I guess not about announcing cheats by both parties is concerned. However, it was ATi that had a stance of invunerabilty regarding this and was something that was frequently mentioned (and thrown harshly at Nvidia..understandably) when discussing ATi and Nvidia cards in the past. I guess going forward, this cannot be used as ammunition on either side of the fence.
 

ChkSix

Member
May 5, 2004
192
0
0
Hahahah!! Pick me up one too! Or maybe an STB Velocity perhaps? Wasn't that 3dfx that took over STB? :confused:
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
Originally posted by: ChkSix
Hahahah!! Pick me up one too! Or maybe an STB Velocity perhaps? Wasn't that 3dfx that took over STB? :confused:

My STB Velocity 4400 ownd all. :D
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: ChkSix
Hey Bar, maybe you should post that link again to show him just what we're talking about. Oh wait, I got it right here.
http://hanisbottomline.com/pics/IMG0007852_1.jpg

That my friend is decietful to the consumer who spends half a grand on a videocard. Not telling the truth about what you do, and then lying about it in your official .pdf while claiming a holier than thou attitude as far as cheating goes, is lies, cheating and complete nonsense.
Is this Trilinear AF mode using their adaptive algo. on by default? If so, it's no lie. ATI's trilinear implementation is optimized to switch to the most efficient mode of rendering without harming IQ. That seems to be the *ENTIRE* point. NVidia's stance of simply disabling it I could see maybe why someone would complain about, as it affects IQ in a slightly apparent way, but I don't see how this algorithm is blatant all-out cheating. They even encourage people to look at the IQ:
(from that image of the X800 IQ pdf)
"ATI recognizes that the customer wants the best image quality by default, with the option to decrease Image Quality later if so desired. We recommend that reviewers visually check their benchmarks with color mipmap tools such as 3DAnalyzer, or use in-game tools (such as firstcoloredmip in UT) to review our image quality advantages."

I like how you guys refuse to answer some of my other points. You are focused on the fact that they don't apply trillinear AF where it isn't necessary to maintain a level of IQ. You are literally pissed off they are being more efficient and "misleading" if you be foolish and take a simple marketing PDF's statement as being a 100% solid explanation of the process by which they have implemented AF. That is laughable. Enjoy your hating, I imagine most people who read the benchmarks and understand what ATI has said they're doing in their AF implementation won't lose too much sleep over it.

There is an old adage about making mountains out of molehills. I think this thread is a classic illustration of that.
 

ChkSix

Member
May 5, 2004
192
0
0
I'm not hating at all. I have equally blamed both parties in this entire thread and in many of my responses. I can grab snippets of it if you like and post them here.

As far as what you're saying regarding ATi's methods, I encourage you to look here:

http://www.beyond3d.com/forum/viewtopic.php?t=12486

I'm no expert, but there seem to be a few of them posting in that thread above. And from what I have gathered, what they have found on their own does seem a bit 'fishy'.

To sum it up best, I'll quote a post on page 31 of the Beyond3D topic.

----------------------------------------------------------------------------------------------
I think the problem is that up until now, ATI has been considered the "Pure IQ" company. Hell, people have defended them in that way to the death on forums around the globe.

What you're seeing here is a term known as backlash. ATI should have seen it coming, they should know by now how worked up people get over these issues. It would have be less painfull to just expose this themselves or to have put a toggle in the C.P.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Dean
Nobody is here telling Nvidia to remove theirs though or implying the consequences(FPS penalty) or how ATI would crush them if they did.



well right now thats not the point of focus. the point is everyone put ATi on a pedestal because they never pulled any BS. And now theyve been caught in something everyone was bashing Nvidia over.

also chksix HardOCP works with BFG tech man they are a NV company and OCP praises their stuff highly
So I cant really call them biased since when some one does screw up they do call em on it.
 

ChkSix

Member
May 5, 2004
192
0
0
Yeah I hear you Visable. I have a lot of animosity towards them for several reasons and it is only my point of view. I don't expect everyone to agree with me on it, in all honestly it can even be personally driven on my end.

:beer:
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: ChkSix
Yeah I hear you Visable. I have a lot of animosity towards them for several reasons and it is only my point of view. I don't expect everyone to agree with me on it, in all honestly it can even be personally driven on my end.

:beer:


I cant argue with ya man I have no idea what transpired between you and them. Whatever happoned man im sorry :(
theyve been real good to me the last hell 4 years ive been with em :)
 

ChkSix

Member
May 5, 2004
192
0
0
No problem bro, and thanks. I personally just need to let certain things go honestly. It's old news and under the bridge....maybe it's time for me to release the grudge and start them off with a clean slate. At least give them a chance again before forumlating an opinion based on an old event. That would be the best way for me to progess a little further in life for the better. Cheers

:beer:
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
What exactly happened between you and HOCP to cause you to address Brent in such a hostile manner?
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Why are people dragging AF into the mix? This is about Trilinear ?right????????

And ATI is NOT doing Brilinear?it?s Trilinear.


ATI?

?The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering?.

Microsoft does set some standards for texture filtering and the company?s WHQL process includes extensive image quality tests for trilinear filtering and mipmapping. CATALYST passes all these tests ? ?
What don?t you people understand about these statements? ATI is doing Trilinear.

This is great news for ATI users. ATI has found a faster way of doing Trilinear. They?re moving technology forward. I can get Trilinear quality on my 9600XT with less of a performance hit. Some people are so caught up in trying to even the ?cheating? score in this thread they?re trying to make this better technolgy look like a cheat. Ridiculous.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I though the definition of trilinear was fixed, unlike AF. So ATi may in fact not be performing trilinear as previously defined, though whether its new sampling algorithm is any worse has yet to be determined.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Blastman
Why are people dragging AF into the mix? This is about Trilinear ?right????????

And ATI is NOT doing Brilinear?it?s Trilinear.


ATI?

?The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering?.

Microsoft does set some standards for texture filtering and the company?s WHQL process includes extensive image quality tests for trilinear filtering and mipmapping. CATALYST passes all these tests ? ?
What don?t you people understand about these statements? ATI is doing Trilinear.

This is great news for ATI users. ATI has found a faster way of doing Trilinear. They?re moving technology forward. I can get Trilinear quality on my 9600XT with less of a performance hit. Some people are so caught up in trying to even the ?cheating? score in this thread they?re trying to make this better technolgy look like a cheat. Ridiculous.


well Nvidia found a faster way of trilinear too....sure it suffered from some slight IQ loss....nothing super big but everyone b1tched.....ATi does it...and its OK? how the hell does this work?
 

imported_Aelius

Golden Member
Apr 25, 2004
1,988
0
0
I'm not going to judge the ATI card I want to get based on current information.

I want to see how retail drivers operate. Not beta drivers.

Also I don't care if Tri is turned off if whatever I'm looking at cannot use it. If it can use it and the driver turns it off then I care. Same with AA/AF.

If they cannot get their act in gear I will find myself another video card to buy.

I sure as heck aint buying an $800 (CAN) video card so that I get less quality in the image than I should.

I won't be running it at insane high resolutions. Maybe 1024/768 or one notch higher.

The reason I'm getting an insanely expansive video card is to get super high image quality at mid resolution in a game like WoW that evolves over time and will require a powerful vid card as time goes on.

If no card can provide this then I will get some second stringer card like the 9800Pro and stick to that until something decent comes along that isn't a ripoff.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: VisableAssassin

well Nvidia found a faster way of trilinear too....sure it suffered from some slight IQ loss....nothing super big but everyone b1tched.....ATi does it...and its OK? how the hell does this work?
How do you know this method of Trilinear produces worse IQ?

This is about the old way of doing Trilinear versus a new-better-faster way of doing Trilinear-- Adaptive Trilinear if you want to call it. Sure it?s faster, but if it doesn?t reduce IQ why wouldn?t you want to use it? And did you consider that maybe it produces better IQ than the old Trilinear. The indications from a couple of posts at beyond3d (in that big thread) are that the IQ on the X800 is better than on the 9800. If that?s the case, then this is really pushing technology forward. Faster doesn?t necessarily mean worse. A faster Trilinear with better IQ is having your cake and eating too.

So my hats off to ATI engineers with their Adaptive-Trilinear -- well done.