ATI cheating in AF?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Matthias99
Originally posted by: DAPUNISHER
However, I personally think there shoud not be *any* silent optimizations like this in the drivers. At the very least, you should be able to disable it, and get "full" trilinear all the time, even if that results in a big performance hit.
There are reasons that are dubious ones for why they were silent about this, but personally I am all for optimizations that can increase performance without trashing IQ. Perhaps the reason they didn't announce this feature was becuase they were a bit timid given the atmosphere a year ago ;) Besides it might have taken heat off of nV for their optimizations, despite the fact their's did trash IQ in some instances, and you should never interupt the enemy when they are making a mistake.

I'm all for optimizations that increase performance without lowering IQ. However, if you set a graphics card to give you "trilinear" filtering, I would normally expect that to be what some folks over on B3D are calling "naive" trilinear -- that is, complete trilinear filtering on all mip-map levels (which is, indeed, what you get right now if you use colored mip-maps). On the X800, though, you're actually getting some sort of adaptive trilinear, which is not identical to full trilinear in most real-world situations. Now, it may be 99% as good 99% of the time, and if it is, I'm thrilled. But they still should have *said* they were doing this, and given an option to turn it off.

either way you know very well 99% of the time no one is gonna turn it off.
especially those who need those extra FPS. granted yes it would be nice if you could turn it off...but alot of people would not bother since they cant see a difference.
 

jamesbond007

Diamond Member
Dec 21, 2000
5,280
0
71
Sorry I cannot input much into the content of this thread, but I've come to ask myself...

...am I the only one on this planet that does not use any AF/AA? I have a 5900XT and run 1280x1024 (LCD resolution). If my videocard can accomplish this, then I am happy. When I turn on AA/AF, I see the difference only when I stand still, but when I am moving around (I play FPS games mostly), I cannot notice any difference, except lower FPS on my older nVIDIA/ATi cards.

I encourage nVIDIA/ATi/AnandTech to run a poll to see how many consumers use the AF/AA features and abilities of their cards?

In a nutshell, a higher resolution is something I notice rather than AA/AF. To be frank, it's the only thing I care about.

My $0.02.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,070
32,594
146
You are right Matt, and I agree, however it is a poor business strategy. Honesty and forthrightness are not always the best policy for a corporation to follow. Pros and cons are carefully considered and when the powers that be err in judgement, it is then time for damage control. That's what you are seeing now, damage control. The damage isn't great so it's not a difficult task. nV's was a far more difficult situation to manage.

The culmination of this will be that the number of potential customers for their products will not be significantly effected by this optimization that wasn't spoken of until discovered. Therefore it'll prove a winning strategy since the bottom line is profit, and the performance figures with minimal impact on IQ this optimization provides will help their bottom line ;)

Ethical considerations are another matter, and perhaps some level of outrage is warranted here, but that evidently does not fall within the purveiw of ATi's, or nV's for that matter, business strategy.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Dean
Geesh. You guys are acting like the ATI implementation of Brilinear is the Devil's doing when in fact Nvidia has been going it as well for the past two years(and globally at that).
Apparently ATI has been doing it for over a year as well, at least. For those of us who had "ATI's superior AF" rammed down their throat like Ron Jeremy's unit finding Ginger Lynn's tonsils about every day for that year plus, this is fairly huge news.

Rollo, you said that ATI's implementation of quack was a cheat and gave it a speed enhancement of 20%. Show me the proof please as every site I saw showed the FPS to be the same after the so called ?cheat? was removed. What you are saying is an out and out LIE!!
Rollo never lies. The sooner you accept that, the closer to peace you will be. LOL
The truth about Quack
Analysis
Even with stock drivers the RADEON 8500 isn't able to keep up with the GeForce3 and GeForce3 Ti 500 in high quality mode. We do see the performance increase it brings, roughly greater than 15% (although this figure does vary depending on the resolution). As you can see, without the driver modifications the RADEON 8500 finishes behind the GeForce3 Ti 200, thus saving a potentially embarassing situation for ATI.

More Quack hijinks
The Facts As We See Them
It certainly seems to us here at ardOCP that ATi has in fact included application specific instructions in their version 5.13.01.3276 Win2K drivers that make Quake 3 arena benchmarks faster by up to over 15%.
You may be the only person I've ever seen dispute that Quack was a app specific cheat on ATIs part that lowered IQ to raise FPS. Where do you get your info? ATIs web page? LOL


I consider all these Brilinear optimizations to be cheap. I would rather have true tri when I want it but in the holy grail of ?my b@lls are bigger than yours mentality? that has affected the entire graphics card industry, things will only get worse. We will see more from Nvidia and ATI trying to 1-up each other by trying to squeeze as much fps as possible to beat the competition. We saw it on a global incarnation with the NV3X and it will continue this generation.

No one would care about the brilinear if it was public knowledge and selectable to make it easy to bench cards at similar settings.

-Vindicated Rollo
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
I wonder what the true performance of the X800 XT is when you consider that ATI has been cheating? For ATI to do this I would imagine that the 6800 Ultra was wiping the floor with it like the 9800 XT did to the 5900 etc so they had to cheat to keep up. ;)
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
The thread on Beyond3D is up to 30 pages! Any insight discovered there is lost in a sea of text.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,070
32,594
146
Originally posted by: nemesismk2
I wonder what the true performance of the X800 XT is when you consider that ATI has been cheating? For ATI to do this I would imagine that the 6800 Ultra was wiping the floor with it like the 9800 XT did to the 5900 etc so they had to cheat to keep up. ;)
"Cheat", "optimize", just words :) A company does what it must to achieve victory. I'm am just pleased to see that it is becoming more and more evident to those who believed otherwise, that there is no moral high-ground for any particular corporation. My cynicism towards corporations and goverments is something that was learned through experience, not arrived at through whim or fancy ;)

EDIT: BTW, I find this "issue" to be far less disturbing than the collusion between ATi and Valve that I believe took place. That kind of subterfuge and dissemination of disinformation smacks of contempt for the consumer and that is something I will hit the war path over! :evil:
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
A quite from ATI:-

"New graphics performance records set with ATI?s RADEON? X800 XT Platinum Edition visual processing unit."

Well we know how they achieved such performance records now don't we by cheating! ;)
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,070
32,594
146
Originally posted by: nemesismk2
A quite from ATI:-

"New graphics performance records set with ATI?s RADEON? X800 XT Platinum Edition visual processing unit."

Well we know how they achieved such performance records now don't we by cheating! ;)
It's OK to be an nV minion, just don't fall on your own sword ;)
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Originally posted by: DAPUNISHER
Originally posted by: nemesismk2
A quite from ATI:-

"New graphics performance records set with ATI?s RADEON? X800 XT Platinum Edition visual processing unit."

Well we know how they achieved such performance records now don't we by cheating! ;)
It's OK to be an nV minion, just don't fall on your own sword ;)


LOL!
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Ackmed
Originally posted by: Matthias99
This was put up by the Inquirer; it is an official statement from ATI on this:

That was posted last page.

Whoops. I totally missed the last 10 posts or so on page 2 of this thread. Sorry about reposting that whole thing. Comments still stand. :)
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: DAPUNISHER
Originally posted by: nemesismk2
A quite from ATI:-

"New graphics performance records set with ATI?s RADEON? X800 XT Platinum Edition visual processing unit."

Well we know how they achieved such performance records now don't we by cheating! ;)
It's OK to be an nV minion, just don't fall on your own sword ;)

I have never been a NV minion because I am a PowerVR minion! ;)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Looking the same doesn't mean they are doing the same thing, especially when one considers the all too important FPS meter.

A quick example:

1. 6800U doing full trilinear and getting 40fps at 1600x1200 4AA/8Anio
2: X800 doing 'brilinear' and getting 45fps at 1600x1200 4AA/8Anio

And now the kicker (the all important FPS meter)

3. 6800U doing full trilinear and getting 40fps at 1600x1200 4AA/8Anio
4: X800 doing full trilinear' and getting 25fps at 1600x1200 4AA/8Anio

Image can look the same, performance is quite different. Would make a huge difference to review sites as well as sway potential buyers.

Once again, games are what matters.

either way you know very well 99% of the time no one is gonna turn it off.
especially those who need those extra FPS. granted yes it would be nice if you could turn it off...but alot of people would not bother since they cant see a difference.

These three ideas are all related. Having the optimization present is ok in my opinion as long as it is made public and can be disabled. If I am going to spend $500 on a vid card, I want raw power. If, on top of this raw hardware power, I have the option to enable optimizations to improve my framerates by accepting varying degrees of IQ reduction than that should be up to me.
 

ChkSix

Member
May 5, 2004
192
0
0
If I am going to spend $500 on a vid card, I want raw power. If, on top of this raw hardware power, I have the option to enable optimizations to improve my framerates by accepting varying degrees of IQ reduction than that should be up to me.

Definitely. :beer:

By the way Pun, I'm going to have to get that book now. :D
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Originally posted by: Rollo
Originally posted by: Dean
Geesh. You guys are acting like the ATI implementation of Brilinear is the Devil's doing when in fact Nvidia has been going it as well for the past two years(and globally at that).
Apparently ATI has been doing it for over a year as well, at least. For those of us who had "ATI's superior AF" rammed down their throat like Ron Jeremy's unit finding Ginger Lynn's tonsils about every day for that year plus, this is fairly huge news.

Rollo, you said that ATI's implementation of quack was a cheat and gave it a speed enhancement of 20%. Show me the proof please as every site I saw showed the FPS to be the same after the so called ?cheat? was removed. What you are saying is an out and out LIE!!
Rollo never lies. The sooner you accept that, the closer to peace you will be. LOL
The truth about Quack
Analysis
Even with stock drivers the RADEON 8500 isn't able to keep up with the GeForce3 and GeForce3 Ti 500 in high quality mode. We do see the performance increase it brings, roughly greater than 15% (although this figure does vary depending on the resolution). As you can see, without the driver modifications the RADEON 8500 finishes behind the GeForce3 Ti 200, thus saving a potentially embarassing situation for ATI.

More Quack hijinks
The Facts As We See Them
It certainly seems to us here at ardOCP that ATi has in fact included application specific instructions in their version 5.13.01.3276 Win2K drivers that make Quake 3 arena benchmarks faster by up to over 15%.
You may be the only person I've ever seen dispute that Quack was a app specific cheat on ATIs part that lowered IQ to raise FPS. Where do you get your info? ATIs web page? LOL


I consider all these Brilinear optimizations to be cheap. I would rather have true tri when I want it but in the holy grail of ?my b@lls are bigger than yours mentality? that has affected the entire graphics card industry, things will only get worse. We will see more from Nvidia and ATI trying to 1-up each other by trying to squeeze as much fps as possible to beat the competition. We saw it on a global incarnation with the NV3X and it will continue this generation.

No one would care about the brilinear if it was public knowledge and selectable to make it easy to bench cards at similar settings.

-Vindicated Rollo

You are comparing using quake.exe and quack.exe on the same bug ridden driver set. Their was a bug in those drivers that affected the new 8500. The following set of drivers the quack fiasco was gone and the FPS was restored. That so called cheat was an optimization for the r100 series cards and it had a detrimental effect on the 8500 series. The only thing ATI was guilty of back then was shoddy drivers. NEXT!!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,070
32,594
146
Originally posted by: ChkSix
If I am going to spend $500 on a vid card, I want raw power. If, on top of this raw hardware power, I have the option to enable optimizations to improve my framerates by accepting varying degrees of IQ reduction than that should be up to me.

Definitely. :beer:

By the way Pun, I'm going to have to get that book now. :D
You will read it 20 times and still find new insights coming to you about it's content :beer: I think you will enjoy the parts that deal with his exploits as well. He was a filthy and haggard old man who smelled to high heaven because of his philosophy after his "bath house incident". Bushido is a very rigid philosophy with very little room for the touchy feely crowd that's for certain. Appologies for the OT post :)
 

ChkSix

Member
May 5, 2004
192
0
0
Apparently ATI has been doing it for over a year as well, at least. For those of us who had "ATI's superior AF" rammed down their throat like Ron Jeremy's unit finding Ginger Lynn's tonsils about every day for that year plus, this is fairly huge news.

Dean just made quote of the year in my book! I loved that one bro!! :D:beer:
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Bump because people cant seem to look down 5 post and feel like making another on the same topic.
 

ChkSix

Member
May 5, 2004
192
0
0
Originally posted by: DAPUNISHER
Originally posted by: ChkSix
If I am going to spend $500 on a vid card, I want raw power. If, on top of this raw hardware power, I have the option to enable optimizations to improve my framerates by accepting varying degrees of IQ reduction than that should be up to me.

Definitely. :beer:

By the way Pun, I'm going to have to get that book now. :D
You will read it 20 times and still find new insights coming to you about it's content :beer: I think you will enjoy the parts that deal with his exploits as well. He was a filthy and haggard old man who smelled to high heaven because of his philosophy after his "bath house incident". Bushido is a very rigid philosophy with very little room for the touchy feely crowd that's for certain. Appologies for the OT post :)



Yeah, I look forward to reading it. Thanks again bro. Cheers :beer:
 

ChkSix

Member
May 5, 2004
192
0
0
Well, I would hope they redo their testings, with and without optimizations. I believe ( but could be wrong) that without the optimizations, the NV40 would easily knock the R420 out of the ballpark, if what the German website (20% performance hit without optimizations on the Ati) is true.