ATI cheating in AF?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: nitromullet
Maybe true, but not relevant. If you claim to do something with your hardware, then it should do it. Period.

...of course all of this is assuming that there is some cheating going on. We'll have to see about that.


spank spank spank spank spank spank spank spank
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: reever
Originally posted by: Blastman
Quack was a bug. The next driver revision fixed it with no performance difference.

Do you really think he even cares?

hell I dont even care the point is they did indeed cheated. Fixed or not they are not as clean as everyone makes them out to be. sh1t there isnt an honest company out anymore.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
This is interesting and if Dave Bauman even says something is fishy there is something up.

BTW reading the B3D thread Dave makes a mention of this being in since at least the Cat 3.4s
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,036
32,521
146
Originally posted by: VisableAssassin
Originally posted by: reever
Originally posted by: Blastman
Quack was a bug. The next driver revision fixed it with no performance difference.

Do you really think he even cares?

hell I dont even care the point is they did indeed cheated. Fixed or not they are not as clean as everyone makes them out to be. sh1t there isnt an honest company out anymore.
Never were many :(
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
What is the performance difference between trilinear and bilinear on these cards? Is it really that huge? Just wondering how u call NV the winner now. (link plz)
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Marsumane
What is the performance difference between trilinear and bilinear on these cards? Is it really that huge? Just wondering how u call NV the winner now. (link plz)

myself I dont notice much of a difference....but IQ is always subjective what looks good to one may look like ass on a window to another.
And I believe they are referencing NV as the winner now due to them not "cheating" this go around.
 

ChkSix

Member
May 5, 2004
192
0
0
Trilinear means more of a hit FPS wise. This finding not only explains the minimial hit the ATi was taking when going from low resolution/no AF to high resolution/high AF, but can also explain in some cases the slight lead it was having in some of the benchmarked games. If both are running full trilinear at high resolutions, given what is known of the original benchmarks and reviews and what has been found recently about ATI's AF methods, the ATi won't come ahead at all, but would possibly be behind the 6800Ultra (maybe not at low res/no af..which could yield similiar results, but definitely at high res high AF). This is all still speculation of course.

As far as a winner is concerned, there have been many respectable sites (including Anandtech) that never claimed a sure victor, but said it all comes down to preference and the consumer.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
If this is true then ATi is definitely cheating. However I can't see how they could possibly detect coloured mip-maps in the first place.



What what what?! BFG, no defense of ATIs "superior" AF?!?!!?

[Rollo drops to the floor, out cold]


LOL Well, I guess the 8,782,415 posts I've read about "that damn nVidia brilinear" just became irrelevant. What will fanATIcs beat into the ground now?
 

ChkSix

Member
May 5, 2004
192
0
0
I sent Brent over at HardOCP an email asking him to cover this on his website. I am interested to see if he'll look into it further, or just brush it off as BS (which doesn't seem to be the case).
 

reever

Senior member
Oct 4, 2003
451
0
0
What will fanATIcs beat into the ground now?

Or what the Nvidia defenders with their "oh if it doesn't reduce iq it's fine" argument will do now when they're trying to put the opposite of the argument they used not too long ago as an attack on ati...
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: reever
What will fanATIcs beat into the ground now?

Or what the Nvidia defenders with their "oh if it doesn't reduce iq it's fine" argument will do now when they're trying to put the opposite of the argument they used not too long ago as an attack on ati...




well the tables are now turn "wholesome" ATi is now busted, so the fanboys are all gonna come in with excuses to justify their purchase. whenever a company screws up the fan boys always justify that company.
personally asl ong as it dont look like ass when Im playing do what ya need to, to get my FPS :)
Right now everyones beloved ATi has been busted and there will be alot of people thatll defend that. just watch :)
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: VisableAssassin
Originally posted by: reever
What will fanATIcs beat into the ground now?

Or what the Nvidia defenders with their "oh if it doesn't reduce iq it's fine" argument will do now when they're trying to put the opposite of the argument they used not too long ago as an attack on ati...




well the tables are now turn "wholesome" ATi is now busted, so the fanboys are all gonna come in with excuses to justify their purchase. whenever a company screws up the fan boys always justify that company.
personally asl ong as it dont look like ass when Im playing do what ya need to, to get my FPS :)
Right now everyones beloved ATi has been busted and there will be alot of people thatll defend that. just watch :)


sorta like when they said 2 slot solutions suck and they dont have room for the extra slot, then they start buying the vga silencers to turn their ati cards into 2 slot solutions
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Schadenfroh
Originally posted by: VisableAssassin
Originally posted by: reever
What will fanATIcs beat into the ground now?

Or what the Nvidia defenders with their "oh if it doesn't reduce iq it's fine" argument will do now when they're trying to put the opposite of the argument they used not too long ago as an attack on ati...




well the tables are now turn "wholesome" ATi is now busted, so the fanboys are all gonna come in with excuses to justify their purchase. whenever a company screws up the fan boys always justify that company.
personally asl ong as it dont look like ass when Im playing do what ya need to, to get my FPS :)
Right now everyones beloved ATi has been busted and there will be alot of people thatll defend that. just watch :)


sorta like when they said 2 slot solutions suck and they dont have room for the extra slot, then they start buying the vga silencers to turn their ati cards into 2 slot solutions


yeah Ive mentioned that in another thread.
Im just gonna sit back and watch as this thread either goes great and offers some good conversing or turns into a fanboy flame war....
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
As far as a winner is concerned, there have been many respectable sites (including Anandtech) that never claimed a sure victor, but said it all comes down to preference and the consumer.
That is precisely why this is a big deal. Currently, ATi has the paper launch lead with the X800 XT by a small gap over the 6800 Ultra. If the claims that ATi is cheating are true, it may change that small lead for ATi into a legitimate lead for nVidia.
 

Cawchy87

Diamond Member
Mar 8, 2004
5,104
2
81
Originally posted by: Naustica
Originally posted by: Schadenfroh
Originally posted by: Naustica
So Matrox is the only honest company left. :)


nope, they got caught a few years ago, guess you will have to go with Cirrus Logic


I rather go with Intel Extreme Graphics.


There's nothing exteme about them lol, false advertising! Now there's something on them too.
 

ChkSix

Member
May 5, 2004
192
0
0
Originally posted by: nitromullet
As far as a winner is concerned, there have been many respectable sites (including Anandtech) that never claimed a sure victor, but said it all comes down to preference and the consumer.
That is precisely why this is a big deal. Currently, ATi has the paper launch lead with the X800 XT by a small gap over the 6800 Ultra. If the claims that ATi is cheating are true, it may change that small lead for ATi into a legitimate lead for nVidia.


Exactly Nitro. That slight lead ATi held in some benchmarks because of this will now go away, and Nvidia will clearly be leading the boards in every benchmark and in every resolution, instead of being neck and neck with ATi half of the time.

What suprises me most, is that supposedly this 'cheat' has exsisted since Cat 3.4's (I forgot who said it here, but it mentions another forum, beyond3d perhaps), and yet it took this long for it to be uncovered.

If anything, follow the Beyond3d forum post regarding this. It seems like those guys in that thread are very knowledgeable in the area and are continuing to uncover more about it. Supposedly, ATi is using an 'optimized' filter on it's X800 that was not on it's 9800 series...which can account for some increase in speed on the new card. But don't ask me about it...I haven't the first clue. Just point the browser to Beyond3d's thread on this, and learn as much as you can.

Off topic....I just wish they would ship the 6800's already. I want the ultra and my damn cc finger is very itchy at the moment.

:D
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
This is what Dave Baumann asked Quasar, the original finder of the "cheat".

"Have you actually shown where any of this gives a detrimental output in games?"


This is Quasar's response "If i'd say both me and my co-author did notice IQ-differences in the published UT2003-Shots after an 18-hour day of hard labour late at night, would you believe me?
Without magnification or contrast-changing tools, that is.

I guess not.

As far as detrimental (i had to look up that word) ist concerned - I am not sure if i'd call myself the authority to call upon in such cases.
I think, everyone should decide for him or herself."



This does not seem so clear cut to me. I'm interested in ATI's response to this. It seems they are basing all their findings on comparisons to older generation of cards (r3x0) vs (r420) expecting the filtering to be exactly the same. If they compared the filtering on the NV3x class against the NV40 it would not be the same either. I think what we have going on here is FanATIcs and Nvidiots on both sides attempting to dig up whatever garbage they can on their respective "Enemy". We saw it first with the garbage that was spewing out of Driver Heaven a couple weeks ago against Nvidia and now we are seeing more pointless garbage pointing towads ATI by computerbase.de
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
I think we should punish ATI by shorting their stock. It's overpriced anyways.
 

ChkSix

Member
May 5, 2004
192
0
0
This came from techreport, but is posted on page 18 of the Beyond3d post regarding this topic.
-------------------------------------------------------------------------------------------------------------------------
Damn those translated articles are tough to read but here's my interpretation (someone correct me if I'm wrong).

1) Using UT2k3, they compared the X800 vs the 9800 pro.

1280x1024 4xAA 16xAF:
Radeon X800 pro 62.26 fps
Radeon 9800 XT 39.89 fps

2) Then they underclocked the x800 to 9800XT speeds:

1280x1024x32 4xAA 16xAF:
X800 @ 9800XT 49.68 fps
Radeon 9800 XT 39.89 fps

3) Then they used coloured mipmaps:

R360 vs. R420 "speciale" firstcoloredmip 1
1280x1024x32 4xAA 16xAF:
X800 @ 9800 XT ColorMips 1 39.59 fps
9800 XT ColorMips 1 39.67 fps

The contention is that regardless what setting is selected in the control panel or application, the drivers use 'brilinear'. Only when the driver detects colored mipmaps, does it uses trilinear and then takes a 20% performance hit, making it the same speed as a 9800XT clock-for-clock.

The differences in the image quality may be subtle, but now you have to be careful when comparing benches between different architectures. For example, if a 6800U and X800 show similar performance scaling, but the X800 suddenly jumps ahead when AF is turned on, is it a fair comparison if the 6800U is using trilinear while the X800 is using brilinear?

[edit] I personally think this issue is looking a little shady (though more info may be forthcoming).

If you select trilinear, you should get trilinear.

If ATI has come up with some whizz-bang 'brilinear' method, then fantastic; but put another checkbox in the control panel that users can explicilty enable.

Is trilinear a poor performer and you dont want people to use it in the current drivers? That's fine; grey out the option for trilinear in the current drivers and enable it when its ready.
------------------------------------------------------------------------------------------------------------------------
We saw it first with the garbage that was spewing out of Driver Heaven a couple weeks ago against Nvidia and now we are seeing more pointless garbage pointing towads ATI by computerbase.de

This is not pointless garbarge, especially when you're considering dropping 400-500 dollars on a card that doesn't perform like they claim it does. Findings like this are important for the consumer, whichever it paints the bad picture for. Knowing the facts can not only make you wiser, but better at choosing the hardware that will give you the best bang for your buck. If you think it's stupid, than they should just stop benchmarking/testing/reviewing/uncovering of everything on the internet and let the consumers "guess" what is best for them.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
This is not pointless garbarge, especially when you're considering dropping 400-500 dollars on a card that doesn't perform like they claim it does. Findings like this are important for the consumer, whichever it paints the bad picture for. Knowing the facts can not only make you wiser, but better at choosing the hardware that will give you the best bang for your buck. If you think it's stupid, than they should just stop benchmarking/testing/reviewing/uncovering of everything on the internet and let the consumers "guess" what is best for them.


It becomes pointless garbage when these sites find something slightly questionable, then go on a holy cruisade to every forum on the web and incite a mob riot mentality on the fanboys within. Its done all with the hope of attracting more "HITS". This current scandal may turn out to be a cheat, but i'm not convinced yet.

Next week some ATI fans will find another "cheat" in Nvidia cards and we'll be through this mess all over again.

What these sites fail to do is investigate all the variables. They see an anomaly and jump to conclusions instead of double checking everything including other driver sets, possible driver bugs and possible hardware bugs.

Everything is a cheat these days...
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
"Have you actually shown where any of this gives a detrimental output in games?"

How can that question be relevant anyway? Isn't the basic issue that when colored mip maps are used to test the filtering, it is somehow detected and Trilinear Filtering is properly applied to make it appear its working properly, but under "any" other instance, Brilinear filtering is substituted for a performance gain, and no way for the user to force Trilinear?
Its detrimental to comparative benchmarking, (if thats the case) and whether it is noticable or not is irrelevent.
 

ChkSix

Member
May 5, 2004
192
0
0
Maybe Dean. But so far, this 'cheat' is turning more into a reality and less into just suspicion as it is popping up on more and more different tech forums where people have the knowledge and insight to do their own investigative work and report their personal findings. So far, what they have found, supports the German website's findings.

But as with the 5800-5900 debacle, I think it is very important for people to read about information like this before jumping the gun and dumping a lot of cash on anything strictly out of brand loyalty. No one cares about us, yet we give these companies our hard earned money. And anyone who invests their hard earned money in anything (house, car, clothes, computers, etc) should know as much as possible about all the pros and cons before making a leap.

This is a big thing. If what they are saying is true, then the X800 series needed filters and optimizations just to compete or slightly nudge Nvidia this round. If the optimizations are removed, the X800 loses and probably by more than just a small margin. And just like I regretted jumping the gun and buying a 5900 (thank God for Nvidia's driver fixes as well as my 9800pro on my second computer) because I didn't know all of the facts, this can happen again, only now the roles are reversed with ATi's current lineup.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Should be fun! Optimizations versus power usage. So I guess we need an estimate of how much the extra power used would cost over a year and then micro analyze all drivers for cheats and not upgrade until better tech is released. The not upgrading would smarten both these companies up quickly.

I am in a serious way concerned about cheats, because it feels that ATI is following the proven Nvidia example that most will accept more speed for a tiny loss in quality. Disapointing as all sites seemed to agree that the IQ, power effiecency and performance of the x800 and was very nice. I am more concerned with power usage as global warming and oil wars are now a fact of life.