OMG!!! ATIisntCHEATING!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Have people missed that fact that nVidia does the same thing?

If you want to slam it as a cheat then fine but be sure to slam both vendors instead of just singling out ATi.

Like I said before, I don't have a problem with either vendor doing it.



WTF? Since when? You have had a big problem with nVidia doing it until you found out ATI was doing the same and everything you had said was untrue. :roll:

http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1272268&highlight_key=y&keyword1=AF
The 9700 Pro might not be faster than the 5950 but crank up both cards to their maximum AF and AA and use a decent resolution like 1600 x 1200 and the gap will be much smaller than you think, not to mention that the 9700 Pro will have better image quality and it also doesn't cheat.

http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1297019&highlight_key=y&keyword1=Brilinear
The two modes are bilinear and trilinear. Brilinear was born out of nVidia's cheating during the 5xxx days when they used a combination of the two to increase performance.

http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1227849&arctab=arc&highlight_key=y&keyword1=Brilinear
nVidia does exactly the same and worse.

For example, when they detected UT2003.exe they used brilinear AF but if the executable was renamed it used correct trilinear AF. Now you can't even rename the app as nVidia uses brilinear AF in all Direct3D applications.

Yeah, you never cared.
This is sort of like that anti-smoking commercial where the kid says:"Tough to get hung by your own words".
 

nick1985

Lifer
Dec 29, 2002
27,153
6
81
Originally posted by: VIAN
This is ATI's response to Driver Heaven's inquiry.

Makes sense.

I feel bad for Nvidia. ATI are better business people-they know how to persuade.

i bet you feel awful for nvidia. you probably cry yourself to sleep knowing only a few months ago you were an insane nvidia freak. did you do your penance?
 

Gagabiji

Golden Member
Oct 1, 2003
1,460
0
0
it degrades IQ, it is a cheat. ATI is trying to cover it up. over at NVNEWS.net they have several posts about it, and even the ATI "pre-release" meeting documents in pdf format, explaining "tricks" on AF filtering. Yes, I linked to nvnews, but that doesnt make it a biased post, the forums are littered with ATI folks, and they even feel betrayed.



Hey there is this forum I can't remember the name of it but it say's nVidias cheating MAJOR time more then we even know and it's nVidia fanboys saying it.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Rollo, don't confuse Brilinear in UT2K3 on NV cards with the adaptive Trilinear they say they are using. UT2K3 on NV cards for a long time there was NO WAY to force true trilinear filtering. At least, that's what I remember the bulk of the arguing being about. There is a difference between an adaptive algo. and an ON/OFF switch.

I'm all for app optimizations, and an adaptive trilinear algo. that can give me the looks of trilinear AF and the speed of a mixed mode is only a good thing IMO, especially since this applies across the board in all games.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
You cant force true Trilinear either on an ATI card now.

BTW Rollo those are some classic quotes from BFG10K.
It is amazing how peoples tunes change when something is being done in their camp isnt it?

BTW I am also enjoying the people who bash Nvidia for the same thing take the PR release as the word of god :)
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
I'm just waiting for AT and other review sites to get the full trilinear on both cards for the (more) final review of these next gen cards. As for the pertinence of the 'cheat,' if you put a NV card on 8x AF and it decided that 2x AF looked just as good and you probably wouldn't notice so it would go ahead and change it for you, the fanboys (both sides, I would hope) would explode with rage. But then again frankly it has nothing to do with IQ, it has to do with the benches: benches are about running different cards on the exact same settings through and through and seeing which does it better...if they aren't on the same settings, they serve no purpose. <sarcasm>I mean did you know that the 6800 gets 82.8 fps in FarCry and the x800XT only gets 47.2?</sarcasm> Oh wait, that's 1028 no AA/AF vs 1600 4x/8x, so it means exactly squat.

I just want an even benchmark instead of this shady business of telling us what is good enough...
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Shamrock
who cares if it's an "official response" as said in my thread from Chksix that it's PR BS, you actually think they would admit to it? and as far as lowering IQ, according to computerbase.de it DOES lower IQ

i don't believe compubase.de ever stated it lowers iq. i recall him saying in a reply on a forum somewhere (b3d maybe? i'm not gonna take the time to search) it wasn't his place to judge, rather the users should decide for themselves..
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: alexXx
typical nvloser. The website is named nvnews. Think about it.


not at all.. sure there are a few fanboys (nowhere near what you'll find at r3d tho, and one might argue there's more ati fanboys at nvnews trolling than there are nv fanboys, lol) but most of the posters there are pretty objective. the mods certainly are.. and mike chambers, who runs it, got his x800pro the same time i did.. and he has an x800xt on preorder as well, heh.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
You cant force true Trilinear either on an ATI card now.

Isn't the adaptive trilinear only enabled on 9600/X800, don't the rest of the lineup use the older method?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: chsh1ca
Rollo, don't confuse Brilinear in UT2K3 on NV cards with the adaptive Trilinear they say they are using. UT2K3 on NV cards for a long time there was NO WAY to force true trilinear filtering. At least, that's what I remember the bulk of the arguing being about. There is a difference between an adaptive algo. and an ON/OFF switch.

I'm all for app optimizations, and an adaptive trilinear algo. that can give me the looks of trilinear AF and the speed of a mixed mode is only a good thing IMO, especially since this applies across the board in all games.

you can't force trilinear on my x800pro either (you can, i suppose.. it just doesn't actually use it)
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: rbV5
You cant force true Trilinear either on an ATI card now.

Isn't the adaptive trilinear only enabled on 9600/X800, don't the rest of the lineup use the older method?

correct.. it's works on 9700/9800
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Oh, and incase you didn't see it in the other thread. ATI will be holding a live Chat:

Chat With ATI
ATI's texture filtering algorithms
There has been a lot of discussion about our trilinear filtering recently. In an effort to clear this up, ATI is holding a live web chat today, May 19, 2004 at 3pm EST, 9pm CET.

Speakers:

Andy Pomianowski, Staff Engineer, and architect of our image analysis and compression algorithms.
Raja Koduri, Engineering manager, graphics architecture and performance tools, one of ATI's experts on performance and image quality.
To chat live, please visit us at 3pm EST, 9pm CET


Link
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
i bet you feel awful for nvidia. you probably cry yourself to sleep knowing only a few months ago you were an insane nvidia freak. did you do your penance?
Oh yes :(.

:)

To me, it just seems like the gaming community misunderstood what Trilinear filtering actually does. ATI cleared it up, but Nvidia didn't and that's why the got the stick.


Shoot, I want them to bring adaptive trilinear to the 9700 Pro. Bring it.
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Can the adaptive trilinear be applied to the 9700/9800 series?? I sure would like to see that it could since a 9800 is currently in my system.

I am definitely for 99% of the maximum possible image quality for only a minor performance hit. Other than most of those on the current bashfest, who wouldn't want this?? Hey, it would defintiely extend the usable life of the card performance wise, would it not. I know I would love to be able to run an overclocked 9800np w/ eye candy and clock for clock get the same performance as a X800pro. Is there something I am missing from all of the chit-chat going on?
 

imported_obsidian

Senior member
May 4, 2004
438
0
0
I love this. ATI does the EXACT same thing as NVIDIA does with "brilinear" filtering and ATI is all good because they put out a press release? Please. Personally, I think brilinear filter is a good thing and always have. It is free performance. However, watching the ATI fanboys who have been bashing NVIDIA on this subject for so long, try to come up with justification for ATI's change makes me laugh.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Shamrock
Both vendors might do it, but at least NVidia doesnt deny it, nor try and hide it. The drivers actually turn it off and on automatically when an application is detected. :\ at least with NV's drivers there is an option to turn it off by going to high quality.
Actually, nV also slipped bri under the radar, starting with the benchmark favorite UT2K3 (and it was evidently an app-specific opt b/c renaming the filter tester to ut2003.exe showed bri :)). Eventually, nV extended bri to all games in both APIs. And a generation later, they included a checkbox to disable tri opts, for which they deserve props.

I agree that slipping it under the radar is not a nice move, and I hope ATi eventually offers a checkbox to disable their tri opts.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: nemesismk2
Originally posted by: Ackmed
Originally posted by: Shamrock
it degrades IQ, it is a cheat. ATI is trying to cover it up. over at NVNEWS.net they have several posts about it, and even the ATI "pre-release" meeting documents in pdf format, explaining "tricks" on AF filtering. Yes, I linked to nvnews, but that doesnt make it a biased post, the forums are littered with ATI folks, and they even feel betrayed.

You cant have Tri-linearing filtering with this optimization. Not even a selection to toggle it on or off.

Show me a pic where it degrades IQ?

btw, just because its posted at nvnews, doesnt mean anything. That site is worthless.

Typical ATIdiot, just because a website doesn't fully support ATI video cards it's worthless! :disgust:

Nice childish name calling.

I dont care who it supports, but banning people for saying something bad (often times true) about NV is just silly.
 
Feb 28, 2004
72
0
0
This isn't cheating, this is a perfectly sensible optimisation. If no/less filtering is required in particular situations - apply filtering as appropriate. So long as there are no visible differences in actual gameplay situations, this is like calling z-culling cheating. If the GPU decides it doesn't need to render some pixels - it doesn't render them. Everyone seems happy with that, but now ATI are saying their filtering algorithm can detect how much filtering it actually needs to do to acheive the same visual effect, and your're calling that cheating?! What nVidiot fanboi came up with that!

So long as it doesn't degrade the image quality noticably when I'm playing games, and gets me a FPS boost at the same time, I don't care what it's doing. And so long as this is the generic behaviour of the algorithm, since I'm totally against app detection (and developers writing code paths to cater for specific GPUs - makes me wonder whether these people forget the reason why we have standard APIs like Direct3D and OpenGL in the first place).
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
See, I was thinking it's along the lines of z-culling. If the card is more efficient, and any time it is required applying trillinear filtering (for quality purposes), what is the issue? NV enabling brilinear I don't really have a problem with either, but this from the description sounds far less like NV's Brilinear mode and far more like a trillinear optimization to help performance. They aren't hurting IQ at all apparently, and I have yet to see anyone show tangible evidence that they are.
If you have two ways of accomplishing the same task, the method that requires less work is the more efficient method. It seems a lot of people are hung up on ATI choosing to be more efficient. The fact is, this has been in place since Cat 3.4s and NOBODY NOTICED IT until ATI basically handed people the "how to check on IQ being harmed or not". If they hadn't mentioned that specifically in the PDF, this would still be something purchasers of ATI cards would be reaping benefit from, without even knowing about. That is the key part, nobody could tell there was something being done differently, for quite some time now. I agree mostly what MajorCatastrophe has said, if it doesn't affect IQ, and gives you higher performance, great, I'm all for it. Where's the discussion of the IQ differences? How much of a difference is there, if any, and are the differences making the image quality worse or better?
 

hysperion

Senior member
May 12, 2004
837
0
0
actually it has been in place since cat 3.4 but only on the 9600 series of cards....most sites do iq tests on the high end cards aka 9800's which didn't have it in place.....so it was probably never noticed......remember shader days with the 9600 beating the 5900u now we know why.....then again the nv 5xxx series wasn't that great in the first place.....this news did make my mind up for me...moving from a 9800xt which has been a beautiful card to the bfg 6850/6800 ue upon release....