ATI's Radeon X800 texture filtering game at Tech Report

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
I think the ATI technique is fine.

I think the ATI benches are fine (since they represent what most people would probably use anyway)

I am disappointed that ATI felt the need to hide this "feature" (which makes me believe they did not want people to know they were doing this)

I am disappointed that ATI advertises full trilinear by default.

I am disappointed that ATI recommended that reviewers compare their optimized filtering with Nvidia full trilinear filtering.

I would actually like to see a retest of the 6800 with Nvidia's optimizations enabled, as well as a comparison of their optimized IQ against ATI's optimized IQ to see if they are similar (maybe it is really is better to compare ATI optimized against Nvidia full-trilinear if Nvidias optimization's degrade IQ significantly - in which case I can forgive ATI one of their faults).

If anyone knows of benchies where the 6800 was tested with an "optimized" filtering algorithm, please post.

Thanks,
D'oh!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Any points ATI has scored on NVIDIA over the past couple of years as NVIDIA has been caught in driver "optimizations" and the like are, in my book, wiped out."
That's an extremely naive comment especially since even now if you poked around nVidia's drivers you'd probably find application detection and shader substitution going on.

Besides, we already knew that ATi's control panel AF didn't provide full trilinear AF long before this fiasco happened. I wonder why Tech-Report didn't have a problem with it months ago?
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Some of you are missing the point of the whole article, it's not about cheating but about ATI's false advertising of their products!

Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time! :D

What we need is for Nvidia to release drivers with ALL of their optimised code in place (most of which was disabled for the 6800) and redo all of the recent reviews because they are worthless! :(
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: nitromullet
Originally posted by: VIAN
They didn't have to... ATi themslves told us in the live chat that they were running an optimization. The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.
Very unfair, each card should have each optimization enabled to see once and for all who is truly the man at fps at least.
Actually, they should be tested both ways With and without the optimizations. That will tell us whio has the most raw, brute power as well as who has the best optimization in terms of IQ and performance.

The two optimazations are not even in the same ball park. NV's degrades IQ, where as ATi's doesnt. Its obvious some of you dont know the difference from Brilinear, to adaptive Trilinear.. you really shouldnt be putting your two cents worth in.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time!

This is oddly a valid comparison, as you only need power steering to work some of the time, when stopped and at low speed.
 
Feb 28, 2004
72
0
0
I don't think this is deception, I think it's a missed marketing opportunity. As one of the ATi guys that was interviewed said, they could have called it Ati SmartFilter or something and no one would have minded.

I don't think this is as bad as the nVidia brilinear thing since ATi's method only kicks in when there will be no loss in image quality. So they've done a performance optimisation but have been very mindful of IQ, which cannot be said of nVidia's brilinear. It took a while before they actually owned up and put an option in their control panel. And it's not like they're detecting the application either which would suck.

This doesn't affect my choice of vid card though when I upgrade next. I do hope nVidia's IQ in Far Cry is due to drivers / game thinking nv40 = nv30, because I want a GT but the screenshots taken on the x800pro clearly look better.
 

ericlp

Diamond Member
Dec 24, 2000
6,137
225
106
Far Cry...

ATI X800 65.8 6800Ultra Nvidia 62.0

ATI takes the lead again. I need to upgrade my 9700 Pro.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Also, let's say ATI or nVidia either one came up with a new way of doing AA, adaptive AA if you will. Now let's say this was done "generically", so it did not require app detection therefore you didn't have to wait around for a new specifically coded driver to take advantage of it in your spankin' new game that just came out yesterday. This new technique only applies AA to lines/places on the screen needed to create a jaggy-free image, so we can assume (just for an example) this cuts workload by oh...say 2/3 over traditional AA methods, and let's just assume this causes no or virtually no degradation of image quality over the previous methods.

Assuming all the above variables are true, is this a feature people would like to see disabled in benchmarking? Would anyone consider this a cheat simply because the competitor's card cannot take advantage of such a feature?
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Originally posted by: MajorCatastrophe
I don't think this is deception, I think it's a missed marketing opportunity. As one of the ATi guys that was interviewed said, they could have called it Ati SmartFilter or something and no one would have minded.

I fully agree, had they called this a hot new feature people would be throwing roses at their feet instead of trying to drag them through the mud. I would change my mind if someone could show a real world situation in any actual game where this degrades IQ to a noticable degree that isn't an obvious bug/mistake such as the bilinear COD shots.

To clarify my "I would change my mind" statement, I believe it is wrong for either company to "decide" what level of image quality I need. I'll decide what fps/IQ compromise I'm willing to make, I don't need nor want someone else to do it for me. If there is any noticable degradation of image quality made by an optimization, I don't have a problem with the optimization remaining in place, I just want the option to turn it off.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
People want to get a clear picture of a cards true performance, apples to apples
Apples to Apples isn't possible, and if it were, then the scores would be the same. I say - run at the cards optimal performance/IQ and then BENCH.

This is where I disagree. If ATI has a feature that Nvida does not why should they be forced to turn it off just to "level the playing field"? It would be no different to ask Nvidia to turn off their PS3 feature in tests because ATI doesn't offer it. If ATI can produce equal IQ with and without the feature enabled then they shouldn't be forced to disable it in tests. Just my opinion though...
EXACTLY
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
On the marketing crap: All those disturbed by this should have realized by now that you have always been buying "the best". Haven't you seen that Ford-Chevy-Dodge commericals always proclaim that they are always the best in class?? Everything is advertised as the answer to end all. Be informed, due the research, find your choice. I personally don't agreed with the ATI marketing techique on this one, intentional or not, but to be worthy of a holy war, I think not.

Vian- on the apples to apples, I agree. Nvidia's techniques are not the same as ATI's nor will they produce the exact same image. Being very simple minded, I take the stance of the means justify the ends. The end is the most convicing image produced.

In my naive little world, apples to apples benchmarks should be based on equivalent image qualities. We don't care how many cubic inches of engine or how much horsepower it takes, we care about how quick a car will go 0-60 or a qaurter mile. If one companies 2XAA and 4XAF produce the equivalent image as anothers 4XAA and 8XAF, should they not then be benchmarked at these equivalent image settings? This will naturally bring up the question of what are equivalent setting, similar to Intel 2.4 = AMD 2400+. I don't buy into that clicking equivalent boxes in a control panel, benchmarking, then proclaiming a card as the best is an apples to apples comparison. Do the research and be an informed buyer.

To each his own, but remember you have already purchased the best, probably 1000 times over at this point in your lives.

My $0.02
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,045
32,547
146
Originally posted by: nemesismk2
Some of you are missing the point of the whole article, it's not about cheating but about ATI's false advertising of their products!
That's what the problem is and I haven't seen this point effectively countered as of yet. I'm on record as saying I believe this optimization is a good thing, but it is false advertising based on ATi's marketing of their products, plain and simple. Is it par for the course? Absolutely, but that does not excuse or mollify the tactics employed.

My advice to anyone who reads this is to be cynical towards all corporations, and not display a warm fuzzy feeling towards any of them AKA fanboyism, because you will always end up being burned by them in the end ;)
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
The problem when comparing the Adaptive AA/AF techniques is while jagged edges come and go depending on the angle you look at an object (making adaptive AA a good idea), the only reason that trilinear turns on in ATi's algorithm is for colored mipmaps, i.e. when you are looking for it. I'm all for adaptive AA and AF; those, if properly tuned, can be godsends. However, other than colored mipmaps, no one has pointed out when these cards actually turn on full trilinear. It's not a beneficial program like adaptive AA, it's an old fashioned bait-and-switch. Compounded with the fact that they say it's always on, and you have an obvious problem. Kudos to ATi for having good enough IQ that no one noticed this up until now, but no one has given a real reason (situation) where this actually turns on trilinear, and thus has beneficial results.
 

MechxWarrior

Senior member
Mar 2, 2004
565
0
76
Originally posted by: ZobarStyl
However, other than colored mipmaps, no one has pointed out when these cards actually turn on full trilinear.

You can say the same about PS3. Where does it give a "real" visual improvement over PS2?

I personally do NOT blame ATi for this. I blame their PR/Marketing team for hiding this fact. It causes NO visual issues, its no problem for me.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: DAPUNISHER
Originally posted by: nemesismk2
Some of you are missing the point of the whole article, it's not about cheating but about ATI's false advertising of their products!
That's what the problem is and I haven't seen this point effectively countered as of yet. I'm on record as saying I believe this optimization is a good thing, but it is false advertising based on ATi's marketing of their products, plain and simple. Is it par for the course? Absolutely, but that does not excuse or mollify the tactics employed.

My advice to anyone who reads this is to be cynical towards all corporations, and not display a warm fuzzy feeling towards any of them AKA fanboyism, because you will always end up being burned by them in the end ;)


exactly no one speaks of the deceitfullness there :(
now NVs brilinear on the 5xxx series was not very good looking. Now if they get to implement right and look good....well it hsould be able to compete with ATi's "adaptive" method.
But either way I really dont like being told "this is how this will run" when it indeed runs a different way.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
You can say the same about PS3. Where does it give a "real" visual improvement over PS2?

Mech, the advantages of PS3.0 are many, but noticably it has the ability to improve efficiency with dynamic branching and can also (due to huge instruction count) create graphical effects in one pass that it takes multiple passes for PS2.0. As far as the visual benefits, look up the Unreal 3 video and take a look at all the effects they are doing with PS3.0...one is making a flat brick wall look like a 3D masterpiece (laugh all you like, but if you see that video you will drool). Here's a reference from Microsoft (take it with a grain of salt, though) and for whatever it is worth at least devs from many companies are touting the abilities of PS3.0...is anyone in the industry praising ATi for using a lower level filtering technique to save fps? You can argue whether or not many games will use PS3.0 but don't compare a step in the right direction to ATi's regression to old filtering. All ATi did was realize that if all the people doing benchies are running AA/AF at highest levels the bilinear filtering effect on IQ will be cleaned up in the final image; they said they sold us solid gold, but it's just electroplated :disgust:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time!
It sounds a lot like nVidia's application detection and shader subsitution.

Besides, ATi's adaptive trilinear is working all the time. It's not like it only works for one game.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Originally posted by: BFG10K
Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time!
It sounds a lot like nVidia's application detection and shader subsitution.

Besides, ATi's adaptive trilinear is working all the time. It's not like it only works for one game.

That's the problem...adaptive trilinear means "Hey, we disabled trilinear but told you it's always on"...ATi has released a card that has NO trilinear except in very certain situations that would never occur in normal gaming (no one has pointed out a situation other than colored mipmaps that turns on trilinear).
You are right...ATi's adaptive trilinear is always working against you.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
As far as the visual benefits, look up the Unreal 3 video and take a look at all the effects they are doing with PS3.0
Sorry, but I think the UE3 demo is SM2.0 at this point (it was also displayed on a X800).

ATi has released a card that has NO trilinear except in very certain situations that would never occur in normal gaming (no one has pointed out a situation other than colored mipmaps that turns on trilinear).
Proof, please? Now we've gone from trylinear not using tri all the time to not using it most of the time? I don't think that's the case, but we haven't seen enough evidence yet to say one way or the other.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
I'm not sure what video you watched, but I was talking about the shakycam one from the 6800U launch, in which the speaker (an Epic employee) says that the effects being used are part of PS3.0, but we've also pointed out that large PS3.0 instructions can be run on PS2.0 for the most part, it just takes many more passes and is less efficient, so you could be right, I never saw the x800 video of UE3. That aside, the original point remains the same Pete, which was that no one should talk down PS3.0 (an advancement) as being similar to the trick that is adaptive trilinear (a regression). If the effects in UE3 are completely capable of running in PS2.0 then I'm sorry but I don't think anyone is going to argue that PS2.0 is better than 3.0 and that we shouldn't look forward to it.
As for the trilinear itself, I was referencing the article from which this thread was started, and I myself keep asking for more situations but no one has shown a case other than colored mipmaps when trilinear is being used. The article points out that the ATi reps cited "dynamically generated texture maps" as another example, but this has not be confirmed/denied, so like anything from a PR guy, I'll take it with a grain of salt. But since the employees defending this technique can only come up with two examples of when trilinear is on, one of which is a technique for testing for the presence of trilinear, I think it's safe to say it's not on much at all.

I agree with you that more evidence would be nice Pete, and I look forward to this issue being resolved...but to be honest all of the existing evidence makes this little algorithm look bad, and furthermore their little chat only made them look like they were floundering for a response. If this was really a valid technique, they would've had much better answers and examples for why they had this thing in the first place as well as why they never told us. I've nothing against ATi, they make good hardware, but they're going to have to come up with a top-notch excuse for this foul-up if they want to save any of the credibility they gained when NV was doing the same.

I personally just want to see both NV and ATi focus harder on the hardware instead of stooping to these lousy optimizations that negate what they accomplished by building such good chips.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
as being similar to the trick that is adaptive trilinear (a regression)
I think you are wrong. I don't think that adaptive anything is a regression. Nvidia has adaptive precision, They don't have adaptive trilinear, but they do have adaptive AF. Adaptive is the new trend where more performance is gained with minimal IQ loss, usually unnoticed, especially in a FPSs. Adaptive is sort of like the way mip-maps work. each mip-map further away has less resolution, in stead of one straight resolution.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
We've gone from using Trilinear to Bilinear; can that be described as anything near to progress? In the article that spawned this thread there is a picture of an ATi slide that says reducing trilinear to bilinear is an unacceptable filtering trick, and I'm inclined to agree with ATi on that one. I've already compared adaptive AA and adaptive trilinear - please see a couple posts above for my stance on that. Also, as I said before, if this technique were actually useful like adaptive AA, why did no one bother to tell us, and why when we asked the devs that question as well as asking what this algorithm does, were they struggling to answer? One line answers to complex questions? Corporate run-around, nothing more.

Also when someone asked (rightly so) when they would restore full trilinear, they simply stated they never took it away and that no benches were invalidated by this...the point of this whole debacle is that we would like a level playing field to bench, and they flat out tell us no, we can't give you full trilinear? I've stated before that I have no problem with ATi, but this kind of corporate childishness is inexcusable. This is our only real demand: a choice. And if the benches will still be valid, allow us to run them on full trilinear to restore your credibility, you have nothing to fear.

Then they said the reason they didn't put in a 'always trilinear' option is to have a more simple layout in their settings...they'll put in a full slider bar for this algorithm but not a single lousy checkbox?

But this was the real turning point for me:
We are constantly tuning our drivers and our hardware. Every new generation of hardware provides the driver with more programmability for features and modes that were hard-wired in the previous generation. We constantly strive to increase performance and quality with every new driver release.
Sometimes many such optimizations are not even communicated internally to marketing and PR teams for example. And many optimizations are very proprietary in nature and we cannot disclose publicly anyways.
Point 1) This is not a new generation algorithm: this has been around since the 9600
Point 2) We didn't bother to tell marketing? We lost the memo? After NV catches so much flak for brilinear, they try to implement a filtering technique of their own and haven't gotten around to telling marketing since Cat 3.4 (Note: Cat 3.4 was released Mid-May 2003)?
Point 3) Since the x800 is not the more feature-rich of the next gen cards, wouldn't a couple of more buzzwords really help the marketing of this card? If the x800 had PS3.0 and FP32 there would have been no question that this was the card I wanted, but the feature set is lacking...why not shore it up by touting it's new Adaptri Algorithm or some other hyped name as such, it could make millions more.
Occam's Razor: They don't want reviewers running full trilinear because it slows the card.

People on both sides are blasting each other and I'm really trying to avoid a flamefest, but the burden of proof is really on ATi now to actually get out and show us a convincing reason why all this happened. If they actually can run full Tri and maintain their performance lead over NV, it just might change my mind on what card to buy...but if they can't be bothered to give me a fully functional card as they claimed, I can't see a good reason for me to own a x800.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
No real problem with the optimization but the way they went about it was dirty. And with the fact the X800 is in a virtual deadheat performance wise and has much less of a featureset and Nvidia will be getting my business again in the Fall.

6800GT here I come.
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: nitromullet

The problem is not them running an optimization that has little or no affect on IQ, but that they instructed reviewers to disable nVidia's optimizations during testing to keep the testing fair.

Yes - that does come across as being pretty sneeky and underhanded. Hmmm.... if i didnt think ATi were cheating before, after seeing those slides and instructions ATi sent out, ATi definitely dont come off looking good. :roll: