ATi's filtering methods explained

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Hanners has written a nice little article on ATI's filtering methods:

In recent weeks a lot of space, discussion and bandwidth have been dedicated to ATi's texture filtering methodology, particularly on their new Radeon X800 boards.

Away from all that, one thing I realised as a result of much of the forum discussion surrounding recent issues is that there is a great deal of confusion about ATi's texture filtering method in general, and more specifically the various different optimisations that are implemented as part of it. So, rather than go down the road of simply discussing the filtering method used on ATi's R420 and RV3x0 boards, I wanted to create a more catch-all explanation of the different facets of their techniques - Mainly how they work, what they mean for users and how (if at all) they can be controlled.

Read the full article here:
Link
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: rbV5
Hanners has written a nice little article on ATI's filtering methods:

In recent weeks a lot of space, discussion and bandwidth have been dedicated to ATi's texture filtering methodology, particularly on their new Radeon X800 boards.

Away from all that, one thing I realised as a result of much of the forum discussion surrounding recent issues is that there is a great deal of confusion about ATi's texture filtering method in general, and more specifically the various different optimisations that are implemented as part of it. So, rather than go down the road of simply discussing the filtering method used on ATi's R420 and RV3x0 boards, I wanted to create a more catch-all explanation of the different facets of their techniques - Mainly how they work, what they mean for users and how (if at all) they can be controlled.

Read the full article here:
Link

Thanks for linking article - looks like a good read >>> ill be going through that this afternoon after lunch then :)
 

PliotronX

Diamond Member
Oct 17, 1999
8,883
107
106
Originally posted by: Schadenfroh
/pulls up chair and grabs popcorn
/me ups the ante with some frosty :beer: (Sam Adams summer brew to be exact)
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I think Hanners does a very good job explaining the different filtering modes. The pics and animations pretty clearly demonstrate what he is describing (however, I'm not seeing much difference between bilinear and trilinear on the animation)

Obviously, the "trylinear" filtering produces a different image than full trilinear as demonstrated, but cranking up the brightness 400% and the fact that it is a synthetic, static demonstation illustrates how well the implementation works IMO.
From a more real-world perspective, some testing in Unreal Tournament 2004 failed to bring up any circumstances where filtering quality was noticeably compromised to the naked eye

I'd like to see the same approach taken comparing the filtering x800 vs the 6800 cards after they become available.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
So should/can nVidia implement something similar? Will they get burned at the stake for doing so?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
They tried to last year, and yes they were burned at the stake for doing so.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Genx87
They tried to last year, and yes they were burned at the stake for doing so.

To clarify:

NVIDIA last year added this 'brilinear' mode that is often mentioned. What this does is a (fairly shoddy) mix of bilinear and trilinear, with no 'intelligence' at all; it uses the same filtering quality across the board. In some situations it looks OK, but in others it looks, well, bad. On top of this, the way they 'introduced' it was to silently force its use in certain applications (specifically UT2K3, which is heavily used for benchmarking). This, of course, gave them a nice boost in their UT2K3 numbers, but was widely decried as 'cheating', since it noticeably drops IQ in some situations.

ATI has what seems to be a pretty decent adaptive trilinear algorithm, which does not appear to cause noticeable IQ drops in 'normal' situations -- and even in pretty contrived ones, it seems to do OK, since it often falls back to full trilinear. And while ATI did silently introduce this algorithm in the RV360 (9600/9600XT), it uses it all the time (as opposed to just turning it on for certain benchmarks). So there are a few differences in what each company did, although neither is blameless.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: skace
So should/can nVidia implement something similar? Will they get burned at the stake for doing so?

With programmable GPU's, shader compilers ect, its all about optimizing performance while maintaining IQ. Sure Nvidia should be doing whatever they can to offer the most performance to their customers, and of course its possible for them to do it. Thats why its important to not just measure performance with benchmarks, but to objectively analyze IQ as well.

As far as getting burned at the stake, its a competetive business with a rabid fanbase..its part of the territory. Most of it is crap, but some of the "burning" is what keeps the industry moving forward.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
So should/can nVidia implement something similar?
They already have adaptive trilinear and it appears to work in a similar fashion to ATi's. The problem is that you can't enable it without also enabling brilinear.