Full Trilinear filtering enabled with registry keys?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
For the fact that the AF covers up most of the IQ problems I'll give kudos to ATi but don't kid yourself they did this to win the benchies and where you see a 12-17% gain I see a card that had to cut corners to crown itself the speed champion. And would you people please stop saying "I wish my 9xxx radeon had this"...it does, it's called "turn on bilinear instead of tri." All this 'optimization' does is do that without asking, and still tells you it's tri. If you think being forced to do something that used to be an option (then being lied to about it) is a benefit you've got problems.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I can't wait till Ati changes this "optimization" with the next driver release. Then we can get back to a pure debate on SM3.0 vs SM2.0 :p
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
And would you people please stop saying "I wish my 9xxx radeon had this"...it does, it's called "turn on bilinear instead of tri." All this 'optimization' does is do that without asking, and still tells you it's tri.

That, of course, is incorrect. The "Trylinear" filtering is not simply bilinear filtering:roll:. There are no bilinear or trilinear filtering options in Catalyst drivers, there are quality and performance settings. Nothing tells you you're using trilinear filtering when you are in fact not.

Of course I wish it was supported on the r3xx cards. The fact is, there will probably be a way to disable it on the cards that it is implemented, and no way to implement it on the cards that don't.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Please if you are going to argue that point rbv post some example of when the x800's are actually using trilinear that doesn't involve colored mipmaps or some other artificial means of detecting trilinear, because AFAIK there are none. I mean, as the OP, you of course realized that trilinear isn't on or it wouldn't need to be hacked back in, would it?

There are only 2 times I know of that tri is enabled 1) the mipmaps and 2) with a reg hack...so point is, every single user out there with an normal x800 is getting nothing but bilinear since colored mipmaps never occur in normal gaming, is that right? Wouldn't that make "trylinear" "simply bilinear filtering"? Once again, please post an example if you would.

Also refer to Cainam's posts if you would since he's the only person around here who has an x800 and he states that he is quite able to see that it is bilinear only.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
There are only 2 times I know of that tri is enabled 1) the mipmaps and 2) with a reg hack...so point is, every single user out there with an normal x800 is getting nothing but bilinear since colored mipmaps never occur in normal gaming, is that right? Wouldn't that make "trylinear" "simply bilinear filtering"? Once again, please post an example if you would.

Take the time to read this article on ATI filtering methods. Link
 
Apr 14, 2004
1,599
0
0
Also refer to Cainam's posts if you would since he's the only person around here who has an x800 and he states that he is quite able to see that it is bilinear only.
No, he is able to see where the card switches from trilinear to bilinear.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Cainam has stated (and posted) screenshots showing a clear and obvious line that would not be visible with any form of trilinear filtering. And Rb, that article though interesting still shows that bilinear is being used in "trylinear" and (as I stated) has a small but detrimental effect on IQ. However what I asked is if there are any actually situations when trilinear filtering is used, because (as has been discussed multiple times and summed up in the Tom's article) the only time it's explicitly visible to anyone is in the places ATi told us to look.

Don't get me wrong I by no means think the x800 series is sub-par but I'm very interested in one of these next-gen cards and I want the one that is truly fastest for the money: if ATi releases a full-trilinear driver that bests the NV offering then I'll go ATi but if they go out of their way to trick us into thinking the benches were even-ground so they could win, then I'm less open to handing my money to them.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ZobarStyl
Cainam has stated (and posted) screenshots showing a clear and obvious line that would not be visible with any form of trilinear filtering.

In one game (Dark Age Of Camelot, hardly the most popular title around), in a few places. It appears to be a flaw in ATI's 'trylinear' algorithm; it probably needs to be applying full trilinear filtering on certain textures and it isn't for some reason.

And Rb, that article though interesting still shows that bilinear is being used in "trylinear" and (as I stated) has a small but detrimental effect on IQ. However what I asked is if there are any actually situations when trilinear filtering is used, because (as has been discussed multiple times and summed up in the Tom's article) the only time it's explicitly visible to anyone is in the places ATi told us to look.

From the article:

What 'trylinear' filtering does is determine when and where full trilinear filtering is and isn't needed, and thus only uses bilinear filtering on parts of the scene that it is decided doesn't need full trilinear. In other words, what you get in most situations is a mixture of bi and trilinear filtering, rather than 'true' trilinear at all times.

'Trylinear' uses a mix of trilinear and bilinear, and ideally attempts to use bilinear whenever it can get away with it without an IQ hit, and trilinear when it can't (for example, with colored mipmaps, or when the application provides its own mipmaps instead of letting the drivers generate them). It certainly seems to give an IQ that's a *lot* better than bilinear filtering, and in most cases seems indistinguishable from trilinear. They even said:

From a more real-world perspective, some testing in Unreal Tournament 2004 failed to bring up any circumstances where filtering quality was noticeably compromised to the naked eye, and there was certainly no problems with mipmap boundaries being clearly visible as was seen on occasions with the early implementation of nVidia's 'brilinear' filtering (which has since improved markedly).

Does it matter if it's 'real' trilinear or not if the end result is the same?
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
And Rb, that article though interesting still shows that bilinear is being used in "trylinear" and (as I stated) has a small but detrimental effect on IQ.

What it is saying is that a mix of trilinear and bilinear is used in each scene depending on where it can be used without affecting image quality, and obviously its not "perfect" because we've seen where there is a difference between r3xx and rv3xx/r4xx. Its pretty good though for the same reasons.


However what I asked is if there are any actually situations when trilinear filtering is used, because (as has been discussed multiple times and summed up in the Tom's article) the only time it's explicitly visible to anyone is in the places ATi told us to look.

Since its only recently been discovered how to disable the adaptive trilinear, the book is still out untill IQ is carefully examined. From what I've seen, its far easier to point out the few instances where it impacts the IQ than it is to state whether its using full Trilinear or Trylinear filtering. To say Trylinear=bilinear indicates a misunderstanding of how the filtering works. So far, its been shown that "most" of the time, you can't tell the difference between Trylinear and Trilinear.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Thanks for your post Matthias, it was well reasoned and helpful, and frankly I'm not arguing that "trylinear" isn't capable of actual trilinear but it all goes back to how this algorithm is determining when/where the filtering is needed, and for what reason, and this is why the situation as a whole bothers me. This of course stems back to the ATI dev chat which left me sour as it felt like a runaround when they couldn't just say straight how it worked, citing the proprietary nature of it (it's been in the drivers for a year, get a copyright already). If you are right and it's just a mix (as shown here) then it's the same as brilinear and this "trylinear" moniker needs to be dropped and they need to release full tri for accurate benching, as nV was finally compelled to do after their similar implementation. That being said, I'm frankly at this point getting confused...if it's brilinear, why are the ATi people who so vehemently hated nV bri suddenly loving this when previously their hardline was tri, nothing less accepted? If it was such a great improvement, why did they lie and tell us it was full tri and that we needed to make sure nV was doing similar filtering? Nv came right out with their brilinear, but it was blasted left and right, so now ATi does it but lies and it's ok? And as for Cainam and his DAoC problems, as you said it's not the most popular game, whereas the extremely popular UT2K4 has no problems, are we possibly looking at an application-detection aspect of this algorithm?

And the most important little tidbit...why is it so important that it's full tri? Simple; like I said before, I really want to buy one of these next gen cards and I want the one that's actually faster for the money. Therefore, I want some even-level benches before I spend a big chunk of change on some silicon. And secondly, it's important because they said it was there...and I don't like being lied to.

I saw these same arguments (from opposite sides) in the 9xxx/5xxx wars a while back, but with that product line the better IQ also comes with the faster performance (Radeons), so it was no real mental leap as to which was worth the money; however I'm still on the fence for this generation and this shady stuff (from both camps) is really starting to annoy.
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Originally posted by: ZobarStyl
Nv came right out with their brilinear, but it was blasted left and right, so now ATi does it but lies and it's ok?
Brilinear reduced image quality to the point where it was visible and you didn't need coloured mipmaps to see it. ATi's adaptive trilinear never really compromised iq to the point where you could see it in-game. Just like ZombieJesus said

Why bother when the adaptive trilinear filtering produces better performance with no iq impact. Unless you play games with a giant 6x magnifying glass infront of your monitor.

You needed to look at coloured mipmaps and ingame screenshots just to find out it was there. And think about it, (i don't remember which site found out) the site that found it out might've even looked over the spot and not think anything was wrong. Seriously, can you tell from any screenshots that its there? Did you look at pictures in reviews of the cards and think "wtf? whats with the weird filtering?" Sure, I may be an ATI fan, but i don't think there's anything wrong with it. And anyways, I have a 9600 Pro. It is supposed to have the Trylinear filtering too but its never been a problem for me.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
why is it so important that it's full tri? Simple; like I said before, I really want to buy one of these next gen cards and I want the one that's actually faster for the money.

Like most people. If you want to get the best value for the money, being a first adopter is hardly the way to go about it anyway. 6800 cards are still not really even showing up in retail yet and their feature set isn't even fully supported yet in the drivers. The X800 cards haven't been fully tested either, so I don't see how a fair comparison can be made until we have the cards available, their actual pricepoints set by the marketplace and their respective feature sets enabled in available drivers.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Fair enough rbV...talk to you then =)...really though, sorry if I seem cross, but right now I'm on a Ti4200 with dual 19's and the old girl just can't take much more, so all this waiting then getting run around is tiring me out and I'm getting cranky with my hurting FPS. I frankly don't care who makes my next card, so long as I get what I pay for.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: ZobarStyl
Fair enough rbV...talk to you then =)...really though, sorry if I seem cross, but right now I'm on a Ti4200 with dual 19's and the old girl just can't take much more, so all this waiting then getting run around is tiring me out and I'm getting cranky with my hurting FPS. I frankly don't care who makes my next card, so long as I get what I pay for.

I've been eagerly awaiting the new gen cards as well. I'm more in a wait and see mode since my present AIW card suites my main rig for now. I'm more interested in the video and display aspects of the new cards than raw gaming performance, so patiently I wait.....
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ZobarStyl
Thanks for your post Matthias, it was well reasoned and helpful, and frankly I'm not arguing that "trylinear" isn't capable of actual trilinear but it all goes back to how this algorithm is determining when/where the filtering is needed, and for what reason, and this is why the situation as a whole bothers me.

Well, ok, but *any* adaptive algorithm is like that. Does it bother you that MSAA may not be properly deciding where to apply antialiasing? Or that your angle-dependant AF may not be optimal?

This of course stems back to the ATI dev chat which left me sour as it felt like a runaround when they couldn't just say straight how it worked, citing the proprietary nature of it (it's been in the drivers for a year, get a copyright already).

I can see you've never dealt with patent law (not copyright). It can take several years to get a patent, assuming things go well.

If you are right and it's just a mix (as shown here) then it's the same as brilinear and this "trylinear" moniker needs to be dropped and they need to release full tri for accurate benching, as nV was finally compelled to do after their similar implementation.

ATI does an *adaptive* mix of trilinear and bilinear filtering, which, in theory, should give quality very close to trilinear (assuming you're doing it right, which they appear to be). NVIDIA's 'brilinear' just uses the same lower quality filtering for everything across the board -- even when you *do* need trilinear filtering for full detail (as you can see from comparisons using colored mipmaps; the 6800 with optimizations is clearly filtering less, whereas in these cases the X800 kicks back into full trilinear).

Also, Toms provides two pictures -- one comparing X800 'trylinear' against bilinear, and one comparing 9800XT trilinear against bilinear. However, they don't show the original screenshots. Try subtracting their two shots from each other with Photoshop (giving you X800 'trylinear' compared to 9800XT trilinear). They're really not as different as they're making it out to sound.

If it was such a great improvement, why did they lie and tell us it was full tri and that we needed to make sure nV was doing similar filtering?

Their opinion is that it *is* full trilinear, because it defaults back to full trilinear whenever that is needed (or at least it's supposed to). 'Brilinear' (at least on things like colored mipmaps) provides a lower level of filtering. However, as you've noted, it's impossible to tell mathematically which is 'better' in real-world situations.

Nv came right out with their brilinear, but it was blasted left and right, so now ATi does it but lies and it's ok?

Um, actually, NVIDIA 'introduced' brilinear by silently forcing its use in UT2K3 last year in order to inflate their benchmark numbers. Then they forced it on all the time, and until recently you had to use various driver hacks to get rid of it on the GeForceFX cards (now, I believe, you can just turn it off in the drivers). I'm not happy with how either company has handled this in terms of marketing, but I think ATI's technical implementation is a lot better.

And as for Cainam and his DAoC problems, as you said it's not the most popular game, whereas the extremely popular UT2K4 has no problems, are we possibly looking at an application-detection aspect of this algorithm?

He's only seeing issues with certain textures in DAoC, not in the entire game. I don't think it's app-detection, just that ATI probably didn't test very well (if at all) with that game, and they do some weird thing with some of their textures that screws up the algorithm. Seems like the most likely explanation to me.

And the most important little tidbit...why is it so important that it's full tri? Simple; like I said before, I really want to buy one of these next gen cards and I want the one that's actually faster for the money. Therefore, I want some even-level benches before I spend a big chunk of change on some silicon. And secondly, it's important because they said it was there...and I don't like being lied to.

If it provides full trilinear quality (or very, VERY close to it) with a decent performance boost, does it make sense to force the X800 to bench with the optimizations off? This starts to get back to whether you can ever get truly 'apples to apples' benchmarks. ATI maintains that their optimized trilinear is just as good as the 'real thing', and that you should bench it against NVIDIA's full trilinear. NVIDIA obviously disagrees. Who do you believe? How do you do your benchmarking? Should you test ATI without its optimizations, even though 99% of the time there's no IQ difference, and 99% of the people that buy the card will run it with them on?
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I think we can all agree that be it nVidia, Ati or whatever, that when we enable a feature, we expect to get that feature 100% of our time, not when they prescribe it to us. Can you imagine the kinda crap the manufactures would try to pull off on us if we did'nt have such a vigilant community?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Matthias99
Originally posted by: ZobarStyl
Cainam has stated (and posted) screenshots showing a clear and obvious line that would not be visible with any form of trilinear filtering.

In one game (Dark Age Of Camelot, hardly the most popular title around), in a few places. It appears to be a flaw in ATI's 'trylinear' algorithm; it probably needs to be applying full trilinear filtering on certain textures and it isn't for some reason.

Dark age of camelot is the #4 game in its entire genre. Its not exactly a no-name title. The graphics engine is almost pure DX8.1, and it's moving to a DX9 engine in an expansion this december.

Its also one of the most graphically demanding genres, MMORPGs, you can have so much geometry and texturing on screen that even a 5950 or 9800XT can be brought down below 5fps.
 

Cawchy87

Diamond Member
Mar 8, 2004
5,104
2
81
I have a 9600XT vid card, will i see any proformance benifits by doing this? either picture quality or fps increase?
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Matthias as arbitrary as benches are, if there isn't some form of even level between two pieces of hardware then the point of the comparison seems moot. ATi even agrees, since they went out of their way to inform reviewers in how to disable NV optimizations. Like I said to rb, I just want what truly is the best deal, so I want to see the most accurate representation of the true power of the hardware, not just what either company feels is good enough for me. To be honest, the only other thing than this stopping me from getting an x800 line card is the ugly alligator thing on the HS :laugh:
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Acanthus
Originally posted by: Matthias99
Originally posted by: ZobarStyl
Cainam has stated (and posted) screenshots showing a clear and obvious line that would not be visible with any form of trilinear filtering.

In one game (Dark Age Of Camelot, hardly the most popular title around), in a few places. It appears to be a flaw in ATI's 'trylinear' algorithm; it probably needs to be applying full trilinear filtering on certain textures and it isn't for some reason.

Dark age of camelot is the #4 game in its entire genre. Its not exactly a no-name title. The graphics engine is almost pure DX8.1, and it's moving to a DX9 engine in an expansion this december.

It's the number four game in a *fairly* small genre; it probably has, what, a few thousand active players? Big-name mainstream games sell hundreds of thousands of copies. I'm just saying that ATI cannot reasonably be expected to exhaustively test every single game on the market.

Its also one of the most graphically demanding genres, MMORPGs, you can have so much geometry and texturing on screen that even a 5950 or 9800XT can be brought down below 5fps.

It also doesn't help that the first-generation MMORPGS (*cough* Everquest *cough*) didn't have very well-written game engines. Most have significantly improved, though, and the latest crop are a lot better in terms of efficiency.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Unfortunately, until we see memory increase in raw speed as quickly as GPUs, we're probably only going to see more of these IQ "compromises." Think about it: raw fillrate went from 3296Mpixels/s with the 9800XT to an whopping 6400Mp/s on the 6800U to an astounding 8320Mp/s on the X800XT, an almost 300% increase. Memory, OTOH, went from 380MHz to 560MHz, an increase of just 50%.

OTOH, I suppose an increase in shader usage may tip the limitation on shader ops/s rather than memory bandwidth, and both IHVs may resort to easily-accessible (and possibly even default to) trilinear. Probably won't happen due to the benchmark wars, but it should become more of a possibility moving forward.

There are only 2 times I know of that tri is enabled 1) the mipmaps and 2) with a reg hack...so point is, every single user out there with an normal x800 is getting nothing but bilinear since colored mipmaps never occur in normal gaming, is that right? Wouldn't that make "trylinear" "simply bilinear filtering"? Once again, please post an example if you would.
Matthias already answered this, but here are a boatload of pics. Pls point to me one in-game instance where a screenshot shows a trylinear or even brilinear mipmap transition as obvious as with bilinear. Try and bri are not "nothing but bilinear," they're more than that. There's no need to throw more misinformation into an already heated debate. The discussion is if bri/try are enough, not if they're equal to bi.

Obviously there are corner cases where bri/try lead to obvious mipmap transitions, thus the outcry for an option for full tri. But we may just have to grudgingly accept compromise with this gen of cards, that happens to straddle the recent set of texture-heavy games and the upcoming set of shader-heavy ones. In the end, I'll bet you'd game at 16-bit if it gave you 50fps than 32-bit if it offered only 40fps, and that likely figured at least partly into nV's and ATi's thinking. But the choice should ultimately be left to us, and I hope ATI got the message.