trillinear filtering on radeon 8500 ??

boran

Golden Member
Jun 17, 2001
1,526
0
76
Will it be possible that future drivers add trillinear filtering and trillinear aniostropic filtering to the radeon 8500 ??? because this is my main reason right now to NOT go with a radeon, I really prefer tri filtering over bi filtering and I'd like to know if there is a way it will ever get enabled or if there is already a way to hack it into the radeon, thanks in advance.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
Well, on my 7500, I used ansiotropic in OpenGl through the control panel, no hacks needed.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
It only depends on the hardware. If there is hardware support for the feature and it is just not enabled for some reason, then yes. Otherwise the chances are very slim, since proper trilinear filtering is nearly impossible to do without hardware support.

There is a way of faking it with n-sample supersampling FSAA. You can't get true trilinear, but by shifting LOD bias between subsamples, you can split the ugly bilinear mipmap border into n borders. This way you'll get pseudo-trilinear without a performance hit if you're using FSAA anyway. The blend will show banding, but it's not that noticeable on low contrast textures. Some guy at Beyond3D forums came up with this idea for Voodoo5, and to my recollection 3dfx even gave him a prototype V5-6000 for it.

I'm in the exact same situation as you - considering the recent driver improvements and very competitive pricing, I would've already bought the R8500 if true trilinear and per-pixel mip mapping were there. Especially the lack of proper depth-based per-pixel mip mapping is strange. For crying out loud, it has been a standard feature of hardware accelerators ever since the days of Voodoo1.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
according to ATI there is NO NEED to do trilinear filtering with their implementation of dynamic anisotropic filtering and smoothvision. (bilinear, 16tab, 32tab, 64tab aniso depending on image contents, angle of 3d objects)

They say that an additional trilinear filtering would not benefit at all and instead only use much more GPU caculating power....and there wouldnt be any benefits since the anisotropic/bilinear filtering does all that already.

From what i know (and from the reviews i've read, including lotsa screenshots which show filtering of gf3 ti 500 <--> radeon)...the radeon has better/crispier picture with anisotropic enabled...at a fraction of the fps dropping as the gf3 would do if you'd enable aniso with a gf3.

I dont understand your decision to say 'i dont go with card XYZ because it doesnt have trilinear'..also this subject is a bit more complex then 'trilinear yes or no'....and in my opinion ati did the abolut right thing !

Why should they implement a trilinear which is totally unnecessary, costs GPU usage and gains NOTHING ? For you decision which card to get i'd recommend reading the review of the radeon with the the new drivers at tomshardware (same article at rivastation)...and look at the section 'anisotropic filtering'...and THEN tell me which card has crispier and better texture resolution - its the radeon w/o doubt...and if you'd crank up the gf3 to look the same it would lose almost 60% fps...the radeon only maybe 3%.. (its a nobrainer imho)


 

Tripleshot

Elite Member
Jan 29, 2000
7,218
1
0
Flexy,
You seem to vasilate between Ti500 and Radeaon 8500 daily. You must have deep pockets.;)

Edit.....

Dang it,I hit the wrong key on this flippen keyboard again. This keyboard goes in the trash after I finish this.

Here is a link to THG on the test he did on the Radeon,which explains exactly my support for the ATI Radeon 8500. It just flat out performs the nVidea and looks better,which is why you buy a quality video card in the first place. If FPS is what drives you,and the video is blurred,what good is that bragging right?

NADA!

"A very impressive aspect is the Radeon's performance when using anisotropic filtering. The R8500 simply leaves the competition by Nvidia in the dust. "


RADEON 8500 - Driven To New Heights
 
Jun 18, 2000
11,211
775
126


<< "A very impressive aspect is the Radeon's performance when using anisotropic filtering. The R8500 simply leaves the competition by Nvidia in the dust. " >>


Well, duh. That's because the 8500 disables trilinear when aniso filtering. Some of you are saying you don't need trilinear when anisotropic filtering is enabled. I'm not sure I agree with that. From what I hear, aniso filtering on the Radeon leads to more texture aliasing. This can't be seen in pics. All pics show are the cleaner textures. Of course, turning on FSAA will clear up any texture aliasing - albeit with a huge performance hit.

I haven't seen the 8500's image quality first hand, but that is a common complaint I've heard. Until I see it with my own eyes, I'll spare judgement. Don't buy into the PR hype. Of course ATi will say you don't need trilinear+anisotropic. That's because they don't have it.

BTW, I believe this is a hardware issue, and can't be enabled with driver revisions.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
according to ATI there is NO NEED to do trilinear filtering with their implementation of dynamic anisotropic filtering and smoothvision. (bilinear, 16tab, 32tab, 64tab aniso depending on image contents, angle of 3d objects)

They say that an additional trilinear filtering would not benefit at all and instead only use much more GPU caculating power....and there wouldnt be any benefits since the anisotropic/bilinear filtering does all that already.


Of course they'll downplay a feature their hardware doesn't support. This is not the first time features are downplayed for marketing reasons, anyone remember 3dfx's statements about hardware T&L, AGP texturing and 32bit color? I'm guessing the reason for ATI to drop trilinear has to do with the fact that it requires non-uniform memory access (interlaced access between two mip map levels during blending operation), and this would've hindered the performance of Radeon's rather traditional memory controller or texture caching subsystem. Since bilinear-anisotropic only takes samples within the same texture and same mip map level, access can be cached much more easily.

Anisotropic and trilinear filtering are completely separate and complementary filtering techniques. Anisotropic addresses the issue of blurriness in high-angled polygon textures caused by mipmapped bilinear or trilinear filtering - or texture aliasing caused by non-mipmapped bilinear filtering - by using more texture samples to produce the output pixel. Trilinear filtering addresses the issue of ugly borders in mipmapped bilinear filtering by depth-selectively blending two distinct mipmap levels into the output pixel. The best image quality is produced by using both.

Using just anisotropic CANNOT eliminate the ugly mip map borders of bilinear filtering. The idea behind anisotropic is that when you have a lot more samples at your disposal than in bilinear/trilinear filtering, you can decrease texture LOD up to a certain point - determined by maximum supported anisotropy degree or number of "taps" - without causing texture aliasing.

If enabling anisotropic on Radeon 8500 indeed causes texture aliasing, it means that ATI is adjusting the lod too aggressively. If this is the case, no reviewer should post image quality comparisons without at least mentioning the texture aliasing issue.
 

Tripleshot

Elite Member
Jan 29, 2000
7,218
1
0
>>>Until I see it with my own eyes, I'll spare judgement.<<<

I have seen it with my own eyes. I have made the same comparisons of visual quality that Anand has and Tom Pabst. The eyes do not decieve. When you can get beyond your obvious bias, you may be able to see also,and make the right choice.

The hands down winner is ATI Radeon 8500.

The people who review this for a living have declared it so. Get over it.:)
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<< Flexy,
You seem to vasilate between Ti500 and Radeaon 8500 daily. You must have deep pockets.;)
>>




lol ;)

Actually your posting here a couple weeks ago about what you've seen of the radeon at comdex was the initializer to think about my gf3 ti 500 purchase.

I cancelled my gf3 ti 500 in between (because it was at backorder for over 2 weeks) and now a radeon 8500 retail is on my way...

No...i dont have deep pockets..especially now since i also have to get christmas presents...in addition to building my new system...but the radeon now saved me $35 which is great, too ;)


 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106

triple,

btw the demos you've seen at comdex....were it those demos (nature, dolphins, rachel, etc...) which are for download now from ati's website ?

 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106

> Of course they'll downplay a feature their hardware doesn't support <

lol....look at the HARDWARE specs of radeon...not only is it clocked faster, support for dx8.1 in hardware..more polygons per sec...faster rams etc....

So...do you think (looking at the features and specs) that they "couldnt" implement trilinear...so they have to downplay the fact that they dont have it ?

I dont think so....i'd rather think its a decision of what implemented features make sense..what features give unnecessary overhead...and not an issue of 'not being able to' implement trilinear ;)



 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Jpprod, I'm glad I've found somebody else who agrees with me about the whole bilinear/anisotropic issue on the Radeon. After such a hammering nVidia got for their crappy implementation of DXT1 we see a similar situation with the anisotropic filtering issue on the Radeon and the whole situation is played down.

I'm still amused when I see ATi zealots claim that the Radeon has better IQ than the GF and when they claim that the Radeon's bilinear + anisotropic looks better than the GF's trilinear + anisotropic. Nope, not even close.

It's also amusing to watch them claim that the Radeon "demolishes" the GF in terms of performance when running anisotropic filtering even though it's not doing the same operations as the nVidia card is.
 

Tripleshot

Elite Member
Jan 29, 2000
7,218
1
0
btw the demos you've seen at comdex....were it those demos (nature, dolphins, rachel, etc...) which are for download now from ati's website

Yes, and I saw game play as well. Serious Sam, Half life counterstike, and Quake III Arena.

You made a good choice. Tom pabst agrees.;)


BFG10K

We find you amusing too. You get what you pay for.And by the way you talk, you'll probablly get what you deserve, a crappy, expensive GF Ti500. You obviously do not have any idea about quality.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
>
I'm still amused when I see ATi zealots claim that the Radeon has better IQ than the GF and when they claim that the Radeon's bilinear + anisotropic looks better than the GF's trilinear + anisotropic. Nope, not even close.
>

lol...i think 'we' dont have to do claims...there is an invention called pictures..and enough websites where eg. you can SEE comparisons of a radeon and aniso filtering on....and for the SAME FREAKIN' IQ a gf3 has to be cranked up to 64tab...and a huuuuuuge 60% performance drop of the gf3 is experienced.

The radeon uses like 5 or 6 fps with the SAME IQ....

he other pictures also showed CLEARLY how blurred the textures were with the GF3..using its AA implementation and statnard 16tabg, 32tab aniso.....its a JOKE...sorry.....nothing else than blurry washed out textures....

I think the 'ati-zealots' dont have to CLAIM when there's already proof.

Some only take a bit longer to realize the truth and to make the more logical and right decision.





 
Jun 18, 2000
11,211
775
126
What the hell? My obvious bias? Get off yourself. I said I'd spare judgment until I saw it with my own eyes. Spare the childish "bias" comments.

It would seem I'm at least open-minded enough to listen to both sides of the argument. Technically, bilinear+anisotropic is inferior to trilinear+aniso. Whether this works out in the real world has yet to be seen BY MY OWN EYES. Hell, I haven't even seen the GF3 in action yet (I've a Kyro2). I'm not quite ready to crown a winner. Apparently my bias is clouding my vision, even though just last week I recommended to a coworker to pick up a 8500 because it hits an amazing price/performance point.


<< The people who review this for a living have declared it so. Get over it. >>


Are you some sort of lemming? I don't give a flip if a million reviewers all say the same thing.

"A million voices lying doesn't add up to the truth." - The Pharcyde
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
We find you amusing too. You get what you pay for.And by the way you talk, you'll probablly get what you deserve, a crappy, expensive GF Ti500. You obviously do not have any idea about quality.

<rolleyes>

lol...i think 'we' dont have to do claims...there is an invention called pictures..and enough websites where eg. you can SEE comparisons of a radeon and aniso filtering on....and for the SAME FREAKIN' IQ a gf3 has to be cranked up to 64tab...and a huuuuuuge 60% performance drop of the gf3 is experienced.

That's because 64 tap setting on the GF3 is the same thing as ATi's 16 "tap" setting; they're just using different naming schemes.

It's all marketing bullsh*t on ATi's part. If nVidia called theirs 1 tap would they be better?

The radeon uses like 5 or 6 fps with the SAME IQ....

That's because the Radeon is using bilinear filtering while the GF3 is not. But of course you knew that, right?

nothing else than blurry washed out textures....

Of course the person taking the shots remembered to adjust nVidia's LOD values to ATi's default levels and remembered to use digital vibrance? I'm sure you checked this, with all your "informed" decisions and all.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
In case anyone missed it, this thread proves that ATi are using a Quack trick in 3DMark to imrpove performance at the expense of performance.

From the article:
==========================================
The Radeon 8500, in general, has a poorer image quality than the VisionTek GeForce3 in 3DMark2001. This conclusion may or may not be transferrable to other applications, especially OpenGL applications. However, we can all say for 100% certainty that the VisionTek has the better image quality in all four of 3DMark2001 image quality tests.

Moreover, I do not see any evidence to support the common myth that NVIDIA enhanced the performance scores in 3DMark2001 by degrading the image quality, by making the images "blurrier". On the contrary, not only did NVIDIA keep the level of detail the same but they also fixed graphical anomalies. In two of the four image quality tests, the deviation from the reference image actually went down. This means that the image quality actually improved in two of the four tests.
==========================================

And guess what? The image quality degraded on the Radeon when using newer drivers. From the article:

==========================================
ATi, on the other hand, went on a different route. The deviation from the reference images went up with the latest drivers while at the same time improved its performance. There is a degradation in image quality in all four of the image quality tests. There maybe a correlation between the performance gains of ATi's latest drivers and the degradation in image quality.
==========================================
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
the discussion about the 'allegedly' worse picture Q with the new ATI drivers (compared to gf3 or the older ATI drivers) is so old..it alredy has a beard...

If its just over people's head that its because different 'default' D3D settings in the new DRIVERS...and they STILL continue to shout 'worse IQ'....well it's not my problem since i know its a driver issue and not a general IQ issue. (See old ATI drivers...there it is crisp again)

But this is a totally different subject...and since i already wrote in at least 2 different forums about it (DRIVER ISSUE...NOT IQ ISSUE)...i dont want to spend more time on that.

The Radeon OF COURSE is faster using '16 tab aniso'...and WHY ? Because it uses a dynamic aniso filtering menchanism compared to gf3..and according to ATI trilinear is not needed.

Yes....i 'knew' that ATI uses bilinear instead of trilinear..but this is totally not important to me since it is t the 'better' dynaic aniso implementation (in combination with bilinear) of the ATI which makes it faster than the gf3.

Sorry..the comparison pictures i have seen 'til today LOOK better with the ATI...no matter if bilinear, trilinear, 16tab, 32tab, 64tab. I think its just smarter of ATI to implement it that way then to use GPU for eg. image parts where no aniso is needed at all ---> its just the BETTER way to do it !


 
Jun 18, 2000
11,211
775
126
BFG10K, to be fair, those tests prove nothing. They just show where the video card renders pixels differently than the reference image. Who is to say that the reference image is perfect or in fact is the better image? For 3dmark, the reference may be the better looking image. However, if ATi's drivers use a more aggressive LOD for mipmaps, comparitive to the reference renderer, it will appear differently when using the XOR image. I did see a few anomalies with the 8500 images, where I agree wholeheartedly. However, that may not be the case in all games/benchmarks.

There was a great discussion on this over at Beyond3D's forums.
 

Agent004

Senior member
Mar 22, 2001
492
0
0
BFG10K, obviously you haven't been using nVidia cards for a while, or the fact that you don't have good memory. Ever used the det 6, 7 10 series/versions? There are massive amount of people saying the same thing, version xxx has the correct LOD/IQ and version xxx is faster. Large number of forum topics are just up just for test the difference in each version of drivers

So your argument of saying nvidia has not drop image quality for speed is invalid. Since you already know about the dt1 or what ever they call it nvidia has implemented in quake3 for significant speed increase and very ugly image, need I say more;) (I referring to the general Geforce series).

Why do you think users just complain about nvidia cards having poor image quality? Because they want to? Or is it really they do have a poor image quality.

I just don't feel why someone go so far to dismissed ATI for doing the samething, for which nvidia got applaused. It's really okey for nvidia to do it, because it's everyone's favourite, but not ATI.

As you rightfully said



<< I'm sure you checked this, with all your "informed" decisions and all. >>




 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
BFG won't care about previous drivers, becuase he's in nVidia's camp, which means his response will be on the order of, oh well it doesn't matter becuase they fixed it :)

anywho..

jpprod, I read the FAQ on Bilinear and Trilinear filtering and I noticed that the largest problems that they make are simply blurring of textures in the distance (like the ground, not a wall), to reduce aliasing. my question is, how would those things look WITHOUT that filtering effect for far off pixels? I mean, in most 3D games I doubt you'd think things worse at all, only in flight sims (and even then Flight sims tend to have really bad aliasing anyway, though I don't know if I've tried Anisotropic filtering on a flight sim before).

could you run Anisotropic WITHOUT bilinear and trilinear? you said Anisotripic filtering increases Aliasing, how is that so? I mean, it can't pull more detail out of a texture than what exists, so I'm assuming we generally don't see all of a texture's detail anyway (to prevent aliasing effects).

generally today I look upon FSAA as the solution to Aliasing in textures, though now with Smoothvision the effectiveniss of FSAA on textures might not be enough (considering how badly trilinear and bilinear filtering blur things).

btw, I've looked at ALOT of the screenshots (some from the thread BFG10K pointed out above), and have to say that in general in 3DMark2001 with the 7191 drivers for the Radeon, it looked better than even the reference images (due to less blurring) esp in the Dragothic scene. of course, this is IMHO, and it was only a picture, so aliasing effects can't really be scene. also due to less agressive blurring (I don't know what exactly it's caused by, perhaps less agressive mipmap thingys?), I couldn't really see any of those mipmap borders (probably partly due to no motion, but generally I can spot a mipmap border in a picture pretty well).

There maybe a correlation between the performance gains of ATi's latest drivers and the degradation in image quality.

granted the 7206 drivers make 3DMark2001 look blurrier than even the GF3 does (again from the screenshots) and no doubt that's the reason why it gained performance in all but the nature scene (I attribute that to the fact that they mention their pixel shader performance was increased). does that mean you're forced to use those drivers? hell nVidia users aren't always using the latest ones, becuase they get problems with them, which are often fixed in later updates.

In case anyone missed it, this thread proves that ATi are using a Quack trick in 3DMark to imrpove performance at the expense of performance. - I think you mean, performance at the expense of quality :)

umm, really, can you tell me where in that thread someone actually went into a hex editor to find 3DMark mentioned in the drivers? perhaps again you should think about the blanket statement you just made, and try to think about other Direct3D games. have you seen any comparisons in image quality in them (comparing 7191's to 7206's)? I haven't (I think), so I can't say that you're wrong in saying that ATi changed filtering and LOD settings specifically in 3DMark 2001, but I'd have to say that what u're saying isn't proven.

what that thread DOES prove is that we all wish that ATi and nVidia had all of those options that are available to us in tweakers, available to us in their already built in control panel (ATi moreso becuase u can't even change/enable Anisotropic filtering in Directx, even with the two tweakers I've tried)

Interesting to note, on my Radeon LE in Counterstrike, (after disabling Truform due to nearly unacceptable framerates), with Trilinear filtering you see a blurred mipmap border, and ALOT of blurring, but with Anisotropic (and therefor bilinear as well) on (I think Very high) you see NO mipmap borders. I could take pictures and post them if you like (as soon as I figure out how to take pics with Half-Life). Bilinear filtering by itself I think left me with the specific mipmap borders (Though I can't remember for sure).

Also, with Star Trek Elite Forces, with Anisotropic filtering set to on and no matter what setting (High or Very High) in the drivers, I still see the mipmap borders, but with it disabled I don't see any mipmap borders and things look crystal clear (ie, not nearly as bad as with Anisotropic disabled in Half Life).

btw, I only enabled Anisotropic in the drivers for half-life (I don't know if it has it's own setting for that), however to enable trilinear (with Anisotropic disabled in the drivers) I had to use the command in the console, whereas with Star Trek, I think I had to enable it both in the drivers and the game in order for me to see anything different. Also I think Star Trek defaults to Trilinear when Anisotropic is disabled (otherwise I would have had bad problems with blurring still I think).

I could try posting some pics, though you'll have to give me time (I don't play these games much due to a 56k net connection :-( but play Star Craft instead :).

On the issue of why ATi didn't allow Trilinear and Anisotropic to be enabled, I don't know why, though I think I can safely assume it was a conscious decision on their part, becuase they knew that the original Radeon didn't have it, and some people (people like the Reverend) found that out. so, either they thought that the people buying these cards wouldn't know again (which would be a safe assumption, considering most people didn't know about this at all until recently, including me), or they truly thought that they could drop it and still maintain quality (which they may or may not have accomplished, but IMHO the Radeon's 7191 drivers certainly look VERY nice by default compared to GF3 by default, and I don't see mipmap borders in Half Life with Aniso enabled, and to destroy the borders in Star Trek I just disable Aniso).
 

boran

Golden Member
Jun 17, 2001
1,526
0
76
okay no radeon for me... I need trillinear no matter what you say trillinear+aniostropic will always be better than billinear+aniostropic cous even with aniostropic billinear filtering there will be banding.. you wont be seeing it on any screens but when moving you'll find that the textures "move along with you" something which doesnt happen with trillinear ...

you can call it smart from ati, I call it stupid trillinear filtering on a geforce comes with almost no performance drop (from billinear to trillinear aniostropic(16t) on my GF2 MX i lost 5 fps in quake 3 at 600x800.

and you claiming ati only needs 16tap is cous their "tap" refers to the amount of billinear filtered blocks they use (one billinear filtered block is 2 by 2 pixels) so they also use 64 pixels to build up their 16tap and nVidia's "tap" refers to the amount of pixels used so 64tap uses 64 pixels therefore the quality beeing the same is not a surprise, but ati uses billinear-aniostropy while nVidia uses trillinear-aniostropy ... someone should compare billinear-aniostropy on the nVidia cards to ati's aniostropy ...

anyway I will get myself a ti200
(By the way, should I go elsa or hercules ?)


 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
So...do you think (looking at the features and specs) that they "couldnt" implement trilinear...so they have to downplay the fact that they dont have it ?

No and yes. They certainly could've implemented it, it's trivial compared to the other features Radeon 8500 has. Pixel value in trilinear as easy as Pixel = TextureMip1*(1-(Z/Mipmaprange)) + TextureMip2*(Z/Mipmaprange). The basic multiplier/addition hardware is certainly there already. But for some reason, ATI chose not to implement trilinear. Reason might've been the required non-uniform memory access. Aforementioned is a no-issue with GeForce3 since it has four separate memory controllers.

The Radeon OF COURSE is faster using '16 tab aniso'...and WHY ? Because it uses a dynamic aniso filtering menchanism compared to gf3..and according to ATI trilinear is not needed.

Yes....i 'knew' that ATI uses bilinear instead of trilinear..but this is totally not important to me since it is t the 'better' dynaic aniso implementation (in combination with bilinear) of the ATI which makes it faster than the gf3


Dynamic anisotropic is (please correct me if I'm wrong) only a way to take less than the "taps" number of texels when not needed. It greatly improves performance, but does not affect image quality (except in the way that you can have higher degree of anisotropy with certain performance hit).

You're missing the point here: anisotropic and trilinear filtering are totally separate and complementary filtering techniques. They address completely different flaws in texture filtering. One cannot replace the other.


jpprod, I read the FAQ on Bilinear and Trilinear filtering and I noticed that the largest problems that they make are simply blurring of textures in the distance (like the ground, not a wall), to reduce aliasing. my question is, how would those things look WITHOUT that filtering effect for far off pixels? I mean, in most 3D games I doubt you'd think things worse at all, only in flight sims (and even then Flight sims tend to have really bad aliasing anyway, though I don't know if I've tried Anisotropic filtering on a flight sim before).

Bilinear without mipmaps would look nice in screenshots, but far off you'd see horrible texture aliasing. (There's no such thing as trilinear without mipmaps BTW, trilinear is just bilinear mipmapping combined with linear mipmap filtering.) Mip maps address the problem of aliasing, but introduce ugly texture level-of-detail borders. Trilinear addresses the issue of borders, but causes even a bit more texture distance blurring than bilinear w/ mipmaps. High-tap anisotropic w/ trilinear is the combination to have: no borders, no blurring, no aliasing.

could you run Anisotropic WITHOUT bilinear and trilinear?

Of course. With infinite degree anisotropy, you wouldn't need mipmaps at all. :) As soon as the texture in distance begins to have more bilinear samples on a pixel's area than the maximum supported anisotropy degree, aliasing occurs.

you said Anisotripic filtering increases Aliasing, how is that so?

Actually it's the other way around. Anisotropy enables better pixel accuracy, you could think of it as texture-antialiasing. When you have anisotropic at your disposal, the LOD can be adjusted to higher detail without aliasing occuring.

Someone mentioned that enabling anisotropic on Radeon 8500 introduces aliasing. If this is true, ATI is adjusting LOD too aggressively, and is therefore effectively cheating: it's impossible to pick out aliasing artifacts from a screenshot.

btw, I've looked at ALOT of the screenshots (some from the thread BFG10K pointed out above), and have to say that in general in 3DMark2001 with the 7191 drivers for the Radeon, it looked better than even the reference images (due to less blurring) esp in the Dragothic scene.

The point of aforementioned image quality tests it to see if a card is rendering correctly. With default settings, as in no special driver LOD tweaks applied, a card should in my opinion produce an image that is as close to the reference image as possible. That's what content developers have to rely to.

I couldn't really see any of those mipmap borders (probably partly due to no motion, but generally I can spot a mipmap border in a picture pretty well).

It's pretty hard to pick them out from a screenshot, especially with aggressive LOD settings. In motion, however, they're very noticeable.
 

Rellik

Senior member
Apr 24, 2000
759
0
0
To provide some more info, let me share my findings on the topic with you.

I have just received my R8500 and tried it for the past 2 days. It is still too early for a detailed analysis, but here are the main points I
noticed:

Prior to this card I have owned a Radeon 64DDR VIVO. This former card gave my a texture quality I had never witnessed before. I could
do comparisons to the Geforce DDR and the GF2GTS. Both cards looked worse then the Ati card. I have to say this: If you only use the
drivers, you cannot get the full potential of the card. I used the Radeontweaker to a great extend and found it to be essential in making sure that I unleash the full potential. If you enabeled functions in the tweaker, you saw them in Q3TA immediatly. Even if
I activated triliniar, it seemed to be ignored since I also activated anisotropic filtering (64tap according to the "nvidia way" of counting,
which btw, is wrong: The industry standard is the way ATI counts, which is you say how much taps(degree of filtering) is done, and not
the result(a 64 times pixel) (If I am confusing pixel and texel, please correct me jukka)

Now here is the beef: The Radeon 8500 is not equal to the old Radeon. Q3TA looks worse. I had trouble with the whole settings:
Since no Radeontweaker is available, I had to use rage3d tweaker,
which is horrible. It offers much settings, but "forgets" settings and is in general extremely inefficiant to use. I fixed the S3TC detail textures in UT(applied the atioglxx.dll from the 3276 driver) but all
OpenGL based games just look more blurry then before. LOD is at 8
like before and the same settings are used (as far as I could make sure of that). My "Test is the first level of CTF in TA. You start on top
near your flag where the floor is some kind of metal floorplates. Here you can easily see banding and/or if AS is used. With AF, I get the floor to look good, but the floor below looks blurred. In closup it is really bad. So are the walls. The sky issue is widely known and I for one thing, don´t get why ATI "broke" that again. They had a real good OpenGL driver for Radeon, but the R200 neads some work.
I now know how nvidia users felt the last year. I have not have the chance to see a GF3 in action, so don´t bother stating it is better.

What I don´t get is why the Ti series should provide better IQ. It is the SAME AS THE GF3, which was never declared superior to the Radeon. But at the moment, the 8500 looks worse then the old Radeon(R6 chip). The 2d, however, is very good. same as the old
if not better. Only Matrox is better. I have not seen one nvidia card offering the same IQ in 2D (And I have seen the Elsa GF1, which all claim to be the best).

Sorry for the long post, needed to get it out there :)

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If its just over people's head that its because different 'default' D3D settings in the new DRIVERS...and they STILL continue to shout 'worse IQ'....

So why did ATi lower the default image quality if for no other reason other than to inflate performance?

Or is it so that people buy the cards based on default performance and then get lower performance because they find the image quality is too low and has to be increased?

Yes....i 'knew' that ATI uses bilinear instead of trilinear..but this is totally not important to me since it is t the 'better' dynaic aniso implementation (in combination with bilinear) of the ATI which makes it faster than the gf3.

No matter how good ATi's anisotropic filtering is, it still can't remove the ugly bilinear mipmapping lines. You still don't seem to understand that trilinear filtering and anisotropic filtering compliment each other, not replace each other.

They just show where the video card renders pixels differently than the reference image.

Exactly. That means in terms of rendering accuracy nVidia is better than ATi.

And is it a coincidence that ATi's 3DMark score increased at the same time as their image quality degraded? I don't think it is. The very thing that nVidia have been accused of for so long is now turning out to be ATi who's doing it.

BFG10K, obviously you haven't been using nVidia cards for a while, or the fact that you don't have good memory. Ever used the det 6, 7 10 series/versions?

Of course I did. In fact the 6xx & 7xx drivers increased the image quality of the 5xx drivers by fixing bugs and allowing DXT3 in OpenGL.

Since you already know about the dt1 or what ever they call it nvidia has implemented in quake3 for significant speed increase and very ugly image, need I say more;)

Agreed. 16 bit DXT1 sucks bigtime and it's completely unusable. But I doubt they did it to increase the speed in Quake3 since the original GeForce was designed well before Quake3 was released.

ATi users also have the advantage when running UT using the high res textures because they don't have the ugly banding around the coronas like nVidia cards have.

I just don't feel why someone go so far to dismissed ATI for doing the samething, for which nvidia got applaused. It's really okey for nvidia to do it, because it's everyone's favourite, but not ATI.

nVidia have never pulled the same blatant Quack stunts that ATi have. In fact there's no evidence anywhere that suggests that nVidia degrade image quality in favour of performance, except maybe in the case of DXT1.