• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Learn me....AA vs Anisotropic Filtering

Tweakin

Platinum Member
What is the difference between AA and Anisotropic filtering? I believe AA is the filter that corrects the jagged edges on a display. Additionally, are there any visual gains above 12x10 resolution? Thanks for the help.
 
aa creates softness and blending/blurriness along the edge of a polygon. af takes away the blurriness of a limited resolution texture(usually, most textures are 128x128 or 256x256 or 512x512) of the texture and makes it sharper.

4xaa is usually the best compromise between performance and quality. any higher and there is really no idfference but in extreme cases.

16xaf is usually good because you can crank it up much higher without too much of a perofrmance penalty.
 
Its all about perception. I can see little details that some of my friends cant.

To me, if its on a 17" monitor (CRT) then 1280x1024 is the max I can see detail. Anything higher and I cant tell the difference. On a 19" CRT I like to run games at 1600x1200 if possible.

Its generally better to run at a higher resolution than to turn on AA. (You are correct about AA). But if you got an older game that will only go so high, then it definately makes an improvement.
Dungeon Keeper 2 only runs stable at 800x600. But with 6X anti-aliasing even I have a hard time seeing pixels on a 19" display.

IF you were to go all out and get a 22" monitor, than I would recommend moving back a little.

Also, if you have an ATI X600 or X800 then you can play the Ruby demo. On that you can tell a difference with AA. Even 2X AA moves it from a really good 3D computer demo, to a movie quality experience.
 
It can depend on the game as I know HL2 can be very aliased and definetely needs some sharpening, even at very high resolutions.
 
If you play games at 1280x1024 or 1600x1200 then you really don't need AA as these high resolutions more pixels are being displayed so AA becomes less important. But, AF is necessary at any resolution and 8x AF is usually enough.🙂
 
AA smooths out jagged edges on the screen. AF sharpens textures when you look at them at an angle.

Yes there are visual gains above that resolution.

The best combination of AA/AF is 4xAA/8xAF. If you have an Nvidia card 2xQAA/8xAF.

 
If you play games at 1280x1024 or 1600x1200 then you really don't need AA

On what, a 14" monitor? Even running 2048x1536 you still benefit from AA(although that setting is much crisper then 1600x1200) and AF you will benefit from also. In terms of a typical consumer display and normal eyes, you would need to run a res around 8000x6000 to eliminate noticeable aliasing, although you certainly reach a point of diminishing returns well before that.
 
Thanks for all the input. I don't think I'll be able to do any of these settings until I get my new card. My 9600xt just barely plays HL2 at 12x10...forget any extras.
 
Originally posted by: BenSkywalker
If you play games at 1280x1024 or 1600x1200 then you really don't need AA

On what, a 14" monitor? Even running 2048x1536 you still benefit from AA(although that setting is much crisper then 1600x1200) and AF you will benefit from also. In terms of a typical consumer display and normal eyes, you would need to run a res around 8000x6000 to eliminate noticeable aliasing, although you certainly reach a point of diminishing returns well before that.

Agreed.

1920x1200 on my LCD still looks pixelated with no AA. 2x AA looks MUCH better but is unplayable. 🙁
 
As you increase the resolution AA diminishes in importance but AF increases, although you certainly still get benefits from AA too (I use 1920x1440 with 4x/8x AA).

The best combination of AA/AF is 4xAA/8xAF. If you have an Nvidia card 2xQAA/8xAF.
If you have any card faster than a 9700 Pro there's no reason to use anything lower than 16xAF.
 
I just turned on x16AF and it didn't affect performance too bad. I never used AF because I thought it would severely affect performance. I enabled x16 AF through the driver as most games do not have the option of setting AF.
 
I have a question. What happens if you enable AA and AF through the video driver and the game? Say it's doom3 and you have x4AA enabled in the game, but you have x2 enabled in the driver. Does the game override the driver or is it vice versa?
 
Originally posted by: g33k
I have a question. What happens if you enable AA and AF through the video driver and the game? Say it's doom3 and you have x4AA enabled in the game, but you have x2 enabled in the driver. Does the game override the driver or is it vice versa?


drivers overide game...unless of course you have the drivers set to application controlled
 
Originally posted by: otispunkmeyer
Originally posted by: g33k
I have a question. What happens if you enable AA and AF through the video driver and the game? Say it's doom3 and you have x4AA enabled in the game, but you have x2 enabled in the driver. Does the game override the driver or is it vice versa?


drivers overide game...unless of course you have the drivers set to application controlled

In which case nothing would be overriding anything because the drivers are set to use whatever the application wants. 😉
 
If you have any card faster than a 9700 Pro there's no reason to use anything lower than 16xAF.
16x is just a bit better than 8x and not worth it. I would rather have the performance increase IMO.
 
I just tried to use 4xAA and 8xAF @ 12x10 in HL2 and my poor 9600 just can't cut the butter. The graphics improved and it's a really neat world to be in...I just need a faster video card if I wan't to play at this res with those filters...

I guess it's time I just buy a X800xl....
 
Back
Top