AA quality

flexy

Diamond Member
Sep 28, 2001
8,464
155
106

i know this has been discussed to death.

My situation:

I was with Nvidia in the early years, starting with a Riva 128, TNT, TNT2 which i thought were awesome cards.

Then i switched over to Ati, had a 8500, 9800 XT, 850 XT, with the 9800 XT probably the best card of all of them in its time.

Now i am back with Nvidia and have a 8800 GTS G92.

My standard setting is usually always running with 4xAA.

It is my subjective opinion that NV AA Quality in 4x is still not as good as ATIs. I can only describe it as that NV 4xAA looks "uneven" while on all Ati cards the picture (subjectively speaking) looked smoother with 4xAA on.

Also..especially since i now run all games 1680x1050 or 1440x900...so i would expect less visible jaggies than i had with ATI running 4:3 on a 1360x1024

My $0.02

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Nvidia 8x 16x CSAA looks pretty good too. Only 15% performance hit from 4xAA.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: flexy

i know this has been discussed to death.
Correct.

I believe the general consensus is that it's subjective, while NV has a technical upper hand.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
ATI always had an image quality advantage in the X1000 vs. GeForce 7 battle, but from all that I've seen that is not really true anymore.

Were you running Adaptive Anti-aliasing with your ATI card? Make sure you have Multi-sampling or Super-Sampling enabled and try one of the Quality settings - 8xQ or 16xQ - if the game is older and performance is still OK.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
i think the AA on my old 9800 looked better than on my 8800GS, but it looked okay on my 8800GT. I noticed a difference in AA image quality when i switched from 169.25 to 174.16. It seemed like it got worse (i noticed jaggies a little more).