Zebo
Elite Member
- Jul 29, 2001
- 39,398
- 19
- 81
Originally posted by: Matthias99
Originally posted by: PhatoseAlpha
I don't like bluriness, and AA, to my eyes, just makes things blurry, not more accurate.
In real life, objects don't have uber-razor-sharp edges like the objects your video card is rendering. It's not "blurring" the image; it's taking extra samples to blend colors around polygon transitions, making it look smoother and more natural. Mathematically, this is also more accurate; it compensates somewhat for only taking one color sample per output pixel in the normal rendering pipeline without AA.
Tack on the performance penalities, and it's not even close to worth considering.
Unless you want your games to look good or something...
Still.
Take a look at these comparison screen shots from AT's X1900 review (make sure to zoom them to 100% so you're seeing the exact pixels):
no AA
6xAA
Look at the edges of the pages of the binder on the desk, and the transitions between the panels on the walls. It looks *horrible* without AA; the 'jaggies' stand out much much more without the colors being blended.
Exactly and you really notice it in gameplay since the jaggies tend to move on the object, creating a rippeling or ripping effect. Me thinks this dude does'nt even have a computer saying stuff like "AA blurry" "Not more accurate." Either that of he's a pro gamer all those guys play 10x7 minimum settings..Third choice is clueless to his enviroment. Forth is blind.
