1920x1200 resolution: do you use AA/AF?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I use 0XAA 16xAF in games that use deferred rendering since theres no point to 'proper' AA there anyway and 0xAA 16XAF for Crysis obviously, but everything else is usually 2xAA 16XAF @1920
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: cmdrdredd
I just don't understand why people say that 1920x1200 with 0x aa looks bad when alot of people are using 4xAA at 1280x1024 and the game looks spectacular. I would think that the increased resolution was more benefitial than doing AA.

It's all about pixel pitch http://en.wikipedia.org/wiki/Dot_pitch

Notice that the pixel pitch on a 24" 1920x1200 screen is 0.270mm, while the pixel pitch on a 17" 1280x1024 is 0.264. The pixels on the 17" screen are actually (slightly) smaller than on the 24" screen, so the edges would actually look sharper without AA on the 17" than on the 24".

Needless to say, the 24" gaming experience will be more impressive simply due to the wow factor that a 24" screen has over a 17" screen. This is why they generally don't make 17" widescreen monitors with a resolution of 1920x1200 for super sharp, non-AA gaming. There are laptops however with 17" screens like that, and a lot of people complain that the desktop icons are too small.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I just don't understand why people say that 1920x1200 with 0x aa looks bad when alot of people are using 4xAA at 1280x1024 and the game looks spectacular.
They probably haven't seen better, like how we were all blown away by how good GLQuake looked @ 640x480 until we saw Unreal running at 1024x768, for example.