AA off looks terrible no matter how high your resolution is. What matters is pixel density, and a 27" 1440p panel doesn't have enough to remove aliasing.
At 2056x1440 it isnt that bad to turn aa off.
I don't see how this thread is any different than if a person asked if FX8120/8150 is good enough for gaming. Yes, it is, but is it as good as the alternative?
I have gamed at 2560x1600 and didnt feel I had to have aa. Stop being a snob really. No single gpu will handle these resolutions with aa on like you think is absolutely necessary.
Are you one of the people who said crysis 1 looked terrible without aa but continued to complain that it ran at 10fps now?
Youll have to turn off aa to even play some games (or use fxaa) at 1440p. Try running 4x msaa on metro 2033 maxed all out at that rez on a single card.
I see aliasing at 1080P, 1200P, 1600P, makes no difference to my eyes as the jaggies are always there without AA.
I'd rather take playable performance over AA. AA isn't 100% absolutely necessary. If you can have both fine, with a single card at 2560x1440/1600 you can't. That was my point. I'm saying you can show 8xaa is better with one card over another but when both are around 35fps I'll take neither and remove AA entirely.
AA off looks terrible no matter how high your resolution is. What matters is pixel density, and a 27" 1440p panel doesn't have enough to remove aliasing.
well you can clearly see here that FXAA can blur in some games. http://hardforum.com/showthread.php?t=1698508&highlight=fxaa&page=3PS
PINGING all blur inspectors!!
Test your skillz on this Skyrim FXAA vs NoAA screenshot.
http://null-url.com/T00D0dW0i0
GL