I have gamed at 2560x1600 and didnt feel I had to have aa. Stop being a snob really. No single gpu will handle these resolutions with aa on like you think is absolutely necessary.AA off looks terrible no matter how high your resolution is. What matters is pixel density, and a 27" 1440p panel doesn't have enough to remove aliasing.
At 2560x1600 you definitely still notice no AA and it is still annoying. That said, AA may be difficult in some games with a single card due to the performance hit, but you would still notice "jaggies". I suppose you could use FXAA in that situation since the performance hit for FXAA is pretty minor...At 2056x1440 it isnt that bad to turn aa off.
I absolutely agree with that. There is no harm in providing more information to the OP. If he/she SPECIFICALLY said "I will not buy an AMD card", and then RS still was recommending an AMD I can see why people would get angry.I don't see how this thread is any different than if a person asked if FX8120/8150 is good enough for gaming. Yes, it is, but is it as good as the alternative?
I have gamed at 2560x1600 and didnt feel I had to have aa. Stop being a snob really. No single gpu will handle these resolutions with aa on like you think is absolutely necessary.
Are you one of the people who said crysis 1 looked terrible without aa but continued to complain that it ran at 10fps now?
Youll have to turn off aa to even play some games (or use fxaa) at 1440p. Try running 4x msaa on metro 2033 maxed all out at that rez on a single card.
But if you stood further back, would there be a point where your eyes just couldn't tell? Then take the flip-side instead of moving away from the monitor, you increase the pixel depth. Would you be able to tell on a 4K display at the same screen size and distance as before without standing further back? There has to be an objective math-based way to calculate when the jaggies are no longer resolvable by a human eye, I think? I know subjectively some people are more sensitive, but I also think there is a physiologically limit to the human eyeball.I see aliasing at 1080P, 1200P, 1600P, makes no difference to my eyes as the jaggies are always there without AA.
That's why flexibility is a gamer's best friend and actually my signature at Rage3d. You may be fine with no AA; another with FXAA or MLAA; another with x2 AA ; some need super-sampled; some need x4 AA or X8 AA. Which way is correct? As long as there is flexibility for gamers -- pretty happy over-all. That "right" balance may differ from each individual based on subjective tastes, tolerances, platform and display.I'd rather take playable performance over AA. AA isn't 100% absolutely necessary. If you can have both fine, with a single card at 2560x1440/1600 you can't. That was my point. I'm saying you can show 8xaa is better with one card over another but when both are around 35fps I'll take neither and remove AA entirely.
I disagree.AA off looks terrible no matter how high your resolution is. What matters is pixel density, and a 27" 1440p panel doesn't have enough to remove aliasing.
well you can clearly see here that FXAA can blur in some games. http://hardforum.com/showthread.php?t=1698508&highlight=fxaa&page=3PS
PINGING all blur inspectors!!
Test your skillz on this Skyrim FXAA vs NoAA screenshot.
http://null-url.com/T00D0dW0i0
GL![]()