A better thread in discussing AA and DX11, imo, would be one addressing the unfortunate situation of so many games under DX11 using deferred shading/lighting, making conventional AA methods useless.
To compensate many new games are coming with FXAA/MLAA implementations, which if you care about AA quality, are just atrocious. It's not even AA as far as I am concerned, it's just a nasty blur effect applied to not just edges - but every texture and visual you see. Yes we some some DX11 games using deferred MSAA, but it's extremely taxing on performance and does not affect all edges. And when I say taxing, we are talking a 20% or higher hit to framerates; see Battlefield 3.
Diablo 3, a highly anticipated blockbuster title is using deferred lighting/shading and the DX9 path - guess what ? FXAA/MLAA is the only 'anti-aliasing' option.
The sad situation lately is that more and more recent games are using FXAA/MLAA and you get no real AA modes available at all. In the few that offer the deferred MSAA - you better be running SLI/CF of the best cards on the market to even enable it. Contrast this to being able to turn on 4x MSAA with any mid-range card in games of the past with a minimal performance hit.
True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.
:thumbsdown::thumbsdown: