AAA Versus TrAA

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
A better thread in discussing AA and DX11, imo, would be one addressing the unfortunate situation of so many games under DX11 using deferred shading/lighting, making conventional AA methods useless.

To compensate many new games are coming with FXAA/MLAA implementations, which if you care about AA quality, are just atrocious. It's not even AA as far as I am concerned, it's just a nasty blur effect applied to not just edges - but every texture and visual you see. Yes we some some DX11 games using deferred MSAA, but it's extremely taxing on performance and does not affect all edges. And when I say taxing, we are talking a 20% or higher hit to framerates; see Battlefield 3.

Diablo 3, a highly anticipated blockbuster title is using deferred lighting/shading and the DX9 path - guess what ? FXAA/MLAA is the only 'anti-aliasing' option.

The sad situation lately is that more and more recent games are using FXAA/MLAA and you get no real AA modes available at all. In the few that offer the deferred MSAA - you better be running SLI/CF of the best cards on the market to even enable it. Contrast this to being able to turn on 4x MSAA with any mid-range card in games of the past with a minimal performance hit.

True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.

:thumbsdown::thumbsdown:

Kudos for bringing up the most important point as far as im concerned.

I have yet to see any difference between "very high quality" and "super high quality". So for the few people interested in stuff like that, have at it.

Personally i think gameplay is king. And id rather there was more focus on that, and in effect, id like them to just "go with the trash that is fxaa/mlaa" and spend resources/time on gameplay rather than very silly stuff.

Immersion is important, but damn, i was way more immersed when playing baldurs gate 1 - 2, than i ever was playing a fps.

Making stuff look "photorealistic" as per Max Payne is cool, but to trow away important gameplay mechanisms in order to include more graphical fancyness is a shot in ones own foot in my opinion.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
A better thread in discussing AA and DX11, imo, would be one addressing the unfortunate situation of so many games under DX11 using deferred shading/lighting, making conventional AA methods useless.

To compensate many new games are coming with FXAA/MLAA implementations, which if you care about AA quality, are just atrocious. It's not even AA as far as I am concerned, it's just a nasty blur effect applied to not just edges - but every texture and visual you see. Yes we some some DX11 games using deferred MSAA, but it's extremely taxing on performance and does not affect all edges. And when I say taxing, we are talking a 20% or higher hit to framerates; see Battlefield 3.

Diablo 3, a highly anticipated blockbuster title is using deferred lighting/shading and the DX9 path - guess what ? FXAA/MLAA is the only 'anti-aliasing' option.

The sad situation lately is that more and more recent games are using FXAA/MLAA and you get no real AA modes available at all. In the few that offer the deferred MSAA - you better be running SLI/CF of the best cards on the market to even enable it. Contrast this to being able to turn on 4x MSAA with any mid-range card in games of the past with a minimal performance hit.

True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.

:thumbsdown::thumbsdown:


Consoles.




Thats great then, because AMD has better IQ options than nvidia.

Of course, if you really want you can scour the internet for aa compatibility bits and then use the highly unreliable nvidia inspector and pray that something works! :D

Simple test is using dead space 2 on nvidia hardware and amd hardware. AMD you click 1 item in CCC to enable SSAA and it works. Nvidia: Scour the net for AA compatibility bit. Rummage through nvidia inspector and update your windows registry with new settings. Nvidia SSAA doesn't work (game turns into a slide show). You can enable up to 8x MSAA but you have to disable in game shadows or you will get corruption.

I think I like option 1 better. AMD has better IQ in this example, and i'll be happy to name more examples if you'd like.

Dead space not having any AA options is not NV's fault. Enable 4x SGSSAA, force 4x MSAA, enable the right compatibility for it, and it works just fine. The reason you have to go through more steps is because SGSSAA (like TrSSAA) is triggered from MSAA, and since Dead Space doesn't really support MSAA, you have to work around it.

AMD's SGSSAA doesn't work with an MSAA trigger, it just works with the CCC setting. But it also doesn't work with DX10+.

Anyway, why is this relevant? Go complain to the makers of Dead Space, not NV and AMD.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Kudos for bringing up the most important point as far as im concerned.

I have yet to see any difference between "very high quality" and "super high quality". So for the few people interested in stuff like that, have at it.

Personally i think gameplay is king. And id rather there was more focus on that, and in effect, id like them to just "go with the trash that is fxaa/mlaa" and spend resources/time on gameplay rather than very silly stuff.

Immersion is important, but damn, i was way more immersed when playing baldurs gate 1 - 2, than i ever was playing a fps.

Making stuff look "photorealistic" as per Max Payne is cool, but to trow away important gameplay mechanisms in order to include more graphical fancyness is a shot in ones own foot in my opinion.

It also addresses the OP. Nvidia does offer TRSAA in DX10 whereas AMD doesn't do AAA in DX10. But neither can do squat in DX11 because you can't force any type of AA in all these new DX11 games of any sort.

You are limited to what the developer implemented or using an out of game FXAA/MLAA injector, which is trash.

I have no idea what DX12 holds, but hopefully it is not more of the same limited AA options. In the realm of anti-aliasing, shockingly, DX11 has been a regression rather than an evolution.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Essentially. It was your post that said AMD didn't have AAA for DX10/11 and so I decided to look it up and found that I missed this in the review for the 7970. So, yeah... It saved me from making a purchase that would have dissapointed me in some aspects.

With that said, since the 7970 was designed to be an eye-finity card, it has massive fill rate. So, perhaps AMD does have the power to apply SSAA for some newer games @ 1680x1050. If that is the case, it might put me back into the 'maybe' camp for the 7970 purchase. But I would need benchmarks to confirm this...


oh my god.. its great that it saved you from making a purchase you would othervise be disappointed in/by.

When are you making your next "this brand doesnt have this function, what do you think about? Did you know that the other brand HAS that function" topic?

I mean since when did we start making topics with every little thing "we found out, that we personally didnt know",("but should have known") instead of asking about it in a topic with much the same discussion already going?

Color me disappointed.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Color me disappointed.

I am sorry to have dissapointed you.

I actually like these types of threads. It brings out some good information that only a few elite really know. I have already learned several things from this thread. That is the goal of the video forum, in my opinion, to discuss and learn things.
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
By all means.
Something tells me your trying to balance out your previous "did you know nvidia cant do this this and that amd can" topic, and why shouldnt you. Im not bothered at all.

The more important discussion, should have been, imo, what type of games do we want? not if you can spot the grain of detail on a 2600x1600 display (?) and judge it to be better than another (to most eyes) identical grain on another display with another video card.

This is like discussing two things that are irrelevant in the large picture, or will be irrelevant if the trends pointed out by groove keep moving forward.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So very true. I won't lose any sleep over developers implementing post-process AA as an additional option, but it's in no way a proper replacement for real AA.


I also only se shaderbased AA as an addon/refinement of MSAA/SSAA...not a replacement.
In order to replace somthing, one would wan't better I.Q...or at least same I.Q. with better performance...not lower I.Q for more performance.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I also only se shaderbased AA as an addon/refinement of MSAA/SSAA...not a replacement.
In order to replace somthing, one would wan't better I.Q...or at least same I.Q. with better performance...not lower I.Q for more performance.

Shaderbased AA as a refinement of SSAA? What?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I see. I just had a friend with a 6970 check Minecraft with AAA and they're getting the same artifacts (and still no AA on transparent textures), so it sounds like you're mostly right.
Yep, with AMD you should get AAA working for alpha textures, but using SSAA will be undefined.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Are you sure about this? I was under the impression with nvidia drivers 258 and onward, that this was fixed.
100% positive; it was never fixed in OpenGL. I’ve actually asked nVidia to fix it several times.

Trust me, I’d know as soon as it was fixed.
OGL should be applying TrAA with the applicable setting, but I am not at home to try it out.
No, because nVidia’s TrAA doesn’t work in OpenGL.

The artifacting is a result of how SSAA does business. It is rendered internally at a substantially higher resolution, and then each pixel is colored from the avarage of all apropriate pixels of the high-resolution image. Due to bugs with the minecraft engine, this is why there are artifacts. The approximated pixel colors, when downscaled, produce artifacts onscreen.
Uh, no. The pink screenshots clearly have NO AA being applied to them. Furthermore, the TrAA screenshots actually have SSAA applied and look perfectly fine.

It’s likely a driver glitch from trying to apply an undefined driver setting.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I wait til the day nVIDIA unleashes all of its AA modes in the control panel.. would give them such an advantage in terms of flexibility. Still have no idea why they dont implement. Those hybrid modes are so good for old games..