AAA Versus TrAA

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Eh, MSAA + TrSSAA doesn't really match true SSAA. With MSAA + TrSSAA you're anti-aliasing polygon edges and alpha textures, however you're not anti-aliasing shaders. The high cost of true SSAA means this is usually a worthwhile tradeoff, but there's an obvious quality difference, particularly in games that have heavy shader aliasing due to specular lighting.

That might be. I just know that I barely notice any aliasing at all when I use MSAA+SSTrAA. Maybe it is just the games that I play? I hate aliasing, but I also like a decent frame rate.

BTW, I think nVidia released a tool that allows the 400+ Series to use use 2X SSAA and 4X SSAA.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Furthermore, the fact remains that nvidia does not have SSAA in the driver
That's not true, there’s driver level support for it.

You can only "try" to enable it in nvidia inspector and that doesn't work most of the time. Please prove me wrong on that one, I have 200+ games on steam and I have tried all sorts of combinations with both sets of hardware. Nvidia override doesn't work a GREAT DEAL of the time. Christ, you can plainly see it in the games list in nvidia inspector ---- most of them don't have AA compatibility bits and that means any type of override does nothing -- also a great number of them have "treat override as application preference" flagged. AMD absolutely has more reliable override than nvidia does, end of.
I think your confusion is coming from the fact that you don't fully understand how profiles work, how to use nVidia Inspector, and/or how nVidia’s AA combinations work.

Do you consider SSAA via nvidia inspector a proper means of enabling SSAA? (when it works, which is probably 20% of the time). Do you think more than 1% of enthusiasts even have nvidia inspector?
SSAA will work anywhere MSAA will work. Is it in the stock control panel? No, but then AMD's missing useful stuff in their control panel too. Third party apps are essential for anyone who wants the best from their GPU.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Please, show me these details because that makes no sense. SSAA is transparent to the application; it simply shifts the pixel centre when rendering and then combines the values. There’s absolutely no reason why it should misinterpret the alpha channel.

I’ve tried TrAA in ~45 OpenGL games, and in every case I’m getting SSAA.

Hmm? How are you getting SSAA out of TrAA. TrSS only affects polygons with transparent textures and does not do all of the brute work that SSAA does.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Please, show me these details because that makes no sense. SSAA is transparent to the application; it simply shifts the pixel centre when rendering and then combines the values. There’s absolutely no reason why it should misinterpret the alpha channel.

I’ve tried TrAA in ~45 OpenGL games, and in every case I’m getting SSAA.
Sure. This isn't the best example, but it's the easiest thing I could whip up.

Minecraft 0x AA

Minecraft 8x SSAA

Minecraft 8x TrSSAA

Notice the pink banding in the water, and the other various artifacts with true SSAA; or for that matter how SSAA still only fixes polygon aliasing, but misses texture aliasing. These do not occur with TrSSAA because of how TrSSAA handles transparencies differently.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
How are you magically getting SSAA out of TrAA. TrSS *only* affects polygons with transparent textures and does not do all of the brute work that SSAA does. Using MSAA + trSS doesn't magically give you SSAA. They're not the same thing.
The same way SSAA was first discovered on Fermi – the setting was broken. The setting was fixed in D3D but not in OpenGL. I thought you knew all this stuff?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yes, the GeForce SSAA Tool. Though don't let the name fool you, it's for any Fermi card. It's just setting registry entries.

Hmm, I wasn't aware of this. Still, nvidia should really just add the ability to use it within the control panel (in the global settings or program settings tab), i've been rallying for it on the geforce.com forums :( It seems like trying to use SSAA in older titles with nvidia is a huge hassle compared to AMD. (where you literally can do it with 1 click)

I don't think its realistic to use SSAA in modern (DX11) games but its a great perk for 2+ year old games. Anyway, i've walked away from this thread with more knowledge, my bad if I got a little too wound up.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Yes, the GeForce SSAA Tool. Though don't let the name fool you, it's for any Fermi card. It's just setting registry entries.

Yep, that is it. Is that what you used for Minecraft? Does anyone know the performance hit of 4X SSAA? I may do some benchmarks when I get home. But if smoeone has already done the homework, I'd be interested to see what performance hit 4X SSAA has. I would guess that screen resolution would have a huge impact on the performance drop of 2X or 4X. Fill rate demands of 1680x1050 @ 4X SSAA would be more than 2560x1600 without AA, correct?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
SSAA will work anywhere MSAA will work.
I'm not sure I'd make such a wide claim. Personally I'd give up someone else's left arm for SSAA in BF3. MSAA works if enabled in BF3, but forcing SSAA on top of that on NVIDIA cards doesn't.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yep, that is it. Is that what you used for Minecraft? Does anyone know the performance hit of 4X SSAA? I may do some benchmarks when I get home. But if smoeone has already done the homework, I'd be interested to see what performance hit 4X SSAA has. I would guess that screen resolution would have a huge impact on the performance drop of 2X or 4X. Fill rate demands of 1680x1050 @ 4X SSAA would be more than 2560x1600 without AA, correct?
For Minecraft I'm using TrSSAA (which is available in NVIDIA's regular control panel), not true SSAA. Though I did use that tool to force true SSAA for the screenshots I took for BFG.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Sure. This isn't the best example, but it's the easiest thing I could whip up.
Minecraft 0x AA
Minecraft 8x SSAA
Minecraft 8x TrSSAA
Notice the pink banding in the water, and the other various artifacts with true SSAA; or for that matter how SSAA still only fixes polygon aliasing, but misses texture aliasing. These do not occur with TrSSAA because of how TrSSAA handles transparencies differently.
I think you need to look at your screenshots again. 8xTrSSAA.png is clearly giving SSAA across the entire scene, just like I said it was (TrAA controls SSAA in OpenGL).

8xSSAA.png is not applying any AA reliably and is likely just driver artifacting from a setting that isn't implemented properly in OpenGL.

I'm not sure I'd make such a wide claim.
Based on the theory and practice, I've never seen a game where it wasn't true, including deferred renderer engines with their own MSAA. I don't have BF3 so I can't test it, but I'm 100% confident I could get it to work.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I think you need to look at your screenshots again. 8xTrSSAA.png is clearly giving SSAA across the entire scene, just like I said it was (TrAA controls SSAA in OpenGL).

8xSSAA.png is not applying any AA reliably and is likely just driver artifacting from a setting that isn't implemented properly in OpenGL.
I'd completely agree with you if SSAA worked on AMD cards, but it doesn't. AMD cards produce the same artifacts using their SSAA mode (and I believe their AAA mode is the same).

I'm not 100% sure what NVIDIA is doing, but their TrSSAA mode under OpenGL is distinct from anyone's true SSAA mode.

Based on the theory and practice, I've never seen a game where it wasn't true, including deferred renderer engines with their own MSAA. I don't have BF3 so I can't test it, but I'm 100% confident I could get it to work.
I wish BF3 wasn't so expensive as I'd love to get you a copy to put that to the test. It hasn't been for a lack of effort on my part; BF3 seems to completely ignore what NVIDIA's drivers tell it to do.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
ArchAngel777, is this thread my fault? :D

Everything BFG10K said is truth.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I'd completely agree with you if SSAA worked on AMD cards, but it doesn't. AMD cards produce the same artifacts using their SSAA mode (and I believe their AAA mode is the same).
AMD doesn't support SSAA in OpenGL, only AAA. You can't make a comparison there.

I'm not 100% sure what NVIDIA is doing, but their TrSSAA mode under OpenGL is distinct from anyone's true SSAA mode.
There's no TrAA in OpenGL with nVidia; in OpenGL that setting controls SSAA.

Your 8xTrSSAA.png is 8xSSAA while 8xSSAA.png is undefined and driver artifacting as a result.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
AMD doesn't support SSAA in OpenGL, only AAA. You can't make a comparison there.


There's no TrAA in OpenGL with nVidia; in OpenGL that setting controls SSAA.

Your 8xTrSSAA.png is 8xSSAA while 8xTrSSAA.png is undefined and driver artifacting as a result.
I see. I just had a friend with a 6970 check Minecraft with AAA and they're getting the same artifacts (and still no AA on transparent textures), so it sounds like you're mostly right.:)

Anyhow unless I'm interpreting this wrong, this would mean that NVIDIA's broken TrSSAA-thats-really-SSAA mode is the only form of enhanced AA under OpenGL. Everything on the AMD side and NVIDIA's official SSAA mode are actually plain MSAA.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
AMD doesn't support SSAA in OpenGL, only AAA. You can't make a comparison there.


There's no TrAA in OpenGL with nVidia; in OpenGL that setting controls SSAA.

Your 8xTrSSAA.png is 8xSSAA while 8xTrSSAA.png is undefined and driver artifacting as a result.

Are you sure about this? I was under the impression with nvidia drivers 258 and onward, that this was fixed. OGL should be applying TrAA with the applicable setting, but I am not at home to try it out. I'll send an email to confirm whether this is the case, but you are talking about something that was broken years ago. I'm fairly certain that it has been fixed.

The artifacting is a result of how SSAA does business. It is rendered internally at a substantially higher resolution, and then each pixel is colored from the avarage of all apropriate pixels of the high-resolution image. Due to bugs with the minecraft engine, this is why there are artifacts. The approximated pixel colors, when downscaled, produce artifacts onscreen.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
A better thread in discussing AA and DX11, imo, would be one addressing the unfortunate situation of so many games under DX11 using deferred shading/lighting, making conventional AA methods useless.

To compensate many new games are coming with FXAA/MLAA implementations, which if you care about AA quality, are just atrocious. It's not even AA as far as I am concerned, it's just a nasty blur effect applied to not just edges - but every texture and visual you see. Yes we some some DX11 games using deferred MSAA, but it's extremely taxing on performance and does not affect all edges. And when I say taxing, we are talking a 20% or higher hit to framerates; see Battlefield 3.

Diablo 3, a highly anticipated blockbuster title is using deferred lighting/shading and the DX9 path - guess what ? FXAA/MLAA is the only 'anti-aliasing' option.

The sad situation lately is that more and more recent games are using FXAA/MLAA and you get no real AA modes available at all. In the few that offer the deferred MSAA - you better be running SLI/CF of the best cards on the market to even enable it. Contrast this to being able to turn on 4x MSAA with any mid-range card in games of the past with a minimal performance hit.

True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.

:thumbsdown::thumbsdown:
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
I was dissapointed to learn (I should have known this before, but I must have missed it!) that AMD does not support Adaptive AA in DX10 or 11. They only support it in DX9.

On the other hand, nVidia does support this in DX9/10/11. I have been a TrAA junky ever since it has been implemented. I am always using it and always wish reviewers would use it in their tests. If you doubt this, check my post history, I am sure I have posted about TrAA a hundred times over the years. Anyway this is a staple option [for me] which means the 7970 is no longer on my 'to buy' list. Unless someone can convince me that I am mistaken? Am I mistaken?

How do others feel about this? I really hate aliasing in the alpha textures. I don't have a lot of experience with AMD cards except in my laptop, and it performs exceptionally, but because it is a mobile GPU, I don't expect a lot from it. Although the 6770M is an impressive notebook chip and is quite up to the task for most games.

Edit ** I always use [when possible] the Super Sampling 4X TrAA combined with regular MSAA for polygon edges. I found that MS version of TrAA often does not work.


Oh and no phsyx too, for AMD boys,,,,the list goes on and on,, CCC driver issues etc... lol
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
games of the past with a minimal performance hit.

True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.

:thumbsdown::thumbsdown:

So you wnat to LOWER the I.Q for skaes of performance...fine.
Just don't think we all want to follow that path...some of us care about I.Q. more than we care about +100 FPS...
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
So you wnat to LOWER the I.Q for skaes of performance...fine.
Just don't think we all want to follow that path...some of us care about I.Q. more than we care about +100 FPS...

Thats great then, because AMD has better IQ options than nvidia.

Of course, if you really want you can scour the internet for aa compatibility bits and then use the highly unreliable nvidia inspector and pray that something works! :D

Simple test is using dead space 2 on nvidia hardware and amd hardware. AMD you click 1 item in CCC to enable SSAA and it works. Nvidia: Scour the net for AA compatibility bit. Rummage through nvidia inspector and update your windows registry with new settings. Nvidia SSAA doesn't work (game turns into a slide show). You can enable up to 8x MSAA but you have to disable in game shadows or you will get corruption.

I think I like option 1 better. AMD has better IQ in this example, and i'll be happy to name more examples if you'd like.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
So you wnat to LOWER the I.Q for skaes of performance...fine.
Just don't think we all want to follow that path...some of us care about I.Q. more than we care about +100 FPS...

I think you need to re-read what I posted again. Honestly, I'm starting to think the offensive nature of most of your posts is you just don't comprehend English very well and can't understand what you are reading.

Then you post some caustic babble due to that.

I already had to educate you in the other thread where you didn't understand the Kepler delays, now here we are again, with you not understanding another post.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
ArchAngel777, is this thread my fault? :D

Everything BFG10K said is truth.

Essentially. It was your post that said AMD didn't have AAA for DX10/11 and so I decided to look it up and found that I missed this in the review for the 7970. So, yeah... It saved me from making a purchase that would have dissapointed me in some aspects.

With that said, since the 7970 was designed to be an eye-finity card, it has massive fill rate. So, perhaps AMD does have the power to apply SSAA for some newer games @ 1680x1050. If that is the case, it might put me back into the 'maybe' camp for the 7970 purchase. But I would need benchmarks to confirm this...
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
A better thread in discussing AA and DX11, imo, would be one addressing the unfortunate situation of so many games under DX11 using deferred shading/lighting, making conventional AA methods useless.

To compensate many new games are coming with FXAA/MLAA implementations, which if you care about AA quality, are just atrocious. It's not even AA as far as I am concerned, it's just a nasty blur effect applied to not just edges - but every texture and visual you see. Yes we some some DX11 games using deferred MSAA, but it's extremely taxing on performance and does not affect all edges. And when I say taxing, we are talking a 20% or higher hit to framerates; see Battlefield 3.

Diablo 3, a highly anticipated blockbuster title is using deferred lighting/shading and the DX9 path - guess what ? FXAA/MLAA is the only 'anti-aliasing' option.

The sad situation lately is that more and more recent games are using FXAA/MLAA and you get no real AA modes available at all. In the few that offer the deferred MSAA - you better be running SLI/CF of the best cards on the market to even enable it. Contrast this to being able to turn on 4x MSAA with any mid-range card in games of the past with a minimal performance hit.

True anti-aliasing looks to be dying in favour of developers opting to go with the trash that is FXAA/MLAA because it's basically free AA, incurring hardly any performance hit.

:thumbsdown::thumbsdown:
So very true. I won't lose any sleep over developers implementing post-process AA as an additional option, but it's in no way a proper replacement for real AA.