Why was the smearAA promoted over pure edge detect AA?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
IIRC, edge detect AA was the best thing AMD ever came up with other than that it didn't use double precision.

Why did Smear ("morphological") AA become more popular?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

MLAA and features like FXAA were welcomed based on the compatibility with games that moved to deferred rendering. Some times didn't have in-game AA or control panel forced on multi-sampling AA by the IHV's. Also, they work in conjunction with the traditional methods.

Never looked at these methods as replacements but a flexible feature that may enhance for a gamer.
 

BoFox

Senior member
May 10, 2008
689
0
0
MSAA (including Edge Detect AA) does not work for many edges in the newer games.. alpha AA (or Adaptive AA) does the job for some transparent textures but there is still so much shader aliasing (especially with various HDR lighting) that these AA methods only gets rid of maybe 30-50% of all the jaggies on the screen, while incurring a large performance hit due to the nature of these demanding games. Back then, a simple DX8 game would look practically jag-free with MSAA plus TrAA/AAA.

SSAA was in my opinion the best thing AMD came out with, other than AAA, with the 5870, but only worked for DX9 and under games, and didn't offer LOD adjustment to make up for the blurry oversampled textures. Now these problems have been largely resolved. Yet it's MASSIVE in performance hit.

MLAA/MLAA 2.0/ FXAA 1-4.0 / SMAA tries to do what SSAA does, but usually with even less performance hit than 4x AA. The quality is questionable at times, but is getting better as better methods are developed/used. TXAA is a bit demanding, but looks "cinematic" with temporal AA (aliased jitter due to motion) being addressed.

Stereo3D gaming with shutter glasses practically gives free 2x1 SSAA as the left and right images are combined/converged, with two different images of the same edge blending together. Looks awesome, with the depth - MY GOSH!!!
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Why did Smear ("morphological") AA become more popular?

because AA requires:
1. more processing power which consoles do not have.
2. more ram which consoles do not have
3. modifying the code to run AA

So developers are all "f that, I am just gonna take the finished image and apply a horribad looking blur filter to it to reduce the noticeability of aliasing at the cost of image clarity, very little system resources, and 0 actual work"... its 0 work since you have nvidia and AMD kindly providing the smear filter.
 

omeds

Senior member
Dec 14, 2011
646
13
81
MSAA (including Edge Detect AA) does not work for many edges in the newer games.. alpha AA (or Adaptive AA) does the job for some transparent textures but there is still so much shader aliasing (especially with various HDR lighting) that these AA methods only gets rid of maybe 30-50% of all the jaggies on the screen, while incurring a large performance hit due to the nature of these demanding games. Back then, a simple DX8 game would look practically jag-free with MSAA plus TrAA/AAA.

SSAA was in my opinion the best thing AMD came out with, other than AAA, with the 5870, but only worked for DX9 and under games, and didn't offer LOD adjustment to make up for the blurry oversampled textures. Now these problems have been largely resolved. Yet it's MASSIVE in performance hit.

MLAA/MLAA 2.0/ FXAA 1-4.0 / SMAA tries to do what SSAA does, but usually with even less performance hit than 4x AA. The quality is questionable at times, but is getting better as better methods are developed/used. TXAA is a bit demanding, but looks "cinematic" with temporal AA (aliased jitter due to motion) being addressed.

Stereo3D gaming with shutter glasses practically gives free 2x1 SSAA as the left and right images are combined/converged, with two different images of the same edge blending together. Looks awesome, with the depth - MY GOSH!!!

Great post, couldn't have said it better myself, on all accounts including AA in 3D. I've also noticed jaggies on the same edge are in different positions in each eye in 3D, and when looking with both eyes in motion there is a temporal AA type effect too.

+rep!
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
these AA methods only gets rid of maybe 30-50% of all the jaggies on the screen, while incurring a large performance hit due to the nature of these demanding games. Back then, a simple DX8 game would look practically jag-free with MSAA plus TrAA/AAA.
This sounds like you are claiming that MSAA + TrAA only anti aliases 30-50% of the pixels on DX9+ games but did 100% of them in DX8 games. I am pretty sure that is not right.
For one thing I am pretty sure both MSAA and TrAA did not exist in DX8 days. For another I am pretty sure they get 100 of the work done even on DX9-11 games

There are some crappy console ports however which CANNOT have MSAA and its successor CSAA work at all in them. Even if you set it to forced on in driver.

Of course SSAA always looks best but it is very expensive
 

omeds

Senior member
Dec 14, 2011
646
13
81
What hes saying is, in older games MSAA + trAA was enough to clean up the image, but now days with deferred render and what not, MSAA + trAA often does not clean up the entire image.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
because aa requires:
1. More processing power which consoles do not have.
2. More ram which consoles do not have
3. Modifying the code to run aa

so developers are all "f that, i am just gonna take the finished image and apply a horribad looking blur filter to it to reduce the noticeability of aliasing at the cost of image clarity, very little system resources, and 0 actual work"... Its 0 work since you have nvidia and amd kindly providing the smear filter.

ding ding ding we have a winner
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
I remember when AA was new, and it was edge anti-aliasing. Then somehow they started antialiasing everything including the textures which takes a lot more processing power. Why did the option to AA only edges vanish?
 

BoFox

Senior member
May 10, 2008
689
0
0
What hes saying is, in older games MSAA + trAA was enough to clean up the image, but now days with deferred render and what not, MSAA + trAA often does not clean up the entire image.

Thanks for saying it better than I did! ;)

I still play UT2004, a DX8 game, with TrAA ever since the option was first available with 7800GTX (and was it also available for the X800XT, or not until X1800XT came out?).. but SSAA actually cleans it up even more. There are just a tad few alpha textures that TrAA seems to miss - something to do with specular light maps? With my GTX 460 1GB, either 2x2 SSAA or 4x SGSSAA still brings the frame rates well below 90fps during intensive battles that I like to maintain at the Vsync rate of 90Hz, and the game is already 8 years old!!!
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
What hes saying is, in older games MSAA + trAA was enough to clean up the image, but now days with deferred render and what not, MSAA + trAA often does not clean up the entire image.

I figured he was, that is what I was referring to as well.

I just wanted to clarify that its not that MSAA and TrAA are now incapable of clearing the image, its that they are not being applied. Since this rendering method is incompatible with those AA schemes, those AA schemes are not used at all (even if you set the driver to force them on)

ding ding ding we have a winner

:)
 

BoFox

Senior member
May 10, 2008
689
0
0
I figured he was, that is what I was referring to as well.

I just wanted to clarify that its not that MSAA and TrAA are now incapable of clearing the image, its that they are not being applied. Since this rendering method is incompatible with those AA schemes, those AA schemes are not used at all (even if you set the driver to force them on)



:)

Yay, you win!!!!!!!!!!!!!!!!!!! o_O Just kidding! :cool:
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Tell that to AMD and the Square Enix. In sleeping dogs, it's either FXAA/MLAA/SMAA/WTF/BBQ or SSAA or both. There isn't an option for MSAA.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You can do some amazing things with MSAA in DirectCompute, but it takes time to code and it's an open standard (read: no money exchanges hands), so good luck getting all companies involved on board. As others mentioned, there's a trade-off of performance and quality that also must adapt with rendering engines. As deferred rending engines become ever more popular because console hardware is ancient, better AA techniques get put on the back shelf. This is because there's little purpose in developing IQ enhancements when your current hardware can barely push the stock image. Hell, that's why you see undersampling being used as developers try to squeeze every last drop of performance out of consoles that are going on 7 years old.

Moving forward, your best results will come from solutions that improve image quality with almost no impact on performance. That said, I'm a fan of SMAA and use an injector in some games (e.g. GW2): http://www.iryoku.com/smaa/ ; http://mrhaandi.blogspot.com/p/injectsmaa.html
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Thanks for saying it better than I did! ;)

I still play UT2004, a DX8 game, with TrAA ever since the option was first available with 7800GTX (and was it also available for the X800XT, or not until X1800XT came out?).. but SSAA actually cleans it up even more. There are just a tad few alpha textures that TrAA seems to miss - something to do with specular light maps? With my GTX 460 1GB, either 2x2 SSAA or 4x SGSSAA still brings the frame rates well below 90fps during intensive battles that I like to maintain at the Vsync rate of 90Hz, and the game is already 8 years old!!!

Imho,

UT 2004 had a lot of transparent alpha blends which adaptive and nVidia's transparency only touched transparent alpha tests, imho! If one plays World of WarCraft and certain grass and foliage textures still have aliasing even with transparency or adaptive turned up -- these are also transparent alpha blends.

One of the great things about ATI release of their adaptive shortly after nVidia's was the feature can be used on all ATI product sku's -- did bring added value to products like the x800 and x850 generations and older.

Very good observation with Stereo 3d -- with each eye receiving a different position; the blending by the brain offers a much smoother full scene image. This point isn't raised too much and nice to read.

If gamers remember FarCry and Doom3, which did bring exciting specular, lighting and shading abilities but what it also did bring was specular, shader aliasing and how difficult it was to use traditional methods to help curb these efficiently.

In a way, it is wonderful to see focus on trying to improve these features by offering flexibility with super-sampled, FXAA, MLAA and even TXAA, which tries to tackle temporal aliasing with some efficiency.
 

BoFox

Senior member
May 10, 2008
689
0
0
Ahh, those damn transparent alpha blends - that's what you call them! :)
 

snarfbot

Senior member
Jul 22, 2007
385
38
91
smaa is the bomb, if you havent used the injector you should, its really leaps and bounds better than fxaa or mlaa, and doesnt blur the image at all.

im sure this has been posted on the forum before, but heres a video of it implemented in cryengine 3.

http://vimeo.com/31247769

its funny because the crysis 3 alpha seems to be using fxaa, while smaa is significantly better.
 

omeds

Senior member
Dec 14, 2011
646
13
81
It does the blur the image slightly, if you take screenshots and flick through you can see on the finer details (say hi-res textures) that it does, but its very slight, and without looking at screen caps I wouldnt notice.

I was using it for months, then after the Nv driver that introduced FXAA I noticed an abnormally large performance hit in games with heavy alphas compared to older drivers, not sure whats going on there, but yeh, it takes a bigger hit than it used to.