What happened to edge only anti-aliasing?

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
I remember in the 90s when 3D gaming was relatively new, we had full scene (or full screen) anti-aliasing, where the whole image was rendered at a higher resolution and then downscaled. But then a much more efficient method was invented, edge antialiasing, which only anti-aliased edges and left polygon faces alone. Now as far as I can tell, modern anti-aliasing methods are all variants on full screen (OR blurring applied to the final image in the case of FXAA). Why was the edge method abandoned?

http://gaming.stackexchange.com/que...the-different-anti-aliasing-multisampling-set

According to this article, the edge only method is called "object-based antialasing".
http://en.wikipedia.org/wiki/Spatial_anti-aliasing
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Today, aliasing not just happens at edges but everywhere -> specular aliasing. And if you have lots of alpha tests (vegetation), that aliases as well. Therefore I don't think this type of AA would be very efficient today in terms of image quality.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
Today, aliasing not just happens at edges but everywhere -> specular aliasing. And if you have lots of alpha tests (vegetation), that aliases as well. Therefore I don't think this type of AA would be very efficient today in terms of image quality.

I seem to recall though that somewhere around 2007 the nVidia settings had an option for texture anti-aliasing, specifically for transparent textures
 

jimhsu

Senior member
Mar 22, 2009
705
0
76
Also - deferred rendering techniques (which produces many of the "fancy modern effects" in today's games) are generally not compatible w/ multisampling techniques without "a lot of work" (work, being relative). http://en.wikipedia.org/wiki/Deferred_shading

Still, things like ENB (developed by a "giant" developing team of one person) do manage to implement forms of EdgeAA/TemporalAA on top of deferred rendered engines. So again, "work" is relative. FSAA/MLAA techniques are just "so much easier", and also work on textures/alpha. Cheaper performance too, and with 50% of everything being a console port .. well, the choice is obvious.
 
Last edited:

jimhsu

Senior member
Mar 22, 2009
705
0
76
I seem to recall though that somewhere around 2007 the nVidia settings had an option for texture anti-aliasing, specifically for transparent textures

Compare the performance of MSAA + transparency AA + temporal AA vs some MLAA technique like FXAA. Then realize that most people still have relatively low res monitors / TVs that hide the blurriness of FXAA / don't allow the advantages of MSAA to be apparent. As a developer, what would you likely spend your limited budget on?

As 4K makes it way through the marketplace though, these discussions (along with the "aliasing" problem) will cease to be relevant. Well, maybe except for temporal AA.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
Compare the performance of MSAA + transparency AA + temporal AA vs some MLAA technique like FXAA. Then realize that most people still have relatively low res monitors / TVs that hide the blurriness of FXAA / don't allow the advantages of MSAA to be apparent. As a developer, what would you likely spend your limited budget on?

As 4K makes it way through the marketplace though, these discussions (along with the "aliasing" problem) will cease to be relevant. Well, maybe except for temporal AA.

Do developers have to do anything? Can't the video card driver handle it?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The edge aliasing is still quite popular. That is what MSAA does. It fixes the aliasing around the edges of objects, but as mentioned, aliasing happens everywhere, and has become much more apparent as graphics have advanced, so some have started using SMAA, FXAA, or if you have the muscle, SSAA.
 

jimhsu

Senior member
Mar 22, 2009
705
0
76
Do developers have to do anything? Can't the video card driver handle it?

As I mentioned, implementing MSAA directly on deferred rendered engines (the vast majority of "new-ish" games) does not work. You need to get the actual geometry (which means that developers need to spend time modifying their engines; or you have someone like Boris who does it for free in ENB). This is also engine specific, so no general solutions. IANA graphics engineer, so I don't know what exactly goes into successfully implementing MSAA on these engines.

People way smarter than me talk about these issues here: https://www.opengl.org/discussion_boards/showthread.php/169925-Antialiasing-in-Deferred-shading
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
I remember in the 90s when 3D gaming was relatively new, we had full scene (or full screen) anti-aliasing, where the whole image was rendered at a higher resolution and then downscaled. But then a much more efficient method was invented, edge antialiasing, which only anti-aliased edges and left polygon faces alone.
That wasn't edge AA, it was MSAA. It's still in use today.

What you're probably talking about is Matrox's FAA which only existed on the Parhelia. And to a smaller degree CSAA, where coverage samples "switch off" in certain cases.
 

lamedude

Golden Member
Jan 14, 2011
1,206
10
81
I guess D3D killed it. It was popular on the N64 and [irl=http://www.vogons.org/viewtopic.php?t=35169#p300965]proprietary API for everyone era 3D cards[/url].
 

Shamrock

Golden Member
Oct 11, 1999
1,439
560
136
I don't know if this is the same thing, but AMD's drivers has an anti-aliasing filter that you can switch from "standard" to "edge-detect"
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
That wasn't edge AA, it was MSAA. It's still in use today.

What you're probably talking about is Matrox's FAA which only existed on the Parhelia. And to a smaller degree CSAA, where coverage samples "switch off" in certain cases.
There was also a edge AA method during old good times of Verite and original 3DFX cards.
It worked by rendering VU-lines on edges of polygons.

Created perfect edge gradients with cost of slightly fattening the object.
Problem is that the lines had to be rendered from back to front for proper sorting. (Z-buffer was used to clip them to rendered geometry.)

As game engines became more complex, it just was not feasible anymore.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Since so much of the effect and colour of pixels we see on screen comes from post processed effects its become increasing difficult to do antialiasinbg. Deferred rendering is the main reason but using post processed lighting compared to the hardware fixed lighting is what makes a modern game look modern. I think with DX12 MS mentioned that they will be changing the API to allow hardware antialiasing to be applied later in the pipeline. That might help performance and AA quality a bit compared to the current more popular post processed AA.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
That wasn't edge AA, it was MSAA. It's still in use today.

What you're probably talking about is Matrox's FAA which only existed on the Parhelia. And to a smaller degree CSAA, where coverage samples "switch off" in certain cases.

What exactly is MSAA and what's the difference from edge AA?

Edit: According to Wikipedia it's something else that I don't understand. http://en.wikipedia.org/wiki/Multisample_anti-aliasing
"The specification dictates that the renderer evaluate the fragment program once per pixel, and only "truly" supersample the depth and stencil values."
 
Last edited:

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
I don't know if this is the same thing, but AMD's drivers has an anti-aliasing filter that you can switch from "standard" to "edge-detect"

I noticed that too... It sounds like the fake anti-aliasing that gets applied to the final 2D image
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I flat out refuse to run FXAA or other cheapo versions of AA unless it's either that or super sample or something. MSAA for now, no AA on 4K later. :D
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
What exactly is MSAA and what's the difference from edge AA?

Edit: According to Wikipedia it's something else that I don't understand. http://en.wikipedia.org/wiki/Multisample_anti-aliasing
"The specification dictates that the renderer evaluate the fragment program once per pixel, and only "truly" supersample the depth and stencil values."
MSAA is slightly altered version of classic supersampling AA.
In ssaa you render image with certain amount of sub-samples per pixel. (Ie. Render image in double size and then reduce size to screen resolution.)

Msaa shades only once per pixel and write same result to all subsamples within triangle when rendering it. (Thus when you render centre of polygons you do not get any aa, on edges you automagically get results from multiple polygons and thus aa effect.)

Edge AA methods either find edges from image or re-render meaningful polygon edges with line drawing method which gives nice aa gradient.
 
Last edited: