And then there is SMAA.
http://mrhaandi.blogspot.nl/p/injectsmaa.html
There are several ways to look at it:
Picture quality in screenshots.
Picture quality with moving camera.
Performance.
Compatibility.
SSAA is the oldest form of AA. It gives nice screenshot-quality images. But when you play the game, you will see "pixel creep". Also called "temporal aliasing". Basically you will see (white) flickering around the edges of objects. But the biggest drawback is the loss of framerate. SSAA just renders 4x (or more) the amount of pixels that your screen has. And then averages out 4 virtual pixels -> 1 pixel on your screen. This will give you 4x more to do, and thus a ~75% framerate loss.
MSAA is the 2nd oldest. It only does work on pixels that are close to the border of objects on your screen. Again, nice picture quality in screen-shots. But suffers from temporal aliasing too. Performance his is a lot less, because the game is rendered in native resolution, and only work is done on a subset of pixels.
SSAA and MSAA are done somewhere "early, or in the middle of the rendering". Nowadays a lot of engines do "
deferred rendering". That means they do a lot of "post-processing". Basically alter the picture after the real 3D rendering has been done. Things like HDR (lighting), motion blur, etc. There is a new family of anti-aliasing that works as a post-processing effect. These are FXAA, MLAA, TXAA and SMAA. FXAA and TXAA are nvidia-only, MLAA is AMD-only. And SMAA is 3rd party, and works on both cards. All these 4 types of AA are relatively light on processing power. And they all are much better at getting rid of temporal aliasing than MSAA and SSAA. But they will also make the screen (particularly the textures) more blurry.
The differences.
FXAA and MLAA are the base of deferred AA technologies. AFAIK they are the same thing, FXAA for nVidia, MLAA for AMD.
TXAA has its focus on 1) even less processing power required, 2) even better at getting rid of jaggies, 3) tries even harder to get rid of temporal aliasing. The big downside is: a) it gives the most blurry picture of all methods of AA. b) it needs native support in each game. There are only a handful of games supporting TXAA.
SMAA was made by people other than nVidia or AMD. It works by dropping 4 files (including one dll) into the directory where your game's executable is. It requires very little processing power. And imho it is the least blurry of the 4 deferred AA methods. And it works pretty well getting rid of temporal aliasing. On top of that, you can enable it together with MSAA (or SSAA).
My preferred way of doing AA in games is now.
1) Enable 4xMSAA. (Or even 8xMSAA if the game is not too demanding).
2) Enable Transparency MSAA.
3) Drop SMAA in the game folder (keeping settings at the default High).
I do this with nVidia inspector. I also set LoDBias to -1. And I prefer SSAO (which has nothing to do with anti-aliasing).
This gives the best picture quality, for a reasonable performance price.
Note, I have a pretty decent system (gtx680 + i5-3570K). But even on my old gtx260 I would run 4xMSAA in most games. (I hadn't heard of SMAA back then).
And then there is
SGSSAA. Sparse Grid Super Sampling.
There used to be a bug in the nVidia drivers, where, if you enabled Transparency SSAA, it would actually do some form of SSAA over the whole picture (and not only the transparent parts). People liked this, and nVidia kept it in their drivers. If you enable SGSSAA you will get something that looks like SSAA. Very nice picture. But very very heavy on the framerates (>50% framerate loss). And I don't think it does anything against temporal aliasing.
There are all kinds of 3rd party tools to mess with settings.
ENB Series and SweetFX seem to be the most popular.
The problem with those 2 tools is that they change so much in the picture, it might take days to find the settings you like best.