I found the article to be UNDERwhelming... In most of the images I had trouble to distinguish the differences they pointed out so clearly.
To tell the truth, I don't see why I should be running 12x AA and in the process perhaps crippling performance, if your card performs better at 8x. 24x yes, but I wouldn't raise AA levels unless I can double what I already have.
Not to mention MSAA vs CSAA and whatever xxAA they can think of. Personally I would be happy if I had a maximum of 8x AA with no added frills. I haven't really found myself stopping to look at the slightly jaggy edges on some overhead cables; I'm too busy playing to notice.
Don't get me wrong; these developments are as important as any other, but I really don't know why I should buy a graphics card for the sole reason that it handles AA better than the next, or it supports a higher level of AA.
Keep AA and AF for CGI purposes and I will be glad to look at smooth pre-rendered scenes; unlike the awful scenes some games have (coughs); like Resident Evil 4... Or great CGI movies. Other than that, I don't see myself using these insane levels of filtering any time soon. Personal preference, but each to their own. I just want to play the game!