Thanks for responding, I'll definitely have to give your article a read.
I never said they were the same thing, but the effects of SSAA seem to override the effects of AF. Any game I turn on with SSAA, the textures automatically get slightly blurry, I'll assume due to the downsampling. If this isn't the norm or we're talking about different things, I'd be interested to know.
This is from a lame mmo player perspective.. (everquest 2, yeah, so sue me i'm lame) I have a gtx470 in my main system and a 5770 in mah 2nd box. So this is from a dx9 point of view:
1. On the gtx470: SSAA doesn't seem to override AF. However, it does make the lack of it much less noticeable if you turn it off. This is forcing it on in nHancer, 2x or 3x SSAA. DX9 games. I don't know as far as dx10/11 stuff.
2. The 5770 does get more blurry when you turn super sampling AA on. However...if you download a 3rd party tweak proggie and set the LOD to the negative a little bit, it will eliminate it--it seems to be slightly more effective at stabilizing the image, as well.
Hassle though to tweak the LOD.
They each have advantages...obviously the gtx470 is in another price bracket, but the 32x CSAA mode works great with very little performance hit, as does the transparency AA. However, the game that I mainly play hardly uses shaders at all, and the 5770's edge detect AA performs extremely well and looks *amazing*. Both drivers are solid.
I think the transparency supersampling mode on nvidia's card is less "glitchy" than the ati one. HOWEVER, ati's multi-sampling transparency aa actually works, nvidia's seems to just do horrible horrible things to many objects, like make them black, lol.