I doubt AMDs blog is a good source of information for this. It is far too superficial for that and lacks detailed and in-depth information. I don't have the proper background or information, but I would assume Crysis 3 uses DirectCompute as well as tessellation. To what extent I cannot say, and neither can anyone else.
Recent AMD promoted titles use OGSSAA which is the most inefficient supersampling method I know of. Probably because only at higher resolutions can AMD bring their advantage in raw shading power to bear. I prefer SGSSAA which provides a far better image quality/performance ratio than OGSSAA (also used in downsampling). The latter should be a last resort only. I own Sleeping Dogs and with SSAA on high, I still see pixel creep, loose two thirds of performance vs no AA. SGSSAA usually gets rid of all jaggies and shimmering at a performance ost of about 45%.
Interestingly, AMDs GCN is unusually weak with SGSSAA as it makes a difference if you render smaller pixels once (high res, OGSSAA, good for GCN) or larger pixels repeatedly (lower res, SGSSAA, bad for GCN). Almost every benchmark with SGSSAA that I know of shifts the picture in favor of Nvidia. Recently, AMD has promoted SGSSAA in their SDK, starting with slide 51:
http://www.pcgameshardware.de/AMD-Radeon-Hardware-255597/Downloads/AMD-Demo-DX11-Download-1045674/
Hopefully this replaces the current SSAA implementations soon. That would be a really nice development.