I suspect that it's game engines not handing rendering to extremely large screen resolutions, from a technical standpoint you'd assume that SSAA would be the most compatible since from the engines perspective all it has to do is render out to a much bigger screen resolution (which is then scaled down).
But that's not really how modern SSAA works. Modern SSAA is an extension of MSAA, which is to say it's based on the concept of taking multiple samples of the scene, usually in a rotated or other non-orthagonal pattern. The
effect of SSAA is that you render x times as many pixels, equivalent to a higher resolution, but internally the renderer is still working at a lower resolution.
What's happening in this case is that a buffer is not being super sampled correctly. When the samples are merged/reduced, it's working from presumably 1/4 as many samples as it expects, resulting in a buffer 1/4 the size. A game doesn't have to be capable of rendering at a high resolution for SSAA to work, and conversely just because it can render at a high resolution doesn't mean SSAA will work. Just take one look at BF3, for example.
Anyhow, to answer the question at hand, SSAA has so many compatibility issues because it's a complex process that very few developers are taking into account. Deferred rendering breaks traditional SSAA implementations just like it does MSAA, so right out of the gate it only works on forward renderers. Then you have to take into account how this will work with the various shader stages, and at what point you take the additional samples. This isn't so bad on DX9 games due to the relatively fixed nature of the pipeline (assuming we have a forward renderer), but DX10+ games can basically do whatever they want, which means MSAA and SSAA must be custom tailored to the game in question if the developer goes at all off of the beaten path.