I'm very happy to see this being widely adspoted. Frames alternating between 10ms and 30ms are going to "feel" something like a consistent 25ms, and not 20ms. It's just the way we are hardwired. The former will display 50fps while the latter will display 40fps. Thus the former will be assumed to provide a better gaming experience...
This is why micro-stutter sucks is incredibly noticeable at sub-60fps. As far as keeping your fps above 60 to avoid it's problems, let's say the above frame times are halved; The difference then becomes 100fps vs 80fps. There you just happen to be in a range where both will feel smooth and thus be much more difficult to tell the difference.
Glad to see AFR finally being exposed for what it is. That is, a load of crap when it's advertised as near perfect scaling, because even though it's true in one sense, it's not in the one that completely matters. This is why for those of us who have witnessed this first-hand, we have a tough deciding between something like GTX 690 vs GTX Titan. Because we already sure as hell know that something like GTX 660 SLI is not going to provide a better gaming experience than a single GTX 680 despite putting up higher frame numbers.
And this is a very real issue for those who are still unaware of it (and may even be running CF or SLI setups), otherwise we wouldn't be seeing any of this.
Edit : And the example I gave is actually a very poor one because micro-stutter becomes much more erratic in real gaming scenarios.