Yeah still trying to wrap my head around this new method and what those numbers mean to ME.
I agree, it's confusing, but I think TechReport has done a lot to clear things up.
http://en.wikipedia.org/wiki/Micro_stuttering
Micro stuttering is inherent to multi-
GPU configurations using
alternate frame rendering (AFR), such as
nVidia SLi and
AMD CrossFireX but can also exist in certain cases in single-gpu systems.
[1][2][3][4]
It's a term used in Nvidia's unofficial sli guide. Which is now, been lost in the new forums. At least in it's original form. And in many other articles the last 5 years or so.
That Wikipedia article is technically correct, but a little out of date.
While microstutter could be described as the variation in frame times, that itself is not what a user would experience as stuttering. In other words, every variation from an average frame time will not be perceived if it's too small a variation for our eye to detect.
What I believe TechReport has documented to be perceptible "latency" is a frame time that is significantly longer than the average, where the average is very low. Their threshold is 50ms. They have been able to match up the spikes over 50ms to perceptible pauses in the game video output.
Now, it still helps to have a fast card or set of cards. A card where every frame time is over 50ms could be very consistent and not suffer from microstutter or "latency", but it would simply be slow, as in average frames per second.
So think of it this way - video cards have three main performance parameters:
(1) average speed (most accurately reported as milliseconds/frame rather than frames/second)
(2) ability to maintain average speed, or essentially standard deviation of frame times
(3) longest frame time (analogous to what we've traditionally looked at as minimum fps, but at a much more granular level)
I posted about this 18 months ago when I perceived stutter on my single HD5850 in BF3 despite a frame rate that would have typically been considered as smooth (45fps, 30min), where the image would simply appear to stop moving. It was then that I suggested that microstutter was not a phenomenon limited to dual card setups.
That is why the Wikipedia article is out of date, and also why using the term microstutter could be misleading, due to its association with dual card setups. There is nothing unique about a dual card setup that limits a frame time anomaly to such systems. Any card can suffer from peak frame times significantly and perceptibly out of line with its average frame time.