Read the article and 2 things stood out:
"1) For graphical applications like games that involve interaction, I don't think we'd want frame times to go much higher than that. I'm mostly just winging it here, but my sense is that a frame time over 50 ms is probably worthy of note as a mark against a gaming system's performance. Stay above that for long, and your frame rate will drop to 20 FPS or lower—and most folks will probably start questioning whether they need to upgrade their systems."
Just like Kyle, he chose an arbitrary "smoothness level", in this case 20 fps minimums. Let's say you have a situation where a card drops 5x to 20 fps min (>50 ms response) but 99% of the time maintains minimums of 35 fps / avg of 60. In other case, you can have a card that never drops to 20 fps but most of the time maintains a far lower average of 50 fps and minimums of 30 fps. Which card would I take? I would take the first one, but it would be worse performing based on the criteria he set.
"2) Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem."
So he can measure the microstutter/jittering but in the real world he can't perceive the difference, later sighting his IPS monitor as "too slow" to capture this latency issue. So we have another component into the mix = microstuttering may differ across TN / IPS panels.
Basically short of a person actually trying CF/SLI for themselves, it seems it's impossible to discount CF/SLI as microstuttering mess because:
1) Micro-stutter may exist but depending on the speed of the monitor, it's may be a non-issue altogether
2) Micro-stutter may exist, but the user may or may not notice it
3) Micro-stutter differs across AMD/NV brands and even across games (i.e., game engines and GPU load too) <way too many factors all of a sudden)
***4) As Tom's Hardware pointed, Micro-stuttering is alleviated somewhat if you run faster GPUs and/or if you add a 3rd GPU into the mix, creating yet another variable.
So we are back to square 1: it makes sense why some report issues with micro-stutter and others don't really have them.