It is a bit like buying shoes (or a similar product category) over the Internet. Yes, you can read about how great material they have, what fancy features they have, and how someone else think they fit. But ultimately it is your own experience that matter, and without trying you won't know. In my case I can measure microstutter, but I do not see it, so it is to me personally this topic is pretty useless, but that does not mean that it cannot be real problem for someone else.My realisation in all this is that all the reviews are near worthless.
No review in the world will be able to tell you if you are going to find stuttering annoying or not. All the reviewer can do is say "Hey, these are the frame times, make of it what you want", while the manufacturer can solve it at the source if it becomes a large enough problem for a large enough group of costumers.
Indeed. We are, in fact, spoiled. It used to be a matter of buying new hardware so you can run it at all, and buy new next year because both software and hardware are evolving so much. These days we demand to have everything turn up to max with no stuttering at framerates that we cannot possible perceive as human beings.Since then, game development has graphically stagnated, as has LCD tech (we're stuck at 1080p and 1600p), and therefore gaming cards can max today's games with room to spare.
There is one thing though, that I see very little discussion about: There were claims that NV had larger input lag in SLI which correlates to how they regularize their frame times. It would be nice to see that included in upcoming studies. This is another thing I cannot see, at least not in single GPU cards, but my gaming performance improved in first person shooters when moving to AMD from NV. It is very incidental as well as subjective, I know, but I wonder if a small difference in input lag could account for it. And this I cannot measure as easily with Fraps, so I hope some review site find a way to do it!

