*-*
The whole argument of, "if you can't see it, then it's irrelevant" needs to be shot down and never come again. If you couldn't see the FPS being produced and were perfectly happy with 15FPS, then there wouldn't be much purpose to review graphics cards other than their boxes are pretty and the HSF is cool looking! The standards have been raised...if microstutter doesn't bother you, then you're not perceptive enough to see it, you're in denial, your eyes and brain aren't trained to recognize it, or you just don't care. Either way, you're out of this discussion.
*-*
Wow someone gets it. If we never looked at the FPS counter and played the game without any hitching we would never know we were only getting 50fps or whatever number.
Remember the time that NO card was getting more than 25fps average on Ultra in Crysis? The game ran smoothly, but at 25fps. So what did I do? Ran it at 25fps and turned fraps off and not once was my experience ruined.
I've gotten picky now that I have SLI, but only to a point. I want good framerates but I don't sacrifice graphic quality settings to have it. So I turn off FPS counters, turn everything up to max and start from there. Anything feel off I start bu reducing AA and then check online for possible fixes if that doesn't smooth out the game for me.
More arrogant BS from someone who is telling everyone they are too stupid to notice they have stutter. Not one single person on this entire thread has stated they get 15 FPS and are happy with it. So feel free to create a strawman argument so you can knock it down with ease.
How can you assume that because I (or any other person) said they notice no stutter that we are just not noticing it. What if, shock horror, we are actually correct.
If someone tests two seperated GPUs, a GTX 680 and a HD 7970 but notices no stutter with either. Why is their opinion to be discounted? Why are the to be told they have no place in this discussion?
Have you tested both current gen AMD and Nvidia GPUs in games? Do you notice any stutter? I care not one jot about the opinion of someone who has tested neither but implies a chart has more truth to it than the honest opinion of someone who has physically tested the cards in question.
1) reference crysis back in 2007. That game never went above 30fps on ultra even with a $600 GPU at the time. Nobody cried about the fps then because the game was properly coded so that even that low of a framerate produced smooth gameplay.
2) Nobody in this thread telling you single GPU solutions stutter to the point it's distracting. Sure a few people might have mentioned it. This is more of a problem with SLI vs X-Fire. I have followed every post and more than once people have said that single card solutions are about the same with one that I remember just a page back mentioning single card microstutter being seen.
3) It matters not what you see or don't see. You don't see the unemployment rate in front of your eyes but with that fancy chart on the TV you can see the trend of the job market right? It's the same thing. Turn off the FPS counter and you don't see the FPS but that chart in the benchmarks will tell you how a particular GPU handles a specific game.
When you say things like "I don't see it so it doesn't exist" it's like saying you have never seen a Snow Leopard so it's all fake.