• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Microstuttering, will it bother me?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I understand your statement, but take into account - I find it likely he doesn't know what it is at all. That he likely made an assertion through reading forums and likely has no experience on the issue. Perhaps never even using a dual card configuration ever in his life, but found it prudent to post advice on the topic here - it spreads FUD.

Well, yes obviously. If he knew what microstutter is he wouldn't have thought 120Hz monitor would help somehow. I was focusing more on the "don't post of you don't know" part. While we should correct people whenever they are wrong (and be corrected in turn when we are wrong), its hard to not post misinformation if you are misinformed, because you believe yourself to be correct.

@Drivenbyvoltage, Grooveriding, & RavenSEAL: Triple buffering and Vsync help against tearing, not microstutter. 120Hz monitors (with fast enough GPU) results in a smoother gaming experience (a 60Hz monitor can only display 60fps max, a 120Hz monitor can display 120FPS).

Micro-stuttering is when rendering a single frame takes a lot longer then the previous/following frames.

FPS = 1000/Time to render (in ms)
60 FPS = 16.666666666666666666666666666667 ms
FPS is normally measured by simply counting how many frames have been rendered in a certain second (or by conversion from an average of ms values). so if you see 60FPS that means the average frame took 16.67 ms to render, but individual frames could be a lot lower or a lot higher. If the time to render individual frames was fluctuating a little, say, between 14 and 19 ms... no big deal. But if it fluctuated between 10ms and 50ms+ you have a problem (specifically, a micro-stutter problem). You can have micro-stutter even though your average FPS is 60+

10ms = 100FPS
20ms = 50FPS
30ms = 33FPS
40ms = 25FPS
50ms = 20FPS
 
Last edited:
FPS = 1000/Time to render (in ms)
60 FPS = 16.666666666666666666666666666667 ms
FPS is normally measured by simply counting how many frames have been rendered in a certain second (or by conversion from an average of ms values). so if you see 60FPS that means the average frame took 16.67 ms to render, but individual frames could be a lot lower or a lot higher. If the time to render individual frames was fluctuating a little, say, between 14 and 19 ms... no big deal. But if it fluctuated between 10ms and 50ms+ you have a problem (specifically, a micro-stutter problem). You can have micro-stutter even though your average FPS is 60+

So could one minimize microstuttering by going for IQ settings that keep frame rendering time lower than 16.67 ms? Obviously that sort of thing is difficult to measure, but I'm thinking if you find a balance of IQ level and desired framerate you could minimize microstuttering as well?
 
So could one minimize microstuttering by going for IQ settings that keep frame rendering time lower than 16.67 ms? Obviously that sort of thing is difficult to measure, but I'm thinking if you find a balance of IQ level and desired framerate you could minimize microstuttering as well?

it really depends on the game in question and why it specifically has micro-stutter. if you are taking less than 16.67ms to render every frame then your min FPS is over 60, that is amazing FPS. microstutter is just your FPS dropping and recovering so quickly that, when averaged, the FPS counter does not report it.
but in theory, and in general, then yes, lowering IQ does improve rendering times (thus improving individual frame render time, min FPS, average FPS, and max FPS) and could improve the problem.
But that isn't guaranteed. If multi-GPU issues, lack of CPU power, a HDD bottleneck or some sloppy coding is the cause of the stuttering then lowering IQ is unlikely to help. in such cases specific per game solutions should be found, and sometimes there is no solution.
 
Last edited:
Well, I can say that i've never personally noticed it before, and i tend to be bothered by that kind of thing. I've seen stuff that I at first thought was micro-stuttering, but was able to recreate using one card. It's unnoticeable enough that I couldn't even tell you what it looks like.
 
Well, I can say that i've never personally noticed it before, and i tend to be bothered by that kind of thing. I've seen stuff that I at first thought was micro-stuttering, but was able to recreate using one card. It's unnoticeable enough that I couldn't even tell you what it looks like.

that is good, a lot of people are unable to detect it. Detecting micro-stutter is not a blessing but a curse, it has no benefit to you and makes your gaming less enjoyable.
 
it really depends on the game in question and why it specifically has micro-stutter. if you are taking less than 16.67ms to render every frame then your min FPS is over 60, that is amazing FPS.

That is how I like to game. FPS of 60 is a very important to me. Some people insist on 4x or 8x AA, but for me I can live with 2x or none, I'd just rather have 60fps and consistently smooth gameplay.

That being said I do like eye candy, but fps is my priority.
 
What's the excuse for the first Stalker then? The frame rate is averaging at 130-140 IIRC, if FRAPS isn't BSing me.

Not making excuses for anything. Honestly if you are over 100 fps you should not have microstutter. As the AFR latency should not be noticeably different - both cards are drawing over 60fps. Perhaps AFR doesn't work well with it for some reason, have you looked to see if there is a specific profile for it?
 
is microstuttering limited to sli/cf, and if so why?
It's a problem with how they sync the cards. Unfortunately, there's just overhead in how the frames are stacked/rendered, and it leads to uneven frame times. So rather than having frames rendered like so (1 and 2 designate cards in a multi-GPU config followed by their associated rendering times):

(1-16ms)(2-16ms)(1-16ms)(2-16ms)

they end up being rendered not so evenly:

(1-22ms)(2-10ms)(1-22ms)(2-10ms)

In either case, the FPS is still the same, ~62, but instance one will feel MUCH more smooth, despite them both being over 60FPS. Furthermore, this effect is greatly exaggerated when frame rates drop.

Unfortunately, this is just the nature of the beast until the technology improves. I've dabbled with multi-GPU configs since my HD 4870's and it's been the same since.
 
MrK6 explains the why of it. As for the first half of your question, it isn't limited to SLI/CF, but it is more common there.
True, I didn't address that 😀. There's technically microstuttering with single cards as well, but it's largely unnoticeable because the difference in frame times is so small. When you start trying to sync multiple GPU's, the overhead greatly increases the differences in frame times, allowing microstuttering to be much more noticeable.
 
Back
Top