I'm considering SLI - Does microstutter always occur?

mashumk

Member
May 19, 2012
40
0
0
SLI newbie here.

I'm seeing a modest increase in companies pumping out 1440p/1600p IPS monitors and foresee a modest price war in the coming months. That's what I've been waiting for.

I've never had more than 1 GPU. The only microstutter I've seen is youtube clips. Don't know how accurate they represent the problem, but I'm a person who would be bugged more and more over time by that tiny but always present stutter.

So I'd like to upgrade to a single 1440 or 1600p IPS when the reliable brands start dropping prices and adding more choices. Not gonna gamble on the Korean monitor Ebay slot machine.

I'm sure my GTX 670 4GB would start to struggle at ultra settings on such a monitor. I'm guessing it's best to wait til the next gen and lower visual settings for games in the meantime.

Basically, is microstuttering still not wiped out? Is there significant improvement with this generation of cards?

Thank you.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Honestly, many years ago I ran 2x 7800GT in SLI, and a few years ago I ran GTX 260 SLI, and now I have GTX 670 SLI. I've never experienced any microstutter.

However, Sweclockers.com did a test on this, and it turned out Crossfire was far worse than SLI with regards to microstuttering.

They also found that the midrange/lower end cards, when in SLI/crossfire had more microstuttering than High-end cards in SLI/Crossfire. But as said, Crossfire was far worse and where really noticable for them.
 
Last edited:

Agent-A01

Member
Jul 6, 2012
105
0
71
I dont experience any microstutter with 580sli. Kepler is supposed to be better about that too
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
Micro-stutter is only really apparent when using Vsync. It's not like a frame dip, but a slight "glitched" frame feel.

When you're playing competitively with vsync disable, Microstutter can't be seen whatsoever, because, one, you're focused on the game, and two, the screen is tearing the shitznit already...

Micro-stutter is an issue, "if you're looking for it", but in terms of actual gaming experience, it's not that big a hindrance. :D

HOWEVER, to answer your question. YES, microstutter WILL ALWAYS BE THERE, in SLI or crossfire.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
I run GTX 470 SLI and I dont even know what microstutter is. I always hear people complain about it yet have never found out what it is or experienced it.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I have had very good gaming experience from gtx460 sli also. Even on older games that can run on 1 card with lesser settings, I know it runs faster and better with the 2nd card. Which is a positive experience for the tech.
 

Annisman*

Golden Member
Aug 20, 2010
1,918
89
91
I have only ever had one definate case of microstutter, and I've run a whole lot of multi-gpu setups.

It was with my 4870X2, I forget which game it was. But microstutter seems to only have a chance of happening (or at least being visible) below 30-40 fps, and considering most people using SLI/CF have higher end cards, fps doesn't usually dip that low anyways.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Depends on the game, the fps and your perception.

I often clearly notice microstutter with my 580 SLI below 60fps. VSync seems to help, but only if one stays above 60fps of course. Some games are very smooth even with 30-40fps, others unplayable. Kepler is supposed to be alot better if there is not much change in the scenery (forward movement in a shooter for example), but if you strafe or turn, it will microstutter (almost) as bad as any other SLI setup.

It is definitely something to consider, but no amount of theory will prepare you for it, so you just have to try it out yourself and how you feel about it.

Micro-stutter is only really apparent when using Vsync. It's not like a frame dip, but a slight "glitched" frame feel.

When you're playing competitively with vsync disable, Microstutter can't be seen whatsoever, because, one, you're focused on the game, and two, the screen is tearing the shitznit already...

Micro-stutter is an issue, "if you're looking for it", but in terms of actual gaming experience, it's not that big a hindrance. :D

HOWEVER, to answer your question. YES, microstutter WILL ALWAYS BE THERE, in SLI or crossfire.

Not really about the VSync, actually the opposite (when staying above 60fps). When below, it doesn't make any difference.
 
Last edited:

lopri

Elite Member
Jul 27, 2002
13,209
594
126
I notice micro-stutters ALL THE TIME with a single GPU. I have no idea how you can possibly NOT notice them in SLI/CF.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Micro-stutter is only really apparent when using Vsync. It's not like a frame dip, but a slight "glitched" frame feel.

When you're playing competitively with vsync disable, Microstutter can't be seen whatsoever, because, one, you're focused on the game, and two, the screen is tearing the shitznit already...

Micro-stutter is an issue, "if you're looking for it", but in terms of actual gaming experience, it's not that big a hindrance. :D

HOWEVER, to answer your question. YES, microstutter WILL ALWAYS BE THERE, in SLI or crossfire.

It is not true that it is only seen when Vsync is on.

It is true that microstutter will be less likely to be noticed the higher your frames are, because in general it allows a better choice of frame to be sent to the display. As a side effect of this, sometimes Vsync without triple buffering or a modern tech could exacerbate microstutter - but it is false to say that turning it off will eliminate it.

Take the possibility for instance that you are getting less than 60 FPS in a game. Quite possible in a few games if you have the settings turned up, and depending on your resolution. Turning Vsync off in this case may actually increase microstutter.

It should also be noted as some users have stated this is not an issue only for SLI and Crossfire cards but can happen to a single card, although amplified by multiple cards. It is when the time between frames rendered varies greatly for some reason.

I have only really seen it in one game which was Stalker Clear Sky. It was very bad, unplayable. I have run 285 sli 570 sli and now 670 sli (although very few games with the latter obviously) However with 570 sli I often ran games in surroundview which may give a different experience than a single monitor gaming one.

My advice would be to see if you can find someone with an SLI setup that you can see running. Some people are far more sensitive to it than others.

Another user has stated that it Kepler is supposed to have improved this, however I have not read that anywhere. Can you give a source boxleitnerb?

http://techreport.com/articles.x/22735/2

Here is an article about adaptive vsync, but the author states that it is NOT advertised as helping with microstutter issues (indeed it is to product a balance between tearing and smoothness in games where frames deviate above and below a monitor's refresh rate and therefor standard vsync would lower FPS to a factor of 60). The only statement I can find is in reference to the 690 which has a better bridgechip to help alleviate this problem - but has no effect on SLI 680's.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I have been using Dual cards for many years, AMD and now NVidia. I am also quite sensitive to micro stutter and see it easily. To me its about whether a game feels smooth and the motion tricks my eyes. If there is microstutter I see jitters. It also shows up in the Fraps frame time captures as uneven times to render alternate frames.

Comparing 2x7970 v 2x680 there is absolutely no contest, NVidia wins mightily. The 7970's microstutter badly in almost everything. Anything above 60 or below 60 fps was worse than a game maintaining precise vsync, whether vsync is on or off. The 680's on the other hand only stutter a bit, but its present in almost all games. I would say that NVidia fools me into motion in SLI when the fps is >45. A single 680 however fools me into seeing motion down to 30 fps with the gap being the impact of microstutter. So in my mind if a game scales perfectly (100%) then you get about 50% extra performance to play with, because you only use half of the power given or you'll start to see stutter.

Bare in mind even at >45 I occasionally see hitches and such, but in general SLI has been massively better than xfire. I am not entirely convinced its worthwhile over a single card if you are see MS.
 
Feb 19, 2009
10,457
10
76
vsync + tripple filtering = i dont notice microstutter.

OP, if u want to know for sure, try to sit in front of someone's PC with SLI/CF and play games. That's the only way to know for sure, because MS is very subjective.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
vsync + tripple filtering = i dont notice microstutter.

OP, if u want to know for sure, try to sit in front of someone's PC with SLI/CF and play games. That's the only way to know for sure, because MS is very subjective.

You certainly can't run triple buffering with AMD's crossfire on 7970's I know because I tried it. What happens is every game crashes. It takes it a while, minutes to even hours but it will blue screen in the end.

NVidia on the other hand provides the option in their drivers and it seems to work just fine.

Just another clear win for NVidia unfortunately.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You certainly can't run triple buffering with AMD's crossfire on 7970's I know because I tried it. What happens is every game crashes. It takes it a while, minutes to even hours but it will blue screen in the end.

NVidia on the other hand provides the option in their drivers and it seems to work just fine.

Just another clear win for NVidia unfortunately.

The driver option only affects OpenGL titles. Microsoft does not allow Triple Buffering on the driver level with DX so you need to force it via something like Direct3D Overrider.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
OP, if u want to know for sure, try to sit in front of someone's PC with SLI/CF and play games. That's the only way to know for sure, because MS is very subjective.
It is subjective per individual's sensitivity, but at the same time it is not an illusion, either. It is objectively measurable.

If one doesn't perceive micro-stutters using SLI/CF, then there is no reason to look for it. Ignorance is truly a bless in that case and I say that without any condescension. As an analogous, there are folks who go hell bent over a couple dropped frames over a 5 minute movie trailer if you go to where HTPC-maniacs reside. I am glad not to be bothered by such thanks to my ignorance.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
You certainly can't run triple buffering with AMD's crossfire on 7970's I know because I tried it. What happens is every game crashes. It takes it a while, minutes to even hours but it will blue screen in the end.

NVidia on the other hand provides the option in their drivers and it seems to work just fine.

Just another clear win for NVidia unfortunately.

Nvidias option works only in OpenGL. You have to use 3rd party tools for TB in DirectX.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I don't think some of you really know was micro stutter looks like. you're only going to really notice it when you have vsync on and are barely around or below 60 fps. all it does is make the frame rate feel lower than what is being reported. with sli/xfire anything below 40 fps is generally going to feel slower than a single gpu would at 30 fps.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0


I dont notice the mircostutter in that video when its playing at "normal" speed.

when they slow it down by a factor of 300 times, you can see the updates in imagines come at a more "fixed" spaced out amount of time between frames, with a single gpu option vs a multi setup.

However since I like most people play games at "normal" speed, and dont record videos at a factor 300 slow-motion to look for stutter, it probably isnt a issue.

Maybe some people actually see it, maybe they "think" they do, but either way its probably not nearly as big a issue as people make it out to be.
 

mashumk

Member
May 19, 2012
40
0
0
Thanks for all the replies. Still sorting through all the info. If the youtube videos claiming stutter are legit, yeah, it will definitely bug me. Just like a single dead pixel would bug me on any monitor. Otherwise I'd be all over those Ebay Catleaps.

A new question has come to mind after reading these posts.

For 1440/1600p single monitor gaming, is vram less a priority than the actual power of the GPU? If so, does it ever reach a point where vram can become a limiting factor on a single hi-res monitor?

Let's say any given game is set to as high graphical setting as possibly allowed by a card to maintain 45-60fps minimum.

For example, would two 4gb 680 SLI have the same graphic ceiling as two 2gb 680 SLI on a single 1600p monitor?

I'm assuming it only matters in special situations like a heavily modded Skyrim?
 

serpretetsky

Senior member
Jan 7, 2012
642
26
101
I notice micro-stutters ALL THE TIME with a single GPU. I have no idea how you can possibly NOT notice them in SLI/CF.
I'm not really sure how you notice micro stutter on a single card. Micro-stutter has been defined by the community as a problem that only exists with multiple card setups. Single cards can't have micro stuttering because there is nothing to desynchronize or synchronize; there's only one card.

I'm also not sure why vsync and tripple buffering got mixed into this. These technologies have nothing to do with micro-stuttering. They may, make microstuttering more visible simple because they may bring the fps to an even lower number. Microstuttering is more visible when your fps gets low.

Microstuttering is NOT some sort of loading stutter

here's a quick run down
http://hardforum.com/showthread.php?t=1317582

basically, how microstuttering happens is easily described by thinking of this simple though experiment.

When you have two cards running in alternate frame rendering mode (most sli and crossfire defaults to this) each card renders an entire frame.

Ideally each card will render a frame out of sync with the other card:
----card1-----FRAME-----FRAME-----FRAME-----FRAME-----FRAME
----card2----------FRAME-----FRAME-----FRAME-----FRAME-----FRAME
visible frames:---*----*-----*----*-----*----*----*----*-----*----*

But here's the worst case scenario:
----card1-----FRAME-----FRAME-----FRAME-----FRAME-----FRAME
----card2-----FRAME-----FRAME-----FRAME-----FRAME-----FRAME
visible frames:---**--------**---------**---------**---------**

notice how in the worst case scenario both cards render their frames at almost the same time, this leaves a giant gap of time between the double frames that is far worse than the ideal setup. While the delay between frames is increased, the cards are still using the same amount of power, doing the same amount of work, and any FPS counter (like fraps) will actually count the same frames per second between both the ideal and worse case scenario setups (if you count the frames you will find they output the same amount!).

So both of those setups could be showing 30fps (by fps counter), but the worse case scenario will be playing as if though its at 15fps, while the ideal one will be true to its word, and playing at 30fps.

edit: once you consider those things, the reason why microstutter is most noticeable at low fps becomes obvious: Its easier to tell the difference between 30fps and 60fps then between 60fps and 120fps.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I've never noticed it, Ive had 8800 GTS SLI, GTX 260 SLI, GTX 260 triple SLI, and my current GTX 480 SLI and have never noticed this under any circumstances, but clearly some people do and its a subjective issue.

Best advice is to test out from someone else if at all possible.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
serpretetsky has the best explanation, his link explains it well too. Mind you, those threads are very old, going back to July 2008!(exactly 4 years ago) Maybe in 2012 with newer cards its not as big as a problem? Is there anyway to better ensure a multi card setup follows the 1st scenario wherein frames are rendered inbetween each other? or is it all driver related? Also, does CPU have any effect on micro stutter? or is it totally independent of that?

Thanks for all the replies. Still sorting through all the info. If the youtube videos claiming stutter are legit, yeah, it will definitely bug me. Just like a single dead pixel would bug me on any monitor. Otherwise I'd be all over those Ebay Catleaps.

A new question has come to mind after reading these posts.

For 1440/1600p single monitor gaming, is vram less a priority than the actual power of the GPU? If so, does it ever reach a point where vram can become a limiting factor on a single hi-res monitor?

Let's say any given game is set to as high graphical setting as possibly allowed by a card to maintain 45-60fps minimum.

For example, would two 4gb 680 SLI have the same graphic ceiling as two 2gb 680 SLI on a single 1600p monitor?

I'm assuming it only matters in special situations like a heavily modded Skyrim?

yes that's the most important question right there. I don't understand people that get these high end cards with just 2gb ram each cause they're gonna be gaming @ high resolutions. BF3 can chew up 1.5-1.8 VRAM, and Shogun 2 has been shown to gobble up to 2.5 with huge battles (although very rare in game) @ 1200 resolution. so if these 2011 games can almost push 2gb of Vram @ a moderate resolution, how does it bode for this year and next year especially @ higher resolutions? This doesnt even consider the ultra quality texture mods that are so popular.

Basically, 4gb per card is a must if going SLI as you'll hit the VRAM limit long before the GPU limit! (indeed i imagine 670gtx in SLI on the GPU end will be able to play games @ 60fps+ for atleast 2-3 years to come, but the 2gb ram will become a problem by as early as next year). 3gb would be ideal as 4gb is a bit of overkill, but only the Radeons have 3gb.
 
Last edited: