[Tom's] Interesting article on microstuttering

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Link: http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-3.html

I've linked to the page where the most interesting discussion takes place, but read the whole article to see what it's all about.

This is of course most relevant to the issue of whether crossfire/SLI have drawbacks that cannot be seen in either average or minimum frame per second benchmarks, but I'd wager it's also relevant to all GPU analyses. We depend greatly on fps to tell us what we need to know, but so much about the user experience comes from the time between frames.
 

arredondo

Senior member
Sep 17, 2004
841
37
91
Fantastic write-up. I was just looking for info on this topic last night. A lot of my questions (and more) were answered.

I imagine the next gen cards will have some marketing language that addresses micro-stuttering to get people to show more interest in dual card setups (if they indeed do address it).
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
The big question is: Is the micro-stuttering perceptible?

The article talks about micro-stutter a lot, and the author tells us when it is observed subjectively, and when it improves, but doesn't seem to measure it beyond the one exploded pie chart which lacks actual numbers. The chart does suggests an alternation of perhaps 20ms between frames at 30 fps; this is roughly half the time it takes to blink.

Is that significant? (Honest question; I've never observed micro-stutter, personally.)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
The big question is: Is the micro-stuttering perceptible?

The article talks about micro-stutter a lot, and the author tells us when it is observed subjectively, and when it improves, but doesn't seem to measure it beyond the one exploded pie chart which lacks actual numbers. The chart does suggests an alternation of perhaps 20ms between frames at 30 fps; this is roughly half the time it takes to blink.

Is that significant? (Honest question; I've never observed micro-stutter, personally.)

they answered that

You end up with what feels like a stuttering engine. Yes, it's going 30 MPH just like a smooth inline-six. But this one hits the same speed and feels like it has one cylinder out of whack. .


When Can Micro-Stuttering Be Seen?


In a nutshell, all of the time. The lower the average frame rate, the more the frame rate is perceived as being lower than the actual average frame rate. Thus, as bad luck would have it, a frame rate of 30 FPS may be perceived as merely 20 to 25 FPS. The human eye does, however, still notice differences in when frames show up on-screen beyond 60 FPS.


This is one of the reasons why we prefer testing with higher frame rates in the GPU scaling tests on the following pages. It continues to amaze us how, even beyond the generally-accepted target of 40 FPS, you can still see the impact of micro-stuttering once rendering becomes imbalanced.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
That info sounds correct to me, but honestly I could have written that page just based upon stuff that I've picked up over the years here (even though I have never used sli/xfire). How about instead of giving us a "scenario", why not just measure the actual time to render each frame?

edit: oops, read the rest of the article, next page has some pretty charts...:oops:

From going through the article, I'm curious what monitor Tom's is using to see the "noticeable" stuttering. I count 8 frames out of 49 in which xfire 6870 drops below 120 fps, and all but 3 of those drops end up very close to 120.

After reading the entire article, I'd like to see all of those systems on 2560x1600 instead of 1680x1050 and 1920x1080. Most of those cards tested, and especially the 2+ card setups, are designed for higher resolutions. I'm not sure why they didn't just perform the tests at the higher resolutions instead of trying to extrapolate info based upon fps that are too high to be displayed by the monitor, anyway.

I'd be interested to see what BFG10k, apoppin, grooveriding, keysplayr, plus any others who have owned sli/xfire over the years have to say about this.
 
Last edited:

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Crossfire-SLI-stuttering,W-F-300543-22.png


I wonder what kind of monitor they were using. If they were using 60Hz monitor then I don't think stuttering can be perceptible.

Yeah, I have had my share of experience with MS.. most notable in F1 2010, but subsequent driver releases after the game released mitigated that issue and made it playable.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The reason that I mentioned having experienced sli/cf users comment is that I wonder about the way that article was written. BFG10k is one of the few people I've ever seen to comment that he can personally detect MS at very high frame rates (not trying to put words into his mouth but I seem to remember him saying that even at 70-80 fps he can still detect a difference). When all but 3 of those frames are over 100 fps, and the 3 dips are widely spaced, it seems unlikely that the author was actually able to detect the MS at all in fact.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I gave up SLI GTX470s for a single GTX580 for reasons of microstutter and input lag

monitors used: Dell U2711 (2560x1440, 60Hz) and BenQ XL2410T (1920x1080, 120Hz)

vsync is one solution for microstutter, but that means even more input lag

IMO, CF/SLI is only good for redonkulous levels of eyecandy at tolerable frame rates, not blistering speed
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The reason that I mentioned having experienced sli/cf users comment is that I wonder about the way that article was written.
It is very subjective. i can see micro stutter also. But then i also notice single-GPU jitters which is sometimes confused with multi GPU micro-stutter.

Overall it does not bother me. All gaming is compromise. It involves suspension of disbelief. i believe that CF should be used primarily to allow for higher settings; not make a nearly unplayable experience on a single GPU faster with two of them.

i was planning to explore microstutter shortly with GTX 580 SLI and GTX 590 as well as using HD 6990 and HD 6970 CF and CF-X3. But my HD 6990 is artifacting even on the desktop on my Intel system and i have asked AMD for advice.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The big question is: Is the micro-stuttering perceptible?

The article talks about micro-stutter a lot, and the author tells us when it is observed subjectively, and when it improves, but doesn't seem to measure it beyond the one exploded pie chart which lacks actual numbers. The chart does suggests an alternation of perhaps 20ms between frames at 30 fps; this is roughly half the time it takes to blink.

Is that significant? (Honest question; I've never observed micro-stutter, personally.)

While they give a lot of evidence to support their position, their conclusions are all subjective and prejudiced by prior knowledge of the measurements and what equipment is running. It's still Tom's, after all. Where they write the evidence to prove their conclusions. Unfortunately, this article proves nothing.

Testing would have to be done double blind (neither the person administering the test nor the subject can be aware of what equipment is running at the time.), and frame-rates would need to be sync'd so as that not being what they are noticing. Then they would need to have multiple subjects do the comparison and be able to tell with near 100% accuracy when 1, 2, 3, and 4 GPU's were running to prove anything. That's just off of the top of my head where proper scientific methods haven't been followed.
 
Dec 30, 2004
12,553
2
76
i went from a 4890 to 2x5770 very noticeable even though solid 60fps very noticeable not smooth distance between objects moving between frames noticeable needlessly im back on 1x4890 not looking back:) :) :)
 
Feb 19, 2009
10,457
10
76
Exactly.

If you go into these tests actively paying attention and looking for MS, you will find it. If you pay attention instead to the actual gaming, it becomes a non-factor. Input lag is the lame-est excuse ever, as if we are pro gamers who even notice or care about an extra 2ms lag.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
It is very subjective. i can see micro stutter also. But then i also notice single-GPU jitters which is sometimes confused with multi GPU micro-stutter.

Overall it does not bother me. All gaming is compromise. It involves suspension of disbelief. i believe that CF should be used primarily to allow for higher settings; not make a nearly unplayable experience on a single GPU faster with two of them.

i was planning to explore microstutter shortly with GTX 580 SLI and GTX 590 as well as using HD 6990 and HD 6970 CF and CF-X3. But my HD 6990 is artifacting even on the desktop on my Intel system and i have asked AMD for advice.


This is pretty much my angle on CF/SLI. I wouldn't choose two 6770's over a single 6950, even if they were technically faster. I would choose two 6950's over a GTX580.

It's all about the trade-offs. I imagine the difference between frames when you are over 100FPS has to be pretty small on a multi-GPU setup. And if the two card setup is fast enough microstutter should be minimized well enough to give the user a better overall experience than a single card.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
This is pretty much my angle on CF/SLI. I wouldn't choose two 6770's over a single 6950, even if they were technically faster. I would choose two 6950's over a GTX580.

It's all about the trade-offs. I imagine the difference between frames when you are over 100FPS has to be pretty small on a multi-GPU setup. And if the two card setup is fast enough microstutter should be minimized well enough to give the user a better overall experience than a single card.
That is what i have largely observed. When you have 100fps, then you want ridiculously good IQ and all of the enhancements instead of just more framerates. When you are marginal, dial back the settings or the resolution.

Tom apparently found that Nvidia's SLI was smoother than CrossFire. i can't agree as each game and each situation is different. However, some games that had bad micro-stutter, had it ameliorated by later driver releases. And some game engines appear more prone to it than others.

That said - overall, i prefer to play with Multi-GPU because i game at such high resolutions (1920x1080/2560x1600/5760x1080) and or play in S3D (where the experience *must be* smooth).

And this is *weird* .. have you ever heard of a video card that works in one PC but not in the other? Simply changing the MB makes the artifacting and BSoDs vanish? i feel like starting a new thread here.
o_O
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I've never been too big on Tom's reviews, they always manage to interject opinion or praise that is suspect and not quantifiable. As they did in this review :\

He did make some valid points but at the same time seems to have tested a pointless resolution. Which is really perplexing as one of his accurate points is that microstutter is prominent when your framerate is low... so why are they testing setups that crush a resolution like 1920x1080. Another point he made that I have also experienced first hand is that 3 GPUs show less microstutter than 2. I've actually read a good write up explaining why this is, I can't remember if it was a review or a forum post. I'll try to dig it up. It's related to the timings with three cards under AFR being tighter than they are with two if I recall correctly.

Micro-stutter definitely exists and you do notice it, generally though, I only see it in certain games and others not at all. The caveat to that would be running a game I usually don't notice it in under multi-gpu then quickly switching out to single gpu and running it again. Or simply, in most games you get used to it and don't see it anymore, in others it's evident most of the time because of the game engine. Engines that are really bad for microstutter are Cryengine 2, Source engine and S.T.A.L.K.E.R's X-Ray engine from my experience. The biggest culprit and worst is Cryengine 3, Crysis 2 has major microstutter.

If you ever have any doubt, the best way to show someone an example of microstutter is to run the original Crysis gpu_benchmark on a multi-gpu and single-gpu system side by side. You can't miss it.

http://www.youtube.com/watch?v=XWwGkz_Zx5A

This is an old benchmark from when the 5870 was released using the Crysis benchmark comparing a 5870 to a GTX295. I can see the microstutter pretty clearly. Pay close attention to the passby of the shack in the water and when it goes up around the hill with the radio tower. I've also done this at home and it's really obvious in person.

Single GPU is definitely superior, but if you are playing at resolutions no single GPU can handle, microstutter is not significant enough that you should not use a multi GPU setup that will deliver decent framerates.

This upcoming 28nm GPU generation should be an exciting one. Games have gone stagnant on pushing graphical boundaries. With the new 28nm GPUs we will finally have cards that can handle anything out there at 1920x1200 and down. I say this as right now 2 6950s can run anything out there at 1920x1200 and assume we will get that performance in a single GPU at 28nm.

Framerates being equal single GPU is so much better than multi not just because of microstutter but all the rest of multi-gpu drawbacks; heat, noise, power, case space etc.
 
Last edited:

pw38

Senior member
Apr 21, 2010
294
0
0
I did 2x 5770 and seemed to notice it on a few titles. Assassin's Creed seemed to be the worst. I have a 6950 now and don't see the need for another one. I'll wait for the 7K cards before upgrading. Crossfire was an interesting experiment for me that I don't really care to repeat.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
And this is *weird* .. have you ever heard of a video card that works in one PC but not in the other? Simply changing the MB makes the artifacting and BSoDs vanish? i feel like starting a new thread here.
o_O

Uh oh, sounds like the vertex3 of video cards there :eek:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Uh oh, sounds like the vertex3 of video cards there :eek:
It's my HD 6990. It started to artifact at stock (factory non-overclocked) speeds. 775W Thermaltake PSU is fine; it runs GTX 590 and HD 6970 CF with no issues. Switched MB slots. Changed drivers. Same issue and it is not temp related.

Then i contacted AMD who was not surprised and said to try my HD 6990 in another MB - that they sometimes have that issue. i stuck it in my Phenom 980BE System and it runs fine
- they sometimes have this issue? This is a first for me
o_O

Start another thread or let it go? i don't want to buy another Intel CPU/MB until X58 is officially replaced
--No HD 6990 Microstutter on my Intel system any longer
:'(

UPDATE: AMD intended for me to run the HD 6990 in another MB and then move it BACK to the Intel system. And it works!! ... Weirdest "fix" i have ever encountered and worth trying before an RMA, perhaps
- added to my tool chest. Unfortunately, i did not have time to include the HD 6990 in my Cat 11-8 Perf Analysis

 
Last edited:

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
Another thing about crossfire that isn't always mentioned is when new games come out they may not always support multiple GPUs. With The Witcher 2 if you had a crossfire setup you were out of luck for awhile until the drivers and updates caught up. In this instance a single 5850 wouldve beat out a dual 5770 setup simply because single GPUs do not rely on software so much to perform. You may be stuck playing a new game at poor resolutions and settings for months if you didn't go with the fast single GPU solution.

For me that makes a crossfire setup a deal breaker. Unless I'm forking over enough that if I am only running one legged on a game the single card will destroy the game anyway then it isnt worth it. At that point what is the point of another card anyway?

From what I see crossfire and SLI doesn't make a whole lot of sense for the mainstream gaming market.
 
Last edited:

GotNoRice

Senior member
Aug 14, 2000
329
5
81
Interesting article. I really don't notice any micro-stutter with my 2x 4870x2 and my games seem just as silky-smooth as any single-GPU setup I've seen. Their results seem to confirm that there is very little micro-stutter using triple or quad GPU setups.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
It's my HD 6990. It started to artifact at stock (factory non-overclocked) speeds. 775W Thermaltake PSU is fine; it runs GTX 590 and HD 6970 CF with no issues. Switched MB slots. Changed drivers. Same issue and it is not temp related.

Then i contacted AMD who was not surprised and said to try my HD 6990 in another MB - that they sometimes have that issue. i stuck it in my Phenom 980BE System and it runs fine
- they sometimes have this issue? This is a first for me
o_O

Start another thread or let it go? i don't want to buy another Intel CPU/MB until X58 is officially replaced
--No HD 6990 Microstutter on my Intel system any longer
:'(

UPDATE: AMD intended for me to run the HD 6990 in another MB and then move it BACK to the Intel system. And it works!! ... Weirdest "fix" i have ever encountered and worth trying before an RMA, perhaps
- added to my tool chest. Unfortunately, i did not have time to include the HD 6990 in my Cat 11-8 Perf Analysis


Along those lines I had a friend that could not get his first gen SB live to work in his system after one point and that was after two motherboard replacements (two different models, as he went through hardware like crazy) and at least one OS reinstall. Worked perfectly fine else where.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Along those lines I had a friend that could not get his first gen SB live to work in his system after one point and that was after two motherboard replacements (two different models, as he went through hardware like crazy) and at least one OS reinstall. Worked perfectly fine else where.
Well, my HD 6990 simply won't work in my Intel x58 MB any longer even though it still works in my Phenom II 980 BE system. Moving it from the Intel to the Phenom II system may have reset some ICs temporarily; but sadly the "fix" that AMD suggested is no longer working and the 6990 artifacts immediately on the Intel CPU-powered desktop and shortly afterward BSoDs the system.

HD 6970 and HD 6970 CF work fine in the same system; GTX 590 and GTX 580 also work great - nothing is changed except the video cards (and the HDD for Nvidia cards), i'd say the card is defective, especially because AMD admitted that "it sometimes happens"
o_O
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
It's my HD 6990. It started to artifact at stock (factory non-overclocked) speeds. 775W Thermaltake PSU is fine; it runs GTX 590 and HD 6970 CF with no issues. Switched MB slots. Changed drivers. Same issue and it is not temp related.

Then i contacted AMD who was not surprised and said to try my HD 6990 in another MB - that they sometimes have that issue. i stuck it in my Phenom 980BE System and it runs fine
- they sometimes have this issue? This is a first for me
o_O

Start another thread or let it go? i don't want to buy another Intel CPU/MB until X58 is officially replaced
--No HD 6990 Microstutter on my Intel system any longer
:'(

UPDATE: AMD intended for me to run the HD 6990 in another MB and then move it BACK to the Intel system. And it works!! ... Weirdest "fix" i have ever encountered and worth trying before an RMA, perhaps
- added to my tool chest. Unfortunately, i did not have time to include the HD 6990 in my Cat 11-8 Perf Analysis


I've had that several times over the years. Either the gpu wasn't seated properly or the cord was loose.

edit: n/m, looks like you're having problems again.
 
Last edited: