ATI 5970 microstuttering tests

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I have never been able to perceive this Micro stuttering stuff... I'm not sure if that means my eyes are terrible (they probably are) or it is something that doesn't bother me (or doesn't, gasp exist, lol). I suppose many people get head aches from 60hz displays, similar thing I suppose. I think people over simplify it though... and that it is not entirely an AFR issue.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
The review needs to show what those graphs would look like with a single card before we can determine how big or small this micro studdering issue is.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
I've never seen it either and I had a GX2, 260 SLI and now a 295. I think you have to overload the cards at 30 fps or less to really see it. I use SLI to turn a 30 fps game to a 60 fps game. That's the way it should be used.
 

adairusmc

Diamond Member
Jul 24, 2006
7,095
78
91
I have never been able to perceive this Micro stuttering stuff... I'm not sure if that means my eyes are terrible (they probably are) or it is something that doesn't bother me (or doesn't, gasp exist, lol).

Same here. I have never seen microstuttering in any of the multi-GPU setups I have owned.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Can someone explain this to me, if I am incorrect:

As I understand it, micro stuttering is supposed to be a result of AFR producing the frames at an inconsistent rate. That is to say two frames very close together, a relatively large gap, then another pair of frames. Such that the fps (frames over total time) is "doubled" in the best case, but the perception, in the worst case, is no FPS increase over a single card.

It isn't at all like the forum linked in the OP goes on about. Which is stuttering. The "perception" of micro stuttering is that it adds nothing.. not that it looks "worse," jumpy, or anything.

The idea that it looks worse, or makes one's head hurt (beyond the possible motion sickness it could cause) seems to be a miss understanding of regular stuttering.. which is made worse by lack of dual GPU optimizations.

To actually show micro stuttering the graph would have to be able to calculate the FPS every single frame, based on the time between it and the last (i suppose FPS = 1/(time - last frame time)).. or just display the time gap between each frame. I cannot see the graphs from this computer.. is that what they are doing?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Can someone explain this to me, if I am incorrect:

As I understand it, micro stuttering is supposed to be a result of AFR producing the frames at an inconsistent rate. That is to say two frames very close together, a relatively large gap, then another pair of frames. Such that the fps (frames over total time) is "doubled" in the best case, but the perception, in the worst case, is no FPS increase over a single card.

It isn't at all like the forum linked in the OP goes on about. Which is stuttering. The "perception" of micro stuttering is that it adds nothing.. not that it looks "worse," jumpy, or anything.

The idea that it looks worse, or makes one's head hurt (beyond the possible motion sickness it could cause) seems to be a miss understanding of regular stuttering.. which is made worse by lack of dual GPU optimizations.

To actually show micro stuttering the graph would have to be able to calculate the FPS every single frame, based on the time between it and the last (i suppose FPS = 1/(time - last frame time)).. or just display the time gap between each frame. I cannot see the graphs from this computer.. is that what they are doing?
The graphs show the frame number and the time (in ms) taken to render it. Microstuttering does look worse because even though the framerate is higher, it's jumping all over the place, which can be annoying to some and even give headaches to others. There's a reason I'm driving my 2560x1600 with only a single 5870. Sure I could run out and buy an HD5970 if I wanted 40FPS+ in Crysis, but the testing confirms my suspicion that the stuttering is still there. After dumping my GTX295 and going back to single card with my 5870, I can definitely say I'm not going back to dual GPU until the technology is better. Until then, I'll just buy the fastest single GPU available and overclock the living hell out of it.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
The graphs show the frame number and the time (in ms) taken to render it. Microstuttering does look worse because even though the framerate is higher, it's jumping all over the place, which can be annoying to some and even give headaches to others. There's a reason I'm driving my 2560x1600 with only a single 5870. Sure I could run out and buy an HD5970 if I wanted 40FPS+ in Crysis, but the testing confirms my suspicion that the stuttering is still there. After dumping my GTX295 and going back to single card with my 5870, I can definitely say I'm not going back to dual GPU until the technology is better. Until then, I'll just buy the fastest single GPU available and overclock the living hell out of it.

But the frame rate isn't jumping all over the place in any way that we can perceive. Unless the fps is abhorrently low you won't be able to tell that it is changing at all.. it will just look slower than it is reported as. It will cycle from "high fps" to "low fps" every 1/20 second if you get anything remotely playable.

Anything that 'looks' like the fps is jumping is not microstuttering, it is stuttering.. which most will agree is worse on dual GPU (for many reasons, optimizations mainly). However, that is an entirely different issues, and is not a fundamental flaw with AFR.

If the FPS was rather low, microstuttering might make it look "funny" on a person to person basis.. but if it looks like it has sections of slow and fast fps that is not microstuttering at all.

As for the graphs (im at home now and can look at em) they don't really show anything.. The time taken between the draws of each frame is where we would see microstuttering. All that shows is that each frame is of a new scene (duh) and means even less without showing the non crossfire comparison. Micro stuttering is a phase discrepancy between the two cards, that would not be seen in the time to render a frame (though the time to render is part of it).

Edit: Don't get me wrong.. MS is in theory a very real thing, just like any phase distortions.. I am one of the many that can't seem to see it mind you. However, it is important not to lump it in with regular old stuttering which many seem to do. Obviously there is benefit and negatives to multi GPU's. I'd like to see a more thorough.. with counter examples (with just one card).
 
Last edited:

scooterlibby

Senior member
Feb 28, 2009
752
0
0
I think I have seen it, but it may actually be something else going wrong with my system. Every time I come back from sleep mode MW2 gets choppy nowadays. I've had SLi setups since the 7600's and always had occasional random choppiness that couldn't be explained by a lack of SLI profiles. Still high frame rates, yet a very perceptible 'stutter.'
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
This was probably the most insightful article on the net about micro-stuttering. Take some time to read and digest it:


http://www.rage3d.com/reviews/video/ati4870x2cf/index.php?p=2

That is a good article. Though I remember reading it before. Pretty much more detail of what I said no? I like how they measure the MS.

More to the point. If the performance is there already you won't notice.. if it isn't you might be in trouble.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Oh and the problems I would have were often solved by turning off SLI.
you dont have to make another post one minute after the other. just use the edit feature especially since you were not even replying to anyone. ;)

back on topic, I have always been a fan of using a single fast card. there are just too many more issues including this microstuttering that sli/crossfire just isnt worth it for me.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
you dont have to make another post one minute after the other. just use the edit feature especially since you were not even replying to anyone. ;)

back on topic, I have always been a fan of using a single fast card. there are just too many more issues including this microstuttering that sli/crossfire just isnt worth it for me.

I don't think anyone would disagree with you. One single card is better than an equivalent dual card setup. In almost every single way. Though the better the card handles the game, the less fo an issue micro stuttering is.. and at the high end you do see pretty good improvements provided it is close to playable before hand, just not if it was ass to start.
 
Nov 26, 2005
15,099
312
126
Please tell me they are doing these 'reviews' on 120Hz LCDs right?

My experience:
I've gone from a 19" 2ms 75hz to a 22" 110Hz LCD and there is a noticeably smoother difference from the stuttering 75hz lcd
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
But the frame rate isn't jumping all over the place in any way that we can perceive. Unless the fps is abhorrently low you won't be able to tell that it is changing at all.. it will just look slower than it is reported as. It will cycle from "high fps" to "low fps" every 1/20 second if you get anything remotely playable.

Anything that 'looks' like the fps is jumping is not microstuttering, it is stuttering.. which most will agree is worse on dual GPU (for many reasons, optimizations mainly). However, that is an entirely different issues, and is not a fundamental flaw with AFR.

If the FPS was rather low, microstuttering might make it look "funny" on a person to person basis.. but if it looks like it has sections of slow and fast fps that is not microstuttering at all.
It sounds to me like you are confusing the terms "framerate" and "FPS." While sometimes used interchangeably, they are not the same. FPS is a specific method of reporting framerate, but framerate itself is as depicted on the graphs posted at XS, and with microstuttering it is jumping all over the place.

As for the graphs (im at home now and can look at em) they don't really show anything.. The time taken between the draws of each frame is where we would see microstuttering. All that shows is that each frame is of a new scene (duh) and means even less without showing the non crossfire comparison. Micro stuttering is a phase discrepancy between the two cards, that would not be seen in the time to render a frame (though the time to render is part of it).

Edit: Don't get me wrong.. MS is in theory a very real thing, just like any phase distortions.. I am one of the many that can't seem to see it mind you. However, it is important not to lump it in with regular old stuttering which many seem to do. Obviously there is benefit and negatives to multi GPU's. I'd like to see a more thorough.. with counter examples (with just one card).
Despite what goes on under the hood (see the article SirPauly posted), the manifestation of microstutter as perceived by the user is the delta from the mean frametimes. The video card set-up used makes no difference here. In fact, there's even measurable deltas in single card configurations, it's just not enough to be perceived as microstuttering by most people. Microstuttering is perceived differently from regular stuttering, as you stated, but that doesn't mean it's not perceived nor change what is actually happening.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
It sounds to me like you are confusing the terms "framerate" and "FPS." While sometimes used interchangeably, they are not the same. FPS is a specific method of reporting framerate, but framerate itself is as depicted on the graphs posted at XS, and with microstuttering it is jumping all over the place.

FPS is a unit of frame rate (simply being the frequency). I'm not confusing it, I know the difference though I was not clear. But it isn't so much "jumping all over the place" as out of phase. I said it was not jumping around in a way we can perceive, not that it is not jumping around but that it is difficult to see it. We don't see the jerkiness unless the rate is very low. What we get is the impression of lower frame rate the higher the delta is. It would look "strange" if the delta were large enough for us to notice it on its own.. otherwise it might just look like 50fps instead of the reported 75.

Frame rate is not depicted on the graphs, frame rate is in frequency space, the graphs show the period, which is in time space.

Despite what goes on under the hood (see the article SirPauly posted), the manifestation of microstutter as perceived by the user is the delta from the mean frametimes. The video card set-up used makes no difference here. In fact, there's even measurable deltas in single card configurations, it's just not enough to be perceived as microstuttering by most people. Microstuttering is perceived differently from regular stuttering, as you stated, but that doesn't mean it's not perceived nor change what is actually happening.
I'm not sure how what you just said is different from what you quoted. Perhaps I'm missing something.

Microstuttering is by definition a phase discrepancy, I didn't mention anything about different "under the hood" aspects. What we want are the two GPU's to be perfectly pi out of phase. When they are not we call it microstuttering. In theory it could be bad enough that the GPUs are entirely in phase to the point the frames overlap and we see only the fps of one card.

We can't perceive the delta's unless the FPS is very low. What we see is more of a smearing of a couple frames together, with the impression of lower frame rate. Certainly this might look terrible to some people, I suppose though I'm not sure. It might even make people sick who are susceptible to that sort of thing. But many people confuse this, which still looks "smooth" (at lest it is still consistent), with regualr old stuttering that may also be caused by a double gpu set up (where we see obvious parts of a game drop FPS for a good chuck of seconds.