Is this micro stuttering?

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
I have a 5970 pitted against a (single) GTX 480 in identical systems. The 5970 will score higher in benchmarks however the 480 just APPEARS smoother to the eye. If it weren't for the benchmark numbers I would pick the 480 over the 5970. :eek:

The 5970 seems to chug and struggle with certain things - namely Heaven 2.0 and the Stone Man. Now Stone Man is actually showing lower FPS on the 5970. :\

Would this be related to crossfire micro stuttering?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Well, if it's only in Heaven 2.0 and Stone Man, it'd probably the tesselation units holding back the rest of the card(s) (the GTX480 would be much faster at tesselation). If it's in everything, check your drivers.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
The 5970 will score higher in benchmarks however the 480 just APPEARS smoother to the eye.

That sounds exactly like it from what I remember.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I have a 5970 pitted against a (single) GTX 480 in identical systems. The 5970 will score higher in benchmarks however the 480 just APPEARS smoother to the eye. If it weren't for the benchmark numbers I would pick the 480 over the 5970. :eek:

The 5970 seems to chug and struggle with certain things - namely Heaven 2.0 and the Stone Man. Now Stone Man is actually showing lower FPS on the 5970. :\

Would this be related to crossfire micro stuttering?

Yes. Try enabling Vsync and Triple buffering and see if the 5970 looks any smoother.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
yes it is microstuttering... I've always have been affected by it with multi gpu, but once you enable vsync it cleans up nicely. Trick is that some programs don't like vsync(dead space) or sometimes have to use nHancer (Clear sky) to be able to use it.

I don't know how sound people find it to be not noticeable. Anyhow, I always end up getting Crossfire or Sli anyways. The benefits outway the drawbacks.
 

JRW

Senior member
Jun 29, 2005
569
0
76
The 5970 will score higher in benchmarks however the 480 just APPEARS smoother to the eye. If it weren't for the benchmark numbers I would pick the 480 over the 5970. :eek:

Yes you've been exposed to microstutter, this is one of the main reason Ill never go back multi GPU config.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
the micro stuttering thing is completely overblown.

I made up my mind a long time ago that I would never use a XF/SLI setup because of what people had said about micro-stuttering.

I ended up with an Alienware M17x with 2 4870's (didn't buy it, dell gave it to me as a replacement) and i can safely say that if you have vsynce/triple buffering forced (d3doverrider that comes with rivatuner lets you force triple buffering) that there's nothing at all to complain about.

If you play with vsync off, then it might be noticeable, but i don't know why anyone would play games with vsync off assuming you can force triple buffering.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I have a 5970 pitted against a (single) GTX 480 in identical systems. The 5970 will score higher in benchmarks however the 480 just APPEARS smoother to the eye. If it weren't for the benchmark numbers I would pick the 480 over the 5970. :eek:

The 5970 seems to chug and struggle with certain things - namely Heaven 2.0 and the Stone Man. Now Stone Man is actually showing lower FPS on the 5970. :\

Would this be related to crossfire micro stuttering?

Benchmark it with FRAPS. FRAPS gives you a table of intervals between frames that is very easy to plug into excel to make a graph.

Ideally you want the interval between frames to be the same, and your graph to be a straight line. Of course this never happens, but a larger amplitude means less consistency between one frame to the next. If the inconsistency is great enough you will perceive the frame rate as lower than the indicated average.

Here's a graph I did between a GTX 280 and a 4870 X2 almost two years ago showing a close up of the intervals between frames that I think exemplifies this the best:

image018.gif


The above graph is from a little write up I did about this topic. It is available here if you're interested: http://steelforest.net/video/4870X2-GTX280.htm
 
Last edited:

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
The numbers are better with tessellation turned off but the 480 still looks smoother. While the graphs may look better on one system actually sitting in front of the screen and watching the action says the "slower" part actually provides smoother action. Pity reviews don't really touch on this - it's all about graphs and frame rates. :thumbsdown:
 

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
They canned sideport on this because of the problems with TSMC's 40nm process required them to add a lot of redundant circuitry (making the die larger). Sideport was their planned method of frame control between the two GPU's which would alleviate this problem. Perhaps we will see it in Northern Islands, and hopefully they can find the room to put it back in to Southern Islands.

The reason for the Microstuttering is that the two processors are processing different frames and are not synched to process frames at even intervals. As an example, say one processor can process an entire frame in 20ms. If the seconds processor processes a frame starting 10ms after the first, it would be perfectly smooth and you would notice no difference. If it processes a frame 2ms after the first, you would see 2 frames that are nearly on top of each other with 18ms of nothing in between. Microstutter is what happens in the second example.

EX1: 1.........2.........1.........2
EX2: 1.2.................1.2........
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The numbers are better with tessellation turned off but the 480 still looks smoother. While the graphs may look better on one system actually sitting in front of the screen and watching the action says the "slower" part actually provides smoother action. Pity reviews don't really touch on this - it's all about graphs and frame rates. :thumbsdown:
And now when people say "microstuttering is a myth," you can laugh in their faces. Microstuttering is and for the foreseeable future will be a problem for multi-GPU setups because the drivers must synchronize each frame render for it to appear smooth, i.e. each frame needs to be rendered x milliseconds apart and each card needs to render every other frame. Typically, driver updates tend to improve multi-GPU synchronization, but some applications are overlooked. Vsync is a partial solution, as you get the associated input lag with it.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Yes you've been exposed to microstutter, this is one of the main reason Ill never go back multi GPU config.

No you see, microstutter happens with single GPU as well. Microstutter was a subject brought up with multi GPU setups because it's more appearent because of latency, but it also affects single GPU.

Anytime you are not using vsync you essentially are witnessing microstutter. The reason is because frames per second is not a unit of measure accurate enough to describe the motion perceived by our eyes. You fps counter can say 60 fps, but that doesn't mean that your card is putting out a frame every 16.6 ms. The counter is simply the total number of frames rendered in that second, not representative of the interval between them. And that's what our eyes pick-up, the interval.

The only remedy to that is vsync.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
The only remedy to that is vsync.

I've tried it and the effect of forcing vsync is worse - pumping and lagging feels horrible.

What we need is displays that can do 300Hz refresh. ;) Of course that puts the bandwidth in the stratosphere. :eek:
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I've tried it and the effect of forcing vsync is worse - pumping and lagging feels horrible.

Then something is wrong with your setup. 60 fps + vsync = perfectly smooth. The only problem is input lag.
 

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
I've tried it and the effect of forcing vsync is worse - pumping and lagging feels horrible.

What we need is displays that can do 300Hz refresh. ;) Of course that puts the bandwidth in the stratosphere. :eek:

If the frame was rendered at point in the past, it doesn't matter when it is displayed, it will still display the frame that was rendered. Vertical synchronization will not change microstutter.

He is right though that microstutter occurs on single GPU's as well, it is just most prevalent on multi-gpu setups because of a lack of synchronization between the processors. However, ATI has a solution that was even included in the 4800 series, but was never enabled due to power constraints. It was removed entirely from the 5800 series due to size contstraints, but hopefully it will show up in future iterations. I don't know what nVidia is doing to combat the issue though.
 

JRW

Senior member
Jun 29, 2005
569
0
76
No you see, microstutter happens with single GPU as well. Microstutter was a subject brought up with multi GPU setups because it's more appearent because of latency, but it also affects single GPU.

Anytime you are not using vsync you essentially are witnessing microstutter. The reason is because frames per second is not a unit of measure accurate enough to describe the motion perceived by our eyes. You fps counter can say 60 fps, but that doesn't mean that your card is putting out a frame every 16.6 ms. The counter is simply the total number of frames rendered in that second, not representative of the interval between them. And that's what our eyes pick-up, the interval.

The only remedy to that is vsync.

microstutter has been documented plenty of times (google) the effect is far more noticeable when using multi-gpu configurations.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
It's been a long time since I used multi GPU, but I remember that you could nullify the microstuttering by using either both vsync and triple buffering or a non-AFR mode. However, most games I tried back then had various problems with triple buffering, even though it worked fine on a single GPU.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Then something is wrong with your setup. 60 fps + vsync = perfectly smooth. The only problem is input lag.

I have no idea - this has been the observation on many a system since I started using LCDs with 60Hz (about 2000). When I had CRTs running 150+ Hz refresh rates using vsync was a joy.

I've pretty much dealt with the tearing issues of having it off. (plus the screaming noise when menus are rendered at 5000 fps is neat. ;) )
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
The numbers are better with tessellation turned off but the 480 still looks smoother. While the graphs may look better on one system actually sitting in front of the screen and watching the action says the "slower" part actually provides smoother action. Pity reviews don't really touch on this - it's all about graphs and frame rates. :thumbsdown:

That's because all they graph are the fps. This is my point in my previous post. While most reviewers use FRAPS, almost none give you the interval graph which is easily generated with FRAPS. I guess they don't think readers will understand the impact frame interval has on the overall smoothness of a game - they are probably right.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
That's because all they graph are the fps. This is my point in my previous post. While most reviewers use FRAPS, almost none give you the interval graph which is easily generated with FRAPS. I guess they don't think readers will understand the impact frame interval has on the overall smoothness of a game - they are probably right.

Kyle Bennett uses plots but the sampling rate must not catch the peaks and valleys?

Perhaps a new criteria needs to be developed to test these cards as the current way just doesn't do it justice.

It's like gauging how a loudspeaker is going to sound by using its sensitivity and frequency response only. :D
 

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
Kyle Bennett uses plots but the sampling rate must not catch the peaks and valleys?

Perhaps a new criteria needs to be developed to test these cards as the current way just doesn't do it justice.

You can't plot more than a second or so of the kind of data that shows microstutter, because even 1s worth of data is approximately 50 points of data. Microstutter is the lag that causes every other frame to be displayed after or before it is supposed to in relation to the previous frame. I am not sure you quite understand it when reading your comments.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
So he mentions heaven 2.0 and stone giant, 2 tesselation heavy games, and that doesn't ring any alarm bells?

Do you play heavon 2.0 or stone giant? or do you play actual games? Do you see the stutter in those actual games? No? Then the HD 5970 is the card to get.

If you do, pick a GTX 480, which is very fast too...
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
1) Regardless of how you define it, it's there and very annoying!
2) I notice it on EVERYTHING not just benchmarks - the benchmarks are used to tell which is the fastest, right? Like a speedometer in a car tells you how fast you go so you don't get pulled over. ;)