• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Multi GPU stuttering captured with 300FPS camera

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
People have different sensitivities to different things, I may notice micro stuttering while another person may not.
 
Thanks for the video.

Why is it that the delay between frames becomes larger? Shouldn't the delay between frames for each GPU be the same as a single GPU? Or are certain frames being thrown out?
 
Last edited:
I thought we did a poll on this forum that showed almost nobody has ever seen it?

Key word is almost no one saw it but there are people that may be bothered by it.

I personally don't see it I was just giving an example lol.
 
Why is it that the delay between frames becomes larger? Shouldn't the delay between frames for each GPU be the same as a single GPU?

It does not become larger per se.. instead.. it becomes erratic.

For example, in a fast paced game where transitions are extremely fast, micro stuttering is quite evident(more so if FPS is constantly below the refresh rate of the monitor).. Imagine one frame(F1) being rendered on GPU 1 not graphically intensive and on GPU 2 the next frame(F2) is extremely intensive.. in such scenarios.. you can make out this stuttering as the delay become erratic.
 
Key word is almost no one saw it but there are people that may be bothered by it.

I personally don't see it I was just giving an example lol.

At rage, personally was a poster that discussed micro-stuttering a lot and was glad to see this fantastic article by Alex. Alex and I banged heads a bit on this cause he couldn't perceive it and then investigated it.

The key that may be surprising to some is that with Quad, one may believe that there would be more micro-stuttering but there is a lot less.

This article discusses Micro-stuttering and Quad:

http://www.rage3d.com/reviews/video/ati4870x2cf/index.php?p=2

Alex is now the Editor at Beyond3d.
 
Thanks for the video.

Why is it that the delay between frames becomes larger? Shouldn't the delay between frames for each GPU be the same as a single GPU? Or are certain frames being thrown out?

The delay between frames is almost never constant, even with a single gpu.

IIRC, what is happening in a dual gpu system is that for some reason of the second gpu is rendering the next frame at the same time the first gpu is rendering the current frame. So the next frame is ready to go immediately when the first gpu gets done with the current frame. The problem occurs because the first gpu is not yet done rendering the frame after that, so you have a gap before the next frame. This repeats itself over and over again.

With a single gpu the frames are rendered and displayed as fast as the gpu can do it, so as long as the scene stays somewhat consistent the intervals between frames will be more consistent than a dual gpu system. Although, they will still vary.

Of course, the higher your frame rate is, the less likely it is that the gaps between frames are noticeable. That's why they did this video at 30fps.
 
Last edited:
Key word is almost no one saw it but there are people that may be bothered by it.

I personally don't see it I was just giving an example lol.

Keep in mind to that there was nothing scientific about that poll either. In order to verify the "noticeable existence" of micro-stutter there would have to be a test given where applicants were sat in front of a screen that was changing at irregular intervals (say 10-30 secs.) between multi gpu and single gpu solutions. I would suggest having 3 different multi gpu and 3 different single gpu that were being switched between randomly. The individuals being tested would then be required to accurately report whether they were watching a single or multi gpu render. Just being able to see any difference what so ever would have to first be scientifically verified. Then micro-stutter would then have to be verified as the discernible difference.

I don't know why this, or something similar, has never been done?
 
Singel GPU have micro stuttering too...

it can have it, yes. but its much more common on multi gpu

@OP: I personally prefer "ms to render each frame" counts for showing microstutter, i easily showed it that way before. And numbers are clearer then a video I think. But this works too, I can clearly see the micro-stutter in the video
 
The times I've seen it on my current system, was with a single card. A 4770 in Heaven Benchmark. Interactive in game, I did not spot it with that setup. With my sli setup , no.
 
I'd rather have neither. I'm happy with my overclocked GTX 580. I don't get any stuttering and not many games I can't run with very high fps. If I find a game is really demanding I lower AA and I get a nice clean 120fps with no stuttering. There is a reason the 580's cost $500 and haven't seen a price drop even though everyone says "what a bad deal the 580's are since 2 GTX 460's offer equivalent performance for much less money." Yeah enjoy your slideshow gaming.

No, there really isn't. Other cards offer 90% of the performance for 65% of the price and you can turn settings down with them too. The 580 is one of the worst deals out right now. Not that it's a bad card by any means, just overpriced.

You should at least try a powerful dual card solution before you pass judgment. Having tried both myself, microstutter rears its ugly head very rarely and the rest of the time dual cards allow some very nice graphics settings.
 
Wouldn't micro-stutter be far less apparant as long as you were getting 60FPS (or more)? At 30FPS there is a realitve large amount of time between the average frame. At 60FPS you are getting four frames in the time you'd get 2 @ 30FPS (on average) so micro-stutter should be a lot less noticeable, right?
 
Didn't someone try something very similar with HL2 before? I seem to remember the crux of the issue being that multi-GPU configurations and the internal Source engine framerate cap did not play well together.
 
Enjoy your Dragon Age 2 performance 😀

Lol they fixed that. But its still hard to enjoy a ruined game. Should of got it for my 360 instead (sad) so I could of got some of my money back. Anyway I agree with OILFIELDTRASH. I enjoy my fast single gpu with no stutter. Tried xfire 6870's and I noticed micro-stutter almost immediately. Sorry if that bothers you. Tease me about how I cant run 3 monitors on 1 card or something if it makes you feel better 😀...................🙄
 
No you wouldn't. Vsync enables double or triple buffering which buffers frames so that they can be spit out at you evenly, whether it's 16.666 ms apart for 60 Hz, or 8.333 ms apart for 120 Hz.

This is why vsync introduces input lag, because you can't buffer frames and still be immediately responsive to changes.

But there's a trick to avoid that too, and enjoy perfectly even frames without input lag.

Trick? You should have told us!

Although as I recall, you have discussed this issue before. And you suggested that you cap the framerate below your monitor's refresh refresh rate, and that would solve input lag.

After you said this I did an experiment in TF2. I enabled V-sync, triple buffering and all, and when the framerate was hitting 60 and above, I had input lag. When I capped the framerate at 59, the input lag went away. I also tested this same method in the Jedi Knight games, and got the same result: Input lag with V-Sync and the framerate reaching 60+, no input lag when framerate capped below refresh rate.
 
Lol they fixed that. But its still hard to enjoy a ruined game. Should of got it for my 360 instead (sad) so I could of got some of my money back. Anyway I agree with OILFIELDTRASH. I enjoy my fast single gpu with no stutter. Tried xfire 6870's and I noticed micro-stutter almost immediately. Sorry if that bothers you. Tease me about how I cant run 3 monitors on 1 card or something if it makes you feel better 😀...................🙄

I meant more along the lines that you need multi-GPU DX11 at 1080p or above. And I'm not teasing you, I already made fun of the dual GPUs I own...

Dragon Age 2 also has the worst AA implementation I've ever seen.

On a static scene in a room with a lot of candles, I had 23 fps at 8xAA, 34fps at 4xAA, 43fps at 2xAA, and 57 with 0xAA, and this is with dual 6970s. The fans were ramping up and the card was going to 95C trying to render this POS engine.

All the reviews I've seen online cap their benchmarks at 4xAA. The performance hit going to 8xAA is obscene.

2560x1600_VH_update.png
 
Last edited:
Trick? You should have told us!

Although as I recall, you have discussed this issue before. And you suggested that you cap the framerate below your monitor's refresh refresh rate, and that would solve input lag.

After you said this I did an experiment in TF2. I enabled V-sync, triple buffering and all, and when the framerate was hitting 60 and above, I had input lag. When I capped the framerate at 59, the input lag went away. I also tested this same method in the Jedi Knight games, and got the same result: Input lag with V-Sync and the framerate reaching 60+, no input lag when framerate capped below refresh rate.
That makes so much sense that I'm /facepalming I didn't think of it. Nice! :thumbsup:
 
if you WANT to notice something, you'll notice it. if you want to enjoy your game rather than nitpick at microstuttering, then you won't even know it's there. just like those dots in movies at the theater. when i get bored of a movie i start to notice all those dots that flash to entertain myself. when i'm enjoying a movie, i don't ever see them.
 
if you WANT to notice something, you'll notice it. if you want to enjoy your game rather than nitpick at microstuttering, then you won't even know it's there.

Great advice, next time I injure myself I will just "want to not feel the pain" and it will go away.
 
if you WANT to notice something, you'll notice it. if you want to enjoy your game rather than nitpick at microstuttering, then you won't even know it's there. just like those dots in movies at the theater. when i get bored of a movie i start to notice all those dots that flash to entertain myself. when i'm enjoying a movie, i don't ever see them.

One may notice because its there and it exists.
 
Back
Top