Is this micro stuttering?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Kyle Bennett uses plots but the sampling rate must not catch the peaks and valleys?

Perhaps a new criteria needs to be developed to test these cards as the current way just doesn't do it justice.

It's like gauging how a loudspeaker is going to sound by using its sensitivity and frequency response only. :D

Yeah, Kyle's graphs are actually among the best in terms of showing fps. His graphs at least let you see the distribution of the max and min throughout the benchmark. He doesn't show the interval between the frames though. No one does AFAIK.

Most sites stopped talking about micro stutter and frame interval after the 4870 X2 launched and micro stutter was declared to have been magically resolved by ATI. Most of the internets just took this proclamation at face value because they don't know any better, and the topic sort of disappeared from the public eye. The issue still existed with the 4870 X2, and I proved that frame intervals were more erratic on on the 4870 X2 than they were on a GTX 280 in the write up I linked to earlier in this thread.

The thing you have to understand about micro stutter is that is requires a number of things to converge before it bothers people.

1) an erratic interval between frames
2) the fps have to be low enough for the largest interval to be perceived as "slow" to the observer - anything running at 120+ fps is most likely going to be smooth regardless of the interval between frames
3) 1 and 2 have to happen on a somewhat regular basis
4) You have to notice it. I don't think everyone does, but some people don't notice AA/AF either.

Since graphics card (especially dual gpu setups) have generally been outpacing game requirements for the most popular games over the past few years, the only place you would really notice micro stutter would be on very demanding games or benchmarks. Chances are that people playing MW2 or anything based on the Unreal 3 engine will never have low enough fps for an erratic frame interval to manifest itself as micro stutter, so they don't care (rightfully so).

I haven't run a multi-gpu setup since I had dual GTX 280s, so I cannot say for certain the issue has not been resolved with the 5970. You, on the other hand, have access to both a GTX 480 and a 5970... Some quick benchmarks with FRAPS with the interval option enabled could confirm or deny whether the issue you are seeing is micro stutter or not. It might not be, it could just be crappy performance...
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Games fly on both - it's just the 480 gives the better or smoother presentation. That's what I am noticing. Vantage scores, 3DMark, etc. all are higher on the 5970. It's like a speaker engineer telling me the sound is tuned right and my ears saying it's not. (I'm in the biz of pro audio and this actually does happen and this reminds me of it very much!)

There's really nothing wrong with the 5970 per say just the 480 seems better. Next I suppose is to add more 480s and see what happens. ;)
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
omg! thats almost a car analogy. Watch it Rubycon!

So does microstutter appear to be noticable on single gpu cards? If so, whats causing it? too narrow pipelines? some bottleneck in the memory chips/side?

what?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
1) Regardless of how you define it, it's there and very annoying!
2) I notice it on EVERYTHING not just benchmarks - the benchmarks are used to tell which is the fastest, right? Like a speedometer in a car tells you how fast you go so you don't get pulled over. ;)

Well, you seem to be sensitive to whatever it is that is causing your issue. Since you're in the somewhat rare position to have both cards in your possession at the same time, I say damn the benchmarks and conventional "wisdom" of the internet, and stick with the card that makes YOU happy.

Someone posted my GTX 280 vs 4870 X2 data on the rage3d forums, and I took a beating for my efforts. I don't buy gear to make some douche bags on rage3d happy about their purchases, so I stuck with the card that gave me the better experience which was the GTX 280.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
omg! thats almost a car analogy. Watch it Rubycon!

So does microstutter appear to be noticable on single gpu cards? If so, whats causing it? too narrow pipelines? some bottleneck in the memory chips/side?

what?

I don't think so. If we go back further with AGP I can regale lots of issues that were related to chipset, IRQ, sidebands, you name it. At least there were workarounds for those.

I notice lots of driver updates seem to include hotfixes for specific titles. Perhaps they are fine tuning them? I thought direct X was suppose to fix that issue altogether. ;) (fix one issue and everything is fixed - yeah right!)
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Games fly on both - it's just the 480 gives the better or smoother presentation. That's what I am noticing. Vantage scores, 3DMark, etc. all are higher on the 5970. It's like a speaker engineer telling me the sound is tuned right and my ears saying it's not. (I'm in the biz of pro audio and this actually does happen and this reminds me of it very much!)

There's really nothing wrong with the 5970 per say just the 480 seems better. Next I suppose is to add more 480s and see what happens. ;)

Have you run it with vsync/triple buffering? Id like to know if you still notice it then.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
Curious. I've never tried my 5770 xfire in any of the benchmarks (who cares imho lol) but in crysis warhead i never notice any stuttering, looks perfect. Then again I have vsync on so hmmmmmms lol.

Still, I think the thing you should try is... do you notice it in any games, if so which ones? If not then use whichever card you feel is giving the better gameplay experience. It sounds like you like the 480 better so use it...
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
I'll try the triple buffering "trick" in a bit. The HD5970 is in a system that's being used for something where I just cannot jump in and start messing around.

I do have another 5970 still in the box but I don't want to pull my 480 out now either. :hmm:
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I found that micro stuttering ocurred in Vantage and Bad Company 2. I found out that removing 1 of the 2 Crossfire bridges diminished that effect to the point that it rarely occur. It may indicate that's running at 40fps, but feels like 20 or less, is annoying.

I use D3DOverrider and is the best thing ever created to force Triple Buffering in game, increasing the minimum frame rates and giving you silky smooth frame rate with no noticeable input lag, without it, V-Sync will give you input lag issues.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Triple buffering should always be used with Vsync if possible. See Derek's excellent article on the subject. From what I remember it's only AFR that exhibit's this problem so if there was a way to force Supertiling or Scissor mode in Crossfire (which there isn't AFAIK) it'd be interesting to see the results. Can't you force the above non AFR modes in Nhancer though with SLI?

Someone should have done that years ago. Attention Derek, BFG or Appopin if you are reading this thread- this would be a unique article :D
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Triple buffering should always be used with Vsync if possible. See Derek's excellent article on the subject. From what I remember it's only AFR that exhibit's this problem so if there was a way to force Supertiling or Scissor mode in Crossfire (which there isn't AFAIK) it'd be interesting to see the results. Can't you force the above non AFR modes in Nhancer though with SLI?

Someone should have done that years ago. Attention Derek, BFG or Appopin if you are reading this thread- this would be a unique article :D

At least it can be forced with Crossfire using the ATi CF Xtension, it allowed me to force Crossfire support with Assassin Creed 2 which doesn't have good Crossfire scaling. The AFR Method didn't work reliabily, The Scissor Mode showed much bette results with great minimum frame rate, but never exceeded the 60% GPU usage even with 24x Edge Detect Anti Aliasing, and Supertiling maxed my GPU usage showing more consistent frame rate than Scissor Mode aka Split Frame Rendering, Assassin Creed 2 is a severely CPU bound game with a sickening Frame Rate cap of 62fps, awful with a 75Hz refresh rate.
 

CP5670

Diamond Member
Jun 24, 2004
5,512
589
126
Assassin Creed 2 is a severely CPU bound game with a sickening Frame Rate cap of 62fps, awful with a 75Hz refresh rate.

I'm not sure what engine that game uses, but a lot of UE3 games have this problem. In those games, it's caused the framerate smoothing being on by default and is easy to disable in the ini files.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
At least it can be forced with Crossfire using the ATi CF Xtension, it allowed me to force Crossfire support with Assassin Creed 2 which doesn't have good Crossfire scaling. The AFR Method didn't work reliabily, The Scissor Mode showed much bette results with great minimum frame rate, but never exceeded the 60% GPU usage even with 24x Edge Detect Anti Aliasing, and Supertiling maxed my GPU usage showing more consistent frame rate than Scissor Mode aka Split Frame Rendering, Assassin Creed 2 is a severely CPU bound game with a sickening Frame Rate cap of 62fps, awful with a 75Hz refresh rate.

Wow I just googled the 'Crossfire Xtension' and that is one sweet application, I'm surprised I haven't heard more about it on AT. How do the other methods work with other games? How are you measuring GPU usage, is it combined usage (both GPUs), or individual usage per GPU? I guess lower utilisation makes sense with scissor as the screen is split 50/50 and one half may have more complex geometry than the other so one GPU is making full use of it's resources and the other is perhaps sitting idle for some cycle times as it completes the less complex half of the screen quickly.

Supertiling should give a much more even distribution of GPU load in a given scene, it's a shame these differing modes are not explored more in reviews.

Thanks for the info.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
At least it can be forced with Crossfire using the ATi CF Xtension, it allowed me to force Crossfire support with Assassin Creed 2 which doesn't have good Crossfire scaling. The AFR Method didn't work reliabily, The Scissor Mode showed much bette results with great minimum frame rate, but never exceeded the 60% GPU usage even with 24x Edge Detect Anti Aliasing, and Supertiling maxed my GPU usage showing more consistent frame rate than Scissor Mode aka Split Frame Rendering, Assassin Creed 2 is a severely CPU bound game with a sickening Frame Rate cap of 62fps, awful with a 75Hz refresh rate.

That's pretty cool that scissor mode can be forced. Was it a noticeable improvement over running in single card mode? (ie. estimate 30-40% scaling?)

I've made a thread about the program. Thanks for mentioning it.
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
yes it is microstuttering... I've always have been affected by it with multi gpu, but once you enable vsync it cleans up nicely. Trick is that some programs don't like vsync(dead space) or sometimes have to use nHancer (Clear sky) to be able to use it.

I don't know how sound people find it to be not noticeable. Anyhow, I always end up getting Crossfire or Sli anyways. The benefits outway the drawbacks.

Even with triple buffered vsync you can tell there is a mouse lag. I enabled vsync in l4d after anand's article and immediately had to change back, it simply caused my mouse response to differ slightly and made it much more difficult to place accurate shots.

I only use vsync with non fps games.

Microstutter is the reason I went from 2x285gtx to one 5870. I do not plan on using multigpu again personally, maybe in a few years they'll have solutions to these issues.
 

jimhsu

Senior member
Mar 22, 2009
705
0
76
Microstuttering is basically one specific instance of the minimum frame rate problem (which is common to everything). In all instances, something that is a constant 30 fps will appear smoother than something that is at 60 fps but spikes down to 15, even if the latter has an average frame rate that is higher. The minimum frame rate (i.e. the maximum frame latency) should be the more relevant measurement reported in benchmarks.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Wow I just googled the 'Crossfire Xtension' and that is one sweet application, I'm surprised I haven't heard more about it on AT. How do the other methods work with other games? How are you measuring GPU usage, is it combined usage (both GPUs), or individual usage per GPU? I guess lower utilisation makes sense with scissor as the screen is split 50/50 and one half may have more complex geometry than the other so one GPU is making full use of it's resources and the other is perhaps sitting idle for some cycle times as it completes the less complex half of the screen quickly.

Supertiling should give a much more even distribution of GPU load in a given scene, it's a shame these differing modes are not explored more in reviews.

Thanks for the info.

I monitored the GPU usage individually by GPU using the Ati Tray Tool, using the Scissor Mode, it only uses an Average 60% GPU usage in each individual GPU. For some reason the individual GPU usage with Split Frame Rendering didn't vary by much. Using Super Tiling maxed the GPU usage to 99%.

I'm not sure what engine that game uses, but a lot of UE3 games have this problem. In those games, it's caused the framerate smoothing being on by default and is easy to disable in the ini files.

As far as I know, it uses a propietary engine. UE3 engine scales very well with a Dual Core with very slight improvement going Quad, Assassin Creed 2 scales incredibly well with a Quad Core.

That's pretty cool that scissor mode can be forced. Was it a noticeable improvement over running in single card mode? (ie. estimate 30-40% scaling?)

I've made a thread about the program. Thanks for mentioning it.

The game was so CPU bound that I couldn't find a huge improvement in frame rate. Using 24x Edge Detect Anti Aliasing at 1280x1024, the game felt choppy with slowdowns dipping in mid 20's at the minimum frame rate with an average of 33fps, going to Crossfire, allowed me to have a minimum frame rate of 33fps with a maximum of 62fps which is the limit of the frame rate cap of the game. The performance difference between Super Tiling and Scissor Mode was very hard to notice, Super Tiling offers you slighly higher minimum frame rate at the expense of huge GPU usage, near 99% on each GPU, compared to the 60% average using Scissor Mode.