• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Microstutter, Sli/xfire users give us your experience please.

In another thread we were having a civil disscussion about microstutter with the older series cards vs the newer series cards.

A conclusions that we came up with, and seemed to all agree upon is that microstutter is there and will never completly go away with the current method of AFR.

The goal of this thread is to determine if the newer series of cards, like the gtx400/500 series and hd6000 still have the problem, and if it has gotten any better since the last series.

If you have any combination x-fire/sli setup, new or old, your input is much appreciated.

Please list your current setup, resolution, games that you are/were playing, you avg fps, and if you noticed it, and how bad the microstutter was if you did notice it.

Thanks for your input.
 
Last edited:
Current system 2x570 paired with i7 920 @ 4ghz 6gb ddr3 1600.

Have not yet noticed microstutter in the system. I do tend to change the settings to always stay over 60 fps.

In AVP surround view I sometimes will drop to 50, but usually only momentarily.

Previously used

2x285 paired with q9550 @ 3.4ghz 4gb ddr3

Noticed microstutter only in one game which was STALKER.

The framerates fluctuated quickly in the game which I believe exacerbated the problem.

My theory is that it is most noticeable when your frame rate is lower this would be due to the fact that the more frames you have per second the closer they must be to having an equal interval, and that games that tend to have a quickly varying strain on the gpu like from 100 fps to 40 fps can cause it to become more noticeable.

Perhaps when I get around to installing Metro 2033 I will be able to comment more on the new system.
 
I have a very positive experience as I stated in other threads.
It does not mean, that I always think SLI is perfect or the right choice in every case.
Only full screen allows constant 'correct' functionality.
Your experience is somewhat dependent on the dual gpu profile and driver release. It can go away in some games with some driver releases.
There is more complexity in everything. Heat released. Power used.

Performance checking , whether that means using monitoring tools, on screen displays, switching back and forth from a single card while noting your experience. All added work. I got my cards at different times for my budget needs. Could not justify buying a 'single' 400 dollar card, 🙂. Separate cards for a couple reasons, I can.

When it all works, I find the experience smooth without hitching. It takes extra work at times , to make sure the second card does not drop clocks and other unique behavior of having 2 gpu's.

With all the positive comments, I would trade the added performance and what I paid for the 2 cards for a single gtx 570. Which would cost me less.
But I did not actually buy either card completely, 1 was a gift. And at different dates. So it works for me.
Also with dual Nvidia cards, I can go 3 monitors if I ever chose (probably not) and plenty of gpu overhead to do 3d (?).

I believe it helps to synchronize your cards (if different) clock speeds, with Afterburner or other utility.
 
Phenom II 940 @ 3.4ghz, 8gb ddr2 1066, 180gb vertex 2, 2 radeon 5850s, full HD 27" 1ms samsung.

With my setup the only possible sources of stuttering are poorly optimized games and drivers. Some games like Fallout New Vegas just stutter a lot on anything but maybe the most extreme systems. In such cases you might be surprised at how much of what you think is microstutter is actually the game paging the HDD. The only problems I've had with microstutter have been with newly released games and video cards, both of which are usually patched within a few weeks. All I can say is if you buy a newly released game or video card, much less crossfire them, be prepared to deal with issues.
 
Last edited:
If you are playing games that you can get > 70FPS i find you wont notice it (unless you are looking for it) If you are getting < 50 it will be very pronounced and gives me a headache, the problem is worse if using a 120hz monitor
 
noticed some micro stutter in Black ops with my older card 8800gs. I dont know why but it went away when i upgraded to quad core and now i haven't noticed it with hd6850.

i was thinking low ram on 8800gs but who knows.
 
How is that a problem? besides the ~5% performance impact?


If we assume everything is configured correct, and the rig is stable. A p55 16x-4x functions via 16 lanes through the pci-e controller in the cpu and the other 4 lanes via the p55 chipset. Its territory not addressed very often. But I did read that it could introduce added issues beyond the actual bandwidth differences. Lost link.
A reviewer theorized it might add latency ? That setup 'might', might behave differently than some tests that a couple web hardware sites tested. When they tested by dialing back the lanes at the video card /m/b slot, by taping the gold contact pads.
This is probably more complex than it even sounds. What if the 'user' has his rig o/c and the pci-e controller is actually under-volted or other problems.

I'm theorizing not all 16x -4x setups are good or bad.

All m/b's multi- gpu performance can also be effected by bios updates.

Another variable, why some might have subpar experiences.
 
If we assume everything is configured correct, and the rig is stable. A p55 16x-4x functions via 16 lanes through the pci-e controller in the cpu and the other 4 lanes via the p55 chipset. Its territory not addressed very often. But I did read that it could introduce added issues beyond the actual bandwidth differences. Lost link.
A reviewer theorized it might add latency ? That setup 'might', might behave differently than some tests that a couple web hardware sites tested. When they tested by dialing back the lanes at the video card /m/b slot, by taping the gold contact pads.
This is probably more complex than it even sounds. What if the 'user' has his rig o/c and the pci-e controller is actually under-volted or other problems.

I'm theorizing not all 16x -4x setups are good or bad.

All m/b's multi- gpu performance can also be effected by bios updates.

Another variable, why some might have subpar experiences.

who are you quoting?,what apile of sh1t


This manner of posting is counter-productive and inflammatory. Please do not post in this manner.

Please familiarize yourself with the AnandTech Forum Guidelines:
We want to give all our members as much freedom as possible while maintaining an environment that encourages productive discussion. It is our desire to encourage our members to share their knowledge and experiences in order to benefit the rest of the community, while also providing a place for people to come and just hang out.

We also intend to encourage respect and responsibility among members in order to maintain order and civility. Our social forums will have a relaxed atmosphere, but other forums will be expected to remain on-topic and posts should be helpful, relevant and professional.

We ask for respect and common decency towards your fellow forum members.

Moderator Idontcare
 
Last edited by a moderator:
In my experience it's very game/engine dependent. Games with noticeable microstutter:

Crysis/Warhead
World of Warcraft
Call of Duty: Black Ops (this may be an nvidia driver issue with SLI though, there are many complaints about this game on nvidia's forums)
Stalker series

I notice it in some other games as well, but these are the standouts.
 
I will chime in even though what im going to say i have already said in a few threads.

I notice no microstutter in my current OC'ed 460 SLI setup. My framerates are always above 60 in 95&#37; of the games i play. I also play with Vsync forced through drivers and triple buffering forced through third party applications since drivers will not allow it even though there is an option for it, the driver option does nothing.

I have noticed micro stutter on a older 8800 series SLI setup a freind asked me to help him smooth out. Micro stutter was unavoidable even with vsync and triple buffering but the framerates were lower and i think this was the issue. Never did get it to smooth out and freind was 100% against the idea of lowering details/AA to get better framerates, he is living with the issue.

I have spent more than 30 hours gaming on a buddies 470 SLI setup and notice no microstutter with this setup either, and he is on a 120Hz display. But he has his system setup so 90%+ of the games run at 120FPS or above, he lowers details as needed to obtain this speed. He also has vsync and triple buffering enabled. I will ask him if he lowered details/aa for speed because of microstutter and reply when i get a response.

my .02
 
Using the rig in my sig. No noticeable microstutter. Resolution 1920 x 1200, vsync is set to off on all my games, triple buffering enabled, cards both set to 950/1100.
I currently play CoD-BO, Just Cause 2, Borderlands, Crysis, TF2, the older CoD games, lotsa Steam games, UT3, etc. I won't list them all. Trying the Crysis 2 leak tomorrow perhaps.
 
I would comment about my 6850s but they don't even work, i'm actually getting negative scaling when CF is enabled, so 😛
 
who are you quoting?,what apile of sh1t
images
 
Using the rig in my sig. No noticeable microstutter. Resolution 1920 x 1200, vsync is set to off on all my games, triple buffering enabled, cards both set to 950/1100.
I currently play CoD-BO, Just Cause 2, Borderlands, Crysis, TF2, the older CoD games, lotsa Steam games, UT3, etc. I won't list them all. Trying the Crysis 2 leak tomorrow perhaps.
triple buffering does nothing if you are not using vsync. also unless you are using a 3rd party app then triple buffering is only applied to opengl games anyway.
 
Back
Top