- Jun 21, 2005
- 11,871
- 2,076
- 126
I have never been able to perceive this Micro stuttering stuff... I'm not sure if that means my eyes are terrible (they probably are) or it is something that doesn't bother me (or doesn't, gasp exist, lol).
The graphs show the frame number and the time (in ms) taken to render it. Microstuttering does look worse because even though the framerate is higher, it's jumping all over the place, which can be annoying to some and even give headaches to others. There's a reason I'm driving my 2560x1600 with only a single 5870. Sure I could run out and buy an HD5970 if I wanted 40FPS+ in Crysis, but the testing confirms my suspicion that the stuttering is still there. After dumping my GTX295 and going back to single card with my 5870, I can definitely say I'm not going back to dual GPU until the technology is better. Until then, I'll just buy the fastest single GPU available and overclock the living hell out of it.Can someone explain this to me, if I am incorrect:
As I understand it, micro stuttering is supposed to be a result of AFR producing the frames at an inconsistent rate. That is to say two frames very close together, a relatively large gap, then another pair of frames. Such that the fps (frames over total time) is "doubled" in the best case, but the perception, in the worst case, is no FPS increase over a single card.
It isn't at all like the forum linked in the OP goes on about. Which is stuttering. The "perception" of micro stuttering is that it adds nothing.. not that it looks "worse," jumpy, or anything.
The idea that it looks worse, or makes one's head hurt (beyond the possible motion sickness it could cause) seems to be a miss understanding of regular stuttering.. which is made worse by lack of dual GPU optimizations.
To actually show micro stuttering the graph would have to be able to calculate the FPS every single frame, based on the time between it and the last (i suppose FPS = 1/(time - last frame time)).. or just display the time gap between each frame. I cannot see the graphs from this computer.. is that what they are doing?
The graphs show the frame number and the time (in ms) taken to render it. Microstuttering does look worse because even though the framerate is higher, it's jumping all over the place, which can be annoying to some and even give headaches to others. There's a reason I'm driving my 2560x1600 with only a single 5870. Sure I could run out and buy an HD5970 if I wanted 40FPS+ in Crysis, but the testing confirms my suspicion that the stuttering is still there. After dumping my GTX295 and going back to single card with my 5870, I can definitely say I'm not going back to dual GPU until the technology is better. Until then, I'll just buy the fastest single GPU available and overclock the living hell out of it.
This was probably the most insightful article on the net about micro-stuttering. Take some time to read and digest it:
http://www.rage3d.com/reviews/video/ati4870x2cf/index.php?p=2
you dont have to make another post one minute after the other. just use the edit feature especially since you were not even replying to anyone.Oh and the problems I would have were often solved by turning off SLI.
you dont have to make another post one minute after the other. just use the edit feature especially since you were not even replying to anyone.
back on topic, I have always been a fan of using a single fast card. there are just too many more issues including this microstuttering that sli/crossfire just isnt worth it for me.
It sounds to me like you are confusing the terms "framerate" and "FPS." While sometimes used interchangeably, they are not the same. FPS is a specific method of reporting framerate, but framerate itself is as depicted on the graphs posted at XS, and with microstuttering it is jumping all over the place.But the frame rate isn't jumping all over the place in any way that we can perceive. Unless the fps is abhorrently low you won't be able to tell that it is changing at all.. it will just look slower than it is reported as. It will cycle from "high fps" to "low fps" every 1/20 second if you get anything remotely playable.
Anything that 'looks' like the fps is jumping is not microstuttering, it is stuttering.. which most will agree is worse on dual GPU (for many reasons, optimizations mainly). However, that is an entirely different issues, and is not a fundamental flaw with AFR.
If the FPS was rather low, microstuttering might make it look "funny" on a person to person basis.. but if it looks like it has sections of slow and fast fps that is not microstuttering at all.
Despite what goes on under the hood (see the article SirPauly posted), the manifestation of microstutter as perceived by the user is the delta from the mean frametimes. The video card set-up used makes no difference here. In fact, there's even measurable deltas in single card configurations, it's just not enough to be perceived as microstuttering by most people. Microstuttering is perceived differently from regular stuttering, as you stated, but that doesn't mean it's not perceived nor change what is actually happening.As for the graphs (im at home now and can look at em) they don't really show anything.. The time taken between the draws of each frame is where we would see microstuttering. All that shows is that each frame is of a new scene (duh) and means even less without showing the non crossfire comparison. Micro stuttering is a phase discrepancy between the two cards, that would not be seen in the time to render a frame (though the time to render is part of it).
Edit: Don't get me wrong.. MS is in theory a very real thing, just like any phase distortions.. I am one of the many that can't seem to see it mind you. However, it is important not to lump it in with regular old stuttering which many seem to do. Obviously there is benefit and negatives to multi GPU's. I'd like to see a more thorough.. with counter examples (with just one card).
It sounds to me like you are confusing the terms "framerate" and "FPS." While sometimes used interchangeably, they are not the same. FPS is a specific method of reporting framerate, but framerate itself is as depicted on the graphs posted at XS, and with microstuttering it is jumping all over the place.
I'm not sure how what you just said is different from what you quoted. Perhaps I'm missing something.Despite what goes on under the hood (see the article SirPauly posted), the manifestation of microstutter as perceived by the user is the delta from the mean frametimes. The video card set-up used makes no difference here. In fact, there's even measurable deltas in single card configurations, it's just not enough to be perceived as microstuttering by most people. Microstuttering is perceived differently from regular stuttering, as you stated, but that doesn't mean it's not perceived nor change what is actually happening.