Separate names with a comma.
Discussion in 'Video Cards and Graphics' started by Final8ty, Jan 2, 2013.
Yes it seems likely that this only affects a few games and can hopefully be fixed, in some cases by driver updates in other cases by more simple means. I'm not sure where accusations have been made that this issue is more widespread than the techreport article suggests?
While I'm sure AMD didn't deliberately sacrifice smoothness, I think there are some questions about how well resourced their driver team is and how long it takes them to optimise the software.
Here you had the chance to show the way, for example by showing the numbers and graphs that correspond to what you perceive as annoying, but instead you add just one more personal anecdote to the collection...
These are the frame times I obtain for the scene I reproduced as near as possible from TR's microstuttering video. The frame times are Gaussian distributed, as you would expect from microstuttering, with a mean of 16.68+-0.79 ms. That can be written as that the microstuttering is 4.7% of the mean value. It contradicts TR's report, so I would very much like to know how they got so bad results. If you do not believe me when I say that I cannot see the microstuttering, maybe these graphs and numbers will convince you? Or is still think this is irrelevant?
Your signature suggests you're playing this at 1080p not 2560x1400, is that the case?
If that is the case then that would explain why you see less microstutter than they do, your card simply isn't being pushed anywhere as near as hard. One point the TR guys make in their recent podcast is they are only getting visible problems when the card is actually pushed reasonably hard and then only in the latest crop of games they tested. The old games they had didn't show the problem but the new ones do. They also said dropping the settings can reduce the problem.
So if its the case your running what is 56% of the pixels then its no surprise the results is different, its also consistent with what they found.
Hmm, I didn't add any anecdote?
As TR showed frame latencies differed hugely from area to area. In the area where the 7950 performed poorly even the 660 Ti showed frames over 50ms, so I'm guessing you haven't managed to replicate the conditions.
Now hopefully this suggests that the stuttering is relatively rare. Hopefully we will start to see more sites examining this issue. The more evidence that is built up from a wide range of reputable sources the better.
Their initial test was in town which when they first tested the game many months ago was a decent test of performance. But then they started going elsewhere and found this outside area that pushed the cards a lot harder and turned out now with all the driver changes to be more representative of the game. Town has become an easy test in time while the scrub land they test in is harder.
They don't say the problem is rare, but they do say it differs area to area. I don't remember them saying the 660ti does the same thing anywhere in any of their podcasts or in the articles themselves so not sure on your reference for that. As far as I know the problems across these games are isolated to the 7950's, they haven't yet shown on any other cards.
My personal experience with the 7970 is I saw this problem in the grand majority of games and that changing settings didn't help much. It was still the case with the first 12.11 beta so I still think there is a wider problem, but no one has reported on that yet.
If that is the case that the high resolution is the problem that seems kinda ironic. Everyone assumed that the 660 Ti's 192 bit bus would cripple it at higher resolutions, but maybe it's actually doing it a favour?
Potentially by holding the card back from completing super fast sub 10ms frames like the 7950 achieves it also prevents the accompanying slow frames.
From the graphs it seems like the faster you complete an individual frame the worse the stutter is.
...what in the world does this have to do with you?
My post was actually constructive. It highlighted the hypocritical change in agenda of AMD apologists that occurred with the release of the HD 7000 series. The fact that Nvidia "zombies" are all riled up is unsurprising.
It's "zAMDies" DAAMIT
I can't wrap my head around how to pronounce that.
What that graph shows is one frame being rendered faster and that makes the next frame slower in reference to the faster one. If the prior frame was rendered at the same interval as the others then the next frame would not appear so slow.
For example, the first "faster" frame has a latency of ~5ms. The next frame is ~24ms. The mean diff between the 2 would be ~14.5ms which is just about exactly where all of the other frames lie.
I don't see how this is shameful. Go work in any industry - in general the solution is to stay quiet. Once it reaches a point that Sales Teams can't ignore, they go to Engineering that typically beings to work feverishly on the issue. It can take a long time to isolate where the problems are, and my best is that they've been working on this for at least a few months.
Once you are well on your way of implementing the solution (or varying degree of)...THEN and only THEN you come forward and talk about it. Otherwise, you are stuck with a "We don't know.." that sticks for 1 year.
BrightCandle, that was the single most useful post in this thread! I was going nuts over why I could almost reproduce their indoors results, but not this wilderness result. I think you just found out why! In the later tests they did not mention the resolution so I missed it!
With the resolution in mind, I see that they use 1080p for Hitman:Absolution so I decided to try that. It is definitely stuttering by default with TR's settings, and the results are not far from what TR reported. However, they run with FXAA on for reasons I cannot imagine. It makes the game very ugly, and just keeping everything at Ultra but turning it off (but still use MSAA) does not only give a more visually appealing game but also somewhat smoother performance. I attach the frame distribution for benchmark using that setting (Ultra, no FXAA, 1080p).
The mean is 8.8 ms. The frame rates have a larger standard deviation compared to my Skyrim results, 1.5 ms, but the benchmark is also not giving perfectly flat FPS over the span of the benchmark, so there is some innate variance. Visually I see that the frames are not butter smooth, but it as the level that I have to actively look for it to see it.
Sorry that was not you. Mea culpa.
Can you share the fraps frame times csv? As I say I am very keen on collecting more and more data on this so we can start to get a picture across a broad range of people and what they can and can not see stuttering in.
This may be a stupid question, but would going with a second 7970 help or hurt this problem? I've considered the option, but don't want to end up worse than with a single card.
The graph you quoted suggests that likely the issue can be resolved without an decrease in overall framerate.
That is, if it is representative of the overall issue.
Sure! Hitman no FXAA
In another thread I showed the micro structure for a couple of cases. A slow frame is always paired with a fast frame, but the interesting thing is that it is periodical, meaning sometimes you have four frames at 16 ms followed by the microstuttering frame pair, and sequence repeats itself, while other times you have six frames at 16 ms spacing between the slow+fast pair. Their occurance is not random, but the magnitude of these frametimes are Gaussian.
Exactly, we all know that the 7950 is capable of perfectly good FPS scores, it is this stutter which is the concern.
Also there seem to be several different varieties of stutter, implying several different causes. Outdoors Skyrim shows this fast-slow pattern, while Borderlands 2 has the distinctive slow-fast-fast-fast stutter. It's hard to tell with other games where we don't have a close up of the graphs, but it seems there are other patterns as well.
It will be fascinating once PCPer have some results to publish with their more advanced testing.
That is a beautiful trace Shows a lot of microstutter and stuttering without the frames per second wildly changing.
One thing I have been playing with is ways to show the amount of microstutter. So below is a graph of the difference between consecutive frame times sorted from highest to lowest. The ideal picture here would be an empty graph, all 0.
So roughly 1/5 of the frame time differences is above 6ms, which I am beginning to think is about the threshold for most people based on the traces I have collected so far over the last year.
16ms is a monitor refresh, so anything above that is a pretty big jolt, but even then if you start talking about 8ms swings you would end up with the following time pattern of world updates on a CPU:
Time of frame 0 16, 32, 48, 64
Pattern 8, -8, 8, -8, 8
Frame time 8 , 8, 40, 40, 72
So even 8 ms is highly destructive to smooth gameplay, it halves the frame rate.
I really want to see their analysis, I think its going to move the state of the art on in a big way but I also fear it wont actually give us anything we don't have already, it will simply validate frame times as a valid approach for determining the offset of age of frames delivered to the GPU.
Its going to be really hard to show the multi dimensional data they will be getting in but age of frame shown is what I suspect we all want to see and I don't know if they will have that at all.
Still if it does validate frame times that will good in that it means you can do accurate testing without having to have really fancy equipment.
I think there are bound to be a few surprises in there as well though. I'd like to know more about that tiny sliver of a frame in the sleeping dogs example and how common that sort of thing is.
Glad to be of assistance! Like I said, that Hitman file has noticeable microstutter, but it is at the level that I do not feel bothered with it. If I keep FXAA on however, it becomes a stuttering mess (and blurry like renaissance painting).
What do you make of this one? This is the outdoor scene from Skyrim I posted the first graph for. On this one I cannot visually detect any microstuttering.
Well, in theory yes. But in practice the fine sequence is more like
Pattern 0, 0, 0, 0, 8, -8, 0, 0, 0, 0, 8, -8
which means every sixth frame is too fast to see, while every other sixth frame has up to 50% increase in frame time, corresponding to instantaneous FPS drop from 60 to 40.
Nah, we had examples of good frame times that still give a stuttering impression. I think the cause of that is different from what we are discussing here, which is the reason why FRAPS is blind to it. Frame times are useful but do not tell the full story I am afraid.
(Now, if I would just find time to actually play anyting! I had so much fun with this issue that I have not played anything in over a month! )
Can I have a trace of that as well?
Stuttering verses a real mess is also useful.
There is a period of possibly noticeable stutter in the middle and about 3/4 of the way through but for the most part its below 5ms so I doubt I would notice it either. The absolute variance graph is quite dramatically less severe in this. The 10% mark is already below 5ms.