new AMD Catalyst driver from Alpha Micro stuttering

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
That's a very good point to me based on I lean heavily towards nVidia and nVidia centric at this time.

Because AMD is important, specifically their choices and what they offer for their customers and sku's - I take notice based on their importance and talents. Companies change directions, change where they spend resources, change strategies. For example: One of my constructive nit-picks was developer relations and AMD is really trying to build momentum with Gaming Evolved and this gets noticed.

When AMD speaks -- I listen! When nVidia speaks -- I listen! What AMD and nVidia offer or do are very important based on their compelling choices for PC gaming, which I am very passionate about!

Fair enough.My comment was not really directed towards rational posters.I'm sorry for resorting to personal attacks but I don't know any better way to put this.My brother is a loyal AMD fan and enjoying his 7950 DC II top for a long time.If there are issues they should be fixed regardless of any camps.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The runt frames were still full frames being rendered by the GPU. The issue was that the next frame came way to early and covered up the majority of the runt frame. They were not cheating to get performance, the tiny slivers of frames being rendered were almost unnoticeable so they made it seem like those frames didn't exist when viewing the output.

I never implied they were cheating, and I'm aware they were accused of it, which is why I even asked this question. It seems to fall on what some people were saying before, the frames were always there just covered by the next frame.

Good to see that's two claims of cheating laid to rest.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Think of it like one of those flipbook cartoons when you were a kid. Flip down the pages and see the smooth animation. Now pull out every 3rd page from the book and flip through again and see the jerks in movement because of the missing pages of the animation. That's the equivalent of a runt frame


Not quite, but a very good analogy nonetheless.

And vsync fixes the issue because it forces those runt frames that are normally displayed for a very short period of time (which is practically equivalent to not displaying them at all...like in the example above), to wait their turn, and display evenly according to your refresh rate.

And since there is a valid remedy to input lag, why anybody plays with vsync off is beyond my level of empathy. If you enjoy looking at torn irregular frames, by all means carry on.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
GPU reviewes never test with vsync becuase it caps performance which of course renders results pointless for the purpose of reviews. It is for this very reason that none of the initial GTX680 review noticed the serious vsync stutter bug and the serious CF issues are noticed.

Vsync and or FPS caps do indeed fix stutter and runt frames very effectively in CF. In fact they also reduce stuttering in SLI which is not immune to micro stutter. The thing is that not everone likes vsync and that SLI does at least work much better out of the box.

And yet when I was running 2 7970s on Crossfire, neither vsync nor frame rate caps (nor any other of the myriad things I tried) ever 'fixed' micro stutter. Sure, they mitigated it noticeably. But that is not the same thing as saying that I was satisfied with the outcome. The smoothness was still orders of magnitude behind what a single 7970 was delivering.

Now, I'll ask again: can anyone actually provide a link to an article in which PCPer or any other review site says that vsync 'fixes' runt frames / stutter from Crossfire? And no, the dynamic vsync 'fix' reported months ago by Tech Report will not suffice. I tried that too and, again, it only reduces the problem without coming anywhere near fixing it.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Not quite, but a very good analogy nonetheless.

And vsync fixes the issue because it forces those runt frames that are normally displayed for a very short period of time (which is practically equivalent to not displaying them at all...like in the example above), to wait their turn, and display evenly according to your refresh rate.

And since there is a valid remedy to input lag, why anybody plays with vsync off is beyond my level of empathy. If you enjoy looking at torn irregular frames, by all means carry on.

My main issue with Vsync is that there is no definitive proof that it is choosing the correct frame to display out of the frames over 60fps to display on the refresh. I believe it was alluded to in the PC per article on Vsync that Vsync will in fact give you a nice flat line on the FRAPS graph, and good complete frames in FCAT, but there is no guarantee (that I have seen) that the frame Vsync is displaying is the one that results in the smoothest progression of frames.

It's better, but still not perfect. Also, I'd take unsmooth animation over input lag any day of the week, but that's because I'm such a l33t pro.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Reviews already said that Vsync fixes it. Even PCper have stated that vsync does away with all stuttering and runt frames by forcing the engine to only display frames when the monitor refreshes.

This whole storm in a tea cup is all about non-vsync performance. it always has been.

You CANNOT display runt frames when Vsync is turned on. FACT!

They did not say it does away with stuttering. Vsync causes stuttering if you don't maintain a constant FPS evenly divided into the refresh rate. They even have a quick article on that point specifically. It fixes the problem if you can maintain 60 FPS on a 60hz monitor with Vsync on.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
My main issue with Vsync is that there is no definitive proof that it is choosing the correct frame to display out of the frames over 60fps to display on the refresh.

That doesn't make any sense. Vsync inposes a frame cap on the GPU. It is only rendering 60 frames when it's turned on. There are no frames over 60 to choose from. This can be easily proved because your card produces less heat and noise when vsync is on, due to the GPU load being lower (for games that would otherwise render well over 60 frames). If it was rendering the same amount of frames as vsync off and then "choosing" like you hypothesize, your card would produce the same heat and noise as vsync off.


They did not say it does away with stuttering. Vsync causes stuttering if you don't maintain a constant FPS evenly divided into the refresh rate. They even have a quick article on that point specifically. It fixes the problem if you can maintain 60 FPS on a 60hz monitor with Vsync on.

That's assumed, you can't pull up frame rates between 30 and 60 into 60 evenly. So you will always get some uneveness when your performance dips. But triple buffering is still somewhat smoother than 1:2 pull down with double buffering. And if you can maintain 55+ fps, you generally have a very smooth display feed with only a few repeated frames. Moving to 120Hz dissolves these issues even further. Even with double buffered vsync, you should be able to maintain an almost perfectly smooth feed when it dips to 60 fps (unless you are very sensitive).
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
And you also don't see another evil, which is tearing. You just have to frame limit to do away with input lag, which we already know is a valid remedy.

Once you do all that, the single number of FRAMES PER SECOND that the 7990 churns out is very much a valid metric. And the card is definitely faster than a 690 and Titan.

Of course it doesn't mean it's necessarily the better choice, power and noise be taken into account. For example in a multi-monitor configuration, Titan puts both cards AWAY. Not only does it have a much larger buffer (which helps tremendously with huge resolutions), but its multi-monitor idle power consumption is nothing short of amazing.

Is it faster? http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official/17
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That's assumed, you can't pull up frame rates between 30 and 60 into 60 evenly. So you will always get some uneveness when your performance dips. But triple buffering is still somewhat smoother than 1:2 pull down with double buffering. And if you can maintain 55+ fps, you generally have a very smooth display feed with only a few repeated frames. Moving to 120Hz dissolves these issues even further. Even with double buffered vsync, you should be able to maintain an almost perfectly smooth feed when it dips to 60 fps (unless you are very sensitive).

Those dips below 60 cause stuttering. How much it annoys you depends on the person. When I play a game in 3D (when I use normal Vsync, because it is forced for 3D), I immediately see when FPS drops below 60. It may not be bad all the time, but even 58 FPS induces a little stutter to the image. Too much of this stutter causes me to get nauseated, so I'm very careful about having over 60 FPS.

Those 33ms frames when every other frame is 16ms, is jarring to me.

When in 2D, those drops below 60 FPS are less noticed, as they are 25ms frames, rather than 33ms frames. It still is noticed, just not as much. (this is a bonus with a 120hz monitor).
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It will be once the new drivers are made available. The graphs are slightly worse(probably not perceptable), but the perceived framerate is still higher on the 7990 than the 690 and Titan.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Improves-CrossFire-Prototype-Driver

We don't know that until they are tested. Anyways, it all depends on what games you test. Either way, SLI is smoother, even with the prototype drivers, but at least it is getting closer.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Plus the 7990 packs 3GB per core, whereas the 690 only 2GB. If you're buying this card for high resolution gaming (why else otherwise), that's a major downside to the 690.
A 690 is quite useful for 3D Vision, as it needs no more vram than a normal 1080p experience, and 120hz, when trying to achieve 120 FPS. I've also never seen 2GB limitations at 1440p or 1600p unless using mods.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
We don't know that until they are tested. Anyways, it all depends on what games you test. Either way, SLI is smoother, even with the prototype drivers, but at least it is getting closer.

It did get tested, and they said that the 7990 has the higher perceived framerate in most of the games they tested. Smoothness is a product of perceived framerate.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
A 690 is quite useful for 3D Vision, as it needs no more vram than a normal 1080p experience, and 120hz, when trying to achieve 120 FPS. I've also never seen 2GB limitations at 1440p or 1600p unless using mods.

I agree but I was referring to surround resolutions or 4K in the near future. If you're going to spend 1K on a GPU, you'd like to hope that it can drive a 4K monitor, should one crop up next year. The 690 will likely fall flat on it's face trying to drive that.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It did get tested, and they said that the 7990 has the higher perceived framerate in most of the games they tested. Smoothness is a product of perceived framerate.

Can you show me where that was written?

I can show you two that said the opposite.
http://www.tomshardware.com/reviews/radeon-hd-7990-review-benchmark,3486-18.html
AMD wants $1,000 for this new flagship—the same price as GeForce GTX 690, which yields a higher practical in six of our eight benchmarks as it delivers frames more smoothly across the board.
But when we combine the quantitative data enabled by video capture-based performance analysis and the subjective judgments of a panel of gaming enthusiasts who simply want to play their favorite titles on the best hardware possible, Nvidia’s thousand-dollar GeForce [FONT=inherit !important][FONT=inherit !important]GTX[/FONT][/FONT] 690 outshines the similarly-priced Radeon HD 7990. Our early look at AMD’s prototype driver suggests that more evenly pacing the rate at which frames are shown on-screen helps minimize frame time variance, which our gamers definitely noticed. But that release isn’t expected for months—the second half of 2013 is as specific as AMD gets.
You may have read the 2nd quote and misunderstood what was written about how their gamers noticed the new prototype drivers are better, but they did not say they were better than SLI. The data in the article doesn't show it better than SLI either, but definitely better.

http://www.pcper.com/reviews/Graphi...ire-Prototype-Driver/great-start-AMDs-CrossFi

Do you have any recent reviews that say Crossfire is smoother that I'm not aware of?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I agree but I was referring to surround resolutions or 4K in the near future. If you're going to spend 1K on a GPU, you'd like to hope that it can drive a 4K monitor, should one crop up next year. The 690 will likely fall flat on it's face trying to drive that.

I'd be interested in seeing 4K tested on 2GB of vram. I bet you'd be surprised at how little vram is needed for 4K buffers. The problem is usually when AA is used, which you might not use at that resolution. 4K resolutions only need an extra 20-25mb buffer. AA on the other hand will be the killer. Though there may be other things that use more vram at higher res, that is why I'd also like to see tests.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
From PC Per.

About BF3:
How does it affect our perceived average frame rate? The HD 7990 actually takes the performance lead over the GTX 690 here and is able to hold on to it until we cross the 90th percentile. Look at the orange line that removes the runt frames and you can tell it was running at nearly half the effective performance.

About Crysis 3:
Much like we saw with the Battlefield 3 results the revised driver on the HD 7990 is completely changing the performance aspects of the card. The average frame rate of 36 FPS is higher than the 34 FPS of the GTX 690 and the 25 FPS of the GTX Titan.

There is no actual text in the article about Dirt 3, but the perceived framerate graph the prototype driver is clearly higher than gtx 690 and Titan.

Far Cry 3:
Impressive! The average frame rate of the HD 7990 with the prototype driver is around 50 FPS, higher than the 41 FPS average on the GTX 690. Even though the prototype result does trail off after the 90th percentile, clearly this updated driver is improving results over the 38 FPS of the 13.5 beta release.

It should be noted that Far Cry 3 is awful on all cards.

Skyrim:
Our minimum FPS percentile graph shows that same modest improvement over the 13.5 beta, with an increase in the average frame rate of 10 FPS. In both cases though, the GTX Titan and GTX 690 remain ahead.
There weren't major issues in Skyrim to begin with, but the prototype improved, all cards stay above 60fps.

I could be wrong, but the perceived FPS graph is the FCAT data dropping runt frames from the FPS metric. Higher perceived FPS is a smoother experience.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Oh, you are referring to something different than I was. Perception on smoothness versus FCAT FPS when removing runt frames.

There is a difference, as you can see in the THG article. A runt frame, by both review sites articles, is quite small. As a result, it doesn't take much to make a frame count on either system. However, both sites show Crossfire has more frame variance, which has a big effect on smoothness (by definition that is all that counts, but high FPS plays a part).
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Oh, you are referring to something different than I was. Perception on smoothness versus FCAT FPS when removing runt frames.

There is a difference, as you can see in the THG article. A runt frame, by both review sites articles, is quite small. As a result, it doesn't take much to make a frame count on either system. However, both sites show Crossfire has more frame variance, which has a big effect on smoothness (by definition that is all that counts, but high FPS plays a part).

Is perceived framerate not the metric most closely tied to the smoothness you see on screen?

There is more variance, but is it visible in the real world? These are all questions that still have not been answered which contributes a lot to all the confusion about these new testing methods.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Is perceived framerate not the metric most closely tied to the smoothness you see on screen?

There is more variance, but is it visible in the real world? These are all questions that still have not been answered which contributes a lot to all the confusion about these new testing methods.

No, perceived framerate is not a metric most closely tied to smoothness. Variance is, but high perceivable FPS also helps create smoothness, so both matter.

Look it up in a dictionary: http://dictionary.reference.com/browse/smooth
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
No, perceived framerate is not a metric most closely tied to smoothness. Variance is, but high perceivable FPS also helps create smoothness, so both matter.

Look it up in a dictionary: http://dictionary.reference.com/browse/smooth

The whole reason why crossfire wasn't as smooth prior to this prototype driver is because the variance caused runt frames which when removed from the FPS metric drastically reduced the perceived framerate. Variance causes runt frames, runt frames are removed from the FPS metric giving you the perceived fps metric. Therefore, perceived FPS = smoothness?

That's my understanding of this. I don't know if it's right. It is how I interpret the information. Most of the reviews are sorely lacking in providing readers an explanation of how to interpret the data.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The whole reason why crossfire wasn't as smooth prior to this prototype driver is because the variance caused runt frames which when removed from the FPS metric drastically reduced the perceived framerate. Variance causes runt frames, runt frames are removed from the FPS metric giving you the perceived fps metric. Therefore, perceived FPS = smoothness?

That's my understanding of this. I don't know if it's right. It is how I interpret the information. Most of the reviews are sorely lacking in providing readers an explanation of how to interpret the data.

Well, runt frames are a metric invented by Pcper. They give a lot of leeway, as it is defined as less than 20% of the average frame size from the previous 10 images. They could bump that up to 50% and call that a runt frame, which would make things look vastly different. They could go with THG's current definition of 20 frames or less, which would look different again.

The point is, evenly spaced out frames creates smoothness. At what point someone notices a difference is a little up in the air, and may vary, but the variance of frames is clearly a metric that matters too.

The end result is the test gamers choose SLI over it, so while it is improving, it isn't quite as good yet. That is the prototype driver, not due to be released for a few months. That said, by then it might be better. It should. The one thing that worries me is they are doing it on a per game basis, rather than a general algorithm.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Well, runt frames are a metric invented by Pcper. They give a lot of leeway, as it is defined as less than 20% of the average frame size from the previous 10 images. They could bump that up to 50% and call that a runt frame, which would make things look vastly different. They could go with THG's current definition of 20 frames or less, which would look different again.

The point is, evenly spaced out frames creates smoothness. At what point someone notices a difference is a little up in the air, and may vary, but the variance of frames is clearly a metric that matters too.

The end result is the test gamers choose SLI over it, so while it is improving, it isn't quite as good yet. That is the prototype driver, not due to be released for a few months. That said, by then it might be better. It should. The one thing that worries me is they are doing it on a per game basis, rather than a general algorithm.

"Runt frames" is a nVidia term
I believe the algorithm has a general aspect to it but will need game specific tweaking for better perfomance
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I'd be interested in seeing 4K tested on 2GB of vram. I bet you'd be surprised at how little vram is needed for 4K buffers. The problem is usually when AA is used, which you might not use at that resolution. 4K resolutions only need an extra 20-25mb buffer. AA on the other hand will be the killer. Though there may be other things that use more vram at higher res, that is why I'd also like to see tests.

I'd say you're gonna crack past 2GB at 4K, and if you add MSAA, way more than 2GB. It might still run fine (who knows what the driver is doing in the background to memory manage), but I wouldn't want to have my VRAM used at 99%... it's just not something that should happen on a $1K piece of hardware.

Titan will not risk bottlenecks, not now, and not for another 3 years IMO.