AMD Comments on GPU Stuttering, Offers Driver Roadmap & Perspective on Benchmarking

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rikard

Senior member
Apr 25, 2012
428
0
0
They used much better tools which did not show evidently that they had a problem compared to Nvidia.
Huh? The article said something completely different:
The shortest answer also the bluntest answer: AMD had a stuttering problem because AMD wasn’t looking for a stuttering problem. AMD does a great deal of competitive analysis (read: seeing what NVIDIA is doing) on overall performance, but AMD was never doing competitive analysis for stuttering.
Or do you mean that someone in this thread something like that (in which case they really need to learn to read before commenting)?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The answer to your question is actually contained in the text you quoted:



It could be. It could also not be. That is the point I wish you would understand. Fraps is a blunt instrument, but even blunt instruments have their uses.

I read that, but:
AMD’s second problem then is that even when FRAPS is being used to measure frame intervals, due to the issued we’ve mentioned earlier it’s simply not an accurate representation of what the user is seeing. Not only can FRAPS sometimes encounter anomalies that don’t translate to the end of the rendering pipeline, but FRAPS is going to see stuttering that the user cannot. It’s this last bit that is of particular concern to AMD. If FRAPS is saying that AMD cards are having more stuttering – even if the user cannot see it – then are AMD cards worse?
And then that:
At the very start of this odyssey AMD’s single-GPU frame interval problem was so bad that even FRAPS could see it.
Explain it to me:
Fraps is bad because it's not showing the whole size of the problem or it's bad because it was used to discover a problem with AMD graphics cards?
 
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
I read that, but:
And then that:
Explain it to me:
Fraps is bad because it's not showing the whole size of the problem or it's bad because it was used to discover a problem with AMD graphics cards?

These points do not contradict each other. FRAPS is not entirely accurate, that does not mean it is useless. It just is NOT the best way to test for latency issues. Nvidia obviously think so as well or the FCAT tool would never have been developed.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
It's the best tool we had, and it spawned a movement.

Jackie Robinson wasn't the greatest player ever, but that doesn't mean he didn't change the game forever.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
It's the best tool we had, and it spawned a movement.

Jackie Robinson wasn't the greatest player ever, but that doesn't mean he didn't change the game forever.

Again where did I say it is/was totally worthless and that the results from it are completely irrelevant? I use it myself to check I am getting the best performance from my CF 7950s. But there has been occasions when FRAFS/FRAPS has been telling me I am getting serious stutter but the game plays perfectly smooth. This has been the main bone of contention with many people since this whole frame time issue became popular. In many (not all) cases Nvidia people were reading a FRAPS graph and stating "that runs like crap on AMD cards". AMD people were saying I don't see any stutter so it isn't a problem. Do we believe a graph/chart that isn't always accurate or do we believe the perception of the card users? And before you jump on that statement I am aware that people have different tolerances of what is smooth.

Nvidia developed FCAT and PCERF have stated in the actual article you are fond of linking to that FRAPS is not very accurate. Yes it can and has been used to highlight obvious problems, but the data is not always accurate.

"We actually showed our readers a runt in our first article about Frame Rating. The implications of runts and drops should be pretty apparent to you if you are following our logic thus far, but just to be sure, let’s elaborate. In both cases, drop and runt, FRAPS essentially thinks the frame is being shown to the user like just any other frame. With a drop though, this isn’t the case – the user actually never sees the frame on the screen and thus the FRAPS data is just wrong."

I am not defending AMD cards by claiming FRAPS/FRAFS has always been wrong. My point is that it wasn't always right and Nvidia obviously agrees or FCAT would never have been developed.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Again where did I say it is/was totally worthless and that the results from it are completely irrelevant? I use it myself to check I am getting the best performance from my CF 7950s. But there has been occasions when FRAFS/FRAPS has been telling me I am getting serious stutter but the game plays perfectly smooth. This has been the main bone of contention with many people since this whole frame time issue became popular. In many (not all) cases Nvidia people were reading a FRAPS graph and stating "that runs like crap on AMD cards". AMD people were saying I don't see any stutter so it isn't a problem. Do we believe a graph/chart that isn't always accurate or do we believe the perception of the card users? And before you jump on that statement I am aware that people have different tolerances of what is smooth.

Nvidia developed FCAT and PCERF have stated in the actual article you are fond of linking to that FRAPS is not very accurate. Yes it can and has been used to highlight obvious problems, but the data is not always accurate.

"We actually showed our readers a runt in our first article about Frame Rating. The implications of runts and drops should be pretty apparent to you if you are following our logic thus far, but just to be sure, let’s elaborate. In both cases, drop and runt, FRAPS essentially thinks the frame is being shown to the user like just any other frame. With a drop though, this isn’t the case – the user actually never sees the frame on the screen and thus the FRAPS data is just wrong."

I am not defending AMD cards by claiming FRAPS/FRAFS has always been wrong. My point is that it wasn't always right and Nvidia obviously agrees or FCAT would never have been developed.

FCAT was delevoped because NVIDIA wanted a better metric...not because they wasn't aware of the problem...and not because FRAPS didn't showed a problem.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
These points do not contradict each other. FRAPS is not entirely accurate, that does not mean it is useless. It just is NOT the best way to test for latency issues. Nvidia obviously think so as well or the FCAT tool would never have been developed.

FCAT is used to show dump and runt frames which have an influence on the frame latency Fraps is showing.

AMD on the other hand declared Fraps useless because it shows problems which are not there.

BTW: AMD's über software did not show a stuttering problem in certain games. So without Fraps they had never delivered a driver update to fix these problems which were not there in their mind...
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Fraps never shows "a problem that isn't there". Everything you see in fraps is causing a knock on effect into the rendering and game simulation loop and is indicative of problems in the ongoing pipeline. It also shows the frame times in a way that translates quite well to what ultimately comes out on the screen in all cases where the manufacturers solution isn't totally broken (just AMD crossfire at this point). Variations in frame times are still a concern, as are the spikes.

Fraps shows the variation on the input, FCAT shows the variation on the output, now what I want to see is the latency linking one to the other. I really want to see how many overall latency there is in the pipeline and the variation of it so we can get the full picture.
 
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
FCAT was delevoped because NVIDIA wanted a better metric...not because they wasn't aware of the problem...and not because FRAPS didn't showed a problem.

FCAT is used to show dump and runt frames which have an influence on the frame latency Fraps is showing.

AMD on the other hand declared Fraps useless because it shows problems which are not there.

BTW: AMD's über software did not show a stuttering problem in certain games. So without Fraps they had never delivered a driver update to fix these problems which were not there in their mind...

Fraps never shows "a problem that isn't there". Everything you thing you see in fraps is causing a knock on effect into the rendering and game simulation loop and is indicative of problems in the ongoing pipeline. It also shows the frame times in a way that translates quite well to what ultimately comes out on the screen in all cases where the manufacturers solution isn't totally broken (just AMD crossfire at this point). Variations in frame times are still a concern, as are the spikes.

Fraps shows the variation on the input, FCAT shows the variation on the output, now what I want to see is the latency linking one to the other. I really want to see how many overall latency there is in the pipeline and the variation of it so we can get the full picture.

Lol, where in any of my post did I say that FRAPS was worthless? I even stated the data can be useful as I use it myself. The problem here is that you are all reading that AMD said it is totally worthless when they in fact said the following.

"AMD’s problem then is twofold. Going back to our definitions of latency versus frame intervals, FRAPS cannot measure “latency”. The context queue in particular will throw off any attempt to measure true frame latency. The amount of time between present calls is not the amount of time it took a frame to move through the pipeline, especially if the next Present call was delayed for any reason. AMD’s second problem then is that even when FRAPS is being used to measure frame intervals, due to the issued we’ve mentioned earlier it’s simply not an accurate representation of what the user is seeing. Not only can FRAPS sometimes encounter anomalies that don’t translate to the end of the rendering pipeline, but FRAPS is going to see stuttering that the user cannot. It’s this last bit that is of particular concern to AMD. If FRAPS is saying that AMD cards are having more stuttering – even if the user cannot see it – then are AMD cards worse?"


Someone please point out to me where in this statement did AMD "declare FRAPS totally and completely useless" as the three of you infer?

I have seen occasions where FRAFS/FRAPS has reported what many would claim should be serious stutter, yet the experience was very smooth without any noticeable stutter.

Two FRAPS frame-times taken on the exact same system using Hitman Absolution at the exact same settings. 2560x1600 resolution.

Vsync on, Triple Buffering on, Flip queue default
HMAUltraBloomoffDOFoff2xMSAA7950CF_zps589ae2a5.jpg



Vsync off, Triple Buffering off, flip queue 1.
HMAUltraBloomoffDOFoff2xMSAA7950CFvsyncoff_zpsaff936a0.jpg


The bottom chart looks like a stuttery mess due to very frame-time variations, but the experience was smooth and very playable. The top chart gave almost no variation in frame times and was smoother overall but it was more due to the fact that there was no tearing rather than tighter frame-time variations. Ultimately there comes a point where what the individual perceives is what matters, not what a graph tells us.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Someone please point out to me where in this statement did AMD "declare FRAPS totally and completely useless" as the three of you infer?

AMD’s second problem then is that even when FRAPS is being used to measure frame intervals, due to the issued we’ve mentioned earlier it’s simply not an accurate representation of what the user is seeing. Not only can FRAPS sometimes encounter anomalies that don’t translate to the end of the rendering pipeline, but FRAPS is going to see stuttering that the user cannot. It’s this last bit that is of particular concern to AMD. If FRAPS is saying that AMD cards are having more stuttering – even if the user cannot see it – then are AMD cards worse?"

Come one...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yes, not only is it lost performance, but it's got it's own list of negatives to go with it besides that.


Otherwise do we start recommending AMD to noobs and granny gamers, and Nvidia to competitive and hardcore gamers?

Simple fact is Nvidia doesn't share this non vysnc problem that AMD has, so vysnc should never be considered a solution unless you already bought and paid for your stutter.

Not only was vysnc enabled, but so was triple buffering. It's like putting two cloudy windows between you and your expensive vista view.
 
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
Come one...

Seriously, you bold parts of a sentence and decide to take each sub statement out of context?

Example
I think the deranged madman was wrong when he said, I will kill you all.

Becomes

I will kill you all.

You can't take a sentence and pick single words or phrases as separate entities, they lose all context.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yes, not only is it lost performance, but it's got it's own list of negatives to go with it besides that.

So does Adaptive v-sync...but I'm sure that's fine to use, no?


Otherwise do we start recommending AMD to noobs and granny gamers, and Nvidia to competitive and hardcore gamers?

Using v-sync differentiates the noobs from the professionals? Well that's a stretch.

Simple fact is Nvidia doesn't share this non vysnc problem that AMD has, so vysnc should never be considered a solution unless you already bought and paid for your stutter.

Not only was vysnc enabled, but so was triple buffering. It's like putting two cloudy windows between you and your expensive vista view.

You're right nvidia doesn't have this issue, and AMD should fix it. What does that have to do with someone who chooses to use v-sync?
 

ICDP

Senior member
Nov 15, 2012
707
0
0
Sorry, but how can you view losing 35% of your performance as a positive?

Let me put that another way. How can you possibly think tearing is acceptable? See how personal preference works?

Because I prefer no tearing does not = wrong. If someone else wants that extra performance they have the choice of running vsync off, it still gives a very smooth experience as long as you can live with tearing.

That wasn't the point of the graph. It was to show that despite FRAFS/FRAPS showing what should be excessive stutter due to the higher frame-time variance it was actually a smooth experience. It was to show that BrightCandle's statement that "Fraps never shows a problem that isn't there." is not accurate.

Currently Crossfire is having serious problems, I would only recommend it over SLI if the person already had an AMD HD 7xx0 card. Me declaring FRAPS is only useful when taken alongside other metrics does not = FRAPS is totally useless.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
So does Adaptive v-sync...but I'm sure that's fine to use, no?




Using v-sync differentiates the noobs from the professionals? Well that's a stretch.



You're right nvidia doesn't have this issue, and AMD should fix it. What does that have to do with someone who chooses to use v-sync?


In some games, but adaptive doesn't use TB as far as I know, that's why it's different in a better way.

Not really, casual gamer vs competitive would probably be a better example. Then of course you have to ask yourself why a casual is paying $550x2 to play games at 1600p.

AFAIK AMD is fixing it, without needing to tell people to use vysnc. Because using vysnc and TB causes a large amount of other issues, including a large input lag factor... What type of person plays a game where they don't care that there at a huge disadvantage against an nvidia user not using vysnc/TB?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
In some games, but adaptive doesn't use TB as far as I know, that's why it's different in a better way.

Not really, casual gamer vs competitive would probably be a better example. Then of course you have to ask yourself why a casual is paying $550x2 to play games at 1600p.

...you calling me out? ;) Well I guess I only have a 1440p monitor haha.

AFAIK AMD is fixing it, without needing to tell people to use vysnc. Because using vysnc and TB causes a large amount of other issues, including a large input lag factor... What type of person plays a game where they don't care that there at a huge disadvantage against an nvidia user not using vysnc/TB?

I don't recall AMD telling anyone to use v-sync to fix it. I recall a lot of forum posters saying it helps, and now an article that is baffled by the results that it actually helps. Caveats aside about input lag of course.

Something tells me the "huge advantage" you are referring to is very rare in terms of frequency. In the situation of both users using a 60hz display, unless nvidia's guy frame refresh catches a twitch response, their both off by a few ms in terms of response time.

But, I've heard people can dodge bullets, so then perhaps the response time issue is a valid argument.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
So pointing out fraps' limitations=saying its completely useless
It took me a while, but I got it now :|

AMD’s second problem then is that even when FRAPS is being used to measure frame intervals, due to the issued we’ve mentioned earlier it’s simply not an accurate representation of what the user is seeing.
So, every number Frap is creating is not an accurate representation of what the user sees.

Or shorter: Fraps is simply useless.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
In some games, but adaptive doesn't use TB as far as I know, that's why it's different in a better way.

Not really, casual gamer vs competitive would probably be a better example. Then of course you have to ask yourself why a casual is paying $550x2 to play games at 1600p.

AFAIK AMD is fixing it, without needing to tell people to use vysnc. Because using vysnc and TB causes a large amount of other issues, including a large input lag factor... What type of person plays a game where they don't care that there at a huge disadvantage against an nvidia user not using vysnc/TB?

I don't play online ever, so being competitive is not a worry for me. Asking why a person would pay big money for a CF setup to play on a 2560x1600 monitor is the same as asking why someone would pay $1000 for a GPU.

I have a 2560x1600 monitor because I used to do a lot of graphics work and I only paid £340 all in for both of my HD 7950s. Does this fact somehow make my gaming preferences invalid?