Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing

csbin

Senior member
Feb 4, 2013
904
605
136
http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Testin

Single GPU Configurations – Performance as Expected
Today’s results focus on the Radeon HD 7970 GHz Edition and the NVIDIA GeForce GTX 680 as well as their SLI/CrossFire options, but let’s start with a quick talk about the results we see with the single card and single GPU configurations. Frame Rating still tells an interesting and unique story compared to FRAPS and thanks to some of our data analysis, (Min FPS percentiles, International Stutter Units) the HD 7970 and GTX 680 compare different than they might otherwise.
We definitely can’t say the same for the multi-GPU results, but when using only a single GPU both AMD and NVIDIA platforms show consistent results on a run to run basis as well as when we compare Frame Rating to the traditional FRAPS average frame rates and frame times. When we showed you the FRAPS graph followed by the Observed FPS graphics you should have seen that both the single GTX 680 and the single HD 7970 are basically the same on both.
Frame time graphs are going to be different due to the different locations in the graphics pipeline in which the frame times are measured between FRAPS and our capture solution, but generally both versions tell a similar story. If there is hitching or stutter found using the FRAPS time stamps then our at-the-display data will show the same thing, but maybe at different specific locations. Patterns are the key to find though as very few gamers are really just playing a game for 60 seconds at a time, let alone the same 60 seconds over and over.
The overall picture comparing the two cards indicates that the AMD Radeon HD 7970 GHz Edition is a faster card for gaming at 1920x1080, 2560x1440 and 5760x1080 triple-monitor resolutions. In Battlefield 3 the performance gap between the HD 7970 and GTX 680 was small at 19x10 and 25x14 but expanded to a larger margin at 57x10 (19%). AMD’s HD 7970 also shows less frame to frame variance in the BF3 than the GTX 680. This same pattern is seen in Crysis 3 as well, though at 5760x1080 we are only getting frame rates of 13 and 16 on average, getting the HD 7970 a 23% advantage.
DiRT 3 performed very well on both cards even at the 5760x1080 resolution though AMD’s HD 7970 maintained a small advantage. Far Cry 3 was much more varied with the GTX 680 taking the lead at 1920x1080 (20%) but at 2560x1440 and 5760x1080 the cards change places giving the HD 7970 the lead. Skyrim was another game that saw small performance leads for AMD at higher resolutions though I did find there to be less frame time variance on the GTX 680 system which provided a better overall experience for game that can run on most discrete GPUs on the market today.
Finally, one of the newest games to our test suite, Sleeping Dogs, the AMD Radeon HD 7970 holds a sizeable advantage across the board of the three tested resolutions. The margins are 34% at 1920x1080, 37% at 2560x1440 and 23% when using triple displays.
While some people might have assumed that this new testing methodology would paint a prettier picture of NVIDIA’s current GPU lineup across the board (due to its involvement in some tools), with single card configurations nothing much is changing in how we view these comparisons. The Radeon HD 7970 GHz Edition and its 3GB frame buffer is still a faster graphics card than a stock GeForce GTX 680 2GB GPU. In my testing there was only a couple of instances in which the experience on the GTX 680 was faster or smoother than the HD 7970 at 1920x1080, 2560x1440 or even 5760x1080.

AMD CrossFire Performance - A Bridge over Trouble Water?
Where AMD has definite issues is with HD 7970s in CrossFire, and our Frame Rating testing is bringing that to light in a startling fashion. In half of our tested games, the pair of Radeon HD 7970s in CrossFire showed no appreciable measured or observed increase in performance compared to a single HD 7970. I cannot overstate that point more precisely: our results showed that in Battlefield 3, Crysis 3 and Sleeping Dogs, adding in another $400+ Radeon HD 7970 did nothing to improve your gaming experience, and in some cases made it worse by introducing frame time variances that lead to stutter. Take a look at some of our graphs on those game pages and compare the FRAPS FPS result to the Observed FPS result that calculates an average frame rate per second after removing runts and drops. Clearly the performance of the dual-card configuration is only barely faster than the single card, removing the “scaling” of CrossFire. This occurs at 1920x1080 and 2560x1440 on those three games and actually happens several times on DiRT 3 but only at 2560x1440 (which actually leads me to believe this is a GPU performance issue, not a CPU performance issue).
It is worth pointing out that this does not necessarily mean you won’t have a fluid gaming experience on an AMD CrossFire configuration. Sleeping Dogs at 2560x1440 is a perfect example of this: CrossFire shows nearly 50% of the frames as runts, cutting the average frame rate in half, but those non-runt frames are actually delivered in a consistent manner. But a smooth gaming experience at 33 FPS on average on two HD 7970s in CrossFire doesn’t sound that good when you can get the same smooth experience at 33 FPS average with a single HD 7970. Dual GeForce GTX 680s in SLI on the other produce a fluid animation in Sleeping Dogs at 46 FPS.
In Far Cry 3 and Skyrim we did not have this problem with our performance metrics since we didn’t see large numbers of runts or drops in our testing. For Far Cry 3 in particular, the AMD cards had quite a bit more frame time variance (leading to stutter, non-fluid gameplay) with even the single HD 7970 getting higher marks on the International Stutter Units (ISU) graph than the GTX 680s in SLI.
The second major concern for AMD CrossFire users occurs when you enable triple-monitor configurations with Eyefinity. In every single game we tested, even Skyrim, DiRT3 and Far Cry 3 that didn’t show major runt issues on single monitor resolutions, just about every other frame of the game was being dropped. Just like the runt frame issue we mentioned above, the Eyefinity drop problem basically means you are running your 5760x1080 configuration at the performance level of a single HD 7970 even though you have invested twice the money AND that other performance software (in-game tests, FRAPS) are telling you differently. The results are so bad in fact from the recorded video that the FCAT Perl scripts aren’t quite able to decipher them because it thinks it is a poor capture; we can assure you that is not the case.
As much as we told you the single card results continued to favor AMD’s Radeon HD 7970 GHz Edition, the CrossFire results here counter that. As a buyer of a high end graphics card that will cost you over $400, the assurance of being able to run a multi-GPU solution to improve performance were not just insinuated, but verbally given. At this point, it is fair to say that AMD is not living up to its promises.

NVIDIA SLI Performance – How we expected multi-GPU to work
The NVIDIA GeForce GTX 680 looks slower than the HD 7970 in our single GPU comparisons, but that all changes when we compare dual-GPU to dual-GPU in this category. While AMD’s solution showed thousands of runt frames on BF3, Crysis 3 and Sleeping Dogs (two of which are AMD Gaming Evolved titles), NVIDIA’s SLI was able to handle scaling without a problem. Battlefield 3 at 2560x1440 goes from an average of 57 FPS on one GTX 680 to 100 FPS on two of them; Crysis 3 at 1920x1080 scales from 31 FPS to 56 FPS; Sleeping Dogs goes from 24 to 46 FPS at 2560x1440. And it is able to do so without massive frame time variance, which means the animations are not only improved by better frame rates but are still nearly as smooth as the single card options.
The secret to NVIDIA’s success lies it the hardware frame metering technology that it has built into the SLI infrastructure since the beginning, but is only just recently coming to light. Apparently a combination of both hardware on the GPU and software in the driver, the frame metering technology’s sole purpose to balance the output of frames from the GPU to the display in such a way to provide the best animation possible and balance performance and input latency.
In my talks with AMD before this article went live they told us that they were simply doing what the game engine told them to do – displaying frames as soon as they were available. Well as we can clearly see with the runts in more than half of our tested games, display a frame too early can be just as detrimental as display it too late. Without the ability to balance the two GPU’s output (or three or four) you will run into these problems and in fact we have seen the same thing happen with NVIDIA cards when metering is disabled. We are hoping that NVIDIA will give us the option to disable it and run some more Frame Rating tests to see how they compare in the near future.
In a couple games, Far Cry 3 and DiRT 3 on occasion, CrossFire is working as we would expect it to. Skyrim does not exhibit the runt problem but it also doesn’t seem to scale at all over a single GPU either. The inconsistency of this behavior might be just as troubling if my theory is correct. In Skyrim, Far Cry 3 and DiRT 3 at low resolutions, it would appear that the CPU may be the primary bottleneck for performance, and for Far Cry 3, a game that has numerous other technical issues, this maybe be why CrossFire is actually working. An artificial limiter on the game engine that helps meter out requests for frames to be rendered would essentially act like the hardware frame metering in NVIDIA’s SLI GPUs allowing for a better overall experience. In games like BF3, Crysis 3 and Sleeping Dogs where the GPU is in more demand, the AMD hardware/software combination is the limiting point in the pipeline and this is where the AMD solution falters.

Vsync – Only a Partial Answer
When I posted my preview of these results during the launch of the GeForce GTX Titan, many of you wanted to know what effects Vsync would have on the runts and frame time variance. As it turns out, Vsync can in fact improve the situation for AMD’s CrossFire pretty dramatically, but still leaves a lot of problems on the table. By doing metering on the frame rendering times of all GPU combinations including CrossFire, it is able to remove the runts from our captures and from affecting performance. Take a look at the results in Crysis 3 at 1920x1080 on the Radeon HD 7970s in CrossFire to see the other emerging issue though: drastically increased frame time variance. The constant shifting between 16ms and 33ms frame times means that you will often see stuttering animation even when the GPU has performance to handle higher or even more consistent frame rates.
To be fair, this same effect happens to NVIDIA’s GTX 680s in SLI. The only difference is NVIDIA has some options to try to fix it called Adaptive Vsync and Smooth Vsync. Both are activated through the NVIDIA Control Panel but Smooth Vsync is only available for SLI users (we are hoping this will be added for single GPU users as well soon). The Adaptive Vsync fixes the frame times at your display refresh rate (16ms, 60 Hz most of the time) anytime your frame rate would be higher than 60 FPS but then allows the engine to essentially “turn off” Vsync under 60 FPS so you don’t get the dramatic stuttering. Smooth Vsync is a little known feature that attempts to only change the frame rate / frame times when it knows it will have extended periods of available performance.
In some select instances for AMD's CrossFire we can actually see a completely resolved frame variance result, as demonstrated with the Battlefield 3 2560x1440 graphs. But Vsync still introduces other problems to latency and interactivity of PC games and is a topic we are going to dive into again soon.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Wow, what a difference - like the whole subject flip-flops:

http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Tes-11

I wonder if the AMD driver team works with v-sync on from the start.

Since I always use le sync of V, might be why I never saw these issues (or at least as many others do.)

Hmmm...sort of makes me more interested in getting my second card, wanna see the riff-raff of V-Sync on versus off.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
The thing with Vysnc or frame limiting is that you're trading in raw performance for smooth game play. Kind of like choosing a single high-end card over two mid-range but "faster together" GPUs. :p
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
wow, amd did better at single cards using a Nvidia latency test...How embarrassing :p
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Pcper has shown very clearly what the issue with vsync is, it causes higher variance, it definitely causes input latency and it also causes huge steps in the frame rate. If you play a game with this constant 60/30 jumping its horrible. You need to leave a lot of performance and set your graphics up for the very worst so you never end up swapping back and forth.

Crossfire is just broken, has been since the day the 7970 was released and I am so glad this level of test has finally been done.
 

Cadarin

Member
Jan 14, 2013
30
0
16
Crossfire is just broken, has been since the day the 7970 was released and I am so glad this level of test has finally been done.

I'm happy too. I own a 7970 and there was always a possibility in the back of my mind that I might add a second at some point. Not now. Crossfire truly is broken, and anyone running that setup has wasted a lot of money. Anyone happy with their Crossfire setup was completely fooled as they could have garnered the same performance with a single card.

On a positive note, I'm not regretting my single 7970 at all. It's performing equal to or better than a 680, and cost me a lot less.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
I wonder if this has been the case with CF for a long time,
I would like to see something like 5970 with old and new drivers being tested.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I'm happy too. I own a 7970 and there was always a possibility in the back of my mind that I might add a second at some point. Not now. Crossfire truly is broken, and anyone running that setup has wasted a lot of money. Anyone happy with their Crossfire setup was completely fooled as they could have garnered the same performance with a single card.

On a positive note, I'm not regretting my single 7970 at all. It's performing equal to or better than a 680, and cost me a lot less.

Not if they a doing bitcoin mining....LOL
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I wonder if this has been the case with CF for a long time,
I would like to see something like 5970 with old and new drivers being tested.

I started to notice a problem with my 5970 in august 2011. I suspect somewhere around there this problem was introduced. That winter was horrible, most profiles were months late, every game I bought that session was broken and the microstutter was terrible even when it supposedly had a profile. I didn't notice an issue earlier in the year however.

The problem was immediately apparent on the 7970 at release and continued through to the moment I pulled the cards out after I had had enough.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Wow.Lonbjerk that sure made your last zillion posts look stupid lolol:biggrin:
Show me the bit where 7970 doesn't own the 680..what?....you can't?
Ouch.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I have a question to those with dual 7970's or 7950's. In Bf3 if you turn the settings up so that you only get 30 fps with a single gpu and then add a second gpu does the experience subjectively get better?

I find it hard to believe that all those people running dual 7970's never noticed that there was no difference between crossfire working and crossfire not working.

If crossfire truely was broken if bf3 why havn't we heard anyone saying anything about it? Its been over a year. I'm tempted to call this review BS or poorly executed. People with crossfire 7970's haven't been complaining that they can't notice the difference between one 7970 and two 7970s.

Yep read the conclusion. They do not state anywhere that "maybe the nvidia provided tool is not able to read the competetitor's cards correctly". Which should really appear in any scientific conclusion. I mean is does seem rather silly than nvidia would go out of their way to make their software work for other drivers, especially when this software was really only developed in their labs to work on their gpu's.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
BrightCandle tends to make a lot void claims with only neurosis to back them.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Lol.
Must be the biggest placebo effect ever if 2 crossfire cards perform exactly the same as a single card.

Clearly this metric is missing something.
Either that or gpu reviewers are the most useless bunch and might just be replaced by trained monkeys.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
You guys are right.. this is an Nvidia ploy to make xfire look horrible and make their own gtx680 look worse as well.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
You guys are right.. this is an Nvidia ploy to make xfire look horrible and make their own gtx680 look worse as well.

I can accept that sli is smoother.
I've a bigger problem accepting that when xfire is supported by the game it performs at single card speed and noone noticed, especially when using triple monitor resolutions.

By the way any tri-fire and tri-sli tests?
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
I've a bigger problem accepting that when xfire is supported by the game it performs at single card speed and noone noticed, especially when using triple monitor resolutions.

You won't notice until you move on to the next generation of single cards that can match the previous dual. There are no single outside of Titan that can even get remotely close to the raw performance of 7970 CF.

I went from dual unlocked 6950s to a GTX 670, and the GTX 670 felt better despite putting up less frames. But I already knew I was experiencing micro-stutter before that when 40fps was completely unplayable on the 6950s. I can deal with 40fps on a single card...
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
You won't notice until you move on to the next generation of single cards that can match the previous dual. There are no single outside of Titan that can even get remotely close to the raw performance of 7970 CF.

I went from dual unlocked 6950s to a GTX 670, and the GTX 670 felt better despite putting up less frames. But I already knew I was experiencing micro-stutter before that when 40fps was completely unplayable on the 6950s. I can deal with 40fps on a single card...

What they saying is that you can't see the difference between 1 6950 and 2 6950.
That when a new game is out and people notice crossfire doesn't work, then it is fixed it actually ain't. you just imagined.

When you got 40 fps a single 6950 would get 20 fps and now that is unplayable.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
Point is, 40fps on a 2x 6950s was unplayable. If I could have magically overclocked a single 6950 to reach the same 40fps, it would have been playable.

If they are claiming that two cards in CF will feel the same as one, that just sounds a little bit ridiculous. But they certainly doesn't feel like anything near double, or whatever frames rates they are putting up together. And it's not even close, and that is why this is finally being brought up.

There's no imaging anything. ~40fps doesn't go from being subjectively playable over a wide variety of games (single GPU), to completely unplayable (dual GPU), back to playable again (single GPU). I witnessed MS first hand and that was it. I don't need anyone telling me I was imaging things. I know what the reality is, but most simply just don't. I didn't pick up on it for a while until one day I was finally just like, "why the hell is 40fps so unplayable? This should be alright..." and then it kind of just hit me, because in general I was experiencing this over a wide variety of games. So instead of needing ~40fps on a single GPU, I found I needed 60 with the 6950 dual GPU setup. But 60 didn't quite feel like 60. It felt more like 40... playable.

I'm not trying to take any shots at AMD or AMD owners here. It's just that I've only had experience with 6850 CF and 6950 (unlocked so 6970) CF. And they obviously haven't fixed the problem in 7000-series. Even AMD themselves have come out and admitted there is a problem - there's no denying it any more. Yeah it's not going to make feel good about having spent a lot on a CF setup, but it is what it is, and at least AMD is working on it.
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
after a $200 experiment.

7970 crossfire is indeed broke.
fraps reports 65fps, however game play feels like 25fps. frametime deviation is 25ms.

you do not have to take my word for it. go out and buy a pair of 7970 yourself and test it yourself. after you sell that pair of 7970.
do report back how much your experiment cost.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Actually in some games. 30 frames is smooth and in others it is unplayable.
And I only said you were imagining when crossfire was on since it was actually off. :)
MS is very subjective, having slowdowns or freezes isn't.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
after a $200 experiment.

7970 crossfire is indeed broke.
fraps reports 65fps, however game play feels like 25fps. frametime deviation is 25ms.

you do not have to take my word for it. go out and buy a pair of 7970 yourself and test it yourself. after you sell that pair of 7970.
do report back how much your experiment cost.


Oh I experienced it and I literally got a headache after 10mins of BF3. My sig no longer has 2x 7970s in it. And wow the 2nd card had coil whine.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
That's what happens to me too, if it's stuttery I get a headache. I was able to get my 470 to stutter like crazy in Skyrim... I even posted a video showing it, it was simply awful and it gave me a near instant headache.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Even AMD fanbois should like the AMD driver team being held to a higher standard, as it helps AMD fanbois get more out of their cards. It's a win for everybody to use better metrics than the outdated frames-per-second metric.