FCAT: The Evolution of Frame Interval Benchmarking

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
You guys are making mountains out of molehills about the whole Nvidia/PCPer transparency...

:biggrin:
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
From Article #1, page 1, last paragraph:

Originally Posted by pcper
NVIDIA's Involvement
You may notice that there is a lot of “my” and “our” in this story while also seeing similar results from other websites being released today. While we have done more than a year’s worth of the testing and development on our own tools to help expedite a lot of this time consuming testing, some of the code base and applications were developed with NVIDIA and thus were distributed to other editors recently.
NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file. Obviously, NVIDIA has a lot to gain from this particular testing methodology: its SLI technology looks much better than AMD’s CrossFire when viewed in this light, highlighting the advantages that SLI’s hardware frame metering bring to the table.
The next question from our readers should then be: are there questions about the programs used for this purpose? After having access to the source code and applications for more than 12 months I can only say that I have parsed through it all innumerable times and I have found nothing that NVIDIA has done that is disingenuous. Even better, we are going to be sharing all of our code from the Perl-based parsing scripts (that generate the data in the graphs you’ll see later from the source XLS file) as well as a couple of examples of the output XLS files.
Not only do we NEED to have these tools vetted by other editors, but we also depend on the community to keep us on our toes as well. When we originally talked with NVIDIA about this project the mindset from the beginning was merely to get the ball rolling and let the open source community and enthusiast gamers look at every aspect of the performance measurement. That is still the goal – with only one minor exception: NVIDIA doesn’t want the source code of the overlay to leak out simply because of some potential patent/liability concerns. Instead, we are hoping to have ANOTHER application built to act as the overlay; it may be something that Beepa and the FRAPS team can help us with.
:whiste:

Irrelevant as that is a quotation from the NDA lift from nvidia for FCAT. They were releasing results long before that point when no one else was, with zero disclosure. There were hints that nvidia was involved with PCPER's results before the NDA lift, but not even from PCPER, it was alluded to from an article at techreport on what they were doing, referring to nvidia's involvement as 'a big industry player'.

None of this says anything as to the veracity, motives and impetus for their current post-NDA reveal or prior results. That is important to note, it just indicates questionable integrity and methodology in trying to deliver impartial reviews with the customer's best interests as the driving reason for studies. Further activities like trying to pass of a review on an unreleased card and drivers using completely different hardware and drivers just paint more of a shoddy approach to proper review methods.

It still is left in the reader's subjective hands to decide if those aberrations matter or not when reading the results. We also have many other sites with the tool in hand now, without the questionable behaviour, so soon we will have more than one source, and sources not as deeply embedded with nvidia as PCPER appears to be. :thumbsup:
 
Last edited:

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
From Article #1, page 1, last paragraph:


:whiste:

Article #1 was back in jan 3, no word about NV involvement.
Article #2 was back in jan 16, no word about NV involvement.
Article #3 was back in feb 22, no word about NV involvement.

Also, I'm not saying there's anything phony about NV and PCPer. PCPer is just giving themselves more credit than they deserve.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Can anyone else spot the problems with this picture?

FrameInterval_575px.jpg


That is not how frames are drawn on the screen at all.

I do understand what you feel is wrong, and it all depends on how you interpret what was shown. It is true that those are the locations the Game Frames show up in the display, but it is also true that you see the bottom part of the frame first, when they are at two different Displayed frames.

The idea is correct, you are just getting tripped up on a specific not meant to be conveyed.
 

willomz

Senior member
Sep 12, 2012
334
0
0
Article #1 was back in jan 3, no word about NV involvement.
Article #2 was back in jan 16, no word about NV involvement.
Article #3 was back in feb 22, no word about NV involvement.

Also, I'm not saying there's anything phony about NV and PCPer. PCPer is just giving themselves more credit than they deserve.

It does sound as though they got the ball rolling, that they gave birth to the idea. That's the big part for me.

But hey if you want to give all the credit to Nvidia that's fine.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
It does sound as though they got the ball rolling, that they gave birth to the idea. That's the big part for me.

But hey if you want to give all the credit to Nvidia that's fine.

Just sitting here trying to figure out what difference it makes anyways. Just a distracting talking point? I dunno. Doesn't really matter. PCPer is performing really nice coverage.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
This is true. Using RadeonPro I can eliminate or vastly reduce Crossfire microstutter/frametime problems. Vsync, Triple Buffering, Flip queue sizes, FPS caps can all be used to work around the problem. Even if you do prefer to use no vsync or FPS caps it is possible to work around the problem using flip queue adjustments.

It does mean CF can be a case of trial and error to get working well, but it can be done. I would still recommend SLI if you were starting from scratch, but if you have an AMD 7xx0 series card and are prepared to tinker, going CF does still give a massive performance boost.

Im not so sure about this, 1)- are the FPS measured by FRAPS accurate then if half the frame are RUNTS and 2)- you need to limit the frames from 100 down to 60...what is the point of the 2nd GPU?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Im not so sure about this, 1)- are the FPS measured by FRAPS accurate then if half the frame are RUNTS and 2)- you need to limit the frames from 100 down to 60...what is the point of the 2nd GPU?

I can answer this - a crap ton more of AA and IQ tweaks.

1x7970:
WoW Ultra @ 1440p stuck using 8xMSAA.
60 FPS - 90% of the time, dips into the 40-50FPS during raids/traveling

2x7970s:
WoW Ultra @ 1440p able to switch up to 4xSSAA+EQ + HQ-AFx16/HQ Bitmapping
60 FPS - 90% of the time, dips into the 40-50FPS during raids/traveling


1x7970:
Bioshock: Infinite Ultra + DDoF @ 1440p == 40-50FPS (>90% GPU utilization)
2x7970:
Bioshock: Infinite Ultra + DDoF @ 1440p == 60FPS (with overhead, as GPUs are pegged at bout 60-70% usage but not many IQ options to force unfortunately.)

1x7970:
Tomb Raider Ultra+TressFX+FXAA @ 1440p == 40-50FPS
or
Tomb Raider Ultra+TressFX+2xSSAA @ 1440p == ~30FPS
2x7970:
Tomb Raider Ultra+TressFX+FXAA @ 1440p == 60FPS
or
Tomb Raider Ultra+TressFX+2xSSAA @ 1440p == >50FPS


For the record, I've always used v-sync since I can't stand tearing. So far only WoW stutters like a bastard and that seems to be only during the world map, dungeons/raids - it works unbelievably smooth.

I've only had my CFX for about 4 days, and after getting over the Radeon Pro Learning curve, I've decided to keep the second card. I'm applying more AA and IQ tweaks to my games without performance issues or stuttering (again WoW being the only exception) which to me is a huge perk.

Of course everyone is different, which is why PC gaming is still the best - options!
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I can answer this - a crap ton more of AA and IQ tweaks.

1x7970:
WoW Ultra @ 1440p stuck using 8xMSAA.
60 FPS - 90% of the time, dips into the 40-50FPS during raids/traveling

2x7970s:
WoW Ultra @ 1440p able to switch up to 4xSSAA+EQ + HQ-AFx16/HQ Bitmapping
60 FPS - 90% of the time, dips into the 40-50FPS during raids/traveling


1x7970:
Bioshock: Infinite Ultra + DDoF @ 1440p == 40-50FPS (>90% GPU utilization)
2x7970:
Bioshock: Infinite Ultra + DDoF @ 1440p == 60FPS (with overhead, as GPUs are pegged at bout 60-70% usage but not many IQ options to force unfortunately.)

1x7970:
Tomb Raider Ultra+TressFX+FXAA @ 1440p == 40-50FPS
or
Tomb Raider Ultra+TressFX+2xSSAA @ 1440p == ~30FPS
2x7970:
Tomb Raider Ultra+TressFX+FXAA @ 1440p == 60FPS
or
Tomb Raider Ultra+TressFX+2xSSAA @ 1440p == >50FPS


For the record, I've always used v-sync since I can't stand tearing. So far only WoW stutters like a bastard and that seems to be only during the world map, dungeons/raids - it works unbelievably smooth.

I've only had my CFX for about 4 days, and after getting over the Radeon Pro Learning curve, I've decided to keep the second card. I'm applying more AA and IQ tweaks to my games without performance issues or stuttering (again WoW being the only exception) which to me is a huge perk.

Of course everyone is different, which is why PC gaming is still the best - options!

Thanks for the info on your Crossfire setup. Are you using dynamic vsync in Radeon Pro? If so, that isn't going to reduce tearing (or stuttering) when you drop below 60fps. On the flipside, if you just use standard vsync, you'll actually be running at an effective 30fps, because almost all your averages are below 60fps. You'll notice that FRAPS will report an fps between 30 and 60, but the frametimes indicate what's really going on - it will be locked at 30fps. And when I play Tomb Raider on my HD7870 with vsync, it sometimes drops down to an effective 20fps, the next step below 30fps.

By the way, don't forget to update your sig!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think the motives are clear: nVidia may have a competitive advantage and desire awareness for it and helped create tools.

They're being pro-active about it!
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,044
3,831
136
Im not so sure about this, 1)- are the FPS measured by FRAPS accurate then if half the frame are RUNTS and 2)- you need to limit the frames from 100 down to 60...what is the point of the 2nd GPU?


you realise that 100 FPS with 40 runts is better then 60 FPS with 0 runts. The reason is all the simulation is still tied to the renderer so you get the simulation run more often.

AMD have stated that they think this is "better" then pacing frames so they come out at an even interval, but in future drivers(june/july) they are going to provide a slider in CCC to change the behaviour.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
you realise that 100 FPS with 40 runts is better then 60 FPS with 0 runts. The reason is all the simulation is still tied to the renderer so you get the simulation run more often.

AMD have stated that they think this is "better" then pacing frames so they come out at an even interval, but in future drivers(june/july) they are going to provide a slider in CCC to change the behaviour.

That is debatable, as the simulation is too near the previous simulation, almost nothing happens. Though it may not hurt either.

Ultimately, it would be better to space it out some, which also gives more evenly space input points.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,044
3,831
136
That is debatable, as the simulation is too near the previous simulation, almost nothing happens. Though it may not hurt either.

Ultimately, it would be better to space it out some, which also gives more evenly space input points.


that's why i said "better" as it is completely subjective. I as a rule of thumb dont go dual card configs so i cant talk from direct experience as to what method is "better".
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
you realise that 100 FPS with 40 runts is better then 60 FPS with 0 runts. The reason is all the simulation is still tied to the renderer so you get the simulation run more often.

AMD have stated that they think this is "better" then pacing frames so they come out at an even interval, but in future drivers(june/july) they are going to provide a slider in CCC to change the behaviour.

I guess if that is the only option -- but what if one can receive 90-95 FPS with zero runts?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
It does sound as though they got the ball rolling, that they gave birth to the idea. That's the big part for me.

But hey if you want to give all the credit to Nvidia that's fine.

More like given a ball and told to go play with it in the street where every one could see. Once better sites have started to finish all their testing it will be interesting to see how they reflect on each other's results. Of note the other sites with the tool have released some results - but reserved final judgements or conclusions - unlike pcper who was running to judgement months ago, as well as being so eager to judge unreleased hardware they created a fantasy review on it.

Credit has to go to nvidia for what they have done here; found a new metric that they do better at and pushing it out trying to get it used. I've seen some of the fraps frametime test results done on GTX 4XX and 5XX cards and the results were terrible. This big push may be the result of them having been aware of that problem with the Fermi cards, improving on it hugely with the 6XX Kepler cards and then wanting recognition of that.

I'm waiting for the full reviews from TR, Tom's and most of all here at AT. The whole notion of the value of the size of each frame being rendered, the value of pushing frames as fast as possible versus adding latency to even the time the frames are sent to your monitor at and what this means for the experience seems to be where the subjective/objective(or lack of) views comes into play.

We need to hear conlcusions from more than just one site, a site echoing some of what may possibly be nvidia mouthpiece-work.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You do have to recognize that Pcper has had the software and hardware the longest and has been working on the project for over a year, so they have spent much more time pondering the results and what they mean.

Though they would be more well received if they let the end user make more of the judgements, as it is hard not to come to their conclusions anyways, but doesn't upset the ones not ready to accept it.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,044
3,831
136
I guess if that is the only option -- but what if one can receive 90-95 FPS with zero runts?

on a 60hz monitor probability not that much of a difference, so long as the 100 FPS with 40 runts completes a complete frame every 16.7ms. On a 120hz monitor 90 FPS with zero runts would obviously be better.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Originally Posted by pcper
NVIDIA's Involvement
You may notice that there is a lot of “my” and “our” in this story while also seeing similar results from other websites being released today. While we have done more than a year’s worth of the testing and development on our own tools to help expedite a lot of this time consuming testing, some of the code base and applications were developed with NVIDIA and thus were distributed to other editors recently.
NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file. Obviously, NVIDIA has a lot to gain from this particular testing methodology: its SLI technology looks much better than AMD’s CrossFire when viewed in this light, highlighting the advantages that SLI’s hardware frame metering bring to the table.
The next question from our readers should then be: are there questions about the programs used for this purpose? After having access to the source code and applications for more than 12 months I can only say that I have parsed through it all innumerable times and I have found nothing that NVIDIA has done that is disingenuous. Even better, we are going to be sharing all of our code from the Perl-based parsing scripts (that generate the data in the graphs you’ll see later from the source XLS file) as well as a couple of examples of the output XLS files.
Not only do we NEED to have these tools vetted by other editors, but we also depend on the community to keep us on our toes as well. When we originally talked with NVIDIA about this project the mindset from the beginning was merely to get the ball rolling and let the open source community and enthusiast gamers look at every aspect of the performance measurement. That is still the goal – with only one minor exception: NVIDIA doesn’t want the source code of the overlay to leak out simply because of some potential patent/liability concerns. Instead, we are hoping to have ANOTHER application built to act as the overlay; it may be something that Beepa and the FRAPS team can help us with.
Hmm:sneaky:
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally Posted by pcper
NVIDIA's Involvement
You may notice that there is a lot of “my” and “our” in this story while also seeing similar results from other websites being released today. While we have done more than a year’s worth of the testing and development on our own tools to help expedite a lot of this time consuming testing, some of the code base and applications were developed with NVIDIA and thus were distributed to other editors recently.
NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file. Obviously, NVIDIA has a lot to gain from this particular testing methodology: its SLI technology looks much better than AMD’s CrossFire when viewed in this light, highlighting the advantages that SLI’s hardware frame metering bring to the table.
The next question from our readers should then be: are there questions about the programs used for this purpose? After having access to the source code and applications for more than 12 months I can only say that I have parsed through it all innumerable times and I have found nothing that NVIDIA has done that is disingenuous. Even better, we are going to be sharing all of our code from the Perl-based parsing scripts (that generate the data in the graphs you’ll see later from the source XLS file) as well as a couple of examples of the output XLS files.
Not only do we NEED to have these tools vetted by other editors, but we also depend on the community to keep us on our toes as well. When we originally talked with NVIDIA about this project the mindset from the beginning was merely to get the ball rolling and let the open source community and enthusiast gamers look at every aspect of the performance measurement. That is still the goal – with only one minor exception: NVIDIA doesn’t want the source code of the overlay to leak out simply because of some potential patent/liability concerns. Instead, we are hoping to have ANOTHER application built to act as the overlay; it may be something that Beepa and the FRAPS team can help us with.

I can play to, baiter.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The thing is, the problem was discovered before Nvidia involvement.

Just using FCAT, they were able to record the runt frames and show stuttering as a result. http://www.pcper.com/reviews/Graphi...art-3-First-Results-New-GPU-Performance-Tools

The problem was, to deliver all the specific details of frame times to create graphs, they needed a script to be able to collect and parse through the data. Pcper was unable to create such a piece of software, and Nvidia helped them out, as they did realize this could be good marketing for them, given their advantage in this area.

AMD acknowledges the data, though the one interview I saw said that they had always focus purely on latency, and never considered spacing of frames as something to consider.
http://www.anandtech.com/show/6857/amd-stuttering-issue

I don't see a reason to doubt the claims. How much it affects stuttering is open for debate. How much crossfire helps is questionable when these runt frames are present.

At current, it appears that any sort of FPS limiter can straighten out the issue. FPS limiters can even include v-sync and CPU bottlenecks. When the GPU's are fed data slower than they can handle, it creates some natural spacing out of frames.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Man what review sites will be left? [H], PCPER, and TechReport have been blasted this gen because of AMD not providing as smooth of an experience.

It seems anyone producing any subjective or objective results outside of avg fps is getting blasted!
 

tygeezy

Senior member
Aug 28, 2012
300
14
81
More like given a ball and told to go play with it in the street where every one could see. Once better sites have started to finish all their testing it will be interesting to see how they reflect on each other's results. Of note the other sites with the tool have released some results - but reserved final judgements or conclusions - unlike pcper who was running to judgement months ago, as well as being so eager to judge unreleased hardware they created a fantasy review on it.

Credit has to go to nvidia for what they have done here; found a new metric that they do better at and pushing it out trying to get it used. I've seen some of the fraps frametime test results done on GTX 4XX and 5XX cards and the results were terrible. This big push may be the result of them having been aware of that problem with the Fermi cards, improving on it hugely with the 6XX Kepler cards and then wanting recognition of that.

I'm waiting for the full reviews from TR, Tom's and most of all here at AT. The whole notion of the value of the size of each frame being rendered, the value of pushing frames as fast as possible versus adding latency to even the time the frames are sent to your monitor at and what this means for the experience seems to be where the subjective/objective(or lack of) views comes into play.

We need to hear conlcusions from more than just one site, a site echoing some of what may possibly be nvidia mouthpiece-work.
Were these in single or dual gpu configs? I can't seem to find any data like this for older cards such as my geforce 480.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
The thing is, the problem was discovered before Nvidia involvement.

Just using FCAT, they were able to record the runt frames and show stuttering as a result. http://www.pcper.com/reviews/Graphi...art-3-First-Results-New-GPU-Performance-Tools

Again incorrect. That colour overlay software was from nvidia, also, the vernacular 'runt frames' is nvidia terminology directly from them to describe a shorter frame.

http://www.tomshardware.com/reviews/graphics-card-benchmarking-frame-rate,3466-2.html

That article was done with nvidia involvement, that involvement appears to have been there right from the beginning without disclosure.


It's just good contextual information to receive the information in. I don't think anyone is saying it's all a bunch of nonsense what they are producing. Just the last site on a long list I would feel comfortable believing is impartial. Generally you will always have different perspectives. You'll easily find some who take any information, even if the source was an anonymous poster on Craigslist, as gospel so long as it said something that aligned with an opinion that validated their feelings.

Since there is so much subjective opinion related to these conclusions, and particularly with pcper using an arbitrary metric with genesis from themselves and nvidia to literally discard a % of performance, good context is necessary due to some of their suspect approaches to research.

My best guess is that nvidia knew they had done lots of work to improve this metric and no one was paying attention to it. They worked with a site willing to do so without disclosure and would help to get the ball rolling for them with generating awareness for these measurements. I'd never heard of this site until I saw Titan owners complaining on overclock.net about GPU boost 2.0 on Titan not performing as advertised. They were linking to a youtube video of an nvidia marketing rep describing how it supposedly would work when making this claim. Turned out it was pcper doing the video with them, I then saw they do a video with an nvidia marketer every time they launch a card in tandem with their always glowing review of said card...

These kinds of statements will always ruffle the feathers of anyone who just wants to take whatever data they can that validates their personal feelings. But in the small fry world of computer hardware review sites, you have to pay attention to partiality with so many of these smaller sites fighting for air space and page hits/advert revenue.

That said, this is worthwhile information and definitely has value because of the nature of multi-gpu. I want to see more since I exclusively use multi-gpu setups and have for a long time now. One thing I noticed going from 480s in SLI to 680s was a dramatic difference in the feeling of smoothness, so there is something to it from one setup to another. So I look forward to seeing the full gamut of testing from multiple sites, rather than one with poor adherence to good research practices when done for the good of consumers.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Again incorrect. That colour overlay software was from nvidia, also, the vernacular 'runt frames' is nvidia terminology directly from them to describe a shorter frame.

http://www.tomshardware.com/reviews/graphics-card-benchmarking-frame-rate,3466-2.html

Did I not say the script to analyze the data was from Nvidia? That includes a water coloring to help view it.

Sorry, but you are just looking at the most recent data. This has been on going for a while and it was not initiated by nvidia, but nvidia did create the tool to highlight the frames with color, as well as the script to analyze the data in a large scale, but the initial project was Pcper's alone, and the data capture, even without the color overlays, shows the runt frames.

Does adding color really make the data unreliable?
 
Last edited: