FCAT: The Evolution of Frame Interval Benchmarking

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The reasons seem pretty clear and obvious: nVidia may have a competitive advantage and desires awareness, and being pro-active about it by helping create tools.

What would you do? Not try to offer this information to the gamer?

I think SLI is better than crossfire in that crossfire usually works better with frame limiters, radeon pro, etc while SLI is better out of the box. So I agree, but to say that crossfire is the same as a single card is ridiculous. And then you have nvidia "suggesting" to throw certain frames out. Nvidia "nudging" PCPer to state that crossfire is the same as a single card. Now that last part is ridiculous,,,.. I haven't used 79xx in about a year but in my time of using CFX dual cards were definitely perceptibly better by a large margin.

I agree that tools are necessary. BUT, I don't think nvidia should have anything to do with such benchmarks. It needs to be a 3rd party to eliminate any questions about objectiveness - having nvidia with an obvious interest in the matter writing the benchmarks raises too many questions on the surface. Or perhaps they could release the source code to the public. Ryan at PCPer is not any type of software engineer/developer so his explanation on this is riduclous. Anyway, open source would work as well IMO.

So let me summarize. These tools are great. These tests are great. I applaud that effort. However, nvidia providing this benchmark raises far too many questions. In fact, if you look at the comments at PCPer many of the comments are questioning the validity due to nvidia creating it. It needs to either be:

A) Written by a third party

or

B) Completely open source to the public.

Period. Nvidia claiming patent issues with the source code is laughable at best. It's a benchmark which isn't being sold. Maybe it's being used to help sell a product. Regardless, being third party or open source completely will eliminate any questions about it.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Chill out, Ryan already explained how NV wants there to be an alternate program measuring the same thing so as to remove the perception of conflict of interest, even though everyone using the program says that they haven't seen any underhandedness in the program.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Chill out, Ryan already explained how NV wants there to be an alternate program measuring the same thing so as to remove the perception of conflict of interest, even though everyone using the program says that they haven't seen any underhandedness in the program.

Like I said , Ryan is in no way qualified to make such a statement; he is not a software engineer or developer nor does he have such knowledge.

Additionally, again - making the tools completely 100% open source (if not third party) will eliminate any such questions of objectiveness. Period. Nvidia should not be providing their own benchmarks, too many questions arise.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
IF there is an issue with the tool the explanation should come from AMD not forum members.So far AMD has acknowledged no issues with the tool at all.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Ideally, one would like to see third party tools but they don't exist yet. Hopefully, this awareness may help with the creation of some.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I find it interesting that they are inundating us with articles that all show the exact same issue. We already know so what's the point of posting the same thing every 2 days? None of this is new information.

That combined with Nvidia providing the tools is extremely suspect.

Of course it is -- different sku's -- single and multi configurations.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think SLI is better than crossfire in that crossfire usually works better with frame limiters, radeon pro, etc while SLI is better out of the box. So I agree, but to say that crossfire is the same as a single card is ridiculous. And then you have nvidia "suggesting" to throw certain frames out. Nvidia "nudging" PCPer to state that crossfire is the same as a single card. Now that last part is ridiculous,,,.. I haven't used 79xx in about a year but in my time of using CFX dual cards were definitely perceptibly better by a large margin.

I agree that tools are necessary. BUT, I don't think nvidia should have anything to do with such benchmarks. It needs to be a 3rd party to eliminate any questions about objectiveness - having nvidia with an obvious interest in the matter writing the benchmarks raises too many questions on the surface. Or perhaps they could release the source code to the public. Ryan at PCPer is not any type of software engineer/developer so his explanation on this is riduclous. Anyway, open source would work as well IMO.

So let me summarize. These tools are great. These tests are great. I applaud that effort. However, nvidia providing this benchmark raises far too many questions. In fact, if you look at the comments at PCPer many of the comments are questioning the validity due to nvidia creating it. It needs to either be:

A) Written by a third party

or

B) Completely open source to the public.

Period. Nvidia claiming patent issues with the source code is laughable at best. It's a benchmark which isn't being sold. Maybe it's being used to help sell a product. Regardless, being third party or open source completely will eliminate any questions about it.

Let's discuss this on throwing away runts -- how do runts improve the experience?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I think SLI is better than crossfire in that crossfire usually works better with frame limiters, radeon pro, etc while SLI is better out of the box. So I agree, but to say that crossfire is the same as a single card is ridiculous. And then you have nvidia "suggesting" to throw certain frames out. Nvidia "nudging" PCPer to state that crossfire is the same as a single card. Now that last part is ridiculous,,,.. I haven't used 79xx in about a year but in my time of using CFX dual cards were definitely perceptibly better by a large margin.

I agree that tools are necessary. BUT, I don't think nvidia should have anything to do with such benchmarks. It needs to be a 3rd party to eliminate any questions about objectiveness - having nvidia with an obvious interest in the matter writing the benchmarks raises too many questions on the surface. Or perhaps they could release the source code to the public. Ryan at PCPer is not any type of software engineer/developer so his explanation on this is riduclous. Anyway, open source would work as well IMO.

So let me summarize. These tools are great. These tests are great. I applaud that effort. However, nvidia providing this benchmark raises far too many questions. In fact, if you look at the comments at PCPer many of the comments are questioning the validity due to nvidia creating it. It needs to either be:

A) Written by a third party

or

B) Completely open source to the public.

Period. Nvidia claiming patent issues with the source code is laughable at best. It's a benchmark which isn't being sold. Maybe it's being used to help sell a product. Regardless, being third party or open source completely will eliminate any questions about it.

Tools and software are fine I expect. Otherwise distributing them and allowing third parties to use them would end up revealing something fishy if it was there.

It's the use of nvidia's parameters on what data is valid and isn't that is not proper adherence to good research when you are discussing them and their competition. The problem is the end result of the data and how it's chosen to be interpreted or what value is placed on the different results attained using it.

This is where it becomes noteworthy to take stock of overt nvidia involvement and ponderous review approaches. I think we should get a better, more impartial picture, once the next significant video card launches where the rest of the hardware sites with this tool are up to speed on it and deliver reviews that may contain some data from using it. Pcper is already towing the nvidia line that this amounts to the sole metric of worth in any video card review as well as using their parameters on what amount of a delivered frame is of worth, other sites have said the tool is merely additive, rather than all-encompassing. More food for thought on possible bias via claims that a tool and parameters straight from nvidia should be the paramount method of reviewing video card hardware....
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Like I said , Ryan is in no way qualified to make such a statement; he is not a software engineer or developer nor does he have such knowledge.

Additionally, again - making the tools completely 100% open source (if not third party) will eliminate any such questions of objectiveness. Period. Nvidia should not be providing their own benchmarks, too many questions arise.

Given that the tool seems to match up to a high degree w/ FRAPS which is not an NV product, it's more likely than not legit. Would it be nice if it were open source? Yes, but these are the tools available TODAY. If you are not interested in such tools then you can basically ignore the articles and wait for someone else you deem sufficiently disinterested to come up with similar software.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
As I understood Ryan's most recent articles, AT will start using FCAT but only after a generic/non-NVIDIA version is out, e.g., by the folks who do FRAPS?

So my first question is: when will this be available? I want to start seeing AT reviews with frame time graphs as the standard metric.

My second question is: what will this mean for the GPU section of anandtech.com/bench, where card comparisons are all about average frame rates?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Let's discuss this on throwing away runts -- how do runts improve the experience?

Also let's agree on a definition of what a runt should be. Note that in the abstract sense, the word itself is rather pejorative (a relative term of being smaller and/or weaker than others). Runt itself is therefore rather subjective in terms of word choice.

It would have been nice to use an objective word, or maybe a mathematical definition.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
http://www.tomshardware.com/reviews/graphics-card-benchmarking-frame-rate,3466-13.html

We've known about the existence of Nvidia's frame metering technology for years, and its efforts to quantify the benefits of spacing frames out evenly, minimizing dropped and runt frames, for months. It was only recently, however, that the company was willing or able to show off the fruits of its development efforts. Even today, the tools can be a little finicky. We would have had even more performance data, even, except that our X79 Express-based benchmark platform was spitting out FCAT data that clearly wasn't right. Switching over to Z77 Express at the last minute gave us the results we were looking for.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
http://www.tomshardware.com/reviews/graphics-card-benchmarking-frame-rate,3466-13.html

We've known about the existence of Nvidia's frame metering technology for years, and its efforts to quantify the benefits of spacing frames out evenly, minimizing dropped and runt frames, for months. It was only recently, however, that the company was willing or able to show off the fruits of its development efforts. Even today, the tools can be a little finicky. We would have had even more performance data, even, except that our X79 Express-based benchmark platform was spitting out FCAT data that clearly wasn't right. Switching over to Z77 Express at the last minute gave us the results we were looking for.


Interesting, and as it turns out, pcper is using X79.

Intel Core i7-3960X Sandy Bridge-E
ASUS P9X79 Deluxe
etc.

More good food for thought!
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
You don't think that runt frames are objectively smaller than normal frames?

In a litter of puppies, each puppy is objectively different sized from the other. But when do you draw the line and call a smallish puppy a runt? If he's 5% different?

I think if you use the definition of "smaller than normal" then that is not consistent with runt.

So the question remains, *how much* smaller before something is a runt? Surely a frame that is 99.99999999999999% as big as another frame is objectively smaller. But I can't wrap my mind around calling that a runt.

Runt would seem like, well, I don't know how much smaller than a 100% normal frame? Do you know where to draw the line?
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I agree that the concept of runt frames really needs to be better defined.

Having used both HD5000 and HD7000 Crossfire, I can say that the graphs showing performance of Crossfire being no better than a single card after discarding runt frames is just plain wrong. I'm sure it's not the 90-100% scaling that can often be found with FRAPS alone, but pinning down the real number is going to take more work.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
From the link GaiaHunter kindly offered:

Having said all of that, we're once again left with questions, even as we uncover a number of answers. While it seems obvious that a runt frame of fewer than 21 scan lines contributes little (or nothing) to the smoothness of a game, would a hardcore gamer see a quality improvement if we split the screen evenly by two, three, or even 50 frames composed of 22 or more lines on the screen? The FCAT tool is built to facilitate user-specified definitions of how large a runt frame can be, and we'll need to play with the script's switches to really dial-in our own recipe for performance evaluation. What we're presenting today is really what FCAT can do out of its proverbial box.


This is a fair point.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
According to THG, a runt frame is 20 rows or less, which is less than 2% of the screen. That's pretty damn small, you could easily say they were erring on the side of caution, and this was the default setting given to them by Nvidia. I might be willing to make runt frames 5% or less, if not higher. THG did say they plan to try different sizes in the future. That should be interesting.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I find it interesting that they are inundating us with articles that all show the exact same issue. We already know so what's the point of posting the same thing every 2 days? None of this is new information.

That combined with Nvidia providing the tools is extremely suspect.

A lot of people have been complaining on the comments of these articles that they only tested X cards, and that Y cards may not have a problem. This unfortunately is needed to show that it is not only a problem with a single card.

They also plan to test latency, since that is what AMD said they were focusing on, so they want to see what advantages there might be from AMD's methods.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
According to THG, a runt frame is 20 rows or less, which is less than 2% of the screen. That's pretty damn small, you could easily say they were erring on the side of caution, and this was the default setting given to them by Nvidia. I might be willing to make runt frames 5% or less, if not higher. THG did say they plan to try different sizes in the future. That should be interesting.

Why not 10% or less?

Or 20% or less?

What does that mean to me when I'm playing a game?

And does the process NVIDIA uses to send frames at regular intervals also induce latency? Or is that Vsync only?

Is induced latency caused by vsync or a frame limiter actually having an impact?

What happens when a 120hz screen is used?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Why not 10% or less?

Or 20% or less?

What does that mean to me when I'm playing a game?

And does the process NVIDIA uses to send frames at regular intervals also induce latency? Or is that Vsync only?

Is induced latency caused by vsync or a frame limiter actually having an impact?

What happens when a 120hz screen is used?

You may want to send your questions to Ryan directly if you are serious about their fine-tuning their testing methodology.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Why not 10% or less?

Or 20% or less?

What does that mean to me when I'm playing a game?

And does the process NVIDIA uses to send frames at regular intervals also induce latency? Or is that Vsync only?

Is induced latency caused by vsync or a frame limiter actually having an impact?

What happens when a 120hz screen is used?

As far as the size goes, I think they erred on the side of caution, but at what point do you stop? I don't know. The higher your FPS, the smaller the sections become anyways, but it takes about 3000 FPS for 2% to be the normal size.

There obviously is some latency induced by it. Though all it does it make a fast frame slowed down to be closer to the frame length of the prior frame, so it shouldn't induce a lot of latency, but we'll have to wait and see. It does add benefits of spaced out frames and input points. Actually, we do see some of the latency results in the Fraps data and it has shown to not be that different than AMD's current beta drivers, and better than prior to their fixes. More study would be good.

Exactly what matters is hard to define. We do see patterns. We do see people noticing differences, but it'll be a while before we can come up with an exact method of seeing what is bad, acceptable and good.

There is no way you can convince me that frames that are 2% in size, followed by 50% or larger sized frames is adding something visually useful. I would definitely say even larger than 2% is suspect as well, but exactly what point is going to be subjective. I do believe the percent could definitely scale with the FPS you have as well.

120hz monitors should increase the time each frame is on the screen, or increase their size, so 2% becomes even smaller in terms of time spent on the screen. Though a 120hz monitor is more capable of using v-sync.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I really wish a review site would do this with 120hz monitors. I'm interested to see the higher refresh rates impact on frame output.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I really wish a review site would do this with 120hz monitors. I'm interested to see the higher refresh rates impact on frame output.
I wouldn't mind that either, as I use a 120hz monitor.

What I'd expect is that without v-sync, the monitor's refresh rate should has no impact on the frame rendering times (I'm certain on this point really). However, as it displays the frames twice as often, it should result in twice the size of partial frames and less tears in general.