• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

A minimum average framerate petition Edit:HTML Graphs!! Anand's still might be watching. Keep it going!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Bovinicus
Woodchuck, no one is suggesting that the average FPS be removed from reviews. How could it possibly hurt things to add this in conjunction with the average framerate? I say that a standard deviation should also be part of the results. This would really give a lot of meaning to the average.
I'm more questioning the definitions given and the conclusions drawn. Extra data is generally a good thing, but if people are drawing nonsensical conclusions from it then it's not.

Edit: BTW, I agree that the SD would have more bearing on consistency than any kind of lowest decile statistic or any kind of 'minimum' FPS score.
 
Originally posted by: Darien
actually...it they ran a standardized benchmark..

FPS vs time graphs would be nice. Then from there the average can be calculated as well as min and max. not to mention standard deviation...

BTW these should be scatter graphs of time to render each frame, not running averages of FPS or frame time (running average is the default in most graphing programs, which hides or exaggerates different effects depending on effect duration vs. graph granularity. Scatters give a much truer picture.

True minimum FPS = 1/slowest frame time

Minimum FPS calculated from conventional graph would depend on graph granularity, how many faster frames got captured in the same sample, etc. For example, the data fishtank gave is actually a running average of framerate with a granularity of 1s. If in one of the one second samples, the game pauses for 0.5 seconds, then has a burst showing 120FPS for the next 0.5s, the average over that 1s interval will be 60FPS and the information that one of the frames took 0.5s (basically a slowdown to 2FPS) is lost.
 
To glugglug:That is the major limiitation of FRAPS, and thus, my method. I've E-mailed the creater of FRAPS and i've requested per frame resolution.

That would be rather easy to implmenent. just make the dumps to the frapssec file as extrapolations of how long it took to draw each frame instead of the average of how many frames per second.

The only disadvantage of such a system is that it suddenly also places an I/O load of the system. That would probably result in a minor performance hit. I'm not sure if that would be entirely acceptable. The sollution would be to put the file in a small RAM drive, as i've considered, but that would complicate everything into a big mess.
 
The overhead of writing the time for each frame is absolutely negligible, as long as you don't specifically tell the OS to flush the file after each frame (at least minimal buffering).

Alternatively, the times can be held in an in-memory array that is written to disk at the end of the test. The call to QueryPerformanceCounter after each frame takes less than 1 microsecond, and assuming you get an extremely generous 100 frames per second, it would take almost 22 minutes for the buffer to fill even 1MB.
 
Yeah.. the key is..

Does this guy, who writes fraps, have time to spend on his program, to get per frame resolution working in the majority of benchmarking suites? I mean, it's freeware, and he even gives permission to reverse engineer his program. I don't think asking *him* is the key.

What I see is a confederation of coders working to reverse engineer fraps and make it dump the result per frame. Then sharing the credit.

The SFRFRAPS. Single Frame Resolution FRAPS. But I don't have the coding tallent, or the knowledge to do such a thing.
 
I agree with FishTankX on this one, minimum/graph's of framerates show the instantaneous behavior of the card, instead of the average behavior of the card.

I like to compare it to calculus, you can get the average value of a point using slopes from two different points, or you can get the exact slope at that point by using a derivative. Sure, if you compare two averages, it will be relatively reflective of the actual slope, but why not just take the exact slope, and find the exact comparison between two slopes?

I like how FishTankX explained that you can see the differences between the graphs, and maybe see an obvious flaw. It's harder to see this when using average FPS.

My vote is a yessir
 
So minimum framerate = 1/t where t is the longest time to render any given frame.
That is the true definition of a minimum but it's not necessarily what someone is after when they ask for minimums. If the minimum is just an excessive dip caused by a random fluctuation it's not going to be terribly useful.

Another measurement would be the sustained minimum where you measure the lowest number of frames over the largest possible time. That's when the '1' in 1/t becomes 'X' because it could be any value.
 
Originally posted by: BFG10K
So minimum framerate = 1/t where t is the longest time to render any given frame.
That is the true definition of a minimum but it's not necessarily what someone is after when they ask for minimums. If the minimum is just an excessive dip caused by a random fluctuation it's not going to be terribly useful.

Another measurement would be the sustained minimum where you measure the lowest number of frames over the largest possible time. That's when the '1' in 1/t becomes 'X' because it could be any value.

so a nice graph would give anice visual of that.
 
Im a versitile guy. I can handle 15 fps in 1st person shooters and still have a ball.

Methods seem good so count me in.
 
Back
Top