A minimum average framerate petition Edit:HTML Graphs!! Anand's still might be watching. Keep it going!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
Ok. I wan't to see reliable Min FPS benchmarks.

But it has to show the time line along with an explanation why the fps went down.

Otherwise people will be reaching for straws.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
I think a better measure would be the 10th percentile frame rate rather than the minimum, ie, at what fps does the game perform better than 90% of the time. That way, you eliminate situations where the game can drop to 2 fps for 1 single frame and then stay at 100 fps the rest of the time. Its pretty common practice to use the 3/5/10th percentile rather than the absolute minimum.

So, if you were playing a timedemo for 40s and the 10th percentile fps was 30, that means there was 4 seconds where it dropped under 30fps. I dont quite know how chaotic the frame rates are so perhaps a 5th percentile would be better.
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
No offence FishtankX, but it seems ludicrous to overlay two FPS/Time graphs and start pulling definitive conclusions about why cards dip in certain places from that. How exactly can we deduce that a spike in the FPS is caused by poor occlusion culling or bad prefetch algorithms?

And you still havn't countered the main point that we dissenters are arguing;
Minimum Framerate is important in real world gaming. Benchmark scores do not relate directly to real world scores. It is therefore impossible to draw any conclusions about the nature of the real world performance from benchmarks.

BTW, why is the thread title "A Minimum Framerate Petition" When you're not asking for minimum framerates to be shown any more?
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
okay, dug, maybe I didn't explain it well enough last time. I'll try and be *really* clear now, because i'm famous for having to explain things twice. >.<

Here we go.

during huge battles, everyone knows that the framerate is gonna drop. That's a given. Thus, you would see a drop in framerate that is *normal*.

If you put the results of 2 cards into the same graph, like you used to see with the CPU and linpack results, you can see where one card might be faltering and one might be puttering along at full speed. At the same spot. That would give you some idea of an anomoly. That's relativley easy to do, to put the results of 2 card's into the same graphs. And it would definatley shed light on expectable minimum framerates, any anomolies (like one card crashing where another is still going strong, which would point to either a driver bug or some Hyper-Z implmenetation helping out, or color compression or whatever), etc.. the key is comparing two cards with graphs ontop of one another. That would be very useful.

The way I see it, just looking at the results by themselves provide very little useful information. Thus a comparison is necescescary. A comparison is what will really make these graphs significantly more useful than numbers. Minimum framerate numbers by themselves, while midly useful, aren't exactly what i'm looking for. I'm looking for a linpack style comparison between graphics cards, CPU's, and maybe even drivers and platforms. This is the goal of this petition. Not specifically to provide numerical results (although those would natrually be thrown in, although I would believe it would be most useful if the highest and lowest numbers were thrown out, IMHO) but rather to provide numerical results that add to what the graphs already tell us. Min, max, average, mean... (which would all be trivial to drive from the fraps dump) with a graph, for each card. That way we could get some real in depth analysis, which wouldn't normally be possible.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
And as for Woodchuck2000's insistnace that benchmarks have no real bearing on real world gamplay, I would think that would be what demo recorders are for, no?

And I didn't specifically say that you could pull a conclusion from a drop in framerate. But knowing about anomolies is the key to getting them solved. And further investigation into the issue, like, say, turning down the texture detail eleminated the anomoly, would automatically point to thrashing in one card. Or turning down the geometry detail, vastly boosting it's scores would be the indicator of a poor T&L implmentation... etc..

Example:In the future, R300 beats the NV30 in AA benchmarks when you take averages. But the NV30 seems to have a much higher framerate in intense scenes (Which would probably be labeled on the graph). This might point to the NV30 having a more effective color compression engine. Espically if turning down the texture detail evens it up. It might also point to the R300 texture thrashing. While this isn't anything concrete, it's something to consider, it's there, and it has a relevant impact on your purchase decision. While the reasons of the anomoly are not known, it's useful to know because if you use FSAA alot that means something to you.

While I don't that anand would take the time to do graphs for every game, minimum framerates (in percentiles, to keep from throwing off results, like Shalmanese said) would be a good adition to average framerates, to see which card pulls the bottom line better. And graphs would help in analaysis of strange behavior of a card, and detect anomolies. Alot of the reason why people say that Nvidia used to give a better gaming experience is because ATi's cards used to have erratic framerates due to a bug. That wasn't detected in your run of the mill benchmarks.

Anyways, i'm not gonna argue the point anymore, because if Anand chooses to include them for their worthiness, i'll be happy. If he decides they're worthless, i'll respect his decision. But I think it's a valuable thing to create a petition so that people can voice their ideas about how useful such a thing would be.

P.S. In reply to your question about what peak minimum FPS in a benchmark tells you, it tells you how low a certian card will drop in heavy action. And if the lentgh of the drop is significant, but one card has higher minimum framerate (possibly due to better Hyper-Z in overdraw laden areas) and one has higher maximum framerate (possibly due to geometry engine strentghs, or other things that could possibly cause something to have a higher max) and their average is close, it's easier to pick out which one you would rather have when the action got rough. I know I would rather have the card that had the lower average but the much higher performance in intense situations. Wouldn't you? Right now, we have no way of knowing which one is which.

And in reply to it's title, while graphs are useful, they're more time consuming. While minimum framerates are easy. It'd be much easier to implmenet minimum framerate's and I don't know if it's worthit to anand to implmenet graphs. I don't know if Anand will implmenet graphs, but if we can get minimum framerates in every benchmark, and one or two graphs, it'd be worthit to me. And essentially, minimum framerate is the most important part of the graphs. So the goal of this petition hasn't changed too much.
 

TMS

Member
Oct 12, 1999
67
0
0
Count me in for the graphs.

I for one would rather have a card that does steady 60 fps than a card that peaks high and low all the time...
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
P.S. In reply to your question about what peak minimum FPS in a benchmark tells you, it tells you how low a certian card will drop in heavy action.
You are yet to define minimum frame-rate. If calculated using the lowest decile as shalmanese suggests, they will have some bearing on the consistency of the card in a specific benchmark.
And if the lentgh of the drop is significant, but one card has higher minimum framerate (possibly due to better Hyper-Z in overdraw laden areas)
You're drawing buzzword-laden conclusion from a hypothetical situation within a hypothetical situation. Without masses of tests eliminating certain factors, there's no way to tell if a 2 second drop in frame rate is due to a system Anomaly, a flaw in the Game Engine, Driver issues, or indeed a problem with the card.

and one has higher maximum framerate (possibly due to geometry engine strentghs, or other things that could possibly cause something to have a higher max) and their average is close, it's easier to pick out which one you would rather have when the action got rough. I know I would rather have the card that had the lower average but the much higher performance in intense situations. Wouldn't you? Right now, we have no way of knowing which one is which.
emphasis added[q/]
Or other things indeed.

And in reply to it's title, while graphs are useful, they're more time consuming. While minimum framerates are easy. It'd be much easier to implmenet minimum framerate's and I don't know if it's worthit to anand to implmenet graphs. I don't know if Anand will implmenet graphs, but if we can get minimum framerates in every benchmark, and one or two graphs, it'd be worthit to me. And essentially, minimum framerate is the most important part of the graphs. So the goal of this petition hasn't changed too much.
If there's an easy way to get the Standard Deviation of a set of fraps, I'd like to see that included.
I still don't feel that my arguments have been answered so until they are I'll probably give this thread, interesting as it is, a break. I'll be interested to see if The Boss (TM) makes any changes based on this thread and if so, what he chooses to include.
 

fkloster

Diamond Member
Dec 16, 1999
4,171
0
0
Originally posted by: Dug
Just as a general point though;
As I see it, the point of a graphics card review is to show the potential of a card. Tests like 3DMark, Flybys and Timedemos (run on interactive engines such as Q3A and UT2K3) show what a card is capable of in a well-known, repeatable environment. The tests are repeated then averaged (mean) to smooth over any hiccups.

You all seem to be interested in minimum frame rates becuase when you're playing a game, the frame rate is most noticeable when it dips. The problem is that everyday gaming is not a repeatable or consistent environment.

Even if minimum frame rates were given in reviews, what would that tell you? Since the conditions under which the benchmarks are being run are non-interactive, they would have little bearing on the minimum frame rates you could expect in a game.

A review is designed to give a repeatable indication of the potential of a system/component. Once the component(s) in question are set loose in the real world a whole host of other factors influence things like the FPS (min and max.) Including minimum frame rates in a review would tell us nothing about expected real-world performance and nothing about the potential of the component(s) in question. They aren't included in reviews because they are more work for the tester and reveal little, if anything, about a system's performance.


Couldn't have said it better myself.

Because you can't duplicate real game play in a benchmark.



I agree
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
But what is a minimum framerate? is it simply 1/t where t is the longest time taken to render any single frame (i.e. the peak minimum) or if not, what?
Minimum framerate is x/t where x (framerate) is as low as possible and t (time) is as high as possible.

As long as a game feels smooth however hectic the action, I'm not too fussed.
Exactly and during the hectic action is when you're most likely to get your minimums which is why both minimums and averages are important during benchmark runs.

Averages show how well your rig does overall in a given situation.
Minimums show how poorly your rig does when things get tough, and whether it's powerful enough to stay above the required threshold of smoothness.

Why does average FPS in a benchmark translate into a world scenerio?
Because it gauges overall performance over a specific situation in actual gameplay. While I agree that benchmarks are only a small window into the actual gameplay (case and point: the effects of 64 MB VRAM over 128 MB VRAM - benchmarks don't show the whole story), it is possible to get really good ones by running very long tests with a wide range of diverse situations and/or to run a wide range of different benchmarks. Serious Sam in particular does this very well because it has six benchmarks to try and all of them are quite comprehensive.

Also when testing it's best to aim for the worst case scenarios and deliberately go after the situations that will slow your system down. That way you never delude yourself into thinking your system is running a game better than it really is. A 60 FPS average on an easy map will absolutely blow on a tough map which is yet another reason why the "you only need X FPS arguments are flawed".
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
As BFG10K said, recent ATi drivers had problems with erratic framerate because of texture upload anamolies.
Those problems existed a long time ago (perhaps even in pre-Catalyst days) and have been fixed for quite some time.

Note to BFG10K:It would be nice if you could also present some graphs of the Radeon9700 pro's performance in such games.
I'll try if I have time but I can't promise anything. See the PM.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
I dunno how relevant min FPS measurements are. If I'm pulling 60fps except for a 10 second drop to 30fps once in a while, I think the 60fps number is far more relevant than the 30.

Besides, most of the time the minimum framerate is due to the CPU, not the video card, and even the fastest card in the world can't keep up a minimum fps without dropping when paired with a slow cpu.
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Minimum framerate is x/t where x (framerate) is as low as possible and t (time) is as high as possible.
From a purely mathematical point of view, X/T where X is FPS (unit Frames*S^-1) and T is Time (Units S) would have units Frames*S^-2. Given that framerate already includes a time component (being Frames*S^-1,) How is dividing FPS by T (without specifying which time either) going to produce a meaningful result?

I'm yet to receive a decent explanation of what minimum FPS represents, unless it is simply the lowest (peak) value that the FPS reaches i.e. 1/t (t being the longest time taken to render any single frame.)

PS. this isnt' meant to be insulting/inflammatory, I simply don't understand the definition you've given.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Allright. I'd like to present an example of a useful metric of minimum framerate.

I'd like to present minimum framerate as a possible average of the lowest 10% (framerate wise) of the sequence.

Thus, you'd take the lowest 1/10th of the total (or the 10% margin) and average it. There you go. A minimum framerate average. While this is still an average, yes, it shows the worst case situation in a given benchmark. Just the slowest frame [to draw] itself might be deceptive, so you must have some sort of an average.

This would be a pretty useful metric, given it's much more important to know how a card will perform in the most hectic of situations, than to know the average of the whole situation. Honestly, if you can push that number over 50 in the most intense situations, you'll be allright in the rest of the game. Because honestly I believe the most intense situations in benchmarks are far worse than what you would encounter in any form of real gameplay.

Do you have any objections, woodchuck? Hopefully this metric, if included into our benchmarks, would give us some sort of an idea on how a card would hold up during the worst of firefights. And frankly, that, with the average of the whole sequence, with your normal statistical mean, mode, etc.. would be more than enough for me to judge a card's performance merits.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Here's the thing in a pretty simple way:
-You know average framerates.
-You know they aren't what you'll get.
-On the other hand, you know how it will relate to what you'll get compared to what you have now.

Min/Max FPS gives the same info, just more of it. I don't know about most of you, but I'd rather get a range of 40-80 FPS than 10-110, even though they both average to 60.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
How is dividing FPS by T (without specifying which time either) going to produce a meaningful result?
You're not dividing FPS by T, you're dividing the amount of frames rendered by T. After the division the unit of measure turns into frames per second.

Looking back on my post "framerate" was probably not the best word to use in this situation.
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
You're not dividing FPS by T, you're dividing the amount of frames rendered by T. After the division the unit of measure turns into frames per second.

Looking back on my post "framerate" was probably not the best word to use in this situation.
It was the use of the word framerate that had me confused...
So it should have read:
Minimum framerate is x/t where x (frames rendered) is as low as possible and t (time) is as high as possible.

Now the lowest possible x is 1
So minimum framerate = 1/t where t is the longest time to render any given frame.

:D.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
Woodchuck, no one is suggesting that the average FPS be removed from reviews. How could it possibly hurt things to add this in conjunction with the average framerate? I say that a standard deviation should also be part of the results. This would really give a lot of meaning to the average.
 

arcenite

Lifer
Dec 9, 2001
10,660
7
81
I think anything that will help people get a better idea of what the review is about, and it's performance is a great idea, no matter how big or small. . Count me in :)