A minimum average framerate petition Edit:HTML Graphs!! Anand's still might be watching. Keep it going!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
It doesn't matter. This is because a lot of reviews are comparing the card in question to the competition. The same situation must be repeated if a good comparison is to be drawn between the two. As well, reviewers have the capability to make demos in which the action is more intense than most anyone will ever come across. This means they can show a pretty good estimation as to what the minimum FPS will be. True, there is no exact science to tell people what performance they will get, but when is there? The systems they used in reviews are very high-end. Most people do not own systems of that caliber, so their performance will be totally different right from the start. The point of reviews is to compare various pieces of hardware to other pieces of hardware available on the market. This can still be done with minimum FPS taken into account.
Ok, let me tackle this from a different direction: What does the (peak) minimum FPS in a benchmark tell you?
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
It tells you how much lower you can expect one card to drop compared to another. It can show someone how low they expect their FPS to drop, as usually it is not the result of a CPU bottleneck that the frames drop. Not to say that never happens, but nonetheless, it will be easier to determine through benchmarks whether that is the cause or not. If the minimum FPS on two different cards are near identical, then the CPU is probably the reason for it. If they are not, then the video card should be what causes the minimum FPS to drop. It tells you the worst case scenario for FPS that you can "hope" for.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Put my name down too. While I test everything like this thoroughly myself anyway, it'd be nice to have Anandtech doing the same because it'll make the reviews more interesting to read.

Also Aces and Tech-Report do this as well and I really like their reviews.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
In addition it'll help to squash the "you only need X FPS" statements from people who don't understand the concept of an average framerate and what it really means.
 

ed21x

Diamond Member
Oct 12, 2001
5,411
8
81
the fact that you guys aren't suggesting this, but rather demanding it would piss me off if I were anand.
 

CrazySaint

Platinum Member
May 3, 2002
2,441
0
0
Demand? Who's demanding anything? Its not like we're threatening to stop reading his video card reviews if he doesn't do this or something. This is just a bunch of people requesting a very beneficial addition to the graphics reviews.
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
While I test everything like this thoroughly myself anyway,

Curious.. how do you get minimum fps in most benchmarks that translates into a reliable indicator of real game play?

 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Curious.. how do you get minimum fps in most benchmarks that translates into a reliable indicator of real game play?
My point exactly. I simply don't believe that minimum FPS in benchmarks will bear any relation to minimum FPS in games. I firmly believe that benchmarks are only indicators of the relative potential of two or more components. Minimum FPS would simply be a less meaningful indication of relative performance then Average FPS.

In fact, that's got me thinking, why do the (peak) min and max have any siginificance at all?
 

Darien

Platinum Member
Feb 27, 2002
2,817
1
0
actually...it they ran a standardized benchmark..

FPS vs time graphs would be nice. Then from there the average can be calculated as well as min and max. not to mention standard deviation...
 

Trevelyan

Diamond Member
Dec 10, 2000
4,077
0
71
Originally posted by: Darien
actually...it they ran a standardized benchmark.. FPS vs time graphs would be nice. Then from there the average can be calculated as well as min and max. not to mention standard deviation...

I agree.... FPS vs time graphs would rule... this way I could see what resolution would the fps always be above 60fps....
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Curious.. how do you get minimum fps in most benchmarks that translates into a reliable indicator of real game play?
Well for one thing a lot of game engines now show a minimum framerate (Unreal, ,UT, Serious Sam etc).

If they don't they usually have a framerate display and I use that to test the areas that I know slow down a lot. After that I'll try a wide range of actual gaming situations and keep an eye on the framerate counter while I'm playing.

I especially keep the framerate counter going when I buy new games because it helps me to better understand the dynamics of the engine and to guage overall performance.
 

Dudd

Platinum Member
Aug 3, 2001
2,865
0
0
Originally posted by: Dug
While I test everything like this thoroughly myself anyway,

Curious.. how do you get minimum fps in most benchmarks that translates into a reliable indicator of real game play?

Why does average FPS in a benchmark translate into a world scenerio? Seriously, this can go on all day long. Besides, just because the info is there does not mean that you or anyone have to pay attention to it.

 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Well for one thing a lot of game engines now show a minimum framerate (Unreal, ,UT, Serious Sam etc).
But what is a minimum framerate? is it simply 1/t where t is the longest time taken to render any single frame (i.e. the peak minimum) or if not, what? As long as a game feels smooth however hectic the action, I'm not too fussed. If it starts to feel jerky, I'll turn the detail down.

Why does average FPS in a benchmark translate into a world scenerio? Seriously, this can go on all day long. Besides, just because the info is there does not mean that you or anyone have to pay attention to it.
It doesn't translate into a real-world scenario! Thats my entire point! The average FPS in a benchmark is simply the best/easiest way of measuring the relative performance of two setups.

There's absolutely no point in trying to equate benchmark scores to real world performance, whether you take the Min, Max, Standard Deviation, Variance, Mean, Mode, Median or any combination of the above. That's not what benchmarks are designed to do. I have no objection to the inclusion of extra data, but if poor ol' Anand is gonna have to go to do extra work to get them then he shouldn't waste his time.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Sign me up. Hopefully this will squash all jokers who think they can run something in 2048x32 with 8x AA and 32-tap aniso because some review said the average fps was 60.
rolleye.gif


Chiz
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
Why does average FPS in a benchmark translate into a world scenerio?
I never said it did. Neither does minimum or maximum.

What BFG10K said about watching a frame rate counter while playing a game makes more sense than seeing what a benchmark will produce. Because the benchmark will not tell you for how long or why.

This is especially true in UT2K3. Where the benchmark and actual gameplay are completely different.

I guarantee you if BFG10K was watching the fps during a game and for one second he saw his frame rates drop to 20, but the rest of the time it was at 80, he probably wouldn't think too much about it. #1 because it would be near impossible to duplicate, #2 it didn't really effect game play.

If it dropped to 20fps for a significant amount of time then he would change resolutions or settings to compensate.
BUT AVG FPS WILL TELL YOU THIS. If you really ran at 20fps for a significant amount of time, you wouldn't get 80fps average.


The benchmark is good for telling you a performance difference between two machines. That's what avg fps do. Min or max fps doesn't.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I'm in. It doesn't need to be as complicated as the idea plays it out to be, but some sort of mention of a minimum frame rate, when it's possible, would be nice and helpful.
 

CrazySaint

Platinum Member
May 3, 2002
2,441
0
0
Originally posted by: Dug
Why does average FPS in a benchmark translate into a world scenerio?
I never said it did. Neither does minimum or maximum.

What BFG10K said about watching a frame rate counter while playing a game makes more sense than seeing what a benchmark will produce. Because the benchmark will not tell you for how long or why.

This is especially true in UT2K3. Where the benchmark and actual gameplay are completely different.

I guarantee you if BFG10K was watching the fps during a game and for one second he saw his frame rates drop to 20, but the rest of the time it was at 80, he probably wouldn't think too much about it. #1 because it would be near impossible to duplicate, #2 it didn't really effect game play.

If it dropped to 20fps for a significant amount of time then he would change resolutions or settings to compensate.
BUT AVG FPS WILL TELL YOU THIS. If you really ran at 20fps for a significant amount of time, you wouldn't get 80fps average.


The benchmark is good for telling you a performance difference between two machines. That's what avg fps do. Min or max fps doesn't.

If you get 120FPS most of the time, but get 5 seconds of < 20FPS for may well still get 60-80 average FPS and it absolutely will affect gameplay in some games, it will also make the game less enjoyable. I like the way Ace's Hardware shows a graph of FPS over the run of the benchmark. This way you can see the high and low FPS and how long they stay high and low.
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
If you get 120FPS most of the time, but get 5 seconds of < 20FPS for may well still get 60-80 average FPS and it absolutely will affect gameplay in some games, it will also make the game less enjoyable
Like I said- if he was running at 80FPS and it dipped down to 20FPS for a significant amount of time he wouldn't get an AVG 80FPS.
I never mentioned 120FPS.

There seems to be too many variables for this whole thing to work.

What would you do once you found out the Min FPS?
Is it the processor, video card, drivers, mb, ram, or the program being benchmarked, or the actual benchmarking program?

I appreciate the work that FishTankX has put into making the graphs. But all they really tell me is an avg FPS.
Sure you can see high's and low's, but why?

FishTankX has given me an example of the Radeon 8500 drivers performing poorly in Serious Sam even though when it was benchmarked it still showed about 60FPS. In this instance I can see why a minimum fps could be helpful, but I still see it as a flaw in the benchmarking software.

The real reason you guys want a Min FPS is to see if there are any glaring irregularities with the hardware out there. Don't blame ya, but I think someone would have brought it up with regular gameplay.

 

Darien

Platinum Member
Feb 27, 2002
2,817
1
0
Originally posted by: Darien
actually...it they ran a standardized benchmark..

FPS vs time graphs would be nice. Then from there the average can be calculated as well as min and max. not to mention standard deviation...



Looking at the thread title, looks like someone listened to me...:)

Replying to Thread: A minimum framerate petition Edit:Graphs!! Proof that it can be done. Anand's still might be watching. Keep it going!
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
You know Dug, the fact of the matter is that the max/min of the cards can vary wildly from card to card.

Mostly it has to do with implementations in occlusion culling and memory bandwidth saving techniques. This benchmark, would test the effectivness of such a method. Which is what I mentioned in the *first* itteration of this thread. I wanted to know how well Hyper-Z-III and LMA 3 helped the bottom lines of the framerates. It's not necescescarily an issue of minimum framerates as a number, because as you can see in my graphs, some quirk made some of the benchmarks get minimum framerates of 9 or so.

It's not the minimum framerate, or any other statistical *number* that counts. But if you overlapped two graphs onto eachother, it *would* count for *alot*. Beceause you can see the behavior of the two graphics cards, side by side, second by second, throughout the benchmark. And for me, a picture means a thousand numbers (words, for analysis too). Wouldn't it be intresting, if you could see anamolies or quirks where a certian graphics card bottomed out while another graphics card kept on going at full steam?

From those results, you could discern things as memory thrashing (which would help you discern the usefulness of color compression on memory usage) texture upload problems (Ala Radeon8500, because obviously the different AGP2X/4X/8X lines might show drastically different results, which would let you know that there was some sort of thrasing was going on.

A picture allows not necescescarily better understanding of the relative performance of a graphics card. Rather, it allow deep analysis on why it performs that way, and allows deep analaysis on which would be the better card. I would say, the Kyro got much better, more stable framerates than it's immediate mode renderer brothers, simply because it didnt have as much wildly fluctuating framerates due to overdraw. If the overdraw changes with imediate mode renders the framerate changes too. But tilers stay the same. Graphs would have shown that, and possibly gotten more people to buy the Kyro series of cards. Average framerate numbers told the story very poorly.

That's why I put so much emphasis on *graphs*. Because it would allow intresting analysis and conclusions that would otherwise be impossible, because it would be impossible to determine the framerate behavior from an average framerate number. I know I would sure as hell not buy a card that regularly dropped down to 30FPS in some benchmarks, like the Radeon85000LE of mine that regularly dropped down to sub 40 in Serious Sam 2 before they got the problems fixed.

Also, graphs would allow comparisons of *driver* revision. This would perhaps show very intresting variations in the different driver revisions (like fixed problems) that would otherwise be reflected as very small differences, if done by numbers. This, Dug, is the true advantage of graphs. I dont' see how you could possibly say that they wouldn't be useful. They'd be uber useful for driver comparisons, and they'd be even more useful for comparing the usefulness of AGP2X/4X/8X on different chipsets (determined mainly by memory bandwidth) and how they might affect gameplay that would otherwise be reflected as very small differences, if put into numbers.

An example:If you take an AGP2X card, and pit it against an AGP8X card, on a thrashing situation, the AGP8X card would get 4X better framerate. Let's say one was getting 15, the other would get 60. In other words, the AGP2X card would hickup very badly, while the AGP8X card would just keep on going. This wouldnt' be reflected in numbers if it was only for a period of 1 or 2 seconds (Since it's merely uploading new textures) but such things can be very disturbing in real gameplay (which is why I noticed a huge difference in minimum framerates moving from SDR to DDR on an AGP4X card on an Athlon system) but would rarely be reflected in numbers.

Also, it would also show *huge* differences between the Parhelia and the Radeon8500LE because one has occlusion culling and one wouldn't. This wouldn't be reflected in numbers because the areas in which overdraw jumps would be short (say some guy sneaks up behind you in a camera quirk and suddenly nothing except the guys back is visible) and have very little impact on numbers themselves. In graphs, they would be visible as massive drops in framerate. In such situations, it's possible that even a Kyro2 might beat a Geforce2 Pro because of the fact that it's a tiler. These kind of things, I would like to see. That's why I would like to see *graphs* to go *along* with *more* numbers. This would allow much more detailed analysis of the situation. Get my point, Dug?
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
By the way, the high and lows in ROMP are battles and lulls, while in flyby I've seen that it tends to happen when alot of textures crash on the scene at once. For botmatch, there aren't any really incredibly fast variances. Just scenes with few people and scenes with many.

And these graphs took a total of 15 minutes to make! Just benchmark, get the framerate dumps, dump the data into excel, graph, and your done. Simple! That's why I think it would be such a great addition. Even 1 or 2 graphed benchmarks would be enough to see the benefits and quirks of various cards under different conditions.