You are yet to define minimum frame-rate. If calculated using the lowest decile as shalmanese suggests, they will have some bearing on the consistency of the card in a specific benchmark.P.S. In reply to your question about what peak minimum FPS in a benchmark tells you, it tells you how low a certian card will drop in heavy action.
You're drawing buzzword-laden conclusion from a hypothetical situation within a hypothetical situation. Without masses of tests eliminating certain factors, there's no way to tell if a 2 second drop in frame rate is due to a system Anomaly, a flaw in the Game Engine, Driver issues, or indeed a problem with the card.And if the lentgh of the drop is significant, but one card has higher minimum framerate (possibly due to better Hyper-Z in overdraw laden areas)
and one has higher maximum framerate (possibly due to geometry engine strentghs, or other things that could possibly cause something to have a higher max) and their average is close, it's easier to pick out which one you would rather have when the action got rough. I know I would rather have the card that had the lower average but the much higher performance in intense situations. Wouldn't you? Right now, we have no way of knowing which one is which.
emphasis added[q/]
Or other things indeed.
If there's an easy way to get the Standard Deviation of a set of fraps, I'd like to see that included.And in reply to it's title, while graphs are useful, they're more time consuming. While minimum framerates are easy. It'd be much easier to implmenet minimum framerate's and I don't know if it's worthit to anand to implmenet graphs. I don't know if Anand will implmenet graphs, but if we can get minimum framerates in every benchmark, and one or two graphs, it'd be worthit to me. And essentially, minimum framerate is the most important part of the graphs. So the goal of this petition hasn't changed too much.
I still don't feel that my arguments have been answered so until they are I'll probably give this thread, interesting as it is, a break. I'll be interested to see if The Boss (TM) makes any changes based on this thread and if so, what he chooses to include.
Originally posted by: Dug
Just as a general point though;
As I see it, the point of a graphics card review is to show the potential of a card. Tests like 3DMark, Flybys and Timedemos (run on interactive engines such as Q3A and UT2K3) show what a card is capable of in a well-known, repeatable environment. The tests are repeated then averaged (mean) to smooth over any hiccups.
You all seem to be interested in minimum frame rates becuase when you're playing a game, the frame rate is most noticeable when it dips. The problem is that everyday gaming is not a repeatable or consistent environment.
Even if minimum frame rates were given in reviews, what would that tell you? Since the conditions under which the benchmarks are being run are non-interactive, they would have little bearing on the minimum frame rates you could expect in a game.
A review is designed to give a repeatable indication of the potential of a system/component. Once the component(s) in question are set loose in the real world a whole host of other factors influence things like the FPS (min and max.) Including minimum frame rates in a review would tell us nothing about expected real-world performance and nothing about the potential of the component(s) in question. They aren't included in reviews because they are more work for the tester and reveal little, if anything, about a system's performance.
Couldn't have said it better myself.
Because you can't duplicate real game play in a benchmark.
Minimum framerate is x/t where x (framerate) is as low as possible and t (time) is as high as possible.But what is a minimum framerate? is it simply 1/t where t is the longest time taken to render any single frame (i.e. the peak minimum) or if not, what?
Exactly and during the hectic action is when you're most likely to get your minimums which is why both minimums and averages are important during benchmark runs.As long as a game feels smooth however hectic the action, I'm not too fussed.
Because it gauges overall performance over a specific situation in actual gameplay. While I agree that benchmarks are only a small window into the actual gameplay (case and point: the effects of 64 MB VRAM over 128 MB VRAM - benchmarks don't show the whole story), it is possible to get really good ones by running very long tests with a wide range of diverse situations and/or to run a wide range of different benchmarks. Serious Sam in particular does this very well because it has six benchmarks to try and all of them are quite comprehensive.Why does average FPS in a benchmark translate into a world scenerio?
Those problems existed a long time ago (perhaps even in pre-Catalyst days) and have been fixed for quite some time.As BFG10K said, recent ATi drivers had problems with erratic framerate because of texture upload anamolies.
I'll try if I have time but I can't promise anything. See the PM.Note to BFG10K:It would be nice if you could also present some graphs of the Radeon9700 pro's performance in such games.
From a purely mathematical point of view, X/T where X is FPS (unit Frames*S^-1) and T is Time (Units S) would have units Frames*S^-2. Given that framerate already includes a time component (being Frames*S^-1,) How is dividing FPS by T (without specifying which time either) going to produce a meaningful result?Minimum framerate is x/t where x (framerate) is as low as possible and t (time) is as high as possible.
You're not dividing FPS by T, you're dividing the amount of frames rendered by T. After the division the unit of measure turns into frames per second.How is dividing FPS by T (without specifying which time either) going to produce a meaningful result?
It was the use of the word framerate that had me confused...You're not dividing FPS by T, you're dividing the amount of frames rendered by T. After the division the unit of measure turns into frames per second.
Looking back on my post "framerate" was probably not the best word to use in this situation.