But BFG, in real gameplay, it's these tiny drops in constant frame rate that create jerkiness and ruin the gaming experience completely.
The only thing you can infer from a minimum is that theres one such drop in the entire benchmark. It doesnt tell you how long the drop happens for, and whether there are more such drops.
The example of Borderlands, it very likely an outlier for a game that hasn't been driver optimized on the new GTX 4xxx series.
I think you have the mistaken impression that aside from outlier games, a minimum is totally accurate and repeatable. That isnt the case at all.
Borderlands is but one of a plethora of examples that can show this. In fact, some games (e.g. UT2003) cant ever show an accurate minimum because they start measuring while the level data is still loading.
In each of these cases, where avg fps seem sufficient, you can see that minimum frame rates hamper playability.
You have no idea if that minimum hampers playability or not. You dont know how long it happens for, or how many times it happens. Only a framerate graph will tell you that information.
If out 10,000 rendered frames, 9999 are above 60 FPS but one is at 10 FPS, is that minimum significant?
And if another card gets 20 FPS at the same spot but 40 FPS in the other 9999 frames, I guess youd be picking the second card because it has a higher minimum?
I sure wouldnt, because for the other 9999 frames the first card is a lot better, and the minimum is insignificant given it only happens for 1/9999 of the benchmark run.
It's one thing to say mins don't matter in cut scenes. But in actual gameplay?
Im not saying minimums dont matter in gameplay. Im saying that in order for them to matter, you need to demonstrate they happen for a significant time period, enough to impact gameplay. A single minimum value doesnt show you that because by definition its a
single data point from an
entire benchmark run.