But if you are swapping various CPUs on socket LGA1156, you are keeping all of your other components constant. Legion Hardware clearly shows that with increased cpu clock speed, minimum framerates increase. I also never said minimums are only affected by CPU speed (of course GPU speed affects them as well).
If you are running a benchmark of a game, you aren't going to start FRAPS at the beginning of you loading the game either, or include cut scenes to scew your results. This approach would eliminate situations of 0 fps in less than useful real world scenarios you described (such as loading a screen, or a cut scene where no gameplay actually occurs). So you can plot a section of a benchmark demo over a graph and see how minimums affect gameplay. This is exactly what HardOCP does. Average framerates are often unable to capture excessive loads on a videocard and CPU where frames tank and your game chops.
I stated it before, and I'll state it again. Minimum FPS is a poor measurement of how a CPU actually performs. It is too unreliable. Average with lower graphical settings produces easier to reproduce results that give a better measurement of how a CPU will perform during droops then minimum could ever hope to.
Imagine for a moment that you have two cpus that you just benchmarked. one has a minimum of 18, the other of 20. Which would you say is faster? The fact of the matter is, you can't, with any confidence conclude that either is faster because far too many variables enter into play when picking a single point as the measurement standard. Maybe the OS picked an in-opportune time to switch threads, maybe the cache was aligned just right for that scene, who knows, there are literally thousands of variable that go on even when the game is the seemingly only running thing.
If your system is GPU bottlenecked, then your average is equally a worthless measure of how fast the CPU is (a fact you seem ok with.) You have to adjust settings until your CPU becomes the limiting factor and measure the average then. That will give an acurate report of which CPU is faster. It will have multiple points of measurement, something minimum FPS can't do.
I've done a lot of code benchmarking (profiling to be exact) and you can bet your buttons that I NEVER take just one measurement. It is always an average of 100->1000 (depending on the codes execution time.) With the exact same piece of code, I have almost never seen the same benchmark results, and this is with fairly small chucks of code running on the same machine. If you want, I can quickly code this up for you to prove my point.
A single measurement (Which is what minimum FPS is) is an unreliable performance metric.
Tell you what, load Dirt 2 gamedemo and run any level you want with 8AA, 4AA and 2AA on your videocard. You will see your avg frames will be hardly affected compared to minimums which would be cut by 50% if not more. Try playing Dirt 2 with 8AA vs. 2AA now and you'll see if you have an insufficient videocard, the 3x 0.1 millisecond instances where frames drop to 30fps will be choppy as hell, but you'll still get great 50+fps averages. In contrast with 2AA, you will still get 60+ fps average but 48 fps minimums! That's what I am saying. Just like loading a videocard affects minimum framerates, loading a cpu also affects minimum framerates. Not considering minimum framerates does not capture AT ALL what I am going to experience playing a game in the real world.
Considering minimum FPS is an exercise of futility which no more reflects what gameplay will be like then if you where to do a synthetic benchmark using 1990's benchmarking software. There aren't minimums, there is a minimum, singular. One measurement does not reflect reality no matter how you set up your benchmark.
*Trusted* - by who? is there an international committee that they are a part of? That's too ambiguous.
Just because most gaming websites don't use minimum framerates in their benchmarks, doesn't make it the right methodology. 50 years ago, many disputed the harmful effects of Mercury and cigarettes. Those were "trusted" sources. Independent thinking doesn't simply accept the status quo as such.
Trusted by everyone using this forum. Trusted, meaning it isn't some random name I pulled out of my butt to prove a point. These are websites that people other then myself have heard about and used. If I've never heard of a website, my first reaction is to not trust it. Only after seeing multiple references to said website do I trust it.
Just think about it for yourself, imagine you are a racecar cariver moving at 300km/h over a 10 km straightaway. Over 3x 300 meter sections you would completely lose engine power. Over the 10 kms, those 600 meters of loss of power will hardly affect avg. speed. However, you would be frustrated I would presume? That's 3 instances of minimum framerates of say 15fps, playing a game at constant 60fps. I guess it comes down to then how susceptible a gamer is to choppiness in gameplay.
And here is where your antidote is broken. Minimum doesn't report how often it dips, at all. The real situation is more like, your race car is driving along, there is a hickup which causes its output to drop very low. That hickup may not happen again for the rest of the race, yet you want to look at that single hickup and say "Oh, the race car must always slow down to that hickup speed. We can't use it." Yet if you race it again, the hickup may never happen again, or be to infrequent to be noticeable. Yet, just because it happens you would trash the engine and say that it is worthless.
Minimum FPS says nothing about the frequency of droops in the game