Many games on that list have a measurable improvement in gameplay when running over 60 FPS. A twitchy game like UT3 or Quake 3 is a slideshow at 60 FPS average. The Stalker games also bog down in the heavier areas if you’re only pulling 60 FPS.
And remember, these are averages, so the minimums will be lower.
First, games differ on gameplay feel, or else I may just have lower standards when it comes to what I feel is a good framerate. A game like Crysis with motion blur may feel smoother at 30fps than a game like Quake 3 at 45fps, just to use one hypothetical comparison. I don't agree that Quake 3 is a "slideshow" at 60fps average, though it may feel jerkier than, say, Crysis at 60fps.
Second, you are right that minimum framerates matter, and I totally agree that average is meaningless if the game continually spikes down. But if it's an isolated spike here or there, it matters less; a time-series chart of fps vs. time can show how big the problem is.
Third, and this a bit off-topic for this thread, but I may have lower image quality standards than other people so a lower graphics settings is fine by me. Imho, there are diminishing marginal returns to image quality so that the jump from no MSAA to 2x MSAA is bigger than the jump from 2x MSAA to 4x MSAA, and that's bigger than the jump from 4x MSAA to 8x MSAA. Just to give one example.
Other examples are like how tessellation in some games did not really improve image quality... I recall a lot of discussion over Metro 2033 for instance and some side-by-side and mouseover comparisons, and I agree that tessellation as implemented in that game didn't add much to image quality despite hurting framerates significantly.
Also, many multiplayer games are so fast-paced that you will hardly be able to enjoy the slight bump in image quality because you are too busy trying to kill or avoid being killed or trying to outrace another car or whatever. So turning down eye candy slightly to improve framerates can be a great tradeoff.