Personally I think average improvement measurement using a limited number of games is a bunch of crap. It really doesn't make sense when you talking about a suite of totally different games. Each game and the improvement on that game is what matters. So called average improvement isn't a true picture of what the card performs like at all. Let all the numbers stand on their own and then the potential buyer will determine if the game/application has enough improvement to warrant the purchase.
I don't know about that, dude. I mean I use my rig to play all sorts of different games. If we're talking about old games, then I'd agree, they'd be useless and skew the results in a bad way as the hardware would either be smoking those games or I'd be finished playing them long ago.
If it's a suite of current games then it's perfect. It's a good indicator of what I could expect from the card as newer games come out. I guess if someone is buying a card just to play one game then focussing on just a single game would make sense, but I just assume most gamers play lots of games. Particularly ones spending this sort of money.
Looking at HWC most
recent review these are the current games they bench;
Batman AC
Crysis 2
Battlefield 3
Deus Ex HR
Metro 2033
Dirt 3
Shogun 2
Skyrim
I've always found overall averages invaluable and generally play out well for what I can expect from a card overall in my usage of it. Particularly computerbase.de's overall charts, which break it down by overall performance at 1080P, 1200P, 1600P and also by different AA levels; 4x, 8x etc.