So you guys care more about your GPU performance at the end of its life cycle, rather than the beginning of its life cycle (when you're turning down settings anyway and looking to upgrade).
I think it's specifically that timeframe that is the most interesting one. Or, well, interesting to some at least.
Look at it from this perspective, two months after you bought the new card you're likely still playing those games that you knew were performing at x fps thanks to benches on the net. There's little reason to do anything else but look out for confirmation bias and if you can't find any, "Meh, I don't care".
But two years later you just happen to buy that new action-RPG and want to play the snot out of it. So you start it up, disable the settings that you know will hammer the performance on any card, put everything else to medium-high and off you go. Four hours down the line you enter the first major city and...
- you see fps in the low 40's with occassional stutters, don't want to turn settings down now that you got accustomed to those visuals, prompting you to impulse-buy a new card in august or
- you see fps in the high 40's with no stutters and continue to finish that game on this card, with the aim to get a good deal on black friday - which has a fair chance to get you much higher performance for the same amount of money that you would have spent in august otherwise.
Just a Gedankenspiel with absolutely no basis on real events, I assure you.
On a sidenote, I just looked at the Steam stats.
I took a moment to realize that the HD7900 Series still has 1.84%, losing only 0.11% from may results while the GTX670 and GTX680 have a combined 0.64%+0.43% = 1.07%, dropping 0.18% since may. not that those stats are particularly good, but another data point that seems to support this thread.