Maybe facts instead for a change?
The cost difference to run isn't even worth talking about.
Still I would agree for a single card setup the 7950 is the better buy. The 580 was never a great card for the price, and at $450 it still isn't - neither is the 7950 though.
Careful, there. That map is a) out of date, and b) due to averaging, it loses information and thus doesn't accurately deal with real life energy usage scenarios, especially with tiering*, which many states practice. As an example, in CA you can pay something like 15 cents/kWh if you're in the first tier, but it rapidly increases to 30 and then even 50 cents/kWh as you go up the tiers to the fifth tier. This can be especially problematic in the summertime when waste heat needs to be cooled, oftentimes by air conditioning, which means even more of a surge in electrical load and a higher tiered rate.
Plus, not everyone lives in the continental U.S. What about Europe, Asia, Hawaii, etc.?
I deliberately chose a high number like $80k for a car to acknowledge that fuel efficiency has a marginal effect unless you drive (game) a lot. Although frankly the 7950's low idle power (11W vs 32W for the 580 1.5GB version and more for the 3GB version) is the bigger effect, if you leave your PC on 24/7. The 21W difference is probably more like 21/0.8 = 26W difference, if you assume a standard 80+ efficient PSU, and over the course of a idle year that's like $34 difference at $.15/kWh. During the year you will be gaming some though so the actual operating cost differential over that year will be more like $40. Those hours add up, even if each hour is only worth a nickel. There are 8760 hours per non-leap year.
But even if you feel that the "car sticker price" number should be more like $100k or higher, and you diligently turn off your computer whenever you don't use it and live in a part of the world with cheap electricity and you stay at the lowest tiered rate (so that the operating cost difference is more like $5-10 instead of $40), the point remains: why pay the same money for something that's a little slower and a lot worse in fuel efficiency? I mean, it's the SAME PRICE. And that's assuming we're comparing apples to apples: 3GB vs. 3GB. The price discrepancy only gets worse if you are comparing GTX 580 1.5GB versions.
It seems that we agree on that, though.
And I agree with you that none of those cards (580, 79xx) are good values when compared to their little brothers like the 6950 or 560 Ti. But it's all relative... the 7950 doesn't compete for the same pool of buyers, it competes with those buyers who would buy a 580 instead.
NV hurry the hell up please, we need the GK104 NOW!
* Tiering, if you don't already know, works like this. (It is an energy conservation/efficiency system that encourages households to use less energy and to buy more energy-efficient appliances.) The exact numbers vary by jurisdiction, so I'm just going to go with hypothetical numbers:
Tier 1 = 100 units
Tier 2 = 100 units
Tier 3 = 100 units
Tier 4 = 100 units
Tier 5 = anything left over
If you use 100 or less units of power in your billing month, you pay tier 1 rates, which are lowest. If you exceed the tier 1 bucket, you go up to tier 2. If you used more than 200 units you are staring at tier 3 or higher.
Typically tier 1 rates are cheapest, tier 2 is still not that bad, tier 3 is bad, tier 4 is nasty, and tier 5 makes you want to get solar panels to install on your rooftop because the rates are so horrible that it's actually cheaper to get those solar panels instead.
So say you use 700 units of power. 100 units are billed at tier 1, the next 100 billed at tier 2, the next 100 billed at tier 3, the next 100 billed at tier 4, and the leftover 300 units of power are billed at tier 5.
How jurisdictions set the thresholds for each tier depends on a variety of factors, including historical average usage. In California, it does NOT depend on square footage of your household or number of people living in it.
You could argue that the kinds of people who buy high-end GPUs can easily afford the extra $40/year it would take to run a GTX 580, so who cares?

Okay fair enough, but the actual extra is probably more than $40/year, probably more like $60 or $90/year more if you live in Northern California. This is because people who can afford high-end GPUs probably also have a lot of other electronic toys (multiple PCs and TVs, fridge, home entertainment and stereo system, microwaves, etc.), and I find it doubtful that they are all being billed at tier 1 and 2 rates. Thus I find that outdated map to understate the actual marginal cost of operating a GPU.
Out of curiosity, I did some back-of-the-envelope calculations and figured about a $90+ cost differential if they are billed at tier 4 CA rates. So $90+ costlier to run a GTX 580 instead of a HD 7950 if you are at tier 4 rates in California and leave your PC idling 24/7 (with occasional bursts of gaming).
http://www.pge.com/yourtiers/
Yes $90/year is still chump change to some people. But it is not a negligible percentage of the price of the card itself. Would you ignore and not send in for a $90 rebate on a shiny new GTX 580 or HD 7950? (And that is assuming that you keep the card for only 1 year. If you keep it for 2 years, it's actually more like a $180 rebate. Three years? $270 rebate.)
By the way, there is a reason why Google placed its server clusters near places like the Columbia River in Oregon, which is near some huge hydropower plants that generate relatively cheap electricity.