Sorry, some parts below are a bit off topic.
How many of us have recommended getting a 5850 and OCing it to the moon rather than buying a 5870? I sure did. I can therefore understand the frustration when every thread turns into "Card A is better because it consumes less power and makes less noise." (I am guilty of that myself too btw).
We are also proud of overclocking since we are able to get $150-250 CPUs and clock them to match the performance of $500+ CPUs. Why not the same with videocards? Sometimes I need to snap back because the times are changing and people are starting to care more and more about factors such as power consumption. So now we are faced with 2 conflicting objectives - maximizing performance yet minimizing power consumption.
Back in the days I remember the fuss about the inflated 480+W power supply requirements for 6800GT/Ultra cards (I Quote"
"The GeForce 6800 Ultra is power-hungry (haha!) and is capable of consuming up to 110W under a peak load. The GeForce FX 5950 Ultra topped out at 80W"). Many people went bonkers on this forum because they thought they had to upgrade their 350W power supplies

Now we consider 500-700W power supplies "mainstream" and GTS450 at 106W TDP child's play. Constantly increasing power consumption of components is eventually unsustainable; so once in a while both companies should reassess the performance/per watt of their parts.
Let's say this again together: 5950 Ultra, top of the line was only 80W at load, GTX480 is 250W today, 5870 not much better at 188W. And in 10 years should we expect 300-400W then? Seems like it.
A little interesting tid-bit from Anandtech's review of X800XL vs. 6800GT:
"The X800 XL does consume less power and thus, will run cooler than the 6800GT, which is a plus for ATI. Given that the 6800GT is already a single slot solution, the power/heat advantage isn't one that is entirely noticeable considering that the X800 XL cannot be run fanless."
6 years ago, Anand basically dismissed the 10W power difference between the 2 cards as immaterial. Today, it's not unusual to see bickering in the forums over 10-20W of idle power consumption between HD5000 and Fermi (for example), or 30W at load. Of course we know 6800GT sold extremely well despite the power consumption disadvantage and $100 higher price tag simply because it was
faster than the X800XL. The same argument isn't made for GTX470 over 5850 today, is it? No because the power consumption difference is more like 90W.
Still, we have even seen arguments that a $100 more expensive 5870 is worth it over say a GTX470 often on the basis that it consumes
less power and runs cooler. Wait a second, when did we see a cooler running $100 more expensive videocard with lower idle power consumption lose a lot of market share to a
hotter running card with similar performance that cost $100
less than the competition at launch? 4870 at $299 was that card positioned right against a $399 GTX260. Fast forward, and hardly anyone is recommending a 470 over a 5870 or thinks 5870 should have a $100 price cut.
So what happened??? Has price/performance become secondary to enthusiasts? Well 5870s have been selling for $350-370 for 12 months (and still are)...while NV had to quickly cut GTX260 to $299....
What about when X1900XTX replaced X1800XT? An increase of
almost 60W of power at load, about the difference between a GTX470 and GTX460 today. I bet those who remember X1800XT would say it was a disappointment and X1900XTX was far superior. What happened to the 60W of extra power? It didn't matter because the performance was in spades. Now if a faster performing card consumes 40-60W of power, it's basically dismissed as a hot, inefficient dud. (not trying to start an argument of which card is better, only looking at power consumption in relative terms).
I guess the environmentalists are doing a great job because now we have to deal with (1) price (2) noise (3) power consumption (4) heat and (5) performance when recommending videocards, with #5 dead last?
Just my 2 cents.