It's the same tired talking point you see the brand loyal go to when their brand of choice is not taking benchmarks. 'The card is faster!' 'Well it uses more power!' 'I just bought one for $600 and it's smoking fast, I don't care that it uses an extra nickel a month in electricity'
What isn't mentioned is that people some people buy 2-3 of these cards. Even if the NV card were to use 30W less power, most of the time these cards are sitting idle. AMD's ZeroCore will make up that 30W in idle on the other 2 cards for the remaining hours in the day when a person is not gaming. It's a wash.
What we want to see are Crysis 3, Far Cry 3, Metro Last Light, Tomb Raider and BF4 benchmarks of 780 max overclocked vs. R9 290X max overclocked.
I care, i'm building a SFF system that can only fit a 450W PSU, so even an extra 30W from the 290x might knock it out of contention for me.
That's not logical. If 30W of extra power is the difference between running your PSU at the limit and destroying it, you shouldn't be putting either the 780 or the R9 290X into it.
But, let's investigate this anyway. A system with i7 3770K @ 4.8ghz with 3 different 780s:
Based on this, if your PSU is only a 450W, it better be rated to operate at 50*C at 450W or it becomes very risky to run i7 OC + 780 OC. Looks like your best bet is to overclock your CPU without any voltage increases so that you can leave as much room as possible for 780/R9 290X overclocking.
Power consumption may not be on top of the important totem pole but it is on it to me. Efficiency may matter to some for multi-gpu as well. To disregard or downplay is odd based on virtually every review investigates this. Personally allow the market to ultimately decide instead of vocal posters!
Some of the comments you make are almost as if they came out of a PR handbook. They seem on topic but don't address anything in detail and ignore everything that was discussed prior to your comment.
Multi-GPU efficiency has already been addressed. NV doesn't have ZeroCore Power which means in the 18 hours or so you are sleeping + working, the AMD GPUs will save a ton of power vs. the 90W of more power a 780x3 may use during gaming over R9 290x x3. If you are going to talk about using more power during games, why aren't you discussing using less power when 3-4 of these GPUs are in idle?
Moreso, you keep making blanket statements about power consumption but the discussion is specifically about 30-40W power consumption differences between flagship products, NOT whether power consumption is irrelevant as a whole. It has already been mentioned by various posters if the power usage was 50% / 100W or something significant, it would matter. No one is arguing power consumption itself doesn't matter. You aren't looking at the context of what's being discussed.
Context: A person running an i7 3770 @ 4.8ghz is a PC enthusiast. The type of user who spends $600-700 on a GPU like MSI Lightning / EVGA Classified / Asus DCUII will most likely overclock those cards too. Why else did they pay a premium? Now looking at the benchmarks, overclocking the 780 alone without voltage control takes one into the 400W system power usage. Oveclocking the 780 with voltage control takes the system into 500W range.
You think a 30-40W difference matters at this point when the total system is drawing 400-500W vs. a card that may be 8-10% faster and/or cost less? You are not being realistic.
Someone else who is at or near the limit of their PSU will not be overclocking the CPU or the GPU. In that case any of the 780/R9 290X or Titan will work on a small form factor system with a 450W PSU.
And with cases such as SUGO 07 coming with a 600W PSU, someone building a SFF system now has a solution as well:
http://www.newegg.com/Product/Produc...82E16811163212
If R9 290 OC can compete with 780 OC but undercut it by $100-150, please tell us again how extra power consumption will matter? What's the break-even point in years on that extra power consumption?