Just because you justify every AMD product with walls of text "proving" their performance per dollar advantage does not mean the consumer has to accept that, or will not use other metrics as well to make their purchase decision.
Maybe you should read my post again and come up with a better rebuttal to my points. The first part of my post has nothing to do with AMD's price/perf but it's clearly talking about NV's changing marketing strategy in the last 5 years. It's obvious you haven't read my posts as I have recommended countless NV GPUs in the last 13 years. There have been plenty of AMD GPUs that I think were not worth buying at all, almost regardless of their price/performance. Recently I slammed 285 and I slammed 7970 on release for many months. I also skipped HD2000 and 3000 series and wait for it...I put my money where my mouth is and bought GTX470s and overclocked them to 750mhz because they beat 5850s OC in DX11 games (with tessellation) which meant their inferior perf/watt was irrelevant. But you must not remember any of that, or me recommending GTX480s 7 months after launch, or GTX460 overclocking, etc.
Getting back to perf/watt.
You seem to forget the facts: AMD had superior perf/watt for ages, starting from the day HD4850/4870 launched and
all the way until the last Kepler card dropped (Kepler took 2.5-9 months to launch top-to-bottom) on the desktop. Even though 670/680 beat 7950/7970 in perf/watt, NV still used outdated stack of Fermi cards to compete in lower end for 6+ months thereafter without any problems. If perf/watt was such a critical factor, NV would have already been on the ropes since AMD's HD4800 series.
From the day HD4850 launched until the first day of GTX680, if we look at the total market share AMD did not gain much, if at all. I am not going to pull up the charts again as I've done it 10X. So AMD had both perf/watt and price/perf and still didn't make a dent in the overall market share in the last 5-6 years. So your point is only now people started to care for perf/watt or is it because NV is in the lead?
When AMD basically beat NV in nearly every metric from VRAM, to price/perf, to perf/mm2, to double precision, to perf/watt, did NV sit with 24% market share?
You also missed my other point. Just like 5 years ago, NV still makes 100W, 200W and 250W GPUs. Get the point? NV never stopped making flagship cards that use 250W+ of power. That means NV isn't some Greenpeace company that makes 150W GPUs and that's it. So how does NV get people to upgrade more often to a lackluster card like a 960 or how does it market a card barely faster than a 780Ti for $550 that in the past NV always sold in the $200-300 band? Perf/watt marketing non-stop. People are still butt-hurt on these forums to accept that GTX680/980 are GTX460/560Ti successors because they don't want to admit they are paying double for what was always a mid-range $250-300 card. No one wants to publicly admit they overpaid for something so how does one justify his purchase? Had NV purposely delayed GTX480/580 and launched $499 GTX460 as the next gen GTX480 flagship and then followed up with $550 GTX560Ti as the next gen flagship and only after released the real flagship GTX580 for $1K, how would that make you feel? Welcome to Kepler + Maxwell strategies.
Did you never notice how no reviewer in the world performs a perf/watt vs. price premium cost-benefit analysis. Why is that? They just say well this card uses X less watts of power so it's price premium is justified (and they blatantly ignore price/perf). Just try to pull that type of analysis in the finance world where you need to back up your conclusion with financial proof.
<
The premium it costs to buy a more efficient product - say a more power efficient TV, dryer/washing machine, videocard, hybrid vehicle, etc.>
vs.
<
the time it would take to break-even and start re-couping the cost savings vs. a less efficient one>
Just because most of the market thinks the premiums are worth it, doesn't mean squat because most people can't do finance, can't calculate TCO, and buy based on emotions and what makes them feel good. It takes FAR more work to justify your buying decision with mathematics that are free of emotional bias.
Perfect example is the popularity of the Toyota Prius:
"It turns out the additional $5,473 required for the privilege of owning a Prius instead of an Insight can buy a lot of fuel. At today's fuel prices, the actual monetary savings earned by the Prius' edge in fuel economy is miniscule, working out to a paltry $70 per year. Paying off the Prius' extra tariff in sticker price with the savings in fuel purchases would require more than
75 years."
http://www.edmunds.com/honda/insight/2010/comparison-test1.html
One can easily find hundreds of thousands of consumers buying Toyota Prius' to save on fuel costs when it'll take 10-20+ years to break even with a $16000 Honda Fit. They'll use all kinds of illogical reasons how Prius saves them money on fuel but they should just say they bought a Prius because they wanted to/it made them feel good inside. From a financial standpoint the Prius is a questionable buy in most cases, but it's a marketing success. The Prius marketing and feel good about yourself factor works, but for someone who breaks things down mathematically, perf/watt or miles per gallon can ALL be measured to provide an exact monetary benefit. I've done the math to figure out if perf/watt is financially beneficial to warrant the price premiums, have you? I bet most consumers in the world have not.
Some highly technical people on our forums have done the maths and have even measured the increase in their room temperature from using 600W of GPUs vs. say 300W GPUs. Most other folks are probably buying a more efficient card because they are told it's cooler, quieter and overall better by review sites (you know the same outlets that get marketing dollars for reviewing and pushing sales! duh!) but they haven't got a clue how to prove if it's really better or not. Also, you can have a card using 150W of power that runs hotter and louder than a card using 500W of power.