Not sure if serious. Performance/$ has always been either the most important or one of the most important metrics for GPU buyers. This isn't the first generation it's being talked about. If it wasn't, we'd all be buying $1-3K Titan setups. If performance/$ didn't matter, what's even the point of researching GPUs? We would all be rolling to the next fastest card out.
Regarding performance/watt, no one
seriously cares when a GTX680 system is using 40W less on a system that's already drawing nearly 400W but GTX680 costs $450 and 1Ghz 7970s are going for $380 with free AAA games. Let me know what the break-even point is on the electricity costs with that 680 card after paying an $80 premium.
And honestly, if you care that much about heat/costs as you keep talking about it so often referring to HD7970 as a "Fermi" (even though an 1180mhz HD7970 uses just 10-20W more power than a stock 580 while being 60% faster), why are you running GTX470 @ 960mhz in Tri-SLI? It's interesting how people buying $500 GPUs suddenly started to care about 40-50W of extra power and yet bought Fermis over HD5000/6000 series. Performance/watt is nice but I'd take a card with 500W of power if I could if it was 2x faster than the Titan for $500. Performance/watt is a nice metric but under no circumstances can it somehow justify GTX680's price premium. HD7950 OC for $280-290 will match a GTX680 or even beat it and still use just about 225W of power doing it. Try doing break-even analysis on power consumption costs on that!