I will not even concede the gtx 460 uses more watts at idle. Vs the 6850 or 6870.
There is going to be some variance in cards and test methods that also help make this minutia , cost of life , argument moot.
At what point do you stop enjoying your hobby ? What about the satisfaction of actually having the hardware to game , and not being paralyzed by the fear of paying your electric bill, or maybe even buying, god forbid a game ?
I already anticipated your using these numbers; see my response earlier in this thread. And this thread is about GTX460 vs 6850, which seems to degenerate into a price/perf discussion a lot.
1. This thread is
NOT about "is $17 per year important in the grand scheme of things?" It's more like: which card will give me better bang for the buck overall when overclocked: GTX460 vs 6850?
2. Although I agree that $17/year isn't much in the grand scheme of things, that is a
separate topic that belongs in another forum.
3. And if you follow the
same logic, how about this question: "is the marginal increase in performance of the GTX460 important in the grand scheme of things?"
4. So, please be
consistent in your reasoning.
5. For the nth time, my conclusion remains the same: it's hard to go wrong either way. If you can't agree with that, then let's agree to disagree.
Edit: I see that He Who Shall Not Be Named twisted what I said. I never said the 6850 is a better buy than the GTX460 overall, and in fact I said that all in all, the GTX460 has slightly better price/performance, earlier in this thread. Just for the record. This is because I think about my usage patterns and it's not as high as $17 because I am good about turning stuff off when not in use. But for some people, especially those living in higher-electricity-cost areas, wattage can add up after a while when left on 24/7. New York at almost 20 cents/kWh makes my own rate look good in comparison. Wow. To repeat, my electricity cost analysis is here:
http://forums.anandtech.com/showpost.php?p=30686718&postcount=77 Average among all states--without considering how many people live in each one--is 12 cents/kWh, not far from my hypothetical example using 13 cents/kWh. But the weighted average is probably higher because most of the cheap-power states are low-population. And this is in a DOWN economy with relatively cheap inputs (of coal/gas). So I think 13 cents is a pretty fair estimate of how much it costs in the U.S. on average right now for one kWh of electricity. Probably higher elsewhere, like in Europe or Africa, of course.
And as for heat that is a complex issue due to different coolers, plus heat must flow somewhere no matter how cool the GPU itself gets, so overall wattage is what one is looking at to measure the delta in room temperature. Some coolers even blow that hot air right back into the case, where it warms the CPU. Ugh.