Yes, it can be important (i.e. HD6870 vs. GTX470/480). In this particular example when comparing GTX460 vs. HD6850, it isn't important. I just googled 4-5 websites and they all show nearly identical idle power consumption, and if anything it slightly favours the 460 by 2-3W. You can see the links I provided above.
If it turns out that they're tied at idle (even systemwide, not just card), then the load wattage delta still makes the 6850 eat less energy, but it would be a far smaller amount. Instead of $50 extra over three years, it'd be more like $5 (at 13 cents/kWh and assuming a few hours of gaming per day on average).
Also, idle system draw isn't the same as idle *card* draw:
HWC shows +14w (or +15w, depending on which card) system
Guru shows +13w system (can't access it right now, but look earlier in the thread for the link)
HardOCP shows +4w system
http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/8
AT shows zero difference at idle:
http://www.anandtech.com/show/3987/...enewing-competition-in-the-midrange-market/20
On the other hand you do have other sites saying system draw is lower on the GTX460:
http://www.pcper.com/article.php?aid=1022&type=expert&pid=13 (-5w)
I speculated as to why that might be the case already--see my previous post about this (e.g., maybe NV cards require higher CPU utilization or something; or different cards by different AIBs with different circuitry and cooling could impact things).
One additional X-factor that I forgot about until now can easily explain idle discrepancies, too: according to reviewers like Anandtech, the voltage of GTX460s is not set! The voltage therefore can vary from card to card, they just boost voltage enough to hit 675MHz with a safety cushion, and call it a day:
http://www.anandtech.com/show/3809/nvidias-geforce-gtx-460-the-200-king/17
"As we’ve discussed in previous articles, with the Fermi family GPUs no longer are binned for operation at a single voltage, rather they’re assigned whatever level of voltage is required for them to operate at the desired clockspeeds. As a result any two otherwise identical cards can have a different core voltage, which muddies the situation some. All of our GTX 460 cards have an idle voltage of 0.875v, while their load voltage is listed below."
While AT's small sample all had .875v at idle, I can imagine that not always being the case, that some cards need more voltage at idle. Perhaps HWC's GTX460 was at a slightly higher idle wattage?
And because there are almost no stock 6850s, the voltages may vary from AIB to AIB in the 6850's case as well.
If we distrust HWC's wattage maybe we should distrust its performance results as well. And performance averages vary depending on which games are sampled. Food for thought.
Anyway I no longer have much interest in this matter because it'd take a miracle for me to waver from the Sapphire 6850 at this point--Amazon started running out of stock last night, which got me to commit to buying one before they ran out entirely. I was going to wait till AT's roundup, but the time pressure made me cave. Sorry ASUS, but I wanted Sapphire's larger shroud (the better to funnel hot air OUT of my case rather than push it back into the case) and am willing to give up a PCB stiffener for it. The $7 premium for the ASUS doesn't help. (For anyone else making the same ASUS vs Sapphire comparison: I looked at Rage3d's review and Vortez and others and it appears to me that Sapphire and ASUS both used solid caps for their 6850s. They also have the same number of power phases and similar GPU temps, noise, etc. And the cooler designs are very similar, except ASUS has a smaller shroud and direct-contact heatpipes.)
Of course if Amazon ships me a defective 6850, I may revisit this matter.
