Attic
Diamond Member
- Jan 9, 2010
- 4,282
- 2
- 76
If a review discusses power under the value aspect I'd be fine with it but it definitely shouldn't be given the same review space as performance, with the exception of perhaps a "silent" or "green" focused site.
From the value, price/performance context:
US average price per kWh = $0.1297
Hours per week played by "core gamers" = 22
Annual cost per 100W of power of "core gamers" PC = 22 hours/week * 52 weeks/yr * 0.1297 $/kWh = ~150 $/yr * .1 (100W) = 15 $/yr
Total cost 100W of power usage of "core gamer" video card if used for 3 years = $45
So for example if the Nvidia 970 saves an amazing 100W over the 780 and this rumor is correct about their performance being relatively equal, that's $45 in inherent value to be considered when comparing pricing.
After crunching the math I'm just going to remember 50 cents per watt when value comparing.
Out of curiosity I looked up Techpowerup's power chart from their latest review: http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/23.html
Their Radeon 290 uses 64W more than the Geforce 780 so that's a negative value of $32.00 using that $0.50 close enough number. Excluding rebates the cheapest cluster of 290s are ~$380 cheapest cluster of 780s are $470, so from a brand feature neutral perspective the 290 is offering $58 in value atm. If the theoretical GeForce 970 using 100W less than the 780 was dropped into the market at $450 it would be close to value parity, $7 better value, with the Radeon 290 at $380.
So how close to 75W of card power does everyone think the GeForce 970 will get?
Sources:
"core gamers" 22 hours a week - http://bgr.com/2014/05/14/time-spent-playing-video-games/
US Average price per kWh - http://www.eia.gov/electricity/monthly/update/end_use.cfm#tabs_prices-3
Nice post. I expect real power draw while maintaining boost clocks to be 160-190w for the 970 after it has heated up to 68c or higher.