Originally posted by: taltamir
I mean easily 30$+ a year... if you leave the computer on 24/7 than over 100$ a year, easily. 50watt difference adds up to a LOT over time.
EDIT: I couldn't find exact figures for the 2900Pro
But:
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=11
The 2900XT is 78watts more on idle and and 123 watts more on load.
So I would be GENEROUS and say the pro is only 50 watts more on either... And the ABSOLUTELY LOWEST power cost in texas of 12 cents per KWH (average being 14+taxes)
@2 hours a day gaming and 5 hours a day general use:
7 hours/day x 50watt = 0.350kwh/day.
0.35kwh/day x 365 days/year = 127.75 kwh/year
127.75 kwh/year x 0.12$/kwh = 15.33$/year
But it likely will take even more electricity. And I DOUBT you only use your computer for 2 hours of gaming and 5 hours of general on an average day (yea i know you sometimes dont use it at all, but then you have the occasional 14 hours in a row of playing a specific game, right?)
Assuming you leave your computer on 24/7 though... well... a quick and dirty calculation would be to devide the result by 7 (the assumed hours per day before) and multiple by 24 (the currently assumed hours per day)...
52.56$ per year.
And again, I am being very generous... your electricity might not be the lowest priced, you probably use the computer for more hours of gaming. and I would bet that it takes closer to 100 watts more when gaming, not 50 more. Which would make it much more expensive.
EDIT2: From what I have read... the 2900Pro is an underclocked 2900XT with some ram differences... I would hazard an educated guess that it takes probably 10 watts less in both load and idle. So that would mean probable consumption increase over a 3850 of 68watt idle and 113 load... Not pretty, and VERY close to the numbers I originally tossed about (I calculated electricity cost differences for so many components recently that I have gotten experienced in it)