Power consumption? Really? Get the numbers here.

Blue Shift

Senior member
Feb 13, 2010
272
0
76
So, lots of people are complaining about the power consumption on the 480. I've seen some reasonable complaints, such as concern about PSU requirements, and some not-so-reasonable complaints, such as the following:

"The GTX 470/480 costs so much more money per year, because it draws so much more power!" Have you really run the numbers?

Here's a calculation of the TOTAL cost per year of running a 480:


...assuming 5000 hours (about 100 per week) idle time,
and 1000 hours (about 20 per week) load time per year...

http://www.anandtech.com/video/showdoc.aspx?i=3783&p=19
5870 idle: 164 watts
5870 load: 319 watts (crysis)
480 idle: 190 watts
480 load: 421 watts (crysis)


5870 total cost calculation:

164 watts idle
* 5000 idle hours/year
= 820 kilowatthours/year

319 watts load
* 1000 load hours/year
= 319 kilowatthours/year

820 + 319 = 1139 kilowatthours/year

* 11 cents/kilowatthour national energy cost average

= $125.29 per year


480 total cost calculation:

190 watts idle
* 5000 idle hours/year
= 950 kilowatthours/year

421 watts load
* 1000 load hours/year
= 420 kilowatthours/year

950 + 420 = 1370 kilowatthours/year

* 11 cents/kilowatthour national energy cost average

= $150.70 per year


Cost difference:

150.70 $/year (480) - 125.29 $/year (5870) =

$25.41 per year additional cost with a 480 over a 5870.



Actually, that's no joke.

Edit: Again, this is assuming about 20 hours/week load, plus 100/week idle for a total of around 120 hrs/week.
Source for power cost numbers: http://www.eia.doe.gov/cneaf/electricity/epm/table5_6_a.html
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
And what is the cost of the extra heat affecting the rest of your PC and the very card itself?
 

br0wn

Senior member
Jun 22, 2000
572
0
0
If you live in Hawaii, it costs close to 30 cents per kilowatthour. So it is closer to an additional of $70 per year just for the electricity alone. Now, factor in the bigger case, cooling and PSU required. Not too mention the noise pollution. Ouch.
 
Last edited:

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
What about people with more than one screen? Two LCDs or a LCD and a HDTV? The GTX480 sucks a crazy ~280W at idle then... It's enough the extra screen is plugged to the card, it doesn't need to be on or anything...

Your numbers for the "green" card jump to $75 (pun intended). When I got my electricity bill summary last year, when I still had the HD4870 sitting there idle most of the time (LCD + HDTV and music running when I'm home), my category jumped from "typical usage" to "way above average with more people in the apartment". Had to cough up an extra $350 for going way over expected usage and my next year upfront jumped dramatically too... let me tell you, it ain't fun. And the HD48xx series had the same thing... actually, it looks like all cards have it, no idea why. I BIOS-modded my HD4870 to run at 160/225 all the time (after I got the bills...) and didn't encounter one single glitch until I sold it. No idea why currently we're forced to run the cards at higher clocks with more screens (stability reasons my ass). Even the HD58xx have the same behavior - their power usage jumps like ~50W.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
No idea why currently we're forced to run the cards at higher clocks with more screens (stability reasons my ass). Even the HD58xx have the same behavior - their power usage jumps like ~50W.

The screen does flicker if the clocks go down too low on multi-monitor setup, you can see it on the 58xx if you OC the GPU it has a bug where it will idle back down to 157Mhz/300Mhz instead of the prescribed 400Mhz/1200Mhz.

I assume the RAMDAC is tied to the core clock or something and those are suppose to run at 400Mhz.. I never see the flicker on my DisplayPort monitor which is clockless so maybe has to do with that.

But yea, the GTX 480 seems to run at FULL clocks at up around 90C idling with a lot of power draw just because it's ramping up the RAMDAC and memory to handle those monitors lol..
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
OP
Electricity cost is the least of GF100's problems. Though, in my case it's $.22/kWh, so $50 according to your calculation.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
There's the catch 22. Some people are arguing that Fermi has more 'future potential' which may be true to an extent, but at the same time Fermi loses it's value proposition against HD 5870 the longer you intend to use it, although a lot of this depends on how much you pay for energy. In any case, if you want Fermi performance without the heat there is probably a higher clocked 2GB Radeon SKU in the works.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
There's the catch 22. Some people are arguing that Fermi has more 'future potential' which may be true to an extent, but at the same time Fermi loses it's value proposition against HD 5870 the longer you intend to use it, although a lot of this depends on how much you pay for energy. In any case, if you want Fermi performance without the heat there is probably a higher clocked 2GB Radeon SKU in the works.

the only thing with "future potential" is future hardware.
when the future arrive, sell your 5xxx series AMD Card and buy either a Radeon 6xxx or an nvidia GTX5xx

To be honest though, there is no reason for me to upgrade from my GTX260 as of yet...
 

GotNoRice

Senior member
Aug 14, 2000
329
5
81
lol I have higher power consumption in games than GTX480 SLI and did not really notice any difference in my bill from back when I ran 4850 crossfire. We're almost always below 100kWh per day.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
The cost of electricity varies widely AND the rate is not always taking into account what you actually pay. If you take a look at an electricity bill there are all these fees added on top of the standard rate (at least it is in Toronto).
 
Last edited:

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
OP: The problem with your calculations is that a lot of people leave their machines on 24/7. If you are counting 21 hours/week (I know you said 20 but just for round numbers), that leaves 7665 hours per year of idle, which is way more than 5000. That's another 69.69 kilowatt hours per year in difference, or $7.62 if using your 11 cents/kWH (which is horribly naive as many places, such as Hawaii, are WAY more than that). That puts it at over $33 more per year. Yeah, doesn't sound like a lot until you remember that the card itself is already ~$100 more than the 5870 (and that's not counting the new case and PSU that some will need.) At its performance, a $100 premium is a hard enough sell. If you use the card for 3 years, that's ANOTHER $100 premium (Hawaiians are looking at ~$300 here) you are paying for just the card itself (again, not counting case/psu.) If you seriously think it's worth an extra $200 (extra $400 for Hawaii) to get ~10% more performance, well you must have more money than you know what to do with.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
OP: The problem with your calculations is that a lot of people leave their machines on 24/7. If you are counting 21 hours/week (I know you said 20 but just for round numbers), that leaves 7665 hours per year of idle, which is way more than 5000. That's another 69.69 kilowatt hours per year in difference, or $7.62 if using your 11 cents/kWH (which is horribly naive as many places, such as Hawaii, are WAY more than that). That puts it at over $33 more per year. Yeah, doesn't sound like a lot until you remember that the card itself is already ~$100 more than the 5870 (and that's not counting the new case and PSU that some will need.) At its performance, a $100 premium is a hard enough sell. If you use the card for 3 years, that's ANOTHER $100 premium (Hawaiians are looking at ~$300 here) you are paying for just the card itself (again, not counting case/psu.) If you seriously think it's worth an extra $200 (extra $400 for Hawaii) to get ~10% more performance, well you must have more money than you know what to do with.

Well the obvious answer is dont leave them on 24/7 heh. With how good sleep mode is in Vista\Win7. Leaving your machine on is a complete waste.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
<3 Win7 sleep mode.

Also 11 cents is pretty low, but here in Florida it's roughly 11.7 so that's not too far off the mark. Typically most states are going to be in the high teens.
 

MoMeanMugs

Golden Member
Apr 29, 2001
1,663
2
81
How much more would it affect your electricity bill by trying to keep the room cool? The cooler you get the card, the more heat you're dumping out into the room. With my 4850 running a game during the summer, the room gets pretty warm - it get uncomfortable after a while, although I must mention I am a hot body.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I thought the rule of thumb was each watt was $1/year if power consumption was 24/7.

EDIT: Oh, I see you are assuming people turn their computers off for the night with those calculations.
 
Last edited:

MadJackalIto

Junior Member
Mar 27, 2010
19
0
0
wow, interesting estimations... In performance/price (power consumption) ATI is much better than nVidia.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Meh - chump change amortized over a year. The real problem is noise required to dissipate that kinda of power which you have to deal with each and every day like someone clawing a chalkboard. ( you guys know what those are, right?)
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Would you Fiqure it using xfire 5850s which is 50% faster for almost same price.
pwrload.gif
 

clok1966

Golden Member
Jul 6, 2004
1,395
13
76
I'm thinking if you are buying $500-600 video cardsthat are 10&#37; faster and in in 2-6 months will be 1/2 the speed of the next gen cards, you really dont care about an extra $100 a year.
Bleeding edge is not cheap, never has been, never will be. Its really to bad bleeding edge is so close to the current big dog, looks like a wash to me.. $350 or $550 ... with the more expensive card just bearly beating the cheaper one.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
I'm thinking if you are buying $500-600 video cardsthat are 10% faster and in in 2-6 months will be 1/2 the speed of the next gen cards, you really dont care about an extra $100 a year.
Bleeding edge is not cheap, never has been, never will be. Its really to bad bleeding edge is so close to the current big dog, looks like a wash to me.. $350 or $550 ... with the more expensive card just bearly beating the cheaper one.

I agree. Anyone who doesn't care though really shouldn't be having these discussions and should just go buy what they like. A lot of people seem to be trying really hard to justify why buying a Fermi is "worth it." Any time "worth it" is a concern, the person DOES care about the cost.

Generally, when people go out of their way to justify a purchase to someone else...

1) The item is not worth buying at that price.
2) Deep down, they are really trying to convince themselves that it's the right choice.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
How much more would it affect your electricity bill by trying to keep the room cool? The cooler you get the card, the more heat you're dumping out into the room. With my 4850 running a game during the summer, the room gets pretty warm - it get uncomfortable after a while, although I must mention I am a hot body.

Heat dumped into the room is not affected by how well you cool the card. The card uses how ever many watts it uses, and they all go into the room one way or another