Cost of operation in electricity

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Since I have much experience doing such math, I was asked to compare the cost in electricity of 4850, 4850CF, and GTX 280 in this thread (EDIT, oops, he asked 4870, I compared the wrong card, still multi GPU power wastage is INSANE!):
http://forums.anandtech.com/me...id=31&threadid=2204464

I ended up spending a couple of hours on it, and ironically, I misread the original post, he wanted to know the diff between 4870 and 4870CF, not 4850, while it is lower, the multi GPU is still significantly more expensive. If anyone is willing to do it with the 4870 as well, and maybe multiply by two all the values for the GTX280 to get the costs of SLIing two GTX280, please do so here and I will add them to this post.

I figured it was such a tangent from the thread's original purpose of that thread, that it deserves a thread of its own... so let me cut and paste:

Originally posted by: SteelSix
I keep looking at the 280's power advantages over ATI's winners though. Hell, I was going to CF 4870's until I saw that power graph. I couldn't believe it actually! Talt, you're an expert on power, what's your take when comparing the two in that aspect?

alright.

http://techreport.com/articles.x/14990/15

4850 CF - 4850 = idle wattage of 4850 = 55watt idle
4850 total idle wattage - 4850 only idle wattage = bare system wattage = 72 watts
GTX 280 - 72 watts = 50 watts idle.

Load wattages:
4850 CF - 4850 = load wattage of 4850 = 128 watts
4850 total wattage - 4850 wattage = base system wattage in load = 108 watts
Verification: 4850 CF - 108 watts = 256 watts, exactly twice that of a single 4850. The math checks out. Meaning that they measured correctly. (some sites the math does not check out, which means that there was something wrong with their measuring methology).
GTX 280 load - bare system load = GTX only Load = 191 watts in load.

So:
4850: 55 watt idle / 128 watt load
4850 CF: 110 watt idle / 256 watt load
GTX 280: 50 watt idle / 191 watt load

Three scenarios:
1. A person who leaves the system on 24/7/365, idling all the time, plays games 2 hours a day on average (on average including ALL days)
2. A person who uses the system for 10 hours a day for 2d work, and 2 hours a day gaming (me), turns it off for the rest.
3. A person who runs folding at home on both GPU and CPU 24/7/365


4850:
2 hours per day of load: 2 hours/day * 365 days/year * 128 watts * 0.001 KW/watt = 93.44 kwh/year
10 hours per day of idle: 10 hours/day * 365 days/year * 55 watts * 0.001 KW/watt = 200.75 kwh/year
22 hours per day of idle: 22 hours/day * 365 days/year * 55 watts * 0.001 KW/watt = 441.65 kwh/year
24 hours per day of load: 24 hours/day * 365 days/year * 128 watts * 0.001 KW/watt = 1121.28 kwh/year

Scenario 1: 2 hours of play + 22 hours of idle = 93.44 kwh/year + 441.65 kwh/year = 535.09 kwh/year
Scenario 2: 2 hours of play + 10 hours of idle = 93.44 kwh/year + 200.75 kwh/year = 294.19 kwh/year
Scenario 3: 1121.28 kwh/year

4850 CF:
Scenario 1: 1070.18 kwh/year
Scenario 2: 588.38 kwh/year
Scenario 3: 2242.56 kwh/year

280GTX:
2 hours per day of load: 2 hours/day * 365 days/year * 191 watts * 0.001 KW/watt = 139.43 kwh/year
10 hours per day of idle: 10 hours/day * 365 days/year * 50 watts * 0.001 KW/watt = 182.5 kwh/year
22 hours per day of idle: 22 hours/day * 365 days/year * 50 watts * 0.001 KW/watt = 401.5 kwh/year
24 hours per day of load: 24 hours/day * 365 days/year * 191 watts * 0.001 KW/watt = 1673.16 kwh/year

Scenario 1: 2 hours of play + 22 hours of idle = 139.43 kwh/year + 401.5 kwh/year = 540.93 kwh/year
Scenario 2: 2 hours of play + 10 hours of idle = 139.43 kwh/year + 182.5 kwh/year = 321.93 kwh/year
Scenario 3: 1673.16 kwh/year


KWH hours range from 6 cents per KWH to 24 cents per KWH. The average is supposedly 9. In texas we pay 14. (after all the hidden charges and taxas that is).
I will calculate for texas since this is where I live.
I will start with scenario 1 and 3, because they are rare, a common user will be most likely scenario 2, scenario 2 is based on me in real life.

Scenario 1:
4850: 535.09 kwh/year * 0.14$/kwh = 74.9126$/year
4850 CF: 149.8252 $/year
GTX 280: 540.93 kwh/year * 0.14$/kwh = 75.7302 $/year


Scenario 3:
4850: 1121.28 kwh/year * 0.14$/kwh = 156.9792$/year
4850 CF: 313.9584 $/year
GTX 280: 1673.16 kwh/year * 0.14$/kwh = 234.2424 $/year

Scenario 2 @14 cent/KWH (this is me):
4850: 294.19 kwh/year * 0.14$/kwh = 41.1866$/year
4850 CF: 82.3732 $/year
GTX 280: 321.93 kwh/year * 0.14$/kwh = 45.0702 $/year

Scenario 2 @ 7 cent/KWH:
4850: 294.19 kwh/year * 0.07$/kwh = 20.5933$/year
4850 CF: 41.1866 $/year
GTX 280: 321.93 kwh/year * 0.07$/kwh = 22.5351 $/year

Scenario 2 @ 24 cent/KWH:
4850: 294.19 kwh/year * 0.24$/kwh = 70.6056$/year
4850 CF: 141.2112 $/year
GTX 280: 321.93 kwh/year * 0.24$/kwh = 77.2632 $/year


Delta of scenario 2 @ 14 cent/kwh: GTX 280 is 37.303$ per year cheaper than 4850CF per year in direct electricity cost for running the card itself.
Delta of scenario 2 @ 7 cent/kwh: GTX 280 is 18.6515$ per year cheaper then 4850CF per year in direct electricity cost for running the cards themselves.
Delta of scenario 2 @ 24 cent/kwh: GTX 280 is 63.948$ per year cheaper then 4850CF per year in direct electricity cost for running the card itself.

This is under normal operation, not under something crazy like 24/7 operation.

The reason I said in electricity cost per card itself, is because they also generate heat equivalent to an electric heater of a similar wattage, generally speaking, cooling with an AC is extremely inefficient compared to electric heating, it should cost 3 times the listed dollar amounts to cool the room (in the winter, it decreases your heating cost by almost that amount. Or less if you use something other then electricity to heat up your room.

Delta accounting for AC, for me. I cool for about 9 months of the year, i heat up for about 3. (texas is hot as hell, and I have Russian blood). So I cool for 9/12 of the year, aka 0.75, and I heat for 0.25 of the year. It costs 3 times as much to cool.
So it is 0.75 * 3 - 0.25 = 2
I need to add a 1 for the cost of running the card itself in additional to AC savings + cost increase.
End result. total cost for me is 3x the electricity demand of the card alone. It is probably a bit higher still, because electricity costs more money in the summer then in the winter.

But really, no need to get that specific. Lets just take the delta and multiply it by 3.

So for me personally: the GTX would be cheaper then the 4850CF to operate by at least:
37.303 * 3 = 111.909$/year in electricity for the card and the AC, minus the reduced costs of heating during the winter, not accounting for the price fluctuation of energy during summer and winter (which will make it slightly higher)


It will probably be higher still.. since I spend 12 hours a day next to the computer, I will feel the heat more strongly, that means that during winter i am less likely to turn on heating, and during summer i am more likely to turn the AC cooling lower, decreasing the temperature in the entire apartment, because I am getting too hot from sitting right next to the heat chucker.

I reduced my overall power consumption by 50$+ a month ever since I started using the computer in my underwear and with the doors to the computer room open wide (for air circulation with the rest of the house) and increasing the house temp by 4 degrees. (during the winter i wear a coat and heavy clothes instead).
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
so use it to game when you want to game, and in the meantime donate the otherwise unused cycles to something like F@H so we can find cures to cancer and such, then you won't be wasting your money ;)
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Kudos for all the calculation. This is just gonna be flamed by some people saying "electricity is cheap...don't game if you can't afford it", "certain people" who said that before will now say the GTX is the clear choice since power consumption paints the GTX in a better light than 4850CF, and yet others (like me) will point out that from that Techreport review the 4850CF usually performs on par or better than the GTX...so more performance = more power used which is alright in my book. You just have to decide whether you want higher performance or better power consumption.

If you factor in the initial cost difference which is about $200+ for a GTX 280 vs. 4850CF (going down now of course), it'll take a couple of years for the GTX to pay for itself. Considering that most people that would buy a high end set up like that will upgrade sooner than 2+ years, the cost increase due to the higher power consumption isn't that big of a deal so if I were going for that kind of setup I'd want the higher performance and lower initial cost.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: thilan29
Kudos for all the calculation. This is just gonna be flamed by some people saying "electricity is cheap...don't game if you can't afford it", "certain people" who said that before will now say the GTX is the clear choice since power consumption paints the GTX in a better light than 4850CF, and yet others (like me) will point out that from that Techreport review the 4850CF usually performs on par or better than the GTX...so more performance = more power used which is alright in my book. You just have to decide whether you want higher performance or better power consumption.

If you factor in the initial cost difference which is about $200+ for a GTX 280 vs. 4850CF (going down now of course), it'll take a couple of years for the GTX to pay for itself. Considering that most people that would buy a high end set up like that will upgrade sooner than 2+ years, the cost increase due to the higher power consumption isn't that big of a deal.

all very valid and accurate points. I wanted to do the cost of GTX280 in sli, GTX260, 260 SLI, and 4870 / CF.
But it took so long just to do all of these calculations...

Generally speaking though, the 4850 in CF are indeed more powerful than a single GTX280 as you said, and they are still cheaper.
Although it all depends on the deals. Some people got them for 145 OTD at best buy, but the price is no longer that cheap. the GTX280 is currently available for 500-40$ MIR.

Comparing the 4870 CF to a GTX280 though... well again, odd comparison, while the 4870 takes more power then a 4850, it also delivers more.

There is also the increased cost of a CF/SLI mobo, a PSU, extra fans on the case, etc...

My main point is that power isn't cheap, it is expensive. And should be taken into account.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Originally posted by: taltamir
There is also the increased cost of a CF/SLI mobo, a PSU, extra fans on the case, etc...

Well in terms of a PSU, nVidia and ATI both recommend a 550w PSU for a GTX 280 and 4850CF respectively. As for a motherboard P45 boards are very cheap but they are limited to x8/x8 vs an X48 motherboard which is x16/x16 (I myself would be a bit hesitant to use a P45 for CF so point taken).

My main point is that power isn't cheap, it is expensive. And should be taken into account.
[/quote]
This I completely agree with which is why I hope ATI and nVidia both get their Power Play/Hybrid power tech working. I think we've discussed this in another thread but there will definitely be people who'll say electricity is so cheap that it shouldn't matter (as I've said before I think most people don't factor in all the costs associated with that electricity).
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Thanks much taltamir for doing this! It does help even though you calculated with 4850CF. Really appreciate you taking the time.

Any thoughts on what kind of power the 4870 X2 will need/use? A tougher question to answer and a guess at best; but it will be a factor I consider now that GTX 280 has come down in price. If it proves to be nearly as demanding as 4870CF, it *seems* this would be an important topic for many people.

Thanks!
 

Bill Kunert

Senior member
Oct 9, 1999
793
0
0
Originally posted by: taltamir

I reduced my overall power consumption by 50$+ a month ever since I started using the computer in my underwear and with the doors to the computer room open wide (for air circulation with the rest of the house) and increasing the house temp by 4 degrees. (during the winter i wear a coat and heavy clothes instead).

Do you have a ceiling fan in your computer room? We have 5 ceiling fans in our house and are able to reduce our air conditioning setting nearly 10 degrees for the same comfort level. It really helps in the den where my computer is. They use relatively little electricity.
We are fortunate to live in Kentucky and Kentucky Utilities is supposedly the least expensive utility in the country. After all taxes, etc. are added we pay about 7 cents per kwh.
 

tvdang7

Platinum Member
Jun 4, 2005
2,242
5
81
Wish we could do energy comsumption tests with the 4850's with the new bios flash running 160/500 as opposed to 500/750 . Think it would make a difference in power usage and heat?
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
The NV cards are also using less wattage at idle because their firmware for that in the bios is working while the ATI parts are not. It is my understanding ATI is working on that. In the meantime, you can use tools to UNDERCLOCK your ATI cards while not in a 3D app to bring down the wattage costs and heat to much more acceptable levels. That's what I think most gamers who buy these cards would do.
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
Originally posted by: taltamir

I reduced my overall power consumption by 50$+ a month ever since I started using the computer in my underwear and with the doors to the computer room open wide (for air circulation with the rest of the house) and increasing the house temp by 4 degrees. (during the winter i wear a coat and heavy clothes instead).

Good to know!
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
I'm still not seeing how even an extra 200$ a year is expensive and not cheap.

Thats 1 hour of work per month at my crappy internship, and a half hour for someone with a decent wage.

I'm just curious what kind of people consider 10-15 bucks a month expensive.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: tvdang7
Wish we could do energy comsumption tests with the 4850's with the new bios flash running 160/500 as opposed to 500/750 . Think it would make a difference in power usage and heat?

Temps didn't drop much with that new bios, makes me wonder if it adjusts clocks only with voltage staying the same. I read somewhere the Powerplay feature still needs updating to adjust voltage..

 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
Originally posted by: Insomniator
I'm still not seeing how even an extra 200$ a year is expensive and not cheap.

Thats 1 hour of work per month at my crappy internship, and a half hour for someone with a decent wage.

I'm just curious what kind of people consider 10-15 bucks a month expensive.

Money is money. Little bits add up, whether it pertains to your own financial situation or whether or not it matters to you is your own issue. He is doing a good job of pointing out how the two setups differ and how much they will give in the long run. I am sure most people would love to have a "free" $200.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
How about the rest of the world that may play 2 hours a night and hit sleep mode or turn off their computers all together? I fit into this market and it appears my playing costs about 25 bucks a year or 2 bucks a month.

/shrug
 

WelshBloke

Lifer
Jan 12, 2005
33,352
11,503
136
Originally posted by: BassBomb
Originally posted by: Insomniator
I'm still not seeing how even an extra 200$ a year is expensive and not cheap.

Thats 1 hour of work per month at my crappy internship, and a half hour for someone with a decent wage.

I'm just curious what kind of people consider 10-15 bucks a month expensive.

Money is money. Little bits add up, whether it pertains to your own financial situation or whether or not it matters to you is your own issue. He is doing a good job of pointing out how the two setups differ and how much they will give in the long run. I am sure most people would love to have a "free" $200.


Yeah but 'insanely more expensive' is a bit out there.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: BassBomb
Originally posted by: taltamir

I reduced my overall power consumption by 50$+ a month ever since I started using the computer in my underwear and with the doors to the computer room open wide (for air circulation with the rest of the house) and increasing the house temp by 4 degrees. (during the winter i wear a coat and heavy clothes instead).

Good to know!

what? next thing you will tell me that you shower with your clothes on.

Originally posted by: Bill Kunert
Originally posted by: taltamir

I reduced my overall power consumption by 50$+ a month ever since I started using the computer in my underwear and with the doors to the computer room open wide (for air circulation with the rest of the house) and increasing the house temp by 4 degrees. (during the winter i wear a coat and heavy clothes instead).

Do you have a ceiling fan in your computer room? We have 5 ceiling fans in our house and are able to reduce our air conditioning setting nearly 10 degrees for the same comfort level. It really helps in the den where my computer is. They use relatively little electricity.
We are fortunate to live in Kentucky and Kentucky Utilities is supposedly the least expensive utility in the country. After all taxes, etc. are added we pay about 7 cents per kwh.

I should probably install one.
And lucky you, 7 cents per KWH after taxas is indeed the cheapest in the USA (its 6 before hidden charges right?)
 

Elfear

Diamond Member
May 30, 2004
7,168
826
126
Great write-up OP. Nice of you to take a little time out of your day to crunch some numbers for us.

A couple points I question just a bit. First, like someone else pointed out, a lot of people use sleep mode with their rigs so that it doesn't sit idle for hours on end. I doubt most people have the time to game for 2hrs straight and than surf the net for 10hrs everyday for a year. I don't know what the typical usage rates are for your average ATer but I would think 1-2hrs of gaming day and 1-2hrs of surfing the net (or insert any other activity here that doesn't use up too many cpu cycles). I would doubt that happens every single day either.
I know you can't take into account all usage patterns by ATers but I think that what I pointed out may be closer to how most people use their computers. Of course I may be wrong here and I'd like to hear other users comments.

The other point I question is the extra cost of AC strictly looking at the delta between the 4850CF and the GTX280. At load and at idle the CF setup consumes ~60watts more than the GTX. That's just like having an extra 60W bulb turned on in the room. I doubt that this would require a big hike in AC costs. The room the user has his computer in may heat up a little more but that wouldn't require that the whole house be cooled down, unless your thermostat is located in the computer room. As someone else pointed out, a fan helps a lot to cool just the one room.

Your estimations may take into account the absolute worst case scenarios but I think an extra $25-35 a year is probably more reasonable for the extra 60W of power that the 4850CF solution would consume. That is still a valid point on comparing graphics card solutions though and people should realize that costs aren't just limited to the initial purchase, especially with increasing energy prices.

 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Great comparison !
This is why I have two 'modes' for my work pc.
1 - not going to be doing anything hardcore, just browsing, email, some photoshop.
Entire system is underclocked, cpu and gpu. Just don't need the power for simple stuff.

2- going to render out 3d scenes. Overclocked cpu, voltages pushed up, gpu overclocked.


Luckily my bios lets me save profiles that I can switch between easily.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
mmm ...

Average Retail Price of Electricity (Cents per kilowatthour): $0.874
(My apologies to Alaska and Hawaii - you were dropped)


Scenario 2
(10 hours a day for 2d work - 2 hours a day gaming)
4850 CF: 588.38 kwh/year
GTX 280: 321.93 kwh/year

@

4850 CF: 110 watt idle / 256 watt load
GTX 280: 50 watt idle / 191 watt load

=

4850 in CF = $51.42
GTX 280 = $28.14
- - - - > > > $23.28

or

$1.94 / month

so ...

With the retail cost of $500 for the GTX 280 and $380 for the HD4850 CF it will take you over 5.15 years @ $1.94 / month to account for the difference in cost (not, of course, accounting for the interest which would accrue over 5 years on $120 - which would be around $19 - or another 9 months or so)

Bottom Line Cost Over 2 Years
(Capital and Operational: 10 hours 2d work / 2 hours gaming per day)


HD4850 CF: $0.6614 / day

GTX 280: $0.7487 / day



 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
@heyheybooboo: assuming you live somewhere with only that much per KWH, and that you don't run the AC to cool your house.

@Elfear: I also use sleep mode, in sleep mode the video card is off, that is why i didnt count any watts for those times (notice the hours of the scenario)...
I gave three scenarios, one for F@H users, one for 24/7 users, and one for me personally, I don't know what "an average person" uses, so I couldn't really guess as to how many hours a day it will idle for an average user. I do think that 2 hours of play and 2 hours of idle is too low though, realistically, almost everyone i know turns their computer on at some point during the day, and turn it off at night before going to sleep. So even if they are watching TV or something, the computer is likely not off or in S3 mode. (i make sure to turn it to sleep mode, but not everyone does).

So that is why I think 10 hours of idle is accurate. Timmy gets home from school and turns on the computer, plays some, watches some TV, plays goes to a friend, etc... turns the computer off and goes to sleep. accurate? who knows.
Feel free to recalculate at other useage points, such as 4 hours of gaming and 0 hours of idle, and various other such options. make up scenarios.
Keep in mind that average takes all days into account, so weekend and holidays, where people can spend 16 hours on the computer instead of say... 8 in a normal day, change the figures.

You said 25-35$/year... that is a very good estimate on your part, notice that this is what I got for the card itself as well:
Delta of scenario 2 @ 14 cent/kwh: GTX 280 is 37.303$ per year cheaper than 4850CF per year in direct electricity cost for running the card itself.
I simply accounted for increased AC costs, which tripled that figure.

I want to ask you about the bulb example... you have seen an electric heater right? it is coils of wire running around a ceramic tube inside a metal casing (so that you don't touch the coils), electricity goes through the metal and generates heat. what is a light bulb? a metal coil inside a vacuum inside some glass... very similar operation, electricity runs through and produces heat and light (heaters do produce light, not as much as a bulb, but they do), light hits objects in the room and is converted back to electricity. I think the reason it produces light is because it is in a vacuum, otherwise it would just burn out, instead it gets really hot, to the point the electrons are excited enough to go to higher valence levels, and then emit light as they fall back down. But it does not combust due to not having any oxygen to react with.

Of course, this is not a light bulb, it is a GPU. I am not sure how it compares to a space heater in terms of electricity to heat conversion. I only know how INEFFICIENT AC is in compared to electric heating.
Do we have someone here who does and cares to elaborate?
 

pmv

Lifer
May 30, 2008
15,142
10,043
136
Electricity prices are probably going to go up in the near future, also.
Personally I live in an underheated house in a not particularly warm country with expensive electricty, so no AC, and the computer serves in lieu of having the heating on during winter, as the heating is horribly expensive.

I suppose one could offset the cost of running the computer against the cost of having the TV on, as personally I find one tends to substitute for the other.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: pmv
Electricity prices are probably going to go up in the near future, also.
Personally I live in an underheated house in a not particularly warm country with expensive electricty, so no AC, and the computer serves in lieu of having the heating on during winter, as the heating is horribly expensive.

I suppose one could offset the cost of running the computer against the cost of having the TV on, as personally I find one tends to substitute for the other.

in texas, the minimum price for electricity went up from 11 cents per KWH to 14 cents per KWH in the past year. There was a power company here that had contracts with people to provide a set price at 11 cents per KWH (i was using it actually), and it went out of business.

I am paying now 15.3 per KWH (I add another 1.3 per KWH to make it 100% renewable. Forget skimping and savings. zero pollution, and the more I waste, the more i subsidize the renewable industry... I still like savings because I don't like tossing money away).
 

dadach

Senior member
Nov 27, 2005
204
0
76
nice to know the calculation, but it shouldnt really be any factor at all, in deciding what gfx card one should buy
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well... this is more comparing multi gpu to single gpu. if you are going with multi gpu of cheap card for "greater value"... it might not be due to higher operation costs... so it is important to keep that in mind... as such, i would multi GPU a 4870, but not a 4850 or an 8800GT.