• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How much is ur power bill?

dorky82

Senior member
I didnt worry about my power bill due to living in military quarters. I have about 1 year left on my contract so after my contract ends ill live outside and will have to pay my utilities

currently running
athlon ii x635
3 hd
1 dvdrw
gtx260 in sli
3 x 21.5 inch monitors. 45w per monitor

right now my computer and monitors are on 24/7
how much power bill do i expect if i play game 4-5 hr per day?

I might have to sell everything and get gaming laptop if it helps


Moved to General Hardware.

Super Moderator BFG10K.
 
Last edited by a moderator:
Not sure on idle/load wattage but if it uses an average of .2kW which is probably high for an average, you are looking about $15/month just for the comp.
.2kW x 24h x 30 days x $0.12 = $15ish
 
Just checked my last bill.

kWh <= 561= $0.111644
kWh > 561 = $0.125733

That's with delievery charges and other fees.

This is in the Detroit metro area.
 
Wow only 15 for pc?
So guessing around 25 with monitors
i thought its going to be around 100-150
i havent paid utility for 5 years
 
Wow only 15 for pc?
So guessing around 25 with monitors
i thought its going to be around 100-150
i havent paid utility for 5 years

Where are you going to be living? I've been to some countries where electricity has been very expensive and some where it is relatively 'free'. I assume you're in the USA, and relatively speaking electricity there is damn cheap. If you can afford 4-5 hours per day of sitting down and playing video games you should be able to afford 100 bucks a month in utilities (in general). If not, I'd restructure my day to earn a bit more and spend a bit less.
 
in the winter our entire electric bill is only around 100 bucks so pc gaming must not have much of an impact in my case. and that's a 3 story house with 3 people and 3 computers. my computer alone is on 12-16 hours a day with probably about 1-2 hours of gaming per day.
 
Mine's under $100 for a typical 3/2 subdivision home. I have no idea what share the PC is, but I doubt its much. If I'm on the PC I'm not watching TV. Its going to be one thing or the other for home entertainment since I'm not much of a book reader. /shrug I do think the impact of a PC (and video cards in general) on your electric bill is way overblown.
 
My bill is around $600 in summer and about $180 in winter. That's for a 3/2 house in Florida.
600 bucks? good lord, I would find a new place to live. our rented house is horribly insulated and the AC runs about 16-20 hours out of the day and our power bill is usually 250-275 bucks in the summer. in fact we have 2 central air main units for the house and even a window unit for the third floor.
 
i have a question that's kinda related to this and was going to make a new thread, but i hope someone here can answer it.


at sites like tomshardwares and anandtech when they do a power consumption test for graphic cards they give numbers such as 200 watts under active idle and 250 watts under full load.

does this mean the graphic card consumes 200 watts per hour when idle and 250 watts per hour when full load?


my hydro bill is $120 a month family 4
 
Thanks for the interesting question dorky, even though I'm not sure why it's on the Video forum.. I guess because video cards are the big power hogs in high-end PCs? 🙂

My answer is ~$20/month on average, since most of my electricity usage is cooking/fridge/PC/a few CFL lamps, and I don't leave my stuff on 24/7 and rarely turn on the AC/heater.

Your answer depends on if your utility does tiering or not and if you're close to getting bumped up to the next tier.

For instance, PG&E has tiering in order to promote energy efficiency by rewarding low usage and punishing high usage. Everyone starts off using Tier 1 allotments of energy. The allotment size is calculated depending on where you live in the state and how large of a family you have and stuff like that. As long as you stay within that allotment, you pay something like $0.12/kWh (including taxes). Any usage after your Tier 1 allotment bumps you to Tier 2, where you pay more, and after you use up your Tier 2 allotment, you pay even more, etc. until you use up your entire Tier 1, 2, 3, and 4 allotment, in which case you are at Tier 5 and get to pay something like $0.48/kWh.

In addition to the cost of power (and the taxes) above, PG&E charges flat rate connection fees that you would have to pay even if you used zero energy that month. But it's not a big amount.

As a rule of thumb, then, for PG&E customers that stay within Tier 1, every watt at idle costs ~$1.05 per year to run (at-the-wall watts; so more like ~$1.31 for 80+). So if a system draws 100 watts idle, expect to pay $105/year for it. Keep in mind this assumes 24/7 usage, and only idle load, not anything else.

If you want a finer-grained analysis:

1. Take the idle power of your system in watts, divide by 1000, and multiply by 19.5. This is your idle load each day in kWh.

2. Take the full-load power of your system in watts, divide by 1000, and multiply by 4.5. This is your gaming load each day in kWh.

3. Add the two numbers together. This is your total kWh usage each day.

4. Multiply this number by the cost per kWh you pay. (In my example with PG&E Tier 1 rates, it's $0.12/kWh.) This is your daily electrical cost for your PC.

5. You can multiply this by 365 if you want to figure out your annual bill.

NOTE: All calculations above assume at-the-wall watts; if you only know your PC component wattage and have a 80+ rated PSU, then just multiply by 1.25. If you have a 80+ bronze PSU, multiply by 1.2 instead of 1.25. If you have 80+ silver, multiply by 1.15. If you have 80+ gold, multiply by 1.11. Why? Because PSUs are not 100&#37; efficient, so your computer components may only be using, say, 100 watts, but an 80%-efficient PSU needs to pull 125 watts from the wall in order to feed your computer. (0.80 * 125 = 100)

Yes I know this is an oversimplification because PSUs are more efficient at 50% load and less efficient at lower or higher loads. But most people probably spend most of their PC time idling (surfing the web, posting on forums, etc.) which is way the hell down there in terms of power draw, meaning likely only ~80% efficiency. And even at 50% load, most 80+ PSUs are no more than ~85% efficient. It gets better as you go from bronze to silver to gold: http://en.wikipedia.org/wiki/80_PLUS

Highly efficient PSUs cost more, but pay for themselves over time for heavy users. Ditto with more-efficient CPUs, GPUs, etc. 🙂 If you are interested in seeing a list of 80+/bronze/silver/gold PSUs, try this link: http://www.80plus.org/manu/psu/psu_join.aspx

Frankly I am surprised there is not a sticky for this topic.

I didnt worry about my power bill due to living in military quarters. I have about 1 year left on my contract so after my contract ends ill live outside and will have to pay my utilities

currently running
athlon ii x635
3 hd
1 dvdrw
gtx260 in sli
3 x 21.5 inch monitors. 45w per monitor

right now my computer and monitors are on 24/7
how much power bill do i expect if i play game 4-5 hr per day?

I might have to sell everything and get gaming laptop if it helps
 
Last edited:
Ok, going by anandtech's bench, a single gtx260 system uses 170 watts at idle (forget what cpu they use, but it probably doesn't make much difference here). SLI is probably more like 200w.
At load, single gtx260 system pulls 300w. So SLI is probably in the range of 400w. (this part doesn't make a big difference)

So with 4 hours of gaming per day and idle the rest, that would be an average of 233 watts.
45w * 3 monitors = 135w.
368w total.

At $0.11 per kwh, that's $0.04048 per hour. $0.97 per day. $29.14 per 30 day month. $354.84 per year.

Of course if you were smart you'd set your monitors to turn off after 15 minutes and save yourself $100 a year. Setting your computer to go into standby would save another ~$150.
 
AC is fine, but we have the old miami roll-out windows, and that screws us royally during the summer.

I've got crank-out side windows too. With a 12000 BTU box AC running, and nothing else but the refridgerator, my electric bill was $136. With my quad-9600GSO GPU F@H cruncher 24/7, my overclocked E2140 @ 2.8Ghz dual-core running 24/7, and my laptop and HTPC 24/7, my electric bill was $180.
 
It depends on the climate where you live. Here in AZ, summer averages over 100 degrees F. Another big user is the pool equipment - runs a 3/4 HP motor 8 hours a day. Summer average is about $250. Winter, less than $100. In that context, the PC running 24/7 is really insignificant.

Further, in the A/C cooling season, every degree runs about $5 a month. I keep my summer temp at a constant 80 F. When I go on a trip, I make it 85. Whatever the computers use is in the noise level. 🙂
 
AC is fine, but we have the old miami roll-out windows, and that screws us royally during the summer.

Dude, wouldn't it be good to get those replaced then? You would save soooo much money and window replacement is what? probably $200 a window max?
 
Back
Top