• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How much electricity does a typical computer uses?

TNTrulez

Banned
Assuming that the monitor is on w/ decent speakers.

How about when the monitor and speaker is off but the PC is still running with hard drive spinning but cd-roms not running?

Answers in Watts please.
 
Originally posted by: Rahminator
Typical computer with monitor and speakers off - up to 300 watts/sec.

You mean monitor and speakers off, or on? The monitor is a much greater power hog than the PC itself. The PC only hits it's peak power usage at boot.
 
Originally posted by: vi_edit
Originally posted by: Rahminator
Typical computer with monitor and speakers off - up to 300 watts/sec.

You mean monitor and speakers off, or on? The monitor is a much greater power hog than the PC itself. The PC only hits it's peak power usage at boot.

I meant both monitor and speakers off. I know that PC hits it's peak power usage at boot, that's why I said up to 300 watts. You can't be more precise when such a general question is asked.
 
what do you mean watts/second? Watt/second is something typically used for strobe/flash photography lighting.
 
Originally posted by: Lucky
what do you mean watts/second? Watt/second is something typically used for strobe/flash photography lighting.

Watt is the SI unit of power (Joule/sec, or in other words, the rate of doing work) that's used anywhere from mechanics (physics) to electric stuff. For example, I can tell you how many Watts (or KW, or hP - 1 hP is 746 Watts I think) is required to push a 20 KG crate across the floor in 10 secs or how many watts is required to use an electric heater that requires 50V with 8 ohm resistance.

Or so I learned in my physics class.
 
Originally posted by: Rahminator
Originally posted by: Lucky
what do you mean watts/second? Watt/second is something typically used for strobe/flash photography lighting.

Watt is the SI unit of power (Joule/sec, or in other words, the rate of doing work) that's used anywhere from mechanics (physics) to electric stuff. For example, I can tell you how many Watts (or KW, or hP - 1 hP is 746 Watts I think) is required to push a 20 KG crate across the floor in 10 secs or how many watts is required to use an electric heater that requires 50V with 8 ohm resistance.

Or so I learned in my physics class.

Yeah, we know what a watt is, what we're wondering about is where in the hell you came up with the "per second" part.
 
Originally posted by: notfred
Originally posted by: Rahminator
Originally posted by: Lucky
what do you mean watts/second? Watt/second is something typically used for strobe/flash photography lighting.

Watt is the SI unit of power (Joule/sec, or in other words, the rate of doing work) that's used anywhere from mechanics (physics) to electric stuff. For example, I can tell you how many Watts (or KW, or hP - 1 hP is 746 Watts I think) is required to push a 20 KG crate across the floor in 10 secs or how many watts is required to use an electric heater that requires 50V with 8 ohm resistance.

Or so I learned in my physics class.

Yeah, we know what a watt is, what we're wondering about is where in the hell you came up with the "per second" part.

Don't be so anal. Technically, it's not supposed to be per sec, but the original poster clearly didn't make the connection between his PSU wattage and how much power does it "guzzle" in a set amount of time (a second, for example). Sometimes you must forego technicalities and strive for clarity.
 
Well seeing as no one actaully gave the original poster the answer he was looking for I guess I will. The average computer uses about 100-125W. The average monitor (CRT) uses about 75-100W. I was kinda surprised but every monitor I've tried has been about in that range. Size didn't matter (I tried 15"-19"). Together they use about 200-225W.

I don't know about speakers but I don't think they usually use that much (I could be wrong). I just looked and my speakers transformer is rated for 15W input. Hope this helps.
 
Back on topic and on the right track... it depends on the hardware in your computer and whether or not it's under load.
For example, a 500MHz Celeron only uses ~10W (per hour) under load, while an OC'ed P4 at 3.0+GHz could use as much as 100W per hour under load. While idle, both cpus would draw much less power. Drives usually use about 10W each (at max usage), monitors (depending on size and model) about 100W, vid cards anywhere from 4W (GF2MX) to 24W (GF4Ti). Sound cards and NICs only use about 4-5W. Speakers vary a lot, depending on rated output. Then of course there are the fans.
Basically, a computer at idle with monitor and speakers off should use less than 100W per hour. Usually quite a bit less, but of course this once again varies a lot depending on the computer. With monitor and speakers on, about 200 or so.
And then a high-end system in the middle of full-tilt online gaming could be using more than 500W (and that's not counting external cable/DSL modems, routers, etc).
 
Originally posted by: notfred
What's with all this wattage per hour crap? Didn't we just agree that it's incorrect?

I'm assuming he's trying to figure out his power bill (or how a new system would impact his power bill).
Electric companies bill in KW/h
 
one watt hour is not 1 watt per hour. It's usage of one watt for a time period of one hour.

Analogy:

my car makes 276 horsepower. It does not make 276 horsepower per hour

If I run it at peak output for 1 hour, then yes, that would be 276 horsepower hours, but it's not horsepower per hour. If you're gonig to say "horsepower per hour" then my car makes "4.6 horspower/minute".

it just doesn't work that way.
 
Take the rating of your PSU, that's about how much your computer alone uses continuosly. How much your speakers take up depends on the make/model/volume/# of speakers/etc. How much your monitor takes up depends on your make/model/size/resolution. I don't think it's worth figuring it all out 😉

i.e. running your computer about 3 hours (300-350w PSU) takes about 1 kw/h
 
where's jerboy when you need him? 😀

All i know is that whoever was talking in watt/seconds in incorrect... 😉
 
Originally posted by: PipBoy
Originally posted by: PSYWVicElectric companies bill in KW/h

No, electric companies bill in kilowatt hours (kWh). You pay mostly for total usage, not rate of usage.

D'oh!

edit: bah, let this be a warning to you all to not PWI 😉
 
MWink,

The reason all the Monitors used the same power (roughly 300 watts) is because the Power supply for all CRT based tubes is very similar to get the voltage up around 25,000 volts. Most Computers under 1 gig use about 100 Watts while now the machines running around 2 Gig now are up around 150 watts. The reason you need the 300 watt or better Power supply is the inrush currents needed when first firing up the computers.
 
Back
Top