What is the measurement of wattage?

BehindEnemyLines

Senior member
Jul 24, 2000
979
0
76
Since I don't know anything about electricity, hope you all can share some thoughts. I'm just wondering how much "watt" I can save if the below were completely disconnected from the outlets for the duration of 12 hours per day.

I've a computer system that uses the following and is mostly off for 12 hours a day.
-Monitor uses 8 watts when off
-Printer uses 2 wattas when off
-Speakers adapter: Input - 110VAC, 60Hz, 24W; Output - 9VDC, 1000mA
-External modem adapter: Input - 120VAC, 60Hz, 18W; Output - 9VDC, 600mA
-PC uses 8 watts when off and unknown when in stand by.

-TV uses 1 watt when off
-DVD player uses 5 watts when off

I know that electricity is measured in watts or kilowatts but what is the unit of measure? Like mile per hour. Let's say a monitor is rated using 8 watts when off, how many watts it has "used" in like an hour?
 

InverseOfNeo

Diamond Member
Nov 17, 2000
3,719
0
0
Watts is amps multiplied by voltage or I*V. What the electric company charges you is WattHours....or probably kilowatt-hours (1000 watts per hour).
 

Stojakapimp

Platinum Member
Jun 28, 2002
2,184
0
0
watts is the amount of electricity going through a point at a certain time.
At least i think that's what i learned in Electrical Engineering

Oops, i was thinking about current
 

Colt45

Lifer
Apr 18, 2001
19,720
1
0
usually KWH.

small stuff would be watt/hour though..

so..

Kwh = (V * I) * T


10v @ 10A for 1 hour = 100w/h
 

PrincessGuard

Golden Member
Feb 5, 2001
1,435
0
0
Current is measured in Amps (e.g. 600mA).

Watt and Volt-Amp are both units of energy per unit time and in purely resistive circuits, are exactly the same. They differ when the voltage and current are out of phase but you probably don't care :)
 

PeeluckyDuckee

Diamond Member
Feb 21, 2001
4,464
0
0
I'm confused as to how a PS' wattage rating relates to cost of keeping the PS going.

Would the cost of running a 500w PS be more than that of a 350w PS, given they're both powering the same system with identical power requirements? Or does it run at its peak only when required by the system, sort of working like SpeedStep in Intel's mobile CPUs?

 

rgwalt

Diamond Member
Apr 22, 2000
7,393
0
0
OK, you are charged for power by the kilowatt hour. A kilowatt hour (KWH) is a unit of energy (not power... two different things). You find the number of kilowatt hours used by figuring out how much power a device draws at a given time in kilowatts, and then figure that it draws that many kilowatts in an hour. So, a device that draws 1000 W every hour uses one kilowatt hour of energy per hour.

It looks like all those devices use a total of 66 watt hours per hour. Or 0.066 KWH per hour. If these were unplugged completely for 12 hours a day for one month, you would save 23.76 KWH per month. Now, my power company charges me 7.2 cents per KWH used. This varies from area to area of course. Lets say your company charges you 10 cents per KWH. If this was the case, you would save $2.37 per month by completely unplugging your devices. In other words, you won't gain much at all by unplugging your stuff.

What you CAN do to lower your power bills is to be sure that you don't keep your place colder than necessary in the summer, and make sure your fridge isn't colder than it needs to be. Be sure to dust the cooling coils on your fridge regularly. Your biggest consumers of electricity in your house are devices that use compressors.

Ryan
 

rgwalt

Diamond Member
Apr 22, 2000
7,393
0
0
Originally posted by: PeeluckyDuckee
I'm confused as to how a PS' wattage rating relates to cost of keeping the PS going.

Would the cost of running a 500w PS be more than that of a 350w PS, given they're both powering the same system with identical power requirements? Or does it run at its peak only when required by the system, sort of working like SpeedStep in Intel's mobile CPUs?

Think about it like this... if they are both powering the same components, they should consume about the same amount of power in a given amount of time. If the 500W was using 150W more than the the smaller PS at all times, the energy can't just dissappear. It would have to either go to running the system devices or be wasted as heat. 150W (540 kJ per hour) is a lot of wasted heat.

Does that make sense?

Ryan
 

PeeluckyDuckee

Diamond Member
Feb 21, 2001
4,464
0
0
ok, I think I get it. So that 150w gets expended one way or another whether the system needs it or not? The difference between the 350w and the 500w PS would be output in heat?
 

CountZero

Golden Member
Jul 10, 2001
1,796
36
86
The 350W and 500W PS will be using the same amount of power, only difference is that if needed the 500W could still run while the 350W would probably burn out, or blow a fuse or something similar.

They will both be producing the same voltages and with an identical setup the same current will still be drawn and since P = V * I the power will be equal.

Note: This is ideal, fact is two different PS designs will draw different amounts of power from the outlet even with the same load.
 

rgwalt

Diamond Member
Apr 22, 2000
7,393
0
0
Originally posted by: PeeluckyDuckee
ok, I think I get it. So that 150w gets expended one way or another whether the system needs it or not? The difference between the 350w and the 500w PS would be output in heat?

No, the system would consume the same amount of power no matter which PS you used.

Ryan
 

Evadman

Administrator Emeritus<br>Elite Member
Feb 18, 2001
30,990
5
81
Originally posted by: rgwalt
Originally posted by: PeeluckyDuckee
ok, I think I get it. So that 150w gets expended one way or another whether the system needs it or not? The difference between the 350w and the 500w PS would be output in heat?

No, the system would consume the same amount of power no matter which PS you used.

Ryan

Depends on how you look at it.

Generally, higher wattage PSU's have a higher efficency, so the 350 PSU will waste more power when it is running than the 500 one will. The components will still use the same power, but the current drawn from the wall will change.

 

mithrandir2001

Diamond Member
May 1, 2001
6,545
1
0
Originally posted by: Evadman
Originally posted by: rgwalt
Originally posted by: PeeluckyDuckee
ok, I think I get it. So that 150w gets expended one way or another whether the system needs it or not? The difference between the 350w and the 500w PS would be output in heat?

No, the system would consume the same amount of power no matter which PS you used.

Ryan

Depends on how you look at it.

Generally, higher wattage PSU's have a higher efficency, so the 350 PSU will waste more power when it is running than the 500 one will. The components will still use the same power, but the current drawn from the wall will change.
So it doesn't hurt to get a larger PSU than necessary?