• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How Does Electricity work among Equipments when left on or off?

Dari

Lifer
While it is obvious that the equipment uses more power when it's in use, what happens when it is off or dormant? Is it better to turn the item off completely (by unplugging it) to save on your electricity bill or can you leave it plugged in? I've heard that the surge that your electrical equipment requires is great compared to just leaving it on. Or is this assessment wrong? Which is better?
 
I don't know the technical answer to this, but I'm going to venture a guess that in the grand scheme of things (years down the road)... the difference in longevity/reliability/cost/etc. is very negligible.

In other words, it's not even something to worry over. That is, unless you turn the thing on and off 20 times a day.
 
Originally posted by: Dari
While it is obvious that the equipment uses more power when it's in use, what happens when it is off or dormant? Is it better to turn the item off completely (by unplugging it) to save on your electricity bill or can you leave it plugged in? I've heard that the surge that your electrical equipment requires is great compared to just leaving it on. Or is this assessment wrong? Which is better?

If it's truly off, it's not using any electricity. I've heard the "surge using more electricity" reasoning to leave things running rather than stopping and starting them. It's true that some things use more energy at first as they start (electric motors, flourescent light ballasts), but it's not worth leaving them on. In the case of motors, the slower they're running, the more amps they draw because of a lower resistance - that's why when things such as circular saws get bogged down, the lights may dim or they may trip a breaker or blow a fuse.
 
Originally posted by: rh71
I don't know the technical answer to this, but I'm going to venture a guess that in the grand scheme of things (years down the road)... the difference in longevity/reliability/cost/etc. is very negligible.

In other words, it's not even something to worry over. That is, unless you turn the thing on and off 20 times a day.

I'm asking because some people do it to save money on their electricity bill. Are you saying it doesn't make a difference?
 
It's sort of an urban legend that it saves money on electricity bills. It doesn't. It doesn't use any electricity.
But, don't take my word for it.
If you wish to verify this, turn off absolutely everything in the house. If it can't be turned off, (refrigerator), you'll have to unplug it. Look at the electric meter... it won't be turning. (If it is, there's something still on in the house). Turn on just one light, and the meter will start turning.

If the meter continues to run, but you can't figure out what's making it run, then unplug all the things you can, and the meter will still be running at the same speed. (unless you discover that it's a plug-in glad air freshener or night light or something)
 
But, the people who insist that it saves money are usually the type of people who are stuck in their ways... there's no reasoning with them. If you convince them to stop unplugging things this month, then it's a colder month in January so their furnace runs more often, they'll use the higher electric bill as proof that you were wrong.
 
The surge-savings depend alot on how often it's on and how long it's off. If you're turning something on and off like a chicken on PCP then just leave it on. But it you use it, and then have it off for 20 hours, just go all the way off.
 
All transformers use power even when off. Your computer probably pulls 20 watts or so when plugged in even when off.
 
'Standby' power - power drawn by equipment which is not in use is a big problem and accounts for between 5 and 15% of a domestic electricity bill, and has an total power draw equal to several large power stations.

With the increasing use of electronic control systems, more and more systems are using low-voltage control systems, some of which remain switched on all the time. E.g. A video recorder, even when switched 'off' at the front panel, uses some power for the clock/timer, and some is wasted in the transformer. Same thing with your PC - on old AT PCs the power button disconnected the PSU from the mains, but on ATX the PSU, and some circuits of the motherboard remain energised (a typical PC will use about 10 W when switched 'off').

Same thing with many stereo systems, cordless phones, battery chargers, microwave ovens, washing machines - almost anything that uses an electronic control system. My washing machine (which I thought was all mechanical - but I never looked inside) uses 10W when plugged in, even when switched off - it now gets switched off at the socket when not in use. The collection of wall warts under my computer desk (for printer, USB hub, modem, etc.) use 30W for themselves with the relevant peripherals switched off - start turning the printer, etc. on and things start increasing further.

Modern electronic switching PSUs are much more efficient than the old transformer based PSUs. E.g. the charger for one of my old Nokia phones would use 5W when plugged into the mains, and 8W when charging the phone - however, my new one uses <1 W (I can't measure it) when plugged in and 4W when charging. There are several groups that are lobbying for such PSUs to be standard with new equipment - it's possible that it may even become a legal requirement in some countries soon.
 
Originally posted by: Mark R
'Standby' power - power drawn by equipment which is not in use is a big problem and accounts for between 5 and 15% of a domestic electricity bill, and has an total power draw equal to several large power stations.

With the increasing use of electronic control systems, more and more systems are using low-voltage control systems, some of which remain switched on all the time. E.g. A video recorder, even when switched 'off' at the front panel, uses some power for the clock/timer, and some is wasted in the transformer. Same thing with your PC - on old AT PCs the power button disconnected the PSU from the mains, but on ATX the PSU, and some circuits of the motherboard remain energised (a typical PC will use about 10 W when switched 'off').

Same thing with many stereo systems, cordless phones, battery chargers, microwave ovens, washing machines - almost anything that uses an electronic control system. My washing machine (which I thought was all mechanical - but I never looked inside) uses 10W when plugged in, even when switched off - it now gets switched off at the socket when not in use. The collection of wall warts under my computer desk (for printer, USB hub, modem, etc.) use 30W for themselves with the relevant peripherals switched off - start turning the printer, etc. on and things start increasing further.

Modern electronic switching PSUs are much more efficient than the old transformer based PSUs. E.g. the charger for one of my old Nokia phones would use 5W when plugged into the mains, and 8W when charging the phone - however, my new one uses <1 W (I can't measure it) when plugged in and 4W when charging. There are several groups that are lobbying for such PSUs to be standard with new equipment - it's possible that it may even become a legal requirement in some countries soon.

So I guess off is better when not in use for over 20 hours, right? Thanks for the info.
 
Bah the amount of power used is negligible compared to when the unit is on. I wouldn't worry about it, when you pay 13 cents per kilowatt hour are you really concerned with <10Watt constant usage? Who want's to be bothered to unplug and plug things in all the time?

The food I eat that gives me the energy to plug and unplug things is more expensive than the electricity to keep it running on it's low standby power. Have you seen the prices on a good ole rib eye steak lately? 😉
 
10 watts/hour * maybe 20 devices in a house = .2 kilowatts an hour * $0.13 kw/hour = $0.026/hour * 24 hours day = ~$0.62 a day * 30 days a month = $18.72 a month.
 
the thing is most modern appliance never turn off, they just go into standby when you think you're turning them off. until energy star came around, some VCR's never turned off at all, they just stopped sending their signal. computers still draw even when they're turned off (thats why the power cable sometimes sparks when you plug it in, and how it's able to start without a physical switch being thrown.) i'd estimate a house full of all the latest gizmos with their standby's, power adapters, etc is still drawing 50 watts even when everything is off. think about your computer station: the computer is still drawing, the printer is drawing with its full interface running so that it can turn on "automatically" when you go to print, the scanner is running it's interface for the same reason, your monitor is drawing a good 5W or so just sitting there, plus your transformers and adapters are all dumping heat into the room. anything that uses a remote is running it's remote sensors full time. your fridge is running the heated strip around the door seal. your automatic sprinklers are going. etc etc. but do you really need to worry about it? no. it isn't worth the time to shut these things off because the savings really aren't that great.
 
Originally posted by: Syringer
10 watts/hour * maybe 20 devices in a house = .2 kilowatts an hour * $0.13 kw/hour = $0.026/hour * 24 hours day = ~$0.62 a day * 30 days a month = $18.72 a month.

i'd say that's way overestimated. the last place i lived, my entire electric bill was about $18 a month.
 
Originally posted by: thomsbrain
Originally posted by: Syringer
10 watts/hour * maybe 20 devices in a house = .2 kilowatts an hour * $0.13 kw/hour = $0.026/hour * 24 hours day = ~$0.62 a day * 30 days a month = $18.72 a month.

i'd say that's way overestimated. the last place i lived, my entire electric bill was about $18 a month.
.... Uh, and is that supposed to be the norm or something?

Lets assume its only 10 devices and 0.075/kWh(about what it is per kWh around here).

10 watts/hour * 10 = 0.1kWh * 0.075 = 0.18/day * 30 = $5.40/mo.

Still pretty significant, especially if your bill is only 18$. 😛

It would be interesting to unplug everything and see how much our bill drops. It hovers around 160$/mo.
 
Originally posted by: Dari
While it is obvious that the equipment uses more power when it's in use, what happens when it is off or dormant? Is it better to turn the item off completely (by unplugging it) to save on your electricity bill or can you leave it plugged in? I've heard that the surge that your electrical equipment requires is great compared to just leaving it on. Or is this assessment wrong? Which is better?

Many devices are really still "on" when you think they are off. For instance, your TV cannot be totally off, otherwise the remote would never turn it back on. Some electronics needs to be active to receive the signal and process it.

As far as the surge suggestion goes, devices that require a great deal of current upon startup are usually inductive, such as a motor, transformer etc. The power surge in that you mention is largely 'reactive power', which is not considered 'real' power. From a vector perspective, reactive power is one leg of a triangle, the other leg is real power, in Watts. The vector sum is Volt-Amps. Your meter measures Watts, so for the most part, this reactive power consumption is not registered on the meter. Is it significant? Over millions of households, yes. But the greatest use of inductive power is not surprisingly - industrial. The power company has to compensate for this at the power plant, but believe me, we all pay for this in the end- i.e. your electrical bill has this factored in. HTH 🙂
 
Originally posted by: yellowfiero
Originally posted by: Dari
While it is obvious that the equipment uses more power when it's in use, what happens when it is off or dormant? Is it better to turn the item off completely (by unplugging it) to save on your electricity bill or can you leave it plugged in? I've heard that the surge that your electrical equipment requires is great compared to just leaving it on. Or is this assessment wrong? Which is better?

Many devices are really still "on" when you think they are off. For instance, your TV cannot be totally off, otherwise the remote would never turn it back on. Some electronics needs to be active to receive the signal and process it.

Well, you can turn it off with the remote, which puts it in standby and it will still use a little power, or you can press the power button on the front which will turn it off totally and it won't draw any power, or you can unplug it.


10 watts/hour * maybe 20 devices in a house = .2 kilowatts an hour * $0.13 kw/hour = $0.026/hour * 24 hours day = ~$0.62 a day * 30 days a month = $18.72 a month.
I hate a stereo that says 0.6w in standby. Not all devices are going to use 10w in standby.
 
Originally posted by: yellowfiero
Originally posted by: Dari
While it is obvious that the equipment uses more power when it's in use, what happens when it is off or dormant? Is it better to turn the item off completely (by unplugging it) to save on your electricity bill or can you leave it plugged in? I've heard that the surge that your electrical equipment requires is great compared to just leaving it on. Or is this assessment wrong? Which is better?

Many devices are really still "on" when you think they are off. For instance, your TV cannot be totally off, otherwise the remote would never turn it back on. Some electronics needs to be active to receive the signal and process it.

As far as the surge suggestion goes, devices that require a great deal of current upon startup are usually inductive, such as a motor, transformer etc. The power surge in that you mention is largely 'reactive power', which is not considered 'real' power. From a vector perspective, reactive power is one leg of a triangle, the other leg is real power, in Watts. The vector sum is Volt-Amps. Your meter measures Watts, so for the most part, this reactive power consumption is not registered on the meter. Is it significant? Over millions of households, yes. But the greatest use of inductive power is not surprisingly - industrial. The power company has to compensate for this at the power plant, but believe me, we all pay for this in the end- i.e. your electrical bill has this factored in. HTH 🙂
Hmm.. that is interesting.

I must say though, that when I turn say, the dryer on... see, I'm lucky enough to be able to watch the meter at the same time, and it really does jump super fast for the moment that the lights go dim when the dryer starts, and then assumes a rate about 1/3rd of the speed of the jump.

I did this a few times, just to watch it.. and came to the conclusion that it would be silly to try and save money by leaving something on to avoid turning it off & on.
 
Originally posted by: Lonyo

Well, you can turn it off with the remote, which puts it in standby and it will still use a little power, or you can press the power button on the front which will turn it off totally and it won't draw any power, or you can unplug it.

Hmm... No? If it was using no power, the remote wouldn't be able to turn it back on. 😉
 
Originally posted by: Dari
Originally posted by: rh71
I don't know the technical answer to this, but I'm going to venture a guess that in the grand scheme of things (years down the road)... the difference in longevity/reliability/cost/etc. is very negligible.

In other words, it's not even something to worry over. That is, unless you turn the thing on and off 20 times a day.

I'm asking because some people do it to save money on their electricity bill. Are you saying it doesn't make a difference?

Even if the device is turned off it still consumes power. For exampe a DVD player uses power to make the remote available.
 
Back
Top