!Updated w/ Poll. Vampire Energy - Put them on a power strip for easy on/off

taltos1

Senior member
Nov 15, 2001
892
0
0
!!Added Poll. Please vote above.


Hello,
I have put my electronics (Vid game systems, VCR, DVD, TV, etc) onto a power strip for an easy way to turn them off and on. My question is: does turning them off and on damage them in some way. In particular I have a plasma TV and I think I read somewhere that this can hurt them as "standby" energy is used to smooth out warming the TV up. I only turn things on every other day or so.

Thanks a lot
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The main issue I have with using a power strip to turn on lots of devices at one time is that it causes a fast drop in voltage when you flip the switch.

If its a quality strip then it will not be a problem, if its a poorly designed strip, then everytime you flip the switch your creating a brownout type situation that could be bad for the devices attached. They have to draw more current to make up for the voltage differences and that can strain power supplies.

Curious as to why your cutting off dvd,vcr, etc.
The power draw from all those devices in standby is not significant, despite what the media may say.

 

taltos1

Senior member
Nov 15, 2001
892
0
0
I am trying to save as much electricity as possible so I figured I will set them all up on a power strip. But I do not want to damage my electronics... Is this "brownout" situation still a factor is I turn off my tv, dvd, vid game systems, wait 5 minutes and then switch the power supply button?

 

sdifox

No Lifer
Sep 30, 2005
99,445
17,581
126
Originally posted by: Modelworks
The main issue I have with using a power strip to turn on lots of devices at one time is that it causes a fast drop in voltage when you flip the switch.

If its a quality strip then it will not be a problem, if its a poorly designed strip, then everytime you flip the switch your creating a brownout type situation that could be bad for the devices attached. They have to draw more current to make up for the voltage differences and that can strain power supplies.

Curious as to why your cutting off dvd,vcr, etc.
The power draw from all those devices in standby is not significant, despite what the media may say.

why should that be an issue? you turn off the machines first, then turn of the power strip. The only draw that will come back on is the standby mode power draw.
 

PurdueRy

Lifer
Nov 12, 2004
13,837
4
0
Originally posted by: taltos1
I am trying to save as much electricity as possible so I figured I will set them all up on a power strip. But I do not want to damage my electronics... Is this "brownout" situation still a factor is I turn off my tv, dvd, vid game systems, wait 5 minutes and then switch the power supply button?

You would be better served by trying to catch each and every light you leave on rather than spending money on surge protectors just to turn off standby current. If you are doing it for money purposes...the trickle current is so insignificant that it will amount to basically nothing. If you are doing it for the environment...then go 5 MPH slower on the highway and you'll help the world much more.
 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
I was actually pretty surprised how much electricity my home theater system was drawing even when everything was "off". Its not a LOT, but its more than negligible.

I wound up getting a "smart strip" so that when my receiver turns off , it kills the power to my TV, subwoofer, DVD player, and media streamer. Its worked well.

Smart Strip
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: Slick5150
I wound up getting a "smart strip" so that when my receiver turns off , it kills the power to my TV, subwoofer, DVD player, and media streamer. Its worked well.
Neat product, but I always worry about the stress that power-cycling puts on equipment.
 

alpineranger

Senior member
Feb 3, 2001
701
0
76
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.
 

taltos1

Senior member
Nov 15, 2001
892
0
0
Hmm. So is it true that "power-cycling" puts to much stress on electronics?

Plus, if this chart is correct using this technique can save you quite a bit of $, while also helping to environment.

Vampire Energy Chart
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

The killawatt is not reliable when measuring small amounts of current.

If all your devices only use 50W in standby that to me is insignificant.
It would cost me a whole 1.23 a month.

Your better putting a blanket on your hotwater heater, installing a programmable thermostat, replacing old light bulbs with florsecent than worrying about standby current usage.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: taltos1
Hmm. So is it true that "power-cycling" puts to much stress on electronics?

Plus, if this chart is correct using this technique can save you quite a bit of $, while also helping to environment.

Vampire Energy Chart

That chart has to be taken with a grain of salt.
Its not going to be accurate for any house except the one with the same exact devices it has listed in it.

Example:
They show a computer using 311 KWH for a year.
Thats 800 watts per day or 33 watts per hour.
In fact the most that any modern pc can draw when off but still plugged in according to the atx spec is 12.5 watts.
That is the maximum, most draw 6 watts or lower.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: sdifox
Originally posted by: Modelworks
The main issue I have with using a power strip to turn on lots of devices at one time is that it causes a fast drop in voltage when you flip the switch.

If its a quality strip then it will not be a problem, if its a poorly designed strip, then everytime you flip the switch your creating a brownout type situation that could be bad for the devices attached. They have to draw more current to make up for the voltage differences and that can strain power supplies.

Curious as to why your cutting off dvd,vcr, etc.
The power draw from all those devices in standby is not significant, despite what the media may say.

why should that be an issue? you turn off the machines first, then turn of the power strip. The only draw that will come back on is the standby mode power draw.

Some devices initialize each time the power is restored.
Examples are home theater, cable & satellite receivers, DVR like tivo.
They all go to full on state when the power is restored, then they settle down and go back to standby.
 

nakedfrog

No Lifer
Apr 3, 2001
61,955
17,721
136
I saw an ad for a Belkin (maybe) power strip that has a remote control... it has two always on outlets, but you can actually cut power to the rest from the remote control.
 

BigJ

Lifer
Nov 18, 2001
21,330
1
81
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.
 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.
 

PurdueRy

Lifer
Nov 12, 2004
13,837
4
0
Originally posted by: Slick5150
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.

Multiple reasons:

1. Because the media is making this out to seem like its going to save you tons of money. Provided you turn your devices at least into standby mode, the power savings between that and completely removing power from the wall is insignicant compared to other ways you can save energy. However, they are using distorted numbers to scare people into doing this.

2. If people go out and buy a power strip for their outlets, that's at least $10 right there. So, just to break even you have to save $10 of electricity. Which isn't that easy to do considering how much power most electronic devices pull when turned off(or in standby).

3. Some people are doing this "for the environment". There are much better, and easier, ways to help the environment if someone really wants to do that. As I said earlier, slowing down on the highways and not tailgating as much both will have a much more significant impact on the energy resources you use. But that would be much more of a hassle right? Imagine having to slow down from 85 on a highway to 75...crazy I know.

So yes, you can do both. However, people shouldn't be obsessing over "vampire" energy. It's like a company that works so hard to save every penny then one worker forgets you add a resistor to a circuit and you have a recall. Completely offsetting any gains resulting in the effort to save every little bit. Same thing for home owners. You can fully turn off every outlet but that one time you leave your light on when you go to work or, even worse, on vacation will completely offset the gains.

 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
Originally posted by: PurdueRy
Originally posted by: Slick5150
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.

Multiple reasons:

1. Because the media is making this out to seem like its going to save you tons of money. Provided you turn your devices at least into standby mode, the power savings between that and completely removing power from the wall is insignicant compared to other ways you can save energy. However, they are using distorted numbers to scare people into doing this.

2. If people go out and buy a power strip for their outlets, that's at least $10 right there. So, just to break even you have to save $10 of electricity. Which isn't that easy to do considering how much power most electronic devices pull when turned off(or in standby).

3. Some people are doing this "for the environment". There are much better, and easier, ways to help the environment if someone really wants to do that. As I said earlier, slowing down on the highways and not tailgating as much both will have a much more significant impact on the energy resources you use. But that would be much more of a hassle right? Imagine having to slow down from 85 on a highway to 75...crazy I know.

So yes, you can do both. However, people shouldn't be obsessing over "vampire" energy. It's like a company that works so hard to save every penny then one worker forgets you add a resistor to a circuit and you have a recall. Completely offsetting any gains resulting in the effort to save every little bit. Same thing for home owners. You can fully turn off every outlet but that one time you leave your light on when you go to work or, even worse, on vacation will completely offset the gains.

1. I'm using my own numbers based on measuring energy use as well as looking at my electric bills. The numbers are NOT insignificant. I'd like to know what numbers you're drawing your broad assumptions from.

2. Again, I have recouped my costs of changing out lightbulbs and buying power strips very quickly (which weren't much to begin with). So, not sure where you're getting that from.

3. Of course there are "better" ways, but this is "another" way. Again, you're using an either/or argument which just isn't the case. You can drive slower AND use less electricity at home. Crazy, I know.

I agree you don't need to obsess over standby energy use, but if someone wants to take the effort to do something about it, I just can't figure out why you're attacking them over it or telling them they're wasting their time. If you don't think its worthwhile for your own home, then don't do it. If someone else does, then that's their decision. The question the OP asked, who clearly is interested in doing it, is whether doing so has any negative effects on the electronics themselves (from having the power cut), and I would say that there most likely aren't, but I don't know that to be 100% true.


 

PurdueRy

Lifer
Nov 12, 2004
13,837
4
0
Originally posted by: Slick5150
Originally posted by: PurdueRy
Originally posted by: Slick5150
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.

Multiple reasons:

1. Because the media is making this out to seem like its going to save you tons of money. Provided you turn your devices at least into standby mode, the power savings between that and completely removing power from the wall is insignicant compared to other ways you can save energy. However, they are using distorted numbers to scare people into doing this.

2. If people go out and buy a power strip for their outlets, that's at least $10 right there. So, just to break even you have to save $10 of electricity. Which isn't that easy to do considering how much power most electronic devices pull when turned off(or in standby).

3. Some people are doing this "for the environment". There are much better, and easier, ways to help the environment if someone really wants to do that. As I said earlier, slowing down on the highways and not tailgating as much both will have a much more significant impact on the energy resources you use. But that would be much more of a hassle right? Imagine having to slow down from 85 on a highway to 75...crazy I know.

So yes, you can do both. However, people shouldn't be obsessing over "vampire" energy. It's like a company that works so hard to save every penny then one worker forgets you add a resistor to a circuit and you have a recall. Completely offsetting any gains resulting in the effort to save every little bit. Same thing for home owners. You can fully turn off every outlet but that one time you leave your light on when you go to work or, even worse, on vacation will completely offset the gains.

1. I'm using my own numbers based on measuring energy use as well as looking at my electric bills. The numbers are NOT insignificant. I'd like to know what numbers you're drawing your broad assumptions from.

2. Again, I have recouped my costs of changing out lightbulbs and buying power strips very quickly (which weren't much to begin with). So, not sure where you're getting that from.

3. Of course there are "better" ways, but this is "another" way. Again, you're using an either/or argument which just isn't the case. You can drive slower AND use less electricity at home. Crazy, I know.

I agree you don't need to obsess over standby energy use, but if someone wants to take the effort to do something about it, I just can't figure out why you're attacking them over it or telling them they're wasting their time. If you don't think its worthwhile for your own home, then don't do it. If someone else does, then that's their decision. The question the OP asked, who clearly is interested in doing it, is whether doing so has any negative effects on the electronics themselves (from having the power cut), and I would say that there most likely aren't, but I don't know that to be 100% true.

Obviously changing light bulbs will make a large difference. You are saving upwards of 50W per bulb. I am not disputing that your electric bill is different.

However, most electronics devices do not take anywhere near 50W when off. In addition, most of our homes have many more light bulbs than electronics devices that pull significant power when off.

Again, if he wants to do it, that's fine. However, if you actually add up the cost savings from fully turning off devices like DVD players, VCR's, etc. it is not as significant as, for example, the bulbs you changed. If someone decides the extra savings is worth the hassle, that's fine. Just making people aware that the current draw from devices that are "off" is not as big of a deal as the news reporters are making it out to be.

I'm not attacking you. I am just stating my opinion on the subject.

In a direct answer to the OP's question however, for most devices it won't matter as they are designed to handle being unplugged "hot". However, it is always a good idea to turn off a device first then turn off the surge protector.
 

BigJ

Lifer
Nov 18, 2001
21,330
1
81
Originally posted by: Slick5150
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.

I don't think you want to start playing this game. There are a substantial number of saving measures that you can take that will reduce your electricity use by much more than 400kWh per year and don't require you unplugging your powerstrip everytime you use your electronics. A few that come to mind is proper insulation, programmable thermostat, window and door flashing, attic fans with sensors, black-out shades, plumping insulation, timers for general lighting, photocell sensors, properly sizing your AC, cleaning your AC, changing your cooking and cleaning habits, changing how you do your laundry, and ceiling fans. I can guarantee you all of these, and the many other steps that can be taken, haven't been exhausted yet.

If I really wanted to be an ass, I would've posted the above. Instead it was merely a suggestion that you could probably make up the energy without inconveniencing yourself everytime. So chill out.
 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
Originally posted by: BigJ
Originally posted by: Slick5150
Originally posted by: BigJ
Originally posted by: alpineranger
I have a kill-a-watt and use it to measure standby power draw of many devices. It is often not insignificant (like a 10W draw on a tv). If you have lots of electronics you could be experiencing over 50W of standby power draw. That's hundreds of kWh per year.

You're looking at about 400kWh a year.

I have a feeling not forgetting to turn off lights when you're out or leave a room would result in far more energy/electricity being saved and a much higher savings.

Are all your light fixtures currently using CFLs? Replace a single bulb and you've almost made up those 50w already.

Why is everyone presenting this as an either/or scenario? You realize that you can replace your lighbulbs AND try to cut down on standby power from your electronics too, right?

I've replaced all my bulbs with CFLs, used a smart strip, and taken a few other steps and there is a huge difference between my electricity bills before and after.

I don't think you want to start playing this game. There are a substantial number of saving measures that you can take that will reduce your electricity use by much more than 400kWh per year and don't require you unplugging your powerstrip everytime you use your electronics. A few that come to mind is proper insulation, programmable thermostat, window and door flashing, attic fans with sensors, black-out shades, plumping insulation, timers for general lighting, photocell sensors, properly sizing your AC, cleaning your AC, changing your cooking and cleaning habits, changing how you do your laundry, and ceiling fans. I can guarantee you all of these, and the many other steps that can be taken, haven't been exhausted yet.

If I really wanted to be an ass, I would've posted the above. Instead it was merely a suggestion that you could probably make up the energy without inconveniencing yourself everytime. So chill out.

And my point was that I'm not inconveniencing myself. I bought the "smart strip" that I linked to above. It does everything I need it to do automatically. My receiver goes off, it kills the power to the rest of my stuff. Receiver goes on, everything else gets power back. Works great for me. If its not for you, then so be it.

 

SlickSnake

Diamond Member
May 29, 2007
5,235
2
0
The best reason to turn everything off when not used is it saves the electronics inside the units that otherwise go into a standby mode when off, like TVs, VCRs, recordable DVD players, receivers, and other HT components. In standby mode, there is still current going through most of the unit, including the power supply, which slowly burns out over time.

I have seen the displays slowly dim over the years on VCRs of various brands left plugged in all the time, to the point where the VCR just stops working one day, and the display won't come on. I have one in one of my HT setups I need to disconnect that is in its death throws as we speak. And it also ate my Led Zeppelin Song Remains the Same VHS tape, just to spite me one last time.

Of course, turning off the HT might make it lose some settings, like a clock, which might be a pain for some people if you are using it. But it sure beats buying a new component from just leaving it in standby mode all the time. And this stand by mode is also one of the reasons TVs burn out in the power regulation circuit frequently, too. I had several different older TVs have this problem. What is odd, is it just won't come on one day. None of them stopped working while on. This points to a stand by mode burning out the power circuit. Just like on my VCRs.

Even PCs can burn up from being left on all the time, including in standby mode. For example the capacitors will swell up after a while, and burst or stop working from continuos current running in electronics.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: SlickSnake


Even PCs can burn up from being left on all the time, including in standby mode. For example the capacitors will swell up after a while, and burst or stop working from continuos current running in electronics.



Capacitors burst if the voltage is exceeded.
Its actually better for the capacitor to keep voltage in it than it is for it to sit empty.


Electronics that are poorly designed can wear down over time, mostly due to heat damage.
But leaving something plugged in for long periods of time does not shorten its life span.

Most devices that use standby setups do not have power going through most of the unit.
They power the supply, which is in standby mode, so it supplies just 3v or 5v to the devices micro, which waits for a button press or ir signal.

The hardest time for electronics is on power up.
When you first plug in any device there is a surge.

The capacitors that are empty have to charge up.

If its a transformer based supply, the windings have to energize, then the capacitors, then the bridge rec, and any voltage converters down the line.

If its switch mode, then the capacitors have to charge, the circuits have to energize and the output capacitors have to charge.

There is no way around that.
Its going to happen every time you plug in something for the first time.

The problem with doing it on a power strip is that all those devices produce that surge at the same time. If the power strip isn't designed properly , then the voltage will drop and the devices connected will have to draw more current during the startup, that is what can damage electronics. If you live in an older home with older wiring then the effect can be worse.




 

taltos1

Senior member
Nov 15, 2001
892
0
0
Wow. Thank you for all the replies but I am still a bit in the dark. I understand that there are 2 elements at play here: energy savings AND longevity of the electronics.

Energy Savings:
It seems the "Power Strip" method will save some energy, but perhaps not as much as some are touting.

Longevity of Electronics:
Some say the power strip method is good, other say it is bad. Both sides have made what seem to be valid points (IMHO, I am no electronics guru) but I am still not sure what to do...???

 

SlickSnake

Diamond Member
May 29, 2007
5,235
2
0
Modelworks Said:

Electronics that are poorly designed can wear down over time, mostly due to heat damage.
But leaving something plugged in for long periods of time does not shorten its life span.

Thats just wrong on so many levels, I won't even go there. I could give you too many examples to prove you wrong. Like all the repair bills for 3 TVs that burned out while standing by, and never turned on again. One TV did this multiple times, until I threw it out. You could feel the back of the case the heat running through the power circuit when these TVs were off, too. Unless they were unplugged.

Most devices that use standby setups do not have power going through most of the unit.

That's what I said. But the power is still running through the power circuit and the voltage regulator that makes it operate, which is why the power circuit failure rates are so high in units that "stand by" when plugged in, like a TV or VCR. The only reason a tube TV stands by when off, is so you don't have to wait a long time to energize the tube. If you are old enough to remember, a tube type TV (one also using tubes) might take 3 to 5 minutes to fully warm up. Leaving a tube operated device on for extended periods will shorten its life span considerably. Those did not stand by for that reason.

This stand by status is not needed for this reason on a newer technology TV though, like LCD. On my LCD Sharp, you can hear the power relay audibly click off after about a minute that it has been turned off. It does not stand by.

 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
Originally posted by: taltos1
Wow. Thank you for all the replies but I am still a bit in the dark. I understand that there are 2 elements at play here: energy savings AND longevity of the electronics.

Energy Savings:
It seems the "Power Strip" method will save some energy, but perhaps not as much as some are touting.

Longevity of Electronics:
Some say the power strip method is good, other say it is bad. Both sides have made what seem to be valid points (IMHO, I am no electronics guru) but I am still not sure what to do...???

It seems you're in the same boat as everyone else here. i guess the lesson learned is that there is no "right" answer to your question. Its doubtful it will harm your electronics, but you never know. You will save electricity, but it may or may not be as much as you're hoping for.