Surge Protector BEEEEEP...

Anomaly1964

Platinum Member
Nov 21, 2010
2,465
8
81
I have an Nvidia GTX 570, just within the last couple of days, when ever I play a game, it makes my battery backup/surge protector BEEEEEEP. Not on the desktop or websurfing but just when playing games, any ideas?

Thanks!
 

Anomaly1964

Platinum Member
Nov 21, 2010
2,465
8
81
what is the surge protector declared output in Watts? Hopefully more than 350W?


It is 625v...

I plug my machine into a 2nd surge/backup, 550v and no beeeep, is it possible the first one took a hit or something...?

Thru EVGA precision my temp and volts are good on the card...
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
battery might be going. i have one i need to change at home. getting a sealed lead acid at interstate is a lot cheaper than buying a new UPS if it was a quality UPS to start with.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
It is 625v...

I plug my machine into a 2nd surge/backup, 550v and no beeeep, is it possible the first one took a hit or something...?

Thru EVGA precision my temp and volts are good on the card...

You mean VA, and those are only equivalent to Watts when you are talking DC current.

With AC current, those battery-backup units are more like 350W for those VA ratings.

Truth be told, you need to purchase bigger-capacity, more-expensive battery backup units for a gaming rig of that caliber.

I bought a 450W UPS for my HTPC (granted, it's a quad-core, but it has no discrete GPU).

For a real gaming rig, I recommend a UPS with 500-650W or more capacity. Which means 1000VA or higher ratings. None of this low-end 550VA crap.
 

westom

Senior member
Apr 25, 2009
517
0
71
For a real gaming rig, I recommend a UPS with 500-650W or more capacity. Which means 1000VA or higher ratings. None of this low-end 550VA crap.
Because the computer must consume so much power that it also toasts bread. In reality, that 600 watt UPS says it is more than enough power for any PC. Because
PC do not toast bread.

Start by reading (or posting here) the UPS instruction book. BTW, that UPS is not a surge protector.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Because the computer must consume so much power that it also toasts bread. In reality, that 600 watt UPS says it is more than enough power for any PC. Because
PC do not toast bread.

Start by reading (or posting here) the UPS instruction book. BTW, that UPS is not a surge protector.
Is your reading comprehension off? Someone asked the OP to list the watts of the UPS, and he said v. Which I took to mean that he was reading off the VA rating of the UPS (since that is what is usually listed on the package).

My comment is that the VA rating on UPSes is NOT equivalent to watts.

If he really has a 600W UPS, then yes, it should be enough for his PC. But if it's 600VA, which it most likely is, then it is no-where near big enough for a gaming rig.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
If the computer is consuming 600 watts or 600 VA, it is still toasting bread. No PC is consuming that much power:
What is your desktop power usage while browsing these forums?

If we look again at the original posters, starting post we see:

I have an Nvidia GTX 570, just within the last couple of days, when ever I play a game, it makes my battery backup/surge protector BEEEEEEP. Not on the desktop or websurfing but just when playing games, any ideas?

Thanks!

The thing is, the link you referred to is about PCs just surfing the web. This usually places any gaming graphics card(s) into "idle" power consumption, so the overall PC power consumption can be quite low (the cpu is usually NOT doing much either, while surfing the web).

BUT, the original poster, was PLAYING GAMES, this usually puts a HUGE strain on the cpu (especially if it's overclocked), and will tend to massively increase the power consumption of the cpu etc, but more importantly, the graphics card can also use a HUGE number of watts, especially if it is a high end graphics card, and/or there are multiple graphics cards fitted.

There are sources on the internet which say how much a gaming PC uses, while gaming, but certainly 600 watts or more, is quite easily achievable, depending on the PCs spec.
Don't forget that if the PC is trying to use 600 watts, it has to go through the power supply, which will consume MORE electricity (because it has a finite efficiency of say 82%), so the 600 watts PC, would use something like 730 Watts at the wall socket.

For the GTX570 I don't know what it uses, without looking it up.

EDIT: Clarification, I am in no way saying the OP PC is using 600 watts, I am saying a hugely overclocked FX8350 (or whatever you have), with dual very high end graphics cards,while playing some games, can use lots of watts, like 300W, 400W, 500W, 600W, or whatever it would be.

EDIT2: Example high end gaming PC which uses about 500 watts at the wall socket
500 Watt gaming Example

But my testing tool reports a maximum consumption of somewhat higher than 500W.
 
Last edited:

westom

Senior member
Apr 25, 2009
517
0
71
A UPS rated at 600VA is only 300-350W. A gaming PC with a high-end VGA can easily draw 350W at the wall.
Good luck finding the system that actually draws that much. Highest gaming system only peak at 350 watts. In one rare case, a 400 watt was reported. A 500 watt system would be IC termperatures of what? Temperature that easily burn bread. With air coming out as hot as that from a toaster.

Why do so many gamers just *know* they need that much power? We are selling graphics cards to consumers with no electrical knowledge. Informed computer assemblers know only current on each DC rail is important - not wattage. But assembler only select using wattage. Therefore one rail can output too many amps and another too few.

This problem is easily solved. Tell computer assembler a wattage number that is twice what they need. A gaming computer that may peak at 350 watts - we tell him he needs a 700 watt PSU. Then he just knows his system consumes 700 watts. IOW fans are blowing air so hot as to toast bread.

Since DC current should be double what is consumed, then we waste no time in tech support teaching computer assembler what they should have known.

Replies to the OP's problem are only speculation. What does the Beep report? Surge protectors don't beep. Why do so many just know when the Beep is not even described from manufacturer's instructions?

Step one: What does the manufacture say that Beep is reporting? And is it a UPS or a surge protector - two completely different devices performing completely different functions.
 
Last edited:

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
Good luck finding the system that actually draws that much.

My (getting old now) gaming PC got so hot, you could feel the heat radiate from it by putting your hand near it. The room got hotter when I used it.

My (previous) xbox 360 got so hot that the room was noticeably hotter when I used it (until the day it red ring of deathed).

Even BEFORE gaming, the latest and hottest AMD FX uses 220 (max) TDP + rest of computer, so it is using 250+ Watts system consumption (if heavily cpu loaded), BEFORE the gaming card.

I'm NOT convinced by your arguments, my experience of gaming machines is that they can use LOTS of power.
 
Last edited:

westom

Senior member
Apr 25, 2009
517
0
71
I'm NOT convinced by your arguments, my experience of gaming machines is that they can use LOTS of power.
It's remains a common problem. Even says why so many 'knew' Saddam had WMDs when quantitative facts said otherwise. Or why no engineer would recommend launching the Challenger. So people who ignore numbers murdered seven astronauts. Subjective reasoning said why (in the 1950s) the majority of Americans knew smoking cigarettes increases health. A majority do not know that numbers are always required for an honest (and believable) answer.

Most conclude using subjective reasoning and their emotions. Most completely forget what was taught even in junior high science - how to know something. Honest answers always include perspective. That means numbers.

If a computer was consuming that many watts, then leave skin when touching its CPU heatsink. As demonstrates by so many in another thread who actually measured their computers with a Kill-A-Watt: consumption mostly between 100 and 200 watts.

Even an Xbox 360 consumes many times less power than speculated. Somehow you know otherwise because it feels hot? A conclusion based in feelings; not tempered by what must always exist: hard numbers.

Computers designed by engineers have a much smaller PSU. Naive computer assembler will even insult brand name manufacturers rather than learn how easily they have been manipulated by PSU myths ... about gaming computers needing 600 and 1000 watts.

So again, back to the OP's question. Why is the OP getting a beep from his surge protector when protectors do not have beepers? What exactly do manufacturer instruction say for that beep? Solving problems means hard facts. Such as what surge protector has a beeper?

Another suggested a probable (reasonable) answer. But did not elaborate. With answers to those questions, then a reply could confirm his reasonable suspicion. But first the OP must provide some basic information. And show an interest in having an answer.
 
Last edited:

Torn Mind

Lifer
Nov 25, 2012
12,078
2,772
136
OP, if you want to approximate how much your rig is pulling while gaming, the Kill-a-Watt is the one with the best combo of convinience and lack of impact on your funds, but at the cost of some accuracy.

The GTX 570 has a max board TDP of 219 watts. I am not sure if overclocking can get you over this TDP, but it makes it easier to reach it. The i5 could be consuming 80 watts or so while gaming.

Is the TV connected to the same UPS?
 

bryanl

Golden Member
Oct 15, 2006
1,157
8
81
Good luck finding the system that actually draws that much. Highest gaming system only peak at 350 watts. In one rare case, a 400 watt was reported. A 500 watt system would be IC termperatures of what? Temperature that easily burn bread. With air coming out as hot as that from a toaster.
For the same reason a 1800W hair dryer won't burn bread or melt solder while a 1000W heat gun or 20W soldering iron can.

The following graph from PC Perspectives shows an ATI Radeon HD 6990 card drawing up to 404W, a newer generation but roughly similar HD 7970 up to 273W. Therefore your 350W estimate for total system power could be bit low, especially when 2-4 video cards are run together with SLI or Crossfire.

power_maximum.gif
 
Last edited:

westom

Senior member
Apr 25, 2009
517
0
71
The following graph from PC Perspectives shows an ATI Radeon HD 6990 card drawing up to 404W, a newer generation similar HD 7970 drawing 273W.
Average maximum or peak? Even a Pentium rated at a maximum of 70 watts might sometimes draw well in excess of 100. And average less than 50. Well in excess of what its power supplies are speced for. But then even power supplies can output significantly higher power for that short period.

Where do those numbers come from? Are they numbers we tell computer assemblers so they will buy PSUs that are double what is needed? If those numbers are correct (not doubled), then they were confirmed by a physical measurement (ie a Kill-A-Watt). In that other thread, people did actual measurements with a Kill-A-Watt to discover power consumption was typically half what they had been told. IOW they did what we were all taught to do in junior high science.

Even a 100 watt light bulb can draw well over 600 watts for a short period. Does that mean the circuit with five 100 watt light bulbs must be on a >25 amp circuit breaker?

More reasons why we need numbers. A 300 watt PSU in a brand name computer is electrically equivalent to one sold to computer assemblers rated at 425 watts. Did they lie? Of course not. To understand why two equivalent PSUs with different watt numbers are same means always doing numbers to know what those numbers really mean. Another example of why informed computer assemblers ignore watts; instead read current for each rail.

OP's questions could be answered with relevant numbers IF additional information is provided. A 600 watt UPS should be more than sufficient for a computer. Of course, another asked another important question. What else was powered by a UPS? Or is it something completely different - a surge protector?
 

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
Pot:

It's remains a common problem. Even says why so many 'knew' Saddam had WMDs when quantitative facts said otherwise. Or why no engineer would recommend launching the Challenger. So people who ignore numbers murdered seven astronauts. Subjective reasoning said why (in the 1950s) the majority of Americans knew smoking cigarettes increases health. A majority do not know that numbers are always required for an honest (and believable) answer.

Most conclude using subjective reasoning and their emotions. Most completely forget what was taught even in junior high science - how to know something. Honest answers always include perspective. That means numbers.

Meet kettle:

Where do those numbers come from? Are they numbers we tell computer assemblers so they will buy PSUs that are double what is needed? If those numbers are correct (not doubled), then they were confirmed by a physical measurement (ie a Kill-A-Watt). In that other thread, people did actual measurements with a Kill-A-Watt to discover power consumption was typically half what they had been told. IOW they did what we were all taught to do in junior high science.

I think you two will get along swimmingly.

In case you need more numbers to disprove your "Good luck finding the system that actually draws that much" claim: Here ya go. Power consumption for a 7970 system while playing BF3 as measured by a KAW is 360W. Hardly an uncommon setup or workload. A KAW doesn't report sub-second transients, so this is a realistic number that the PSU will have to output.
http://www.anandtech.com/bench/GPU13/598
 

bryanl

Golden Member
Oct 15, 2006
1,157
8
81
Average maximum or peak? Even a Pentium rated at a maximum of 70 watts might sometimes draw well in excess of 100. And average less than 50. Well in excess of what its power supplies are speced for. But then even power supplies can output significantly higher power for that short period.

Where do those numbers come from? Are they numbers we tell computer assemblers so they will buy PSUs that are double what is needed? If those numbers are correct (not doubled), then they were confirmed by a physical measurement (ie a Kill-A-Watt). In that other thread, people did actual measurements with a Kill-A-Watt to discover power consumption was typically half what they had been told. IOW they did what we were all taught to do in junior high science.
I found the video powr consumption chart at PC Perspective (not PC Perspectives as I wrote earlier), but those and similar charts are available at several other review sites, including Tech Power Up:

http://www.techpowerup.com/reviews/AMD/HD_7990/23.html

The definitions of peak power and maximum power in those tests are unconventional since their peak is never as high as their maximum. They measure maximum power, likely avearged over at least a few seconds, by running Furmark, a program known for making video cards run their hottest.

Even a 100 watt light bulb can draw well over 600 watts for a short period. Does that mean the circuit with five 100 watt light bulbs must be on a >25 amp circuit breaker?

More reasons why we need numbers. A 300 watt PSU in a brand name computer is electrically equivalent to one sold to computer assemblers rated at 425 watts. Did they lie? Of course not. To understand why two equivalent PSUs with different watt numbers are same means always doing numbers to know what those numbers really mean. Another example of why informed computer assemblers ignore watts; instead read current for each rail.

OP's questions could be answered with relevant numbers IF additional information is provided. A 600 watt UPS should be more than sufficient for a computer. Of course, another asked another important question. What else was powered by a UPS? Or is it something completely different - a surge protector?
The power numbers show that you're probably wrong about a 350W high quality power supply and being adequate for all PCs, but you're likely right about the 600W UPS.
 

westom

Senior member
Apr 25, 2009
517
0
71
The power numbers show that you're probably wrong about a 350W high quality power supply and being adequate for all PCs, but you're likely right about the 600W UPS.
Even those worst case numbers says a 600 watt UPS provides sufficient power. That remains the relevant point. To say more means the OP must provide additional facts. Those facts may identify one poster as having answering correctly. And would explain why a 500 watt UPS also provided sufficient power. But nobody can answer definitively without facts from the OP.

We know a 600 watt UPS should be sufficient. A 600 watt surge protector would violate UL and other safety codes.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
a 600 watt UPS provides sufficient power.

As VirtualLarry has been trying to explain to you, the OP has NOT got a 600W ups (based on OP description).

(I did NOT know this, until I looked it up for this thread).
Apparently, the VA rating is the output capacity to provide output power AND Power Factor.

Explanation

Quotes below are from above link:

What is the difference between Voltage-Amps (VA) and watts and how do I properly size my UPS?

The power in Watts is the real power drawn by the equipment. Volt-Amps are called the “apparent power” and are the product of the voltage applied to the equipment times the current drawn by the equipment. Both Watt and VA ratings have a use and purpose. The Watt rating determines the actual power purchased from the utility company and the heat loading generated by the equipment.

I'm stopping arguing with you, and leaving the OP thread alone.
TL;DR I'm the terminator, really. "I'm Back!".
 
Last edited:

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
Correction to my post above. If the OP's power supply has power factor correction, then that moves the VA and watts ratings closer together.

The better PSUs can achieve (from memory) a power factor of something like 92%, but I have no idea what PSU the OP has, so I don't know what the value is. Possible ups output waveform distortion, COULD worsen the power factor results in theory anyway, as the psu is expecting a nice clean, relatively perfect sine wave (AC) waveform.
 

westom

Senior member
Apr 25, 2009
517
0
71
As VirtualLarry has been trying to explain to you, the OP has NOT got a 600W ups
He is playing number games with what was obviously irrelevant. Since a 600 watt system would toast bread. VA is a function of the load - not a UPS. He should have known that. A second reason to ignore his posts. Does not matter if a battery backup is 600 VA or 600 watts. It still provides sufficient power to his computer. As did a 500 watt or VA battery backup. Just a third reason why his posts about a 600+ watts were obviously bogus. And best ignored.

Why post the same irrelevant arguments rather than address the OP's problem? But then that requires information he has not provided. As you admit, those recommendations exist without relevant facts. We know a 600 battery backup should be sufficient - for three reasons. In every post, I keep returning to another relevant fact. We need more information from the OP - and quiet from the naysayers who only create fears and confusion.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
We need more information from the OP

I definitely agree on THIS!

Also, depending on the OP's exact specifications, I also agree the VA and Watts could be very close, without knowing more, it is difficult to argue against. Or they could be somewhat different, but without the OP's info, we are stuck.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Even a 100 watt light bulb can draw well over 600 watts for a short period. Does that mean the circuit with five 100 watt light bulbs must be on a >25 amp circuit breaker?

A 600 watt surge protector would violate UL and other safety codes.

You talk a lot about power, but you seem to be missing out on some fundamentals.

600W AC, at 120V, would be only 5 A. Not 25 A. (Edit: Nevermind, now I get your point, if there were five bulbs, and each took 600W (5A), then all five would take 25A peak.)

And yes, plenty of basic model surge protectors handle 600W just fine, while still being UL-listed, and not bursting into flames. (I still stand by my statement on this, I know someone running an air conditioner on a surge bar. Most are rated for 15A @ 120V.)

I feel that you are unqualified to comment on the subject matter at hand, given those statements of yours.

And yes, I've measured PC power consumption with a Kill-A-Watt, far in excess of 300W. And no, it doesn't toast bread, nor take my skin off when touching the heatsink.

Why? Because of this little thing called physics, and effective heat-dissapation.

Perhaps you should consider studying it some time. And possibly learning more about gaming PCs, video cards, power consumption, and UPSes.
 
Last edited: