• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Hospital had generator and UPS failure

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I will speak from personal experience at or 3 major Data Centers in CA.

All generators at each data center are ran once a month with simulated load tests.

Two of the Data Centers are in the San Diego area. One has true A-B power with Dual UPS's and dual Generators. So each Generator can take the full load and even in the event of a full UPS failure. If you plugged into your equipment correctly you should be ok. The second Data Center has 2 generators but only 1 UPS. The Primary Generator (1MW) will take the full load of the building. It is a call center and they want the building up and running even in power outages. We have a second smaller generator that will only power the Data Center in the event the primary fails.

The importance of regulary testing was drilled home several months ago at this Data Center. They where testing the primary generator and had alloweed it to run to warm up and then where ramping up the load to 95% during testing. The generator tech heard a awful racket and went to press the emergency shutdown, however it was to late basically the one of pistons exploded inside a cylinder doing fatal damage to the generator. We had to replace the entire thing $200k but at least it happened during regular maintenance instead of discovering the issue when we needed. We got a rental delivered within 6-hours and cabled up a new one is on order. As or Manager told us no one wants to pay for Emergency Generators until they lose power and then they are very important.

I think with hospitals it just might be a older mindset as to maintenance and maintaining generators. It is looked as a expense with no benefit. They could probably learn something from Data Center engineers and power infrastructure. They seem to have a lot better feel for emergency power systems and have redundant backups than hospital facility staff. Designing a system that you have to impact business in order to properly test a generator is not very smart because then it pushes back regular generator testing.

Does anyone know how much days of run time of fuel a hospital usually keeps on hand for their generators? I have also thought that this could be a weakness in a major event, since fuel could be hard to come by.
 
This sort of happened at my work. UPS tech put system in bypass for maintenance and forgot to put it back. Power went out, generator kicked in and fried some switches and a server. Cisco made some money that Sunday.

Did you guys implement MOPs for all work after this event?
 
No, and ups tech said he didn't leave it off even though he had it off when he was working on it two days earlier on friday afternoon.

We started implementing MOP's after a incident earlier this year in the San Diego DC. Somehow a tech got ahead of himself while performing maintenance on the UPS and caused the B side UPS to go offline. Most equipment wasn't affected (A-B power) however some equipment was. After that we started implementing MOP's before any work is approved. I find it helpful to keep everyone on task as to what is supposeed to happen when. We even started implemeting it now for all change work, servers network etc. Originally we kind of grumbled about it but I find it useful now. I can list out all my steps ahead of time and what is happening when. So this way at 2am I don't have to try to remember what is supposeed to happen when. I just follow the MOP and it is kind of like recipe. It really helps to prevent mistakes.
 
Does anyone know how much days of run time of fuel a hospital usually keeps on hand for their generators? I have also thought that this could be a weakness in a major event, since fuel could be hard to come by.

I don't know how many days it would last, but this hospital had a 400 gallon tank.
 
the worst night I ever had on the job was when I got the 11 pm call that the generator failed at our data center and we had 3000 servers offline.

turned out that right as grid power failed, lightning struck the generator.

Was this "The Planet" by chance? :biggrin: They had an explosion at one of their data centres years back and had a couple thousand servers offline for like a week. I think it was a switch gear explosion if I recall. Completely random though, I don't think it was a storm or anything.

I can't imagine the stress of dealing with this many servers offline, and customers calling in not too happy (and I don't really blame them).

Though it just comes to show how important multi location redundancy is if your stuff is that important. Even big Sas Type 2 certified and whole 9 yards data centres can go down.
 
Thought I'd post an update to this.

Problem was eventually traced to the generator transfer switch. The manufacturer had just noticed a problem, and had issued a recall, but the paperwork was still going through, and although the regional health authority had been alerted, they were still preparing to cascade the alert down to individual hospitals, after checking to see if this particular model of switch was in use.

There was a delay in implementing a fix as there were multiple problems with the electricla system - notably, that demand was running at 99.7% of the rated capacity of the primary voltage feed to the campus.

It had been planned, just to ask the power company to upgrade the feed, but management decided to get a specialist EE consulting firm in to advise before agreeing the $500k price that the power company wanted to upgrade the campus feed from 22 kV to 66 kV and upgrade the power transformers.

Anyway, the consulting engineers' report was interesting. The main conclusions were:
1. Single primary voltage feed to the hospital transformer. This was tee'd off the campus supply. Risk: single point of failure. Damage to feeder capable would result in loss of power to hospital until repaired.
2. Single feed transformer. Failure of transformer would result in loss of power until repaired.
3. Single generator transfer switch without bypass. Failure would result in loss of power..... Non-bypass design meaning that maintenance on the switch could not be undertaken without cutting off all hospital power (inc. generator protected circuits)
4. Generator undersized. Generator would only supply 20% of hospital load. This would be sufficient for lighting, ERs, ICUs, etc. but would not be adequate to run heavy equipment such as CT or MRI scanners, or provide adequate supply to the server room.
5. Individual sub-distribution panels had no redundancy. Each panel had a single feed from the main panel, and each region of the hospital had a single panel. In the event of panel failure, or damage to the feed cable, there was no cross feed capability, and an affected region would have no power until panel/cable replacement.

Apparently, this caused quite a bit of discussion at the next board meeting.

They are now planning to rewire the entire building, install a 2nd transformer, and have the power company bring in a second feed to the campus, and put full cross-connect dual redundancy in place from transformers through to sub panels. Apparently, the cost is going to be astronomical.

A new 100% load capacity generator is also coming, with the 20% one staying as critical area "ultimate backup".
 
Thanks for the update. Amazing to think redundancy wasn't even thought of(or not thought important enough) for such critical operations.
 
We do weekly generator tests....more than anything they are tested just so they get run time. Nothing cooks an engine faster than not being used. (parts seizing) We don't have it rigged to do a transfer though... My old datacenter used to actually flip the transfer switch during the test. That was fine, but the power did blink every week when it happened. The test was timed on a manual relay that was patched into the system....the only problem was, it actually flipped one time when maintenance had the battery unhooked from the generator for charging. It resulted in everything switching to UPS power and me scrambling to shut everything that I could down before the battery power ran out. I didn't have access to override the test. 🙁

For what it's worth, Generators and large UPS units don't cost THAT much money as far as infrastructure goes. They're actually pretty affordable if you're not getting ripped off on the installation or contractor overhead. Ordering the temp generator probably costs a large chunk of what replacing the old would if you could wait a few weeks/months for it to be delivered.
 
I work for a NOC/central station and we are a UL certified facility which means that we have to run weekly generator tests and keep logs of our tests. Every week I throw open a breaker that connects the site to utility street power. The building is on a UPS which instantly supports the main load. Everything is over sized and in the event of a generator failure, we can run on UPS battery power for 1.5 days. Our generator takes about 40 seconds to start and spin up before it can accept the load. It is a big badass thing too, straight six engine straight out of a tractor trailer, somethng like almost 400 horses and it LOUD. In the event of a generator failure, we can support the connection of an emergency rollup generator that arrives on a flatbed truck and the permenant generator can be isolated and the rollup gen can power the building.

I'm surprised that for something as critical as a hospital, the generator is not exercised as frequently.
 
I'm surprised that for something as critical as a hospital, the generator is not exercised as frequently.

Hospitals are required by CMS and Joint Commission to run genny every 30 days. Semi annual load tests. If there was even a tiny bit of media coverage I'm betting the hospital in question is almost certainly being reviewed by Regulators as I type.

Also, any decent accredited hospital is going to have 96 hours of off the grid capabilities for water, food, meds and power. Part of basic emergency ops.
 
I'm surprised that for something as critical as a hospital, the generator is not exercised as frequently.

It used to be load tested every 2 months. However, because the transfer to generator power takes 30 seconds, testing prooved to be quite disruptive.

Because the generator transfer switch was at the main panel, the only way to switch the generator in was to turn off the main building switch. Anything that wasn't on a UPS (which was basically everything except OR/ICU/some servers) had to be shut down in advance. MRI scanners had to be shut down for the test to avoid data loss. Elevators had to be brought down to ground floor and locked out. Clinics wouldn't have access to IT services, as office PCs/networks were not a "generator priority".

Effectively it meant the hospital had to shut to everything non-emergency. But, management couldn't tolerate the half-day per 2 month productivity loss.
 
Last edited:
It used to be load tested every 2 months. However, because the transfer to generator power takes 30 seconds, testing prooved to be quite disruptive.

Because the generator transfer switch was at the main panel, the only way to switch the generator in was to turn off the main building switch. Anything that wasn't on a UPS (which was basically everything except OR/ICU/some servers) had to be shut down in advance. MRI scanners had to be shut down for the test to avoid data loss. Elevators had to be brought down to ground floor and locked out. Clinics wouldn't have access to IT services, as office PCs/networks were not a "generator priority".

Effectively it meant the hospital had to shut to everything non-emergency. But, management couldn't tolerate the half-day per 2 month productivity loss.

We are in the same situation as the hospital except for us to maintain our UL certification, it meant we needed to do a hard cut of utility power and run that weekly generator test under load..

Basically everything critical is connected to a huge UPS, the entire building even power hungry items like the AC compressors. We run generator tests every week and added individual UPS devices to units that do not draw power from the protected electrical circuits.

We also run a UPS only test that is not disruptive and shuts the rectifier in the UPS off. This stops charging the batteries and powers the load from the battery bank. We do this test every week as well, usually after a generator test.
 
Another update: Rewiring work has now well underway, and the new substation building is being built, and contracts with the power company have been signed, and the power co has ordered the new service transformers.

However, the redesign work has added a significant delay.

As the temperature hit 90 degrees today, and load on the sole service transformer hit 110% of maximum, management had all the AC turned off at the panels, and banned the use of non-essential electrical equipment such as ventilation fans, etc. In order to prevent the transformer from melting down.

It's going to be a long, hot Summer.
 
no shit. i work for a small company and we test our generators weekly. you would think that a hospital would have their shit together.

This, every place I know tests them at regular intervals, a hospital with ICU should definitely do it... I think its negligence, heads would roll.
 
Another update: Rewiring work has now well underway, and the new substation building is being built, and contracts with the power company have been signed, and the power co has ordered the new service transformers.

However, the redesign work has added a significant delay.

As the temperature hit 90 degrees today, and load on the sole service transformer hit 110% of maximum, management had all the AC turned off at the panels, and banned the use of non-essential electrical equipment such as ventilation fans, etc. In order to prevent the transformer from melting down.

It's going to be a long, hot Summer.

Thanks for the update.

With the new 100% capacity Generator are they planning on getting a artifical load bank tester with the generator? We have one on or IT Data Center that generates 200kw of simulated load on the Generator so we can verify the ability of the Generator to take a load without having to cut-over to generator.

http://www.csemag.com/home/single-article/load-bank-testing-ensures-performance-reliability/78a658bfd0e96c9542373bf983f028d6.html
 
We have consciously chosen to augment centralized power backup systems with local power backup systems. Laptops are chosen over desktops in many cases because of built in battery backup, those are augmented by under the desk UPS systems, the switches/routers in many closets in critical areas have their own battery backup systems, those are feed by a large, central UPS, which is all backed up by generator.

I've learned a few hard lessons over the years, but the most critical is that a generator typically has a battery of it's own to crank the engine. Those need checked regularly as even if you run a generator test on interval, that battery WILL die when you need it most if you haven't checked that critical sub-component.
 
Another update: Rewiring work has now well underway, and the new substation building is being built, and contracts with the power company have been signed, and the power co has ordered the new service transformers.

However, the redesign work has added a significant delay.

As the temperature hit 90 degrees today, and load on the sole service transformer hit 110% of maximum, management had all the AC turned off at the panels, and banned the use of non-essential electrical equipment such as ventilation fans, etc. In order to prevent the transformer from melting down.

It's going to be a long, hot Summer.

how the hell is a hospital built w/o redundancies?!
single electrical feed?!?
and no one did anything when usage > 90%?! 99% usage.. geez!

wouldn't it be cheaper to build a new hospital with recommended redundancies?
and leave the old hospital for non-critical care?

and why are you still working there?
sounds like mgmt is doing penny wise, pound foolish?

has anyone been fired for this? has the hospital been fined? has regulators gotten on them?
 
Did some contract work at a corporate HQ in the Detroit area. They related that when the big blackout of 2003 hit, while they did have backup power, if I remember right, the water system in the area taking a massive dump basically doomed them for cooling purposes.

This after they spent a day or two ferrying in diesel fuel to keep the generator running (and yes, it was the IT guys ferrying it in personally 😀).
 
Back
Top