A night from hell in the ER last night

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: preslove
Do us all a favor and tell us what hospital you work at so we can avoid it. Not that you're not a great nurse, but your bosses must be stupid :D

Hey we provide damn good care despite the little computer hiccup. :p

Anyways the system works quite well the only time it's went down were planned downtimes, and this one power failure. In 2 years of operation thats not too bad.


And just a note on screw ups they happen everywhere.

Hell July of last year the hospital behind mine(my medical center is 5 hospitals built and interconnected with tunnels) lost power when the electric company cut(physically) out the main line to the hospital itself on accident. The backup generators kicked in as planned and then the electric company while trying to fix the main power line shorted the backup generators too. That was a hellish 8 hours or so. My hospital had to mobilize to move ALL the patients on life support from that hospital to my hospital without elevators through the tunnels over a half mile manually hand bagging/ventilating the patients. But didn't lose any patients so all turned out good in the end!
 

Danman

Lifer
Nov 9, 1999
13,134
0
0
Originally posted by: NuroMancer
Originally posted by: slag
Originally posted by: NuroMancer
Originally posted by: slag
I find it funny that the majority of people here probably haven't set foot in a datacenter, let alone had a role in helping plan, engineer, or implement a redundant datacenter/RTSC/or even a lab for hvac/network/power and yet still know everything there is about everything.

A job I worked at had a major power failure one night during a systems test. Seems the circuit on the backup grid blew during failover. This circuit had been tested by an engineering firm and signed off as ok before putting into use and it still failed. I don't remember all the specifics, but all T's were crossed, all I's dotted, and still everything didn't work as planned.

My point is, while you depend on your equipment to be ready to work, when it doesn't, sometimes its no ones fault.

Everything fails eventually and sometimes shit just happens.

Thats why you build redundancy, and you test it...

That way when shit blows up, it is no ones fault, and the DC still doesn't go down.

Again, my point is, we could have 10 or even 100 successful tests and the one time its needed to work, something can fail. Everything mechanical fails eventually.

Your right, I agree, my point is you are calling people out over criticizing the design of the DC which apparently, probally due to buget, lacked redundancy.

The reason for dual redudant or greater systems is simple, you reduce the chance that when you need it, that it will fail.

You are absolutely right, but you still don't understand how IT works. We get our projects approved from people who don't have a clue what IT is. Even though we may put it in laymen terms and stress on how important this may be, a project still may not get approved or funded.

People have the general mentality that's it's always the IT folks screwing something up.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
I've seen all sorts of weird IT decisions in hospitals - at least they seem weird to me as someone who knows a bit about IT, but don't work in the field.

I once visited a hospital, looking for a job, which boasted the 'biggest and most sophisticated PACS installation in Europe', with hundreds of PC terminals. PACS is a system for digital X-rays/scans/etc. instead of film, the images are stored and viewed digitally. As part of the tour before I applied, they thought it would be an idea to show it, and this included a quick tour of the 'datacenter' - a large closet which contained a massive RAID array (dozens of TB easy), and a massive DVD-R jukebox the size of a filing cabinet but nearly 6 foot tall, and a huge rack of servers and network infrastructure to handle all the queries, real-time compression and data flow. It was immediately obvious that the entire infrastructure was in one room, including the backup system. As this held the only copy of X-ray images, etc. for hundreds of thousands of people, I thought it was a little presumptuous not house the backup seperately, or at least have redundant servers at another site.

I've also worked at hospitals where, even 2 years ago, simple things like power glitches, or routine generator tests, would shut the entire IT system down for hours. I remember one Friday morning when a planned generator test was performed - everywhere in the hospital experienced a 10 second power drop, the problem was when things came back on there was no IT. Couldn't even log into desktop PCs for nearly 6 hours - 'Unable to authenticate with the domain' or some such error. Then once the domain had come back up, all the 'floor-facing' servers that provided things like laboratory results to the doctors/nurses had lost their databases. It took another 24 hours for the web servers to resync with the individual laboratory servers. No idea what actually happened, but if there were UPS systems, they didn't work. Of course, this is the same hospital that doesn't connect its elevators to the generator, because the generator is too underpowered: it's a 20 floor tower block! (Well, the elevators descend to ground safely in the event of a power outage, but that's all they do). Want to transfer a patient to the OR urgently? Nope. Need an urgent X-ray with a portable machine? nope. No elevators to bring the X-ray unit to the patient.

 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Well, a major IT system crashed at my hospital today. It's the radiology department's computer system, which handles the booking and results of X-ray, ultraosound, CT, MRI examinations, etc.

Got in at 8:30 and found that it wasn't working. You couldn't record any details for a request or, more importantly, record the results, although most of the rest of the functions worked. As you might imagine it was chaos. Thankfully, we did still have some old dictation machines and cassette tapes put away gathering dust, so they were rescued so that urgent results could go to secretaries for typing. Other people had to resort to hand writing results, or giving verbal results over the telephone.

No idea what the problem was, but it took about 8 hours to fix. Just guessing, I think it might have been some form of file system corruption - if you did try to save a result, it would just give some error message like 'Unable to save because the directory does not exist' or something like that.
 

miniMUNCH

Diamond Member
Nov 16, 2000
4,159
0
0
Anything on windows is bound to crash someday...

When I was working control systems we were running everything we could on QNX on mil-spec, industrial PC's... our control system was nothing short of bomb proof.
 

OdiN

Banned
Mar 1, 2000
16,430
3
0
Originally posted by: homercles337
Look, IT people SUCK. You would not have had a "night of hell" if IT did their fucking job. Sounds like you have a bunch of fuck ups in IT if you ask me. Im a scientist so this may mean nothing.

Get lost.

I'm an IT person. We had a power outage here, the backup generator kicked on and worked great. Only a few computers even rebooted between the time the power went out and the time the generator was on and providing enough power to power the whole building. 95% of the computers stayed on and all connections stayed active. We did not drop calls and did not lose data.

You don't know what you're talking about.
 

rh71

No Lifer
Aug 28, 2001
52,844
1,049
126
It's like basically the one thing that a guy was hired to do - they failed at.
 

Chadder007

Diamond Member
Oct 10, 1999
7,560
0
0
Originally posted by: DeathBUA
Originally posted by: Xanis
Originally posted by: bwatson283
Originally posted by: Xanis
Your entire infrastructure was fucked because the sever room lost power? That seems pretty absurd. Shouldn't the server room have been set up with UPSs and generators?

learn to read

FYI, I can read very well. What exactly are you talking about?

I'll type it again...our entire hospital infrastructure is computerized. EMR, or electronic medical records. We are about 97% paperless, meaning nearly all charting and record keeping for a patient is kept on a computer(an ER visit before EMR generated about 100-150 pieces of paper charting, now with computerized charting an ER visit will generate about 5-8 sheets of paper). Now last night the server room lost power and according to an e-mail I got the backup generator didn't fire. Probably means the problem was within the server room itself since the hospital didn't lose power. When the servers went down so did our infrastructure since the servers are literally the beating heart of the hospital.

I wish we would go paperless. But they love killing trees too much around here, their poor little eyes can't take looking at a screen.
 

Chadder007

Diamond Member
Oct 10, 1999
7,560
0
0
Originally posted by: homercles337
Look, IT people SUCK. You would not have had a "night of hell" if IT did their fucking job. Sounds like you have a bunch of fuck ups in IT if you ask me. Im a scientist so this may mean nothing.

Ban....there is no need for that language here.
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
my wife does patient admissions at MUSC and their servers go ape shit every so often requiring them to hand write everything and manually enter it later.

She feels your pain...
 

DrPizza

Administrator Elite Member Goat Whisperer
Mar 5, 2001
49,601
167
111
www.slatebrookfarm.com
Wow, you're an optimist. I can almost see it now, the next meeting in the board room:

"All hell broke loose."
"The nurses were going nuts, it was chaos."
"Thanks to our good nurses, though, we made it through the problem unscathed - no one was injured, no one died."
"How much is it going to cost to put in a fail-safe system?"
"A million dollars??!"
"How much did this whole problem cast us to fix, plus overtime for the additional staffing?"
"That's it??"
"Well, it's probably a one time thing. But, if it even happens again, we know our nurses can handle the situation. "
 

GeekDrew

Diamond Member
Jun 7, 2000
9,099
19
81
Originally posted by: RadiclDreamer
Water sensors in the floor have done it for us, they shutdown the ups and take all the servers with them as a safeguard

That's also poor design, unless they detected massive amounts of water. A small leak shouldn't trigger a full-scale shutdown, but it should trigger environmental alarms all over the place. I've worked at several places that have had non-fatal environmental conditions cause premature device shutdown. Temperature is a problem for all data centers, and little shops generally have more difficulty with it than large ones -- few consider that maybe alarms should sound when the room hits 75 or 80 degrees, but that emergency shutdown shouldn't commence until 90 or so, and only then, if an authorized user has not bypassed the alarm.

There are tons of reasons that something of this scope could happen, many of which are beyond the control of IT, as has been said previously. I don't think I've yet seen anyone mention sabotage yet, though. The right disgruntled employee could shut down a datacenter with the push of a button or two.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: RadiclDreamer

Water sensors in the floor have done it for us, they shutdown the ups and take all the servers with them as a safeguard

How much of an issue is water in datacenters.

Someone at my hospital left the door to the datacenter open one day, so I wandered in to take a look. I'd swear that the server racks were water-cooled.

There were huge insulated pipes running down the walls, and behind each rack were some big valves and big huge hoses running into the back of the rack, along big power and data cables.
 

JEDIYoda

Lifer
Jul 13, 2005
33,986
3,321
126
Originally posted by: Xanis
Originally posted by: bwatson283
Originally posted by: Xanis
Your entire infrastructure was fucked because the sever room lost power? That seems pretty absurd. Shouldn't the server room have been set up with UPSs and generators?

learn to read

FYI, I can read very well. What exactly are you talking about?

things that make you go hmmmmmm
 

MichaelD

Lifer
Jan 16, 2001
31,528
3
76
Originally posted by: DaveSimmons
I'd be wondering why the servers had no UPS systems with alarms, and why there wasn't a working backup generator.

My thoughts exactly. Any kind of mission critical infrastructure and backbone should be on both a UPS and a backup generator. Especially in a hospital!