• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

so, entire Interweb network was down at work

brainhulk

Diamond Member
450 bed hospital. down for about 4 hours. Some monkeys were preparing for a scheduled maintenance for the weekend when they effed up the entire network.

Is it possible to have a backup network connection that can take over in case of emergencies like this? For example, we have backup generators for power failures. Do other hospitals have redundant networks in place? Is there such a thing as having a redundant backup network?

It was pretty much chaos for the 4 hours because there were no backup hard copies of documents and protocols stored on network drives.
 
Last edited:
450 bed hospital. down for about 4 hours. Some monkeys were preparing for a scheduled maintenance for the weekend when they effed up the entire network.

Is it possible to have a backup network connection that can take over in case of emergencies like this? For example, we have backup generators for power failures. Do other hospitals have redundant networks in place? Is there such a thing as having a redundant backup network?

It was pretty much chaos for the 4 hours because there were no backup hard copies of documents and protocols stored on network drives.

I mean I guess you could but it would probably be a giant, expensive PITA. I know at one of the hospitals around here they have a paper based process for if the network goes down. Not sure how charts or things like that work but they have carts with procedures\paperwork\medicine that they break open if the network goes down. (They rotate the medicine out before expiration date so it doesn't have to get disposed of)

Side note - a fairly large organization lost power to their data center this weekend when someone threw the wrong breaker while power maintenance was being done. Only off for 45 seconds or so but a ton of stuff was powered off suddenly that doesn't like being powered off suddenly. Lots of money spent on a robust power system to supposedly avoid the need for the heat and $ cost of UPSes...
 
I mean I guess you could but it would probably be a giant, expensive PITA. I know at one of the hospitals around here they have a paper based process for if the network goes down. Not sure how charts or things like that work but they have carts with procedures\paperwork\medicine that they break open if the network goes down. (They rotate the medicine out before expiration date so it doesn't have to get disposed of)

Side note - a fairly large organization lost power to their data center this weekend when someone threw the wrong breaker while power maintenance was being done. Only off for 45 seconds or so but a ton of stuff was powered off suddenly that doesn't like being powered off suddenly. Lots of money spent on a robust power system to supposedly avoid the need for the heat and $ cost of UPSes...

yeah, no backup hard copies. I was searching my department through every drawer to find hard copies of protocols. Nothing! I don't know what the nurses did because the entire patient chart is online now. They were calling me asking if I knew the last time something was given. I was like wtf, how the hell would I know?

I could only imagine the IT guys sweating out those 4 hours. Wow. Will heads roll?

yeah, I have a friend who works in the IT department and I came to visit him just to see wtf happened. As we were joking around saying everybody is fucked...the VP called at 1 am in the morning, my other friend put her on speakerphone to update her while we were joking and cussing around. The first guy said did you just put the VP on speaker phone while we were talking shit? lol...I walked out after that.
 
The Dept of Justice internet / email/ special programs on servers, etc., was down for two days this week.
May help explain why there is no justice.
 
yeah, I have a friend who works in the IT department and I came to visit him just to see wtf happened. As we were joking around saying everybody is fucked...the VP called at 1 am in the morning, my other friend put her on speakerphone to update her while we were joking and cussing around. The first guy said did you just put the VP on speaker phone while we were talking shit? lol...I walked out after that.

lol.
 
I would think that hard copies for offline use would be a required part of hospital regulation per JHACO? At my facility, we have hard copies and protocols in the case of the network going down. Additionally, we have a small separate "downtime" network that houses some read only patient chart information and can provide lab data, etc.
 
I would think that hard copies for offline use would be a required part of hospital regulation per JHACO? At my facility, we have hard copies and protocols in the case of the network going down. Additionally, we have a small separate "downtime" network that houses some read only patient chart information and can provide lab data, etc.

yeah, I wrote on my report that hard copies of our protocols were needed and not available.
I would think a dedicated workstation that automatically copies an image of the network drive to it's local drive would be something simple to set up. Let's see if they can figure this out.

as for the Patient chart, Idk. I know they print hard copies for scheduled down time. Since this was unexpected, I'm not sure what they had available. As soon as my relief came in, I said, "shit hit fan, good luck!" and then I got out of dodge.
 
yeah, I wrote on my report that hard copies of our protocols were needed and not available.
I would think a dedicated workstation that automatically copies an image of the network drive to it's local drive would be something simple to set up. Let's see if they can figure this out.

as for the Patient chart, Idk. I know they print hard copies for scheduled down time. Since this was unexpected, I'm not sure what they had available. As soon as my relief came in, I said, "shit hit fan, good luck!" and then I got out of dodge.
Even when we have scheduled down time (generally overnight), the hospital basically stops functioning. This is an 800 bed community facility just for reference.
 
Last edited:
yeah, no backup hard copies. I was searching my department through every drawer to find hard copies of protocols. Nothing! I don't know what the nurses did because the entire patient chart is online now. They were calling me asking if I knew the last time something was given. I was like wtf, how the hell would I know?

D: Someone could have died from that. That's seriously not acceptable.
 
You'd be amazed at how many incompetent IT people are put in the position to do stuff that's way above their pay grade and expertise. Then again, lots of companies try to cut corners by hiring someone who's less qualified for the required job just to save a few grand at the expense of the hours of downtime that would cost said company hundred of thousands of dollars.
 
You'd be amazed at how many incompetent IT people are put in the position to do stuff that's way above their pay grade and expertise. Then again, lots of companies try to cut corners by hiring someone who's less qualified for the required job just to save a few grand at the expense of the hours of downtime that would cost said company hundred of thousands of dollars.

I'm not overwhelmingly surprised by the IT situation, hell, I've personally caused outages that long by accident, and recovered from far worse. But all patient records, including current-administered medication not having an offline copy? That scares the shit out of me.
 
I'm not overwhelmingly surprised by the IT situation, hell, I've personally caused outages that long by accident, and recovered from far worse. But all patient records, including current-administered medication not having an offline copy? That scares the shit out of me.
Yea i would think something like that needs to be accounted for during their BCP meetings.
 
Used to work IT in a hospital, I've seen all sorts of interesting stuff like that, one issue with hospital networks is there is too many hands in the pot, and too many legacy systems, at least at ours. You got IT, vendors, etc... in our case even doctors could add stuff as they please which caused switching loops several times because they only knew enough to be dangerous. Lots of NT4 stuff too and SCO Unix, you just hope it never goes down because nobody knows anything about it and the company that made it is long gone.

Then there is the fact that IT is not allowed to make any kind of changes or maintenance to production even if it needs to be done and we had no say about network config or security etc. Constant walking on egg shells and when something minor goes wrong you have to bandage it instead of fixing it properly so you end up eventually creating a big problem. Everything is a time bomb that you're just trying to delay, not remove. "We should do this" "we should do that". The answer is always no even if it's what makes sense. That place was a mess. We also had no say when it came to stuff like security. The asshole IT manager was the primary reason I left that place, the secondary was because it was too much of a mess and I felt it was too big of a liability.

Funny thing is I make more money now with WAY less stress.

As for a backup network, it's not really feasible, you'd pretty much need two nics in every PC. That would just add more complications.
 
I could only imagine the IT guys sweating out those 4 hours. Wow. Will heads roll?
Not if it is government unless the IT services are done by contractors. More likely they get promoted. Our network goes out for entire days and there is no consequence to anyone responsible.
 
I would think a dedicated workstation that automatically copies an image of the network drive to it's local drive would be something simple to set up. Let's see if they can figure this out.

I'm a retard and figured out how to backup our network drive to the computer at my desk.

This isn't allowed and everyone seems to wonder how I can keep working while they are scurrying around in a panic once a week when everything breaks.
 
One big barrier in corporate IT is red tape. Something that should be super simple usually turns into something super complicated and the complexity causes it to fail and get even more complicated and it's a snow ball effect.
 
I have some friends who work in hospital IT & every single hospital sounds like it's an absolute insane mess of insanity, both for IT & for power. Not a job I would ever ever want.

If places were willing to modernize, VDI is pretty good these days, SAN tech is great, virtualization blades with load-balancing is great, buy a bunch of Tesla Powerwalls for local emergency backup for the diesel generators that never get properly tested anyway, get something like a Shoretel system that has local boxes for backup calling, etc. The tech is there, it's fairly easy to implement, it just costs money. They can make the problem go away if they'd just invest, but they don't, generally.
 
Is it possible to have a backup network connection that can take over in case of emergencies like this? For example, we have backup generators for power failures. Do other hospitals have redundant networks in place? Is there such a thing as having a redundant backup network?
Sure, but how cost-effective it is depends on which single point of failure you want to eliminate.

For example, if your data is stored in the cloud somewhere, you'd need your building to have redundant WAN connections. That's fairly straightforward, and only really costs the ISP charge. (You jack it into your existing router and set up WAN failover. Easy-peasey.)

But if your data is stored in the building and your outage was due to, say, a core switch failing or a router getting hosed? Not so easy. (You can have largely redundant switching fabrics, but unless you've got dual NICs and dual wiring to each workstation, at some point there's going to be a device that will put 24-48 of your workstations offline if it dies.)

If you're just talking about a network drive being unavailable, then it's quite possible somebody just fucked up your file server or a Windows Server thing (somebody mentioned SSO upthread.) Redundant file servers are actually pretty doable.
 
Simple rebuild.

Some millionaire chick hired me as a network manager when I was 18 .... I crashed that whole database real nice. Deleted a few drive paths, wrekt all network ip's, sent that whole company to shit for at least 3 weeks 😎
 
Back
Top