Saw an interesting program today. Does HIPAA extend to programmers?

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
I came across an interesting application recently, that seemed to break every single rule in the book where it came to security.

I first noticed something was up, when I was waiting for the app to run a basic database search (which could take several minutes, during which time the app froze and became unresponsive with the window not redrawing). I was idly clicking explorer windows on the client PC and came to the C:\ directory. There was something interesting there; a file called 'usercache.txt'.

I loaded it up. It was a CSV file that was fairly obviously a cached copy of the 'users' database table for the business app. It included everything, user role, admin privileges, login name, real name, password ("encrypted"), entire password history, etc.

Examining the passwords, it was painfully obvious that the encryption was crude. Just eyeballing the file, and my own password history, it was fairly obviously some type of polyalphabetic substitution cipher (aka Vigenere cipher, but extended to include numerals). As a lot of people had their first password as the 'default' first password (password) in the history, it soon became apparent that the encryption 'key' was the username + a fixed offset.

Why this was cached instead of just running an SQL query on the oracle server against it, I don't know. Maybe it was for performance reasons.

Anyway, I tinkered around further. This time I just browsed the network explorer window and found the app server. Clicked on it, and opened the shared drive (Initially, I was somewhat surprised to find that I could log on freely as guest and see lots of apparently interesting files. But then, the app ran fine when run under the windows guest account, so I guess this was deliberate).

The files on the totally open app server share were quite interesting:
Text files containing conversations from the integrated private-messgaing function in the app, and .wav files containing everyone's voice messages and voice memos.

However, much more interesting than this was the [applicationname].ini file on the root of the file share.
This contained the very interesting segment:
Code:
[ClientDatabaseConnectionString]
Data Source=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=EPRORA)(PORT=1521)))(CONNECT_DATA=(SERVER=DEDICATED)));User Id=SYSTEM;Password=D7g8Yj2-+f;

Hmm. I wonder what that could possibly do.

Anyway, my interest was piqued at this point, so I thought I'd try a more directed approach. I popped in a USB stick and copied the app .exe file to it. Took it home and threw it at some decompilers.

The findings here were just as interesting:
1. All the database queries where strings in the application source. There were no views, and no stored procedures.
2. The query strings were built programmatically. Although prepared statements for user-supplied fields were used appropriately, many of the parameters in the .ini file were used to construct the query strings without any escaping.
3. There was no 'runtime' checking of user credentials/rights/etc. except after the login procedure.
4. The password encryption was confirmed to be a polyalphabetic substitution cipher using the usename as key.

5. The examination of the code confirmed the reason for some peculiar data corruption and database inconsistencies that I'd noticed following client crashes. There were no transactions used anywhere in the app. Command batches that updated multiple tables relied purely on seredipity to ensure consistent data.

And you know the best thing about this system:
It's an electronic patient record system at a major hospital - and it was installed just a few weeks ago!

N.B. Passwords have been changed to protect the guilty.
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,286
145
106
:rolleyes:

Wow, I wonder what was (or wasn't) going through the programmers mind when they wrote this. This is the sort of system that is just BEGGING to be abused.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,696
4,658
75
Does HIPAA extend to programmers
Absolutely. I did contract work at an insurance company once years ago.

But I think you're asking if it applies to programs. I'm not sure about that.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
HIPAA applies to the organization as a whole. It is up to them to make sure that whatever they are doing complies with the rules. A program can reveal everything about a patient in log form on the server as long as that log has safeguards to prevent anyone who uses the software on a daily basis from reading the patient information without permission. Even a simple password like "1234" would be allowed if the only terminal where it could be entered was in a locked room.

HIPAA does little to secure software , unfortunately.
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
This contained the very interesting segment:
Code:
[ClientDatabaseConnectionString]
Data Source=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=EPRORA)(PORT=1521)))(CONNECT_DATA=(SERVER=DEDICATED)));User Id=SYSTEM;Password=D7g8Yj2-+f;

Hmm. I wonder what that could possibly do.
I assume you actually now very well what you could do with it?

wow. Besides the fact that the voice memos and so forth probably contain rather confidential stuff, basically any half-witted user could just drop the database. You actually don't even have to be a hacker...
But first i would take a little tour of that database. You never know what else interesting you might find. maybe adjust your salary? Just don't get too greedy or they will notice. ;)
 

911paramedic

Diamond Member
Jan 7, 2002
9,448
1
76
Strange question. I don't think it would actually pertain to the programmer, but if somebody chose to use the program itself and it left the information wasn't secure, the client using the software would be answering a lot of questions.

It is up to the provider to ensure that patient info is secure, period. The programmer may have no idea what HIPPA even means or entails.

(I hope I read enough of these posts to understand what's being asked, I think I did.)
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Strange question. I don't think it would actually pertain to the programmer, but if somebody chose to use the program itself and it left the information wasn't secure, the client using the software would be answering a lot of questions.

It is up to the provider to ensure that patient info is secure, period. The programmer may have no idea what HIPPA even means or entails.

(I hope I read enough of these posts to understand what's being asked, I think I did.)

There are some people that are personally liable for HIPPA violations, such as nurses and doctors, but I have no idea whether that extends to hospital IT.
 

EvilManagedCare

Senior member
Nov 6, 2004
324
0
0
Strange question. I don't think it would actually pertain to the programmer, but if somebody chose to use the program itself and it left the information wasn't secure, the client using the software would be answering a lot of questions.

It is up to the provider to ensure that patient info is secure, period. The programmer may have no idea what HIPPA even means or entails.

(I hope I read enough of these posts to understand what's being asked, I think I did.)

My guess is the developing company could eventually be held accountable for an app that somehow breaches HIPPA, and that would probably not happen until the problem is traced to the software. The hapless user would likely be dragged over the coals first. I also doubt it pertains to the specific programmer.
 

brandonb

Diamond Member
Oct 17, 2006
3,731
2
0
My software interacts with hospital systems. What is the name of the software? EPIC by chance?
 

Spaces

Junior Member
Feb 18, 2011
7
0
0
My software interacts with hospital systems. What is the name of the software? EPIC by chance?

I don't think it would be a good idea to release that information to the public. Maybe request a PM instead?
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
I don't think it would be a good idea to release that information to the public. Maybe request a PM instead?

There is no problem with someone telling us about shitty software to avoid. I can't see the OP getting in trouble over confirming who the provider is (hopefully he will help some poor Hospital IT admin in the future avoid this software company).
 

MrChad

Lifer
Aug 22, 2001
13,507
3
81
There are some people that are personally liable for HIPPA violations, such as nurses and doctors, but I have no idea whether that extends to hospital IT.

I don't know if individuals would be personally liable, but the organization using the software might be. As someone who develops for clients in the healthcare industry, I can attest to the fact that the big players take personal health information VERY seriously and generally have rigorous standards in place to protect and secure that data. Their practices are audited on a regular basis (which is required by Medicare, I believe), and they can face severe penalties for violations. As developers we have no access to production data at all and all test data has to go through an obfuscation process before it is entered in our system.
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
I don't know if individuals would be personally liable, but the organization using the software might be. As someone who develops for clients in the healthcare industry, I can attest to the fact that the big players take personal health information VERY seriously and generally have rigorous standards in place to protect and secure that data. Their practices are audited on a regular basis (which is required by Medicare, I believe), and they can face severe penalties for violations. As developers we have no access to production data at all and all test data has to go through an obfuscation process before it is entered in our system.

Exactly. Frankly, I'm surprised this software made it through RA/QA through the FDA. Hell, the validation process should have weeded this out if it somehow passed the RA/QA. There are entire companies devoted to medical software validation and it's a huge business just because of this need.

Something sounds fishy.

The reason why the typical medical/healthcare software is 6+ years behind the industry average is because of the heavy regulation process leading to a severe increase in development time.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Exactly. Frankly, I'm surprised this software made it through RA/QA through the FDA. Hell, the validation process should have weeded this out if it somehow passed the RA/QA. There are entire companies devoted to medical software validation and it's a huge business just because of this need.

Something sounds fishy.

The reason why the typical medical/healthcare software is 6+ years behind the industry average is because of the heavy regulation process leading to a severe increase in development time.

Small time agencies with small time clients usually get away with a lot of crap. Even though someone is SUPPOSED to be correctly audited, doesn't mean that the auditor used actually knows what he is supposed to do.

May I point you to here
http://serverfault.com/questions/29...ot-how-do-i-give-him-the-information-he-wants

(A good WTF read)
 

Check

Senior member
Nov 6, 2000
366
0
0
Believe it or not:
This is not a HIPAA violation.

Shocking right?

If the system used electronic signatures and therefore was subject to CFR 21 Part 11, the FDA MAY be unhappy, but a case could be made for being compliant with that regulation.

To answer your original question though:
"Does HIPAA extend to programmers?"

HIPAA applies to anyone that has access to the system. If you are not authorized to see a patients information, then you are not allowed to see two or more pieces of information. For the most part this isn't a problem since hospitals are full of people that are authorized to see this information. However with IT there are two ways to deal with it.
1) When the patients are signing their waiver make sure it extends to ALL hospital employees and people that work in the hospital in some fashion
2) program IT accounts to not be able to access patient information

It gets a little dicey when you are talking about a system that is going to be serviced by a Field Service Tech or a device that stores records and gets sent back for a refurb.
In this case the Field Service tech is a super user, and even though their account might not be allowed to see patient information there is nothing blocking them from making a physician log in.
This is accepted as a risk and it's on the super user not to go out of their way to look at patient information.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136

UK... no wonder.

But it was a good laugh or rather shocking. I mean I do buy stuff online now and then...

EDIT:

BTW: System on which you have to regularly change your passwords are IMHO less secure for the simple reason that people start to write down their passwords. It possible to remember a strong password after a certain time but if it changes every month or so...And for anything that has to be really secure a password alone is not good enough anyway.
I know I just remembered that Story on TDWTF with the IT security head who wrote his pin onto his smartcard...so we all know the weakest link...That's probably explains the hype of biometrical authentication.
 
Last edited:

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Exactly. Frankly, I'm surprised this software made it through RA/QA through the FDA. Hell, the validation process should have weeded this out if it somehow passed the RA/QA. There are entire companies devoted to medical software validation and it's a huge business just because of this need.

Something sounds fishy.

I have no idea how this thing got approved. The HIPAA reference was a bit 'tongue in cheek' as this isn't in the US, so the FDA aren't involved - but even so, I'd have thought that there would be some oversight. After all, there are draconian data protection laws here in the UK.

I think the reason that the system is so bad is that it is a 3rd party GUI that interfaces with a very old, and no longer supported, records system. The old interface is a classic example of how not to design data-entry software. However, this 3rd party update only provides part of the functionality (functionality for doctors to write and accessnotes , but not for clerical and admin staff - e.g. for booking tests, appointments, updating contact details, etc. ).

So, the new system has to somehow interact with the old system, without breaking compatibility. I guess, they decided to just connect directly into the old oracle database.

Having performed some more investigations:
1. The password 'cache' file is maintained by the old client software, not the new one (the new one caches nothing locally, except under error conditions, where it will dump work-in-progress into a text file on the desktop before exiting).
2. For compatability, both systems have to use the inept password storage scheme
3. The server SMB share is completely open ('everyone' has rw access to pretty much everything client related - so graphical files, scanned documents, sound files, etc. which are too big to go into a database field are stored in a big directory tree which anyone can browse). This is likely an installation failure, due to incompetent management of user rights, and the foolish decision to allow the client application to run under the windows guest account. Thankfully, the database files, etc. don't appear to be accessible.
4. The connection string I quoted above doesn't actually appear to be the real deal. I think it must have been a sample file for the old client software. I did eventually find the real connection string in the new software (it was encrypted with 256-bit AES, although the key was stored in cleartext in the .exe!)
5. The new client software had a reasonably well designed licensing check system - using public key cryptography and digital signatures. Interesting that the effort got made there, and not into protecting the real data. It was probably the best designed component in the whole thing.
6. I found the reason that the software was preposterously slow (taking minutes for a simple tasklist search - e.g. show all incomplete tasks oldest first). The client runs an SQL statement to retrieve the tasks from the list. However, for some reason, it doesn't populate all the fields it needs. So, it then goes and runs a new SQL query for each field in each row that it needs. So if you have 2000 incomplete tasks in the list, each of those tasks will generate about 10 more SQL queries - meaning the search will actually end-up running 20k queries (dynamic text, not prepared statements or stored procedures) on the server!

Moral of the story: if you're going to upgrade a system - get one that's been properly designed from the ground up. Don't try and patch an obsolete and unsupported system with 3rd party tools! This 3rd party add-on wasn't cheap at all - but I suppose it was cheaper than buying a brand-new system. And that's all the managers were interested in - the lowest bidder. Nevermind productivity, reliability or security.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
What the hell? I want to know what company that auditor works for so I can make sure my company never does business with them, or anyone that the company audits.

That story is so crazy, I don't even know whether to believe it. However, based on my own experiences, it probably is real.

Hell, I've even seen someone promoted to 'system administrator' for a medical records system who couldn't understand that a 'system service' was different to a 'server'.