I'd suggest you work with a company that has a lot of experience in this area before you inadvertently find yourself fined (or sued) into bankruptcy.
On the positive side- I've learned to love the compliance. Most of it is common sense things (like not giving out info over the phone to any random person) and has made our organization more efficient.
i think some companies try for the quick buck and in the name of cost savings run a shoddy operation. ymmv.
ps - do not try making said quick buck in health care. contrary to VC bets otherwise, it doesnt exist. plan for a very very long haul.
- Physical server isolation: you cannot have other instances sniffing around in your deallocated garbage memory.
- Encrypted data stores: physical theft of the server should not provide access to your data.
- Server providers who can sign a Business Associate Agreement: many hospitals and firms with medical data require this in their stipulations.
- Audit trails for database modifications, access, etc. Basically, log everything, and this has to be encrypted too if it contains protected health information (PHI).
- All PHI over HTTPS if you have a webapp. NO PHI OVER EMAIL OR HTTP.
- "Soft" guidelines such as password complexity measures, auto session expiration, disallowed multi-sessions.
Again, this is not an exhaustive list. You really need to check with a lawyer who knows this stuff. The fines are enormous (read: business-ending) if you break the rules.
How do you work to implement these? Well, find a host who is willing to sign a BAA. Here are the two major contenders I'm aware of:
- Use Amazon AWS; they're willing to sign a BAA with you and provide you the physical server isolation you need. However, this doesn't come cheap. Expect >$2,000/mo in costs to keep this configuration. Also, you'd better be a network pro or willing to learn how to manage VPCs correctly to provide proper network-level isolation for the databases.
- Use aptible.com (they happen to be a YC company, and I don't know of anyone else doing this). Frank & Chas (the founders) are very responsive and aim to provide a comprehensive package, including backups, audit trails, and even employee training. The Docker-based and heroku-like interface is very appealing:
This option is still expensive. They host on AWS as well, so you're paying for the server costs + premium. However, this will still be a lot cheaper than hiring a competent sysadmin to make sure the execution is flawless.
AFAICT the whole "can't do HIPAA in the cloud" meme arose from the reluctance of cloud services to sign BAAs, Google only got on board with that earlier this year.
The HIPAA Security Rule requires that you take "reasonable and appropriate" measures to safeguard the confidentiality, integrity, and availability of electronic regulated health data.
Physical server isolation is not prima facie required, meaning there is no requirement that literally states you must isolate servers down to bare metal. Your customers' judgments about what constitutes "reasonable and appropriate" safeguards may vary, though. That decision should be driven by your risk assessment.
Encryption is also not prima facie required, actually. I can't imagine a case in which it would be reasonable for a cloud SaaS provider not implement it, so I'd say it is de facto required. MFA may be moving to the same category, for most web services.
I am not a lawyer.
However, you may want to have a look at TrueVault which has been featured on HN.
I'm just surprised at how few resources there are that explain what it takes, and I hope that someday soon, healthcare startup CTO's will be referred to clearly documented open source solutions that are fairly fool-proof, rather than paid-for services (@sebst). Amazon's compliance page is unfortunately uninformative (@byoung2).
The only reason I have been able to navigate HIPAA/HITECH is because I've worked many years previously as a software engineer and then a senior technology manager for a very large hospital network and dealt with these requirements daily.
As noted in other comments, most of HIPAA is not technical. Most of the requirements relate to risk assessment, policies, training, incident response, etc.
With that in mind, I'm going to quickly run down all of the major moving parts and then cover some of the technical considerations for setting up a server.
HIPAA has three main rules you need to comply with:
1. The Privacy Rule - Governs the use and disclosure of PHI (protected health information). Applies to all forms of PHI (verbal, written, electronic, etc.).
2. The Security Rule - Governs safeguards for electronic PHI
3. The Breach Notification Rule - Governs your responsibilities during a security or privacy incident
The Security Rule has a general security standard, some documentation/retention rules, and three sections of safeguards. They are:
1. Administrative Safeguards
2. Physical Safeguards
3. Technical Safeguards
Some of the safeguards are mandatory. Some are "addressable," meaning if you don't implement them you must document why you chose not to and what other safeguards you applied instead.
Most likely, you're going to start with something like the following for your servers:
1. Sign a BAA with any service provider who is going to touch PHI for you.
2. Restrict physical and logical server access to authorized individuals. Document how you restrict access and why the methods chosen are reasonable and appropriate given the risk posture of your organization. (There's a LOT packed into this step.)
3. Log all access and data modification events. If you use a logging service that isn't HIPAA-compliant, make sure you're not including PHI data you send them.
4. Encrypt data at rest and in transit, including inside the network perimeter. Document your network topology and access points.
5. Implement backups according to your organization's HIPAA contingency/disaster recovery plan. Document the backup scheme.
A few caveats:
- I haven't addressed application-level security. The same requirements apply, but the implementation differs.
- Your customers will demand additional safeguards that aren't in HIPAA.
At Aptible, we help with all of this, plus all of the other requirements (risk assessment, policies, training, etc.), so you can get a complete handle on your compliance status.
> $0.10/Hour Additional App/Database Containers $0.40/GB/Month Additional Storage.
Thanks for at least giving me a source to cite in grant applications!
We've been building applications with both startups and large healthcare organizations like the VA. I'd love to talk with you more to see what your needs are. Feel free to hit me with an email (firstname.lastname@example.org) with any questions.
Part of what makes HIPAA compliance challenging from a techies perspective is that there are very few proscriptive rules. A lot of implementation is left up to the provider to provide flexibility but the justification for all those decisions needs to be defensible.
A couple last items I would add:
Not only do you need a BAA with any service provider you use you will also need one for any contractor who has access to PHI you are responsible for. As of the latest set of rules this also applies to any subcontractors that your contractors may use.
You will also need named privacy and security officers who are responsible for the overall program and will be the first ones HHS and OCR will ask for should you be audited.
One should also note that the Breach Notification rule, through its definition of what constitutes unsecured PHI, actually sneaks in technical requirements that entities dealing with PHI probably should treat as near-mandatory for encryption, etc., since, even though they aren't strictly mandatory, significantly impact the likelihood of a reportable breach.
My reading says "neither". Conservative move is "encrypt everything" but curious if others have passed/failed a HIPAA audit with a standard MySQL or SQL Server system (assuming you have individual ID access & logging).
the rest is really basic good practice for any professional-grade service that is entrusted with customer data
i respect that someone is trying to market a prepackaged solution but i would advise against shortcuts. you (the developer/owner/cto) should take the time to understand your stack from bere metal up, and be able to explain the risks that are real vs academic, the financial impact of attacks or internal employee mistakes/theft, and the real cost and benefits of devops done right.
you owe it to your customers, your shareholders, your employees and yourself to invest the mental energy to really understand the nuances.
otherwise, dont go into health care markets.
(i like to think we take our work as seriously as the best MDs, so would you want your surgeoun taking shortcuts or trying to build a MVP?)
How are y'all? What's the latest?
Options are to go with a service company like Aptible or TrueVault, or fumble through vast amounts of obtuse technical and legal documentation, then hire a security expert to audit your homemade system and hope that everything goes OK. Both options, as they currently exist, require a fair amount of $$$.
If you have minimal security expertise, then you're supposed to be treated as noncompliant. There are two valid options: have the security expertise (and be prepared to legally vouch for it that you have, and get insurance for cases if it turns out that your expertise wan't enough), or get the security expertise from someone else. Oh, and the third option is to stop handling any sensitive data at all if you aren't equipped to handle it.
The reason why HIPAA (and similar compliance issues in other domains) exist is to try to eliminate what you're proposing. Random companies with minimal security expertise shouldn't be handling such data themselves, period.
If you aren't a trained, competent security analyst/engineer (or working with one) then don't accept sensitive information. PHI, credit card numbers, even contact lists: all of these are considered "sensitive" for a reason, and you are being reckless in the extreme if you accept them from your customers without the proper protections in place.
I have seen and even been responsible for (many years ago) the implementation of some horrifyingly bad "security controls" by people who simply didn't understand the field.
It's not just crypto, or disabling root logins via SSH, or preventing XSS on your web forms; you need to plan for a multitude of possible attack vectors, internal and external. Many of those aren't likely to be intuitive without some real training and exposure to best practices.
I believe that rackspace has a pretty program around compliance.
training-hipaa.net provides Server Disaster Recovery Plan Template which is the part of HIPAA Compliance.
This Server Recovery Plan documents the strategies, personnel, procedures and resources necessary to recover the server following any type of short or long term disruption. You can find more information about this over here http://www.training-hipaa.net/template_suite/Server_recovery...
That said, the thread does have some great safe guards and industry best practices you should look at.
We sell cloud but focused on security, compliance, and performance. Check us out.
* 5 X VPN's (no apostrophe needed; what does this even mean?)
* 1 Dedicated IP Addresse
* Header/footer links open in a new tab.
* Some numbers are listed as 800, some as 1-800.
* Data center addresses have a comma at the end of the first line. That's not how US addresses are written.
* Headquarters address is all on one line.
Please don't go and fix the specific things I listed and think you're done. You need to significantly rethink the impression you're giving.
And we've gone through a HITRUST audit to validate these claims:
Check out TrueVault - HIPAA-compliant data store that is a YC grad.