

Big cyber hack of health records is 'only a matter of time' - Futurebot
http://www.politico.com/story/2014/07/cyber-hack-health-records-matter-time-108486.html#.U7QAut2eyvE.twitter

======
throwaway725
It's hard to describe the extent to which _no one is even trying_ outside of
Silicon Valley and very large corporations.

You guys have actual graybeard sysadmins. There are actually people who
understand security research and exploit development on your staffs. Even if
they don't have the power they need, _someone_ in the room is sufficiently
paranoid to at least _want_ to practice better security.

Come to the Midwest sometime. It's _dark_ out here. For starters, it's all
Windows Server all the time - not even by choice, but just because nothing
else ever occurred to anyone. No one has more than played with Linux at home.
Security consists of buying magical appliances and antivirus software (from
your VAR's salesperson on a site visit, through your VAR, through their
distributors, through their distributors), and that's if you're lucky.
"Security consulting" and PCI auditing consists of selling Nessus scans
(again, if you're lucky, more likely some no-name proprietary tool) at
$100,000 a pop. You're lucky if the Network Admin has ever even _heard_ of the
myth of the paranoid graybeard UNIX sysadmin. I personally know at least 60%
of the people in my city who are qualified to configure high-end Cisco gear -
I can count them on my hands - and they're nowhere near experts.

A "security expert" has maybe been to a sales event marketed as training and
knows, at most, that if you absolutely need security, you should install
Norton or Symantec anti-virus software. No one ever inquires into the
horrendous security engineering of the proprietary LOB applications they're
completely locked in. Encryption is on the "yeah, it would be nice, but let's
not get our heads in the clouds" list, even in healthcare.

No one thinks it's a problem that we all share the same code for the doors and
it hasn't changed in living memory despite extensive staff turnover, that the
cameras haven't worked in years, that they key to the server closet is kept in
a neighboring desk, that the that everyone's passwords are on post-it notes
under their keyboards, that all administration accounts have pretty much the
same password. No one thinks it's a problem that half the switching equipment
is in an unlocked closet in a patient-accessible hallway. No one thinks it's a
problem that we don't even bother with badging. The "security policy" was
copied and pasted out of a kit without even having been read.

Don't bother with companies who employ someone who's heard of Matasano or
Bruce Schneier. Don't bother with companies whose IT contractors employ
someone who's heard of Matasano or Bruce Schneier. Don't bother with people
who know what pf is, who understand that VPN is not a Cisco product, or who
might ever have bothered setting up appropriate ACLs on their network shares.
You don't need to find a place where a skilled adversary overlooked something.
There are hundreds of thousands of businesses that pay top dollar to IT firms
that _don 't even try._

Hit a small business in the Midwest, medical or otherwise. They won't even
notice.

~~~
georgemcbay
>It's hard to describe the extent to which no one is even trying outside of
Silicon Valley and very large corporations.

As someone who has worked for multiple Silicon Valley companies and a couple
of fairly large corporations, I think you are being overly optimistic about
how much (on average) they are trying wrt/security.

The actual processes tends to be: Put out something that coders with little
sense of real security think seems kind of secure, wait to get hacked, fix
attack vector, repeat.

This process is probably not much different than the midwest places you are
thinking about except for the fact that those places just aren't targeted as
much because they aren't sexy targets so the release->hack->fix cycle never
gets churning fully.

------
siculars
I actually work for one of the institutions cited in the article; yes, we have
recently received the largest HIPAA related fine to date. It seems to me, as
indicated in the article, that the governments agenda is to enforce
compliance/increase security by way of increasing penalties. If you look at
penalties levied to date there is little correlation to the amount of people
effected, but rather the ability for the institution to pay. I actually think
this is a reasonable solution for the government to implement. Too often we
have seen various industries scoff at low fine amounts. A sliding scale is
more likely to get executives attention.

In truth, the institution is taking this very, very seriously. There have been
and will be many policy changes coming down from management. Outside
consultants/auditors have been brought in (KPMG). Mandatory reporting and
audit for three years by HHS. No solo (re. rogue) IT/developer personnel will
be allowed to deploy changes to production without oversight, ie. change
management oversight by a central board of peers. Complete endpoint encryption
campaign. Employee education. Re-evaluation of business associate agreements.
And on and on.

All this will, of course, drive costs up but hopefully necessarily so - if
implemented properly. The idea is that on the other side of this the
institution will be a lot tighter ship with less leaks and possibilities for
information leaks throughout. It will also act as an opportunity to educate
executives as to the importance of security. The down side is that process can
be implemented with a heavy hand grinding all progress to a screeching halt.
If they can implement process while maintaining progress it should work itself
out.

My personal hope is that the institution invests not only in process but
people. As others have pointed out, there has been a complete underinvestment
in technical resources, re. people, to run the enterprise in healthcare and
that has to change.

~~~
specialist
Ambiguous rules, ruthlessly enforced.

When I got started, there was a pretty big HIPAA case (locally), which had
recently settled. I contacted both sides to find out the terms. Hoped to gleen
some actionable advice.

The agreement for the hospital was "to try harder".

------
MattGrommes
I think I'm honestly more afraid of what insurance companies and employers
would do with unfettered access to the same data. Identity theft protections /
responses will continue to get better but I can't say the laws will.

------
chton
Plenty of people in the tech world have been warning about this for a long
time. Healthcare is lax with security on all levels, from electronic
pacemakers to healthcare.gov. Manufacturers and providers need to act instead
of waiting for it all to go wrong.

~~~
FLUX-YOU
Physical security as well. Put on a pair of scrubs and maybe fake a badge
(sometimes not required) and you can get pretty far. Usually far enough to get
access to at least one machine. Hospitals are big and the uniform grants a lot
of trust.

~~~
Mandatum
People need to realize this isn't an issue targeted to a single area or
industry - it's across the board. Banks with remote SQL access, finance firms
and patent offices with executable rights.. The only solution is to create a
separate environment for those inclined to operate within.

------
specialist
Demographic data are used to match patients across systems. That data must be
plaintext, otherwise you cannot match.

There are only two solutions to protecting patient privacy.

#1 Issue globally unique identifiers (a la RealID or Medicare For All). Then
you can hide the demographic data, use translucent database techniques to
encrypt records, etc.

#2 Patients data goes with the patient. Either on a thumb drive or data vault.
Impractical.

There are no other way.

About the "hack" attack... I implemented 5 RHIOs with a couple 100
participants. Is copy files via FTP considered a hack? Any attacker wouldn't
have to be very clever.

~~~
siculars
I work in medical informatics. Currently working on an interesting data
integration project. Drop me a line, would love to talk more and hear about
your RHIO experience.

------
cordite
One thing to note is that many of the companies that run EMR software actually
are constrained by someone within to stay on 80s unix servers.

I do not know about the security and network walling that happens, though I
know a lot use windows servers to run the client code in a likely secured
environment over citrix sessions.

------
919939700040
Do you love me cyber artist i love my cyber brain hacking machine sir give me
a greenes life

------
angersock
So, here's the thing.

We're going to be letting big data and whatnot into health. Every heartbeat
you have, every milliliter of blood drawn, even your base pairs will be
cataloged meticulously and shoved into a database. So, that amount of
privacy/care is gone, never to return.

There are two things that are really worth asking:

Are patients or practitioners the ones who own collected data?

What is the utility of data leaks of non-billing information, truly?

For the first, I rather think we're doomed if we don't legislate that patients
own all of their own data and license it back to care providers--basically,
treat labwork and everything as works-for-hire with copyright belonging to the
patient.

For the second, I think we're going to continue to treat all the health data
as super mission critical, which will in turn only cause "security
contsultants" to line their pockets further and justify the miserable
busybodies in hospital IT that get in the way of progress under claims of "but
but but the terror^H^H^H^H^HHIPAA".

We already give away gigabytes of data every year about everything else, so
it's kind of silly/sad that we're so protective about arguably the least
harmful/most useful data we have.

~~~
orf
>> ..which will in turn only cause "security contsultants" to line their
pockets further and justify the miserable busybodies..

You obviously don't understand how utterly appalling the security of a lot of
software is. My job title is "security consultant" and the last test I did was
on a medical related application that tracks medication administered to
patients in care facilities. All the sensitive patient data can be read and
updated completely unauthenticated. They implemented ALL of their permission
checks in the thick client which simply talked to a webservice - anyone with a
decompiler and half a brain could alter patients drug doses with no record.
This is a large application used in a large number of places.

Maybe we need more security consultants in the medical industry.

~~~
angersock
There's a lot of really weird systems design that goes on in hospitals, but
that doesn't necessarily mean that it is wrong.

The main problem I see is that the security is going to focus on things which
won't matter, at the cost of creating _still more_ barriers to entry for new
techniques (I may or may not be fighting this battle at my current company).

~~~
chton
If your data can be reached by everyone, completely unsecured, that's not a
weird design. That is just plain wrong. Security does matter for these things,
because human lives and identities are at stake. Health data is far more
important than your facebook profile, but it's woefully undersecured. If
fixing security problems is "creating more barriers", then so be it, it has to
be done.

~~~
angersock
It's not available to _everyone_ \--for example, a lot of institutions don't
have wireless (lol) and have pretty strict controls on devices that plug into
the wired networks.

~~~
chton
what if I break in to the network through a web server? or happen to find an
open network port in my hospital room?

Security is done in layers. Physical access is 1 layer, network security is
another, requiring proper authentication/authorization for your software is
yet another, etc. Each layer stops a percentage of hackers. If you skip
important layers like auth, you're allowing a larger number of people in,
simple as that.

It's the equivalent of saying "but ofcourse I leave a large pile of cash in
the middle of my dining room, my front door is locked, that's secure, right?
Putting it in a safe or even a locked closet is just a burden." It ignores
basic security principles.

------
Mz
I worked for an insurance company for five years. I sometimes tried to suggest
some IT type upgrades for some of the data they had. It all got shot down. And
this was just in hopes of getting better organized internally. I think they
really don't even know what info they have. It was a pretty crazy-making
situation.

------
jqm
The root of the problem in my opinion?

Management often doesn't know what they don't know and are afraid if they try
to find out, or listen to someone, they will be "out of control of the
situation".

The other part of the problem? Tons of turf protecting MS type admins who know
just enough to blow smoke in the ears of management and keep out people who
would address the problem and (in their view) upset the apple cart.

This self serving turf protecting lack of openness is always a disaster
waiting to happen. But not to worry. The two above-mentioned parties are
experts at deflecting blame when an issue occurs. So at least we will have
someone to throw under the bus when the big breach finally does happen.

