
2.5M Medical Records Leaked by AI Company - wglb
https://securethoughts.com/medical-data-of-auto-accident-victims-exposed-online/
======
maybelsyrup
I'm a medical researcher, and I've worked with exactly this sort of data. The
number of hoops I had to jump through to even be allowed on the project was
insane. Background checks, legal forms of all kinds, trainings on how to keep
the data secure. Fingerprint locked rooms. Access control. And most
importantly: the data themselves were air-gapped.

If I fucked up on this scale, jail would be the least of my worries. I'd never
work again. I'd be professionally shunned forever, with no hope of redemption
once my name is attached to the incident in some Google search. No more
grants, no more collaborations, no nothing. I'd be ruined. Frankly I'd be
begging to be let into prison; where else am I gonna be able to eat?

Nothing will happen to these people.

~~~
sillysaurusx
This is a wild exaggeration. First of all, everyone is insecure. Work as a
pentester, and you'll see that the whole world is insecure. After the 30th
project with an SQL injection vuln or misconfiguration, you'll never be phased
again.

Secondly, this seems to be a straightforward database misconfiguration. The
database was set to public with no restrictions. It likely wasn't accessible
to anyone who wasn't explicitly looking for it, i.e. security researchers or
black hats. Yes, this was horrible, but nothing bad happened here. The
database was simply set to private.

Third, my friend, chillax a bit. Yes, it's extraordinarily important to
protect people's medical data. But no, you're not personally liable if a
fuckup happens. _Everyone_ fucks up. That's why the fuckups are covered by
legal protections. You take your job seriously. Good! But you also take your
job way too seriously. Bad. You're gonna burn yourself out within a decade
with this mindset.

None of this is to say that what happened didn't matter. Of course it matters.
But you are literally saying you'd be begging to be let into prison. That's
not proportionate, and feels like a reflection of the current social climate
of retribution-as-forgiveness.

~~~
mirashii
> This is a wild exaggeration.

It isn't. I get that experience as a pentester can make you jaded to security
practices, but the healthcare space is extremely regulated and much of what
the grandparent said is a direct result of that.

> Secondly, this seems to be a straightforward database misconfiguration. The
> database was set to public with no restrictions. It likely wasn't accessible
> to anyone who wasn't explicitly looking for it, i.e. security researchers or
> black hats. Yes, this was horrible, but nothing bad happened here. The
> database was simply set to private.

This isn't just a minor misconfiguration. Again, this is a regulated space,
with rules. There should be multiple layers in place to make sure that this is
not opened to the public. There's risk assessments that needed to be carried
out when the database was launched, and on a regular cadence afterwards. "It
wasn't accessible to anyone who wasn't explicitly looking for it" isn't
reassuring. Look at access logs on any device with a public IP address and
it's clear that people are _constantly_ looking.

> But no, you're not personally liable if a fuckup happens.

Under HIPAA, you can be held personally liable. This includes jail time for
certain classes of violations.

> But you also take your job way too seriously. Bad.

People working in this space are required to have regular training and undergo
audits of their activity, what information they access, risk assessments on
how it is stored, and long, drawn out compliance activities just to get
access.

~~~
kspacewalk2
>Under HIPAA, you can be held personally liable. This includes jail time for
certain classes of violations.

Has anyone ever been actually sent to jail under HIPAA for something
comparable to an honest DB config mistake? Has anyone been credibly threatened
with it in the court of law? Has anyone come remotely close?

~~~
parliament32
Not even close. The only case I could see it happening was if you purposefully
exfiltrated data and sold it, or got some sort of personal gain out of it.

It's the same thing with PCI. They say you can go to jail for violations but
exactly 0 people have ever been prosecuted, despite constant negligence in the
payment industry.

------
eggsmediumrare
This happens because execs are cheap, farm out everything to dev shops, and
the beleaguered, underqualified devs power out nonsense code to keep up with
the arbitrary point quota for the sprint. No one knows how the whole system
works, reqs change constantly so poorly-planned updates degrade the whole
thing, and all the while actual patient data gets mixed into the big soupy pot
of bullshit. Anyone who raises concerns about the situation gets to be the
"compliance officer," thus passing responsibility from the people benefitting
from the situation to the people who just want to keep their job.

I wonder what would happen if devs could say to their boss "no, this won't be
released until we're confident in it" with legally enforced immunity to
consequences.

~~~
iecheruo
What you're describing is a professional licensed engineer who has to among
other things take out insurance because they are ultimately responsible for
their decisions.

~~~
pnathan
Software development is long overdue for a legally professional software
engineer license and certification process. I imagine it's going to happen
down the road after something important gets burnt.

~~~
qchris
Every time this comes up, people (some of whom are in this thread) end up
talking about how this can't/shouldn't happen for software. After all, what,
is every high-schooler or green college grad that ever wants to code their own
app for a startup going to have to be professional certification?

I guess I'd argue that those people shouldn't be legally allowed near this
kind of thing without that kind of a certification. Looking into all of the
other engineering disciplines, that's exactly the kind of thing you see. I
have a BSME, but I haven't taken the Fundamentals of Engineering exam to get
my FE cert, in part because getting a PE certification requires working
underneath a licensed PE for a certain number of years, which isn't the case
for my current job.

I also know that by not doing so, there are certain projects that I simply
can't work on. I have to imagine that there's a way to create a legally
enforceable framework that falls into the same category for software
engineers. Want to build a company that creates a digitally-synced notepad?
Have at. Want to touch personally-identifiable medical data? Better have a
licensed engineer working on that project to sign off, else your company is
wide-open to liability claims _with teeth_. If something unreasonable gets by
the signed-off engineer, they're on the hook too.

Obviously, it's a complicated problem, and reducing things to a first-order
solution rarely is a catch-all, but there _has_ to be some more
professional/personal responsibility taken by the individuals building these
systems, and a requirement of licensure is a way of empowering engineers in
those positions to the point where it actually matters.

~~~
raxxorrax
I think this is a bad idea.

I developed software for medical devices and you have to do a risk analysis,
formalize the software development process, declare qualifications of people,
make it revision proof, have a formal testing process, ... everything is
already accounted for.

Notified bodies ensure compliance. They have the problem that they cannot
really evaluate the work of software engineers of course. Not even another
software engineer could do that within feasible time limits. No software
engineer can make sure there aren't exploits that could endanger user data.
You can at most test if due diligence was ensured.

The manufacturer is responsible for ensuring safe operations of devices and
yes, that includes keeping personal data safe.

But again, the problem wasn't the engineer at all, the problem is the wish for
amassing data like this. Paper license or not, it rarely ensures competency
and wouldn't have solved this problem.

Aside from legislative issues that ensures that user data belongs to the user
the data is about, ensuring that companies don't sell and share medical data
with "friends and family", ... this is probably the last step, if it is even
required at all, which I would dispute. There are no guarantees if you amass
data like it was done here.

~~~
rapind
Make it prohibitively expensive to leak data (compliance fines, lawsuits) and
the problem will solve itself. Companies that collect data will then be
begging for certification and regulation.

~~~
dependenttypes
It would be even better if people learned to refuse to give data irrelevant to
the service that they are seeking and/or if there was some sort of regulation
about this (I should not have to give my name and address when returning a
product for example).

------
divbzero
> _Medical data is the most valuable and it is bought and sold daily on the
> Dark Web. The infosec company Trustwave published a report that valued
> medical records at $250 per record on the black market, while credit cards
> sold for $5.40 per record._

Why can medical records be sold for so much? What value do they provide to the
buyers?

~~~
nitwit005
You can bill insurance companies for fake treatments.

~~~
octoberfranklin
And when you get caught, you lose your medical license.

Can you really fake bill enough before getting caught to be worth burning an
M.D.?

I never really understood this angle. Say the insurers use a phone call to
verify 1% of treatments with the patient. Say 20% of those calls are answered
and produce a coherent reply. That ought to be enough to make the expected
payoff negative.

Either way, this problem isn't unique to the medical world. Every other
industry has to prevent fraudulent invoicing.

~~~
hobofan
I assume that if you do fake billing, you wouldn't do that with your own
identity, so the risk of losing a medical license isn't there.

~~~
octoberfranklin
Nobody accepts anonymous medical invoices.

~~~
sushshshsh
But they do accept ones with forgef credentials and redirected bank payments
^.^

~~~
octoberfranklin
If you can "redirect" somebody's money you don't need to bother stealing
patient medical records.

~~~
sushshshsh
Then you'd be surprised to hear about the income tax return scam back in the
day where even lay people were able to achieve just that, for 2 million bucks
specifically ^.^

[https://www.google.com/amp/s/www.nydailynews.com/news/nation...](https://www.google.com/amp/s/www.nydailynews.com/news/national/woman-
arrested-falsely-claiming-2-1-million-tax-refund-
article-1.1093171%3foutputType=amp)

~~~
octoberfranklin
And they didn't need any medical records to do it!

... which was pretty much my point.

If they can steal money out of doctors' bank accounts, or redirect it on its
way in, they don't need patient data. The doctors already have patients. And
payments coming in. This isn't like income tax refunds where the payer pays
each individual.

In fact, if they have patient data, it's stupid to use it. What insurer isn't
going to notice some doctor suddenly getting a thousand new patients all in
the course of one week?

Sorry man, it still doesn't add up. If you're equipped to do this at all
you're equipped to do it without needing patient data.

------
est31
The issue with such highly sensitive data is that it's just data. Probably a
few GB. Many AAA game downloads nowadays are much bigger than that. Even if
they protected their database (article says they didn't), it's easy to break
into a place that has all the data collected compared to collecting it
yourself from doctors all over the country. There's an easy solution to this:
don't concentrate highly sensitive data.

With nuclear material, the IAEA has set up a sophisticated surveillance regime
to make sure that countries use nuclear technology for peaceful purposes.
Everything that contains nuclear material is video surveilled. They collect
probes and check composition to prevent someone taking, say 1%, replacing it
with filler material, and then mixing it again.

If we concentrate data, we should employ such surveillance regimes for highly
enriched sensitive data as well.

~~~
sebmellen
Decentralization and self-sovereign identity are the only way to solve this,
IMO. I'll sing the praises of Estonia on this one: their X-Road system for
storing sensitive data is brilliant.

I'm working on a team writing an in-depth whitepaper on this topic right now,
I'd love to link it. Perhaps when it's published.

~~~
adminprof
Please do. Would love to hear success stories for data security.

~~~
sebmellen
Would you mind sending me an email at s@assembl.net? I can't find any contact
info in your bio.

------
hn_throwaway_99
Wow, let's just say from their website I'm impressed they got anyone to give
them data at all. That site just screams "Take some programming tasks that
require a minimum of data analysis and call it AI." From their product page
I'd take they easy bet that they're just wrapping some cloud providers bot SDK
and calling it their chatbot. Given the lack of professionalism on this site I
am unsurprised about their security issues.

~~~
twodave
I'm not surprised in the least. I've sat in multiple meetings with execs whose
boards have told them, "We need to get into this AI game, it could solve all
our scaling problems!" Often both the board and the exec are clueless as to
what AI is capable of or can offer their business specifically, they begin
talking with vendors.

The execs feel they need to show they at least made an effort, and these
companies branding themselves AI are easily able to take advantage. They
market their products as these complex decision-making/analytical systems, but
most of them are either glorified reporting dashboards or just an integration
point/pipeline builder for a set of services.

The business buying this stuff doesn't need any of it to become more
efficient, they are effectively spending the money to be able to generate the
hype of saying, "We're in the AI space".

------
tmpz22
HIPPA authorities are going to fine them into oblivion right? Right?

~~~
leetrout
Jail is on the table for HIPAA violations. I am always afraid to mess with
medical data.

~~~
hn_throwaway_99
Jail is on the table if you purposefully steal the data. I was interested so
searched for examples of actual HIPAA jail time, and in all the cases I could
find the conduct was egregious and with clear intent (e.g.
[https://mazarsusa.com/ledger/jail-time-for-a-hipaa-
violation...](https://mazarsusa.com/ledger/jail-time-for-a-hipaa-violation/)
). Misconfiguring your S3 buckets isn't going to hit that level, though that
could still result in a hefty fine. This of course all assumes that the AI
company got access to the data legally to begin with.

~~~
peteretep
Is there any personal liability for the directors Or investors below jail?
Going to be a shame if the company is able to just bankrupt out of this

------
reustle
> This database was set to open and visible in any browser (publicly
> accessible) and anyone could edit, download, or even delete data without
> administrative credentials.

> there were multiple references to an artificial intelligence company called
> Cense. The records were labeled as staging data and we can only speculate
> that this was a storage repository intended to hold the data temporarily
> while it is loaded into the AI Bot or Cense’s management system.

------
fermienrico
We need some kind of monetary punishment for not protecting user data.
Simultaneously, we need to give tax breaks for companies that have had a
streak of many years of taking security seriously without leaks.

Increasing bug bounties doesn't happen when the executives do not have a
culture of security in their company. That's a loss cost center for them. We
need a different incentives that go beyond just rewarding hackers with bug
bounties.

~~~
colechristensen
Mandatory data breech insurance with prescribed penalties paid to affected
parties by the insurance companies.

Insurance is a great aligner is financial incentives.

------
Santosh83
And in this scenario of technical and political incompetence, more so in some
places than others, the Indian government has recently proposed to compile the
medical data of ALL it's 1.3 billion citizens into a single database.

It's only a matter of time before the whole thing gets leaked out, not to
mention that compiling such data in the first place ought to be none of the
govt's business. Such data is simply too powerful to be entrusted into the
hands of any single party, political or corporate.

------
ferros
I wonder what it will take for data security to ever be taken seriously.

~~~
totetsu
Well somehow PCI compliance seems to work well enough for credit card payment
processing. But what happens is, the payment handling is passed off to a
vendor, so not every company has to get things right in house.. I wonder if
that can be a solution here.. medical data is handled by external vendors.

~~~
colechristensen
PCI compliance works because of the aligned financial interests of the actors.

Most kinds of compliance are linked to legal costs as the ultimate source of
consequences, not so for PCI.

The ultimate costs for failing to comply with PCI are the actual costs of card
fraud which don’t depend on anything in the legal system. When your
regulations are designed and enforced by the entity that actually loses money
when they aren’t followed, motivation lines up and they work better.

~~~
totetsu
Good point. It seems like it might be hard for regulations to be designed and
enforced by the losers of personal medical data leaks.

------
abhisuri97
It's insane that medical data has become so highly valued on the black market
(I guess because of the amount of information stored with a single patient
such as SSNs, maybe billing info, addresses, etc). Yet, it seems that the
healthcare industry is not well prepared to deal with a huge amount of
sensitive data (and doesn't seem to be obfuscating it when sharing with
vendors). Ironically, I feel that the most secure way of doing patient record
keeping is keeping everything physical.

~~~
colechristensen
The healthcare industry is prepared just fine. Regulations exist for how you
can share data with vendors, when you are required to remove identifiable
information, and how to share unredacted information with legal agreements,
audits, etc.

When this doesn’t happen right there are penalties and notifications. If
correct, a breach of this magnitude is entirely possibly a death penalty for
the businesses involved, and criminal liability.

If it isn’t, then it isn’t the business sector being unprepared or the
regulations being missing but the legal system failing to follow through.

------
kamyarg
Does anyone know of a "shame wall" for people/companies responsible for PII
leaks?

I think if there are at least social consequences some managers will start
taking these stuff more seriously.

Just asking, probably it is not even possible with all the blanket "right to
be forgotten" laws.

------
stmw
The significance of medical records - and the importance of maintaining their
privacy - is often overlooked. This is far different than leaked password
hashes or addresses or credit numbers - it's information that cannot be
changed or effectively de-identified. Any company that accumulates medical
records on that scale must be very careful. I'm biased (as a co-founder of a
major healthcare secure cloud software compnay), but it is hard work and all-
too-often neglected work.

------
noahmbarr
It’s my understanding that you can go to jail for HIPAA related breaches.

------
noyesno
Question for devs working in the medical industry: what are the regulations
and standards that you have to follow when storing and handling patient data
in your country/region?

------
cwhiz
Send the executives to prison and shut down the company.

------
yoaviram
If this sort of incompetence bothers you, I suggest sending a CCPA (if you're
in California) or a GDPR (EU) deletion request to this company. This will
protect you from the next time it happens (and it will), as well as incur some
not-insignificant cost of handling you're request.

Here is a simple way to do this:
[https://yourdigitalrights.org/d/cense.ai](https://yourdigitalrights.org/d/cense.ai)
(disclaimer I'm the co-founder of this free, nonprofit service).

~~~
watermelon0
Minor nitpick: It does not actually guarantee that the data is deleted, but
they could face a very high fine, if it turns out that they didn't respect
your request.

~~~
yoaviram
You're right, though I still think it's worth the effort.

------
mlang23
Incidents like these are the reason why I opted out of our e-health record
effort.

------
ycombonator
Cense is an Indian outsourcing outfit.

------
yalogin
This could be somewhat prevented if companies acquiring the data are legally
barred from sharing it to other entities without getting a written statement
from that entity that they have a secure mechanism to store and use it. Right
now the company that did the shitty job is going to get a slap on the wrist at
best. But the company that shared that data with this one is going free and
even got to keep the money they got from this.

~~~
cabaalis
What you are referring to is a Business Associate Agreement, and is indeed
already required by law.

