
Google DeepMind's first deal with the NHS was illegal, UK data regulator rules - antr
http://www.businessinsider.com/ico-deepmind-first-nhs-deal-illegal-2017-6
======
jostmey
As someone in the biomedical sciences in the United States, I wish the barrier
to access patient information was much lower. The pendulum has swung too far
the other direction--toward too much patient protection at the expense of
medical progress.

Where I work, we've spent nearly 2 years trying to get approval to collect
samples (with patient consent) that are being thrown away. No one on the
ethics board wants to sign off on it (maybe because they are afraid of being
blamed if something bad happens?). We agreed to pay ridiculous fees to de-
identify samples and are crossing our fingers that we will finally get the
samples.

I am glad to see that Google got access to this data and is not being
punished. There has to be balance so that medical science can move forward at
a reasonable rate

~~~
k-mcgrady
>> There has to be balance so that medical science can move forward at a
reasonable rate.

No there doesn't. If I don't want my personal information being provided to
Google by the NHS it shouldn't happen. In a country where private healthcare
isn't an option for most, the only option (the NHS) shouldn't be giving
personal data to private, foreign companies, in breach of the law. 'It'll make
my job easier' is awful reasoning for violating privacy and the law.

~~~
beisner
I think I instinctively agree with your argument, although the particulars of
this topic have made me wonder a bit about absolute privacy in the human
health space. Specifically, we generally consider privacy not to be an
absolute right, but to be one moderated by the rights of the public (i.e. if
there is reasonable suspicion that you have committed a crime, police are
allowed to violate your privacy and conduct searches with a warrant). I wonder
at what point this might extend to issues of medical research/public health.
We already know that (in the US at least) cases of certain illnesses must be
reported to state/federal authorities, which to some degree violates the
privacy of the patient. The selection of what sorts of data are shared is
somewhat arbitrary; where is the right place to draw the line? If increasing
ease of access (and perhaps violating privacy at some level) to millions of
health records could bring about the discovery of more-effective predictors
for heart disease 10 years faster than without this access, should this trade-
off be made? I'm not saying it should, but it makes me wonder where the line
should be drawn.

I also recognize that this might be a false binary, that streamlining the
process might be orthogonal to maintaining privacy, but for the sake of
argument let's assume they are not totally orthogonal.

~~~
k-mcgrady
I agree balancing rights is important. I would argue that the link between
more data and curing a disease is too weak to swing the scales. Another major
issue is that you'd be violating rights on a major scale. It would be very
easy to make a similar argument that, for example, recording everyones
communications would stop or significantly reduce crime.

I also don't think it's a debate we really need. More and more people are
wearing health tracking devices. There have been research studies done through
mobile apps. People are tracking health data themselves more and more and a
lot of those people are going to be willing to share it. Let it happen
naturally instead of opening the floodgates and leaving people without a
choice.

~~~
entee
Re. the link between more data and curing disease being weak, I'd argue that's
clearly false at the very least regarding how we evaluate disease treatments.

Imagine being able to have 10x more patients for a drug effectiveness study,
or being able to look at the entire patient population for adverse drug
reactions/interactions. All the datasets we have for this sort of thing are
currently pretty poor. Despite this, the FDA mines datasets like the Adverse
Events Reports database [1] to identify drugs that are performing poorly. With
better, more complete data, the FDA could far more quickly identify adverse
drug interactions and drugs that are harming patients more than they are
helping.

Medicine is inherently about rare events. Most of us are mostly healthy most
of the time. More data by definition helps you find those rare events when
someone is sick or are risk. As we start to ask more difficult questions about
co-occurring events, cancer detection or genome associations, our statistical
tools become powerless to find the weak signals in the data. The only real
solution is vastly more data.

[1]
[https://www.fda.gov/drugs/guidancecomplianceregulatoryinform...](https://www.fda.gov/drugs/guidancecomplianceregulatoryinformation/surveillance/adversedrugeffects/ucm082193.htm)

------
Eridrus
The outrage people feel about this sort of thing always feels misplaced since
it assumes that their data is currently being safe guarded properly and that
DeepMind/Google having it would somehow lead to worse outcomes.

Hospitals routinely get hacked by the most bottom of the barrel mass malware.
I doubt small practices are any better, though I guess they may still have
paper files.

I had a work experience gig for a week when I was ~14 in a hospital, and in an
effort to try to occupy me they gave me an access database on 50k people
including things like HIV status to mess around with.

~~~
lumberjack
I think there is a difference between having your medical data accessed by
some Bulgarian cyber mafia and having it accessed by Google.

Maybe I'm crazy, but I think my data in the hands of Google would be more
harmful to me. I mean, next think you know my health insurer in some not very
developed country that I have to move to for work, comes to find out about my
past medical history and raises my premium and deductible. I find it more
plausible that they would get the information from Google than from some shady
Bulgarian cyber mafia.

~~~
mtgx
> I think there is a difference between having your medical data accessed by
> some Bulgarian cyber mafia and having it accessed by Google.

Not if the Bulgarian cyber mafia hacks Google to steal that data. Okay, Google
has super-ultra security and it's never been hacked (a false statement), but
even so, how many companies have Google's security? I think you can barely
count them on one hand. I completely expect drug cartels, for instance, to
hack into organizations holding user data any time they want. If Google is
easily allowed access to this data, then a dozen other companies with much
weaker security will also be allowed access to it.

------
peoplewindow
The UK ICO is one of the better regulators out there, and it seems the
punishment (nothing) is proportionate to the actual harm done here (apparently
none). But this is still a sad outcome.

The message that tech workers will take away from this sort of thing is "don't
try to work with the NHS or do anything innovative in healthcare". DeepMind
didn't appear to have any actual business plan in mind for Streams - all the
stuff I read from them on this project implied it was driven simply by a
desire to help the UK's national treasure/obsession. Being owned by Google
does let important execs do pet projects like this so I don't see any need for
conspiracy theories here.

Both the ICO and the NHS are part of the government and yet they reached
different decisions about the same laws, I expect DeepMind had their own legal
advice which also concluded it was OK. It is unlikely any other company or
startup could get a hard-and-fast 100% guaranteed correct deal with the NHS to
do things with patient data.

It's doubly unfortunate because the NHS has deeply dysfunctional IT and
_badly_ needs the assistance of data specialists. It faces huge challenges to
become more efficient and shows little ability to do so. Healthcare records
are still passed around on paper, the biggest IT failure in history was an
attempt to digitise the NHS, medical data leaks all over the place and is
guarded far more loosely than Google would ever consider, many genetic
diseases are fairly rare and need a lot of data analysis to make progress on,
and so on. Healthcare and the tech industry _should_ be a match made in
heaven. Instead the very, very few attempts to dance together inevitably seem
to end badly.

~~~
DanBC
> The message that tech workers will take away from this sort of thing is
> "don't try to work with the NHS or do anything innovative in healthcare"

They shouldn't, because ICO has been clear that this decision should not deter
people from this type of project, and that the modifications needed are
relatively minor.

[https://iconewsblog.wordpress.com/2017/07/03/four-lessons-
nh...](https://iconewsblog.wordpress.com/2017/07/03/four-lessons-nhs-trusts-
can-learn-from-the-royal-free-case/)

[https://ico.org.uk/about-the-ico/news-and-events/news-and-
bl...](https://ico.org.uk/about-the-ico/news-and-events/news-and-
blogs/2017/07/royal-free-google-deepmind-trial-failed-to-comply-with-data-
protection-law/)

> and the NHS

It wasn't "The NHS" (eg NHS England) but a single hospital trust. Many other
NHS trusts and Caldicott Guardians expressed concern.

------
Jyaif
This is the kind of limitation that I imagine do not exist in China, and what
will allow/is allowing them to leapfrog the western world in terms of
healthcare and science.

Of course in a hundred year or so China will itself be paralyzed by
regulations and some other part of the world will take over.

~~~
mtgx
On the other hand, China is using that data to develop a "perfect citizen"
score system and use AI to censor, spy on, and deny citizens services for "bad
behavior" (like say criticizing the president - but you would never do that in
the U.S. anyway, _right_?!)

So I'm not sure we _want_ to go in that direction anyway...

[http://www.ibtimes.com/china-use-big-data-rate-citizens-
new-...](http://www.ibtimes.com/china-use-big-data-rate-citizens-new-social-
credit-system-1898711)

[http://boingboing.net/2017/02/16/chinas-citizen-scores-
us.ht...](http://boingboing.net/2017/02/16/chinas-citizen-scores-us.html)

------
rscho
I am in the healthcare industry (public services) and I actually agree with
both sides of the argument. Freeing healthcare data from its current
ethical/legal shackles would definitely allow for great things in science. It
is just as true that it destroys privacy at the population level. There is no
doubt that the data will also be used to evil ends and restrains everyone's
right to autodetermination (destabilizing politicians, discrimination on
hiring, etc.) so the question is simply "do we, as the whole population, want
to take that risk?". Some will say yes, others no. This should be subject to a
vote at the national level. Progress has often impinged on individual freedom
in the past, and I am pretty confident we will solve this the usual way:
without asking the public's opinion.

~~~
benchaney
>"do we, as the whole population, want to take that risk?". Some will say yes,
others no. This should be subject to a vote at the national level.

No it shouldn't. It should be a personal decision for each individual.

------
mattcoles
> While the ICO found the deal to be illegal, it has no plans to punish the
> Royal Free or DeepMind.

What the hell? Why do we let Google get away with this.

~~~
patrickaljord
> What the hell? Why do we let Google get away with this.

So Google signed a deal with a government agency that was approved by said
agency and they should be fined for that? Wow.

~~~
detaro
Both sides in a contract are responsible for them acting according to laws,
yes. (one-sided punishment would be unfair though, so I think in this case the
requirement for proper follow-up and doing it better next time is good
resolution, unless it caused actual damage)

------
kiasta
I'm curious what would happen if this breach occured in the US. We have severe
penalties for these kinds of breaches outlined in HIPAA. I work in the
healthcare industry as a software developer and know full well the costs for a
single breach for the company. The thing about it is, there doesn't even have
to be a breach per say. There just needs to be the potential for a breach for
it to be considered. For instance, someone steals a laptop that has remote
access to hospital records. That is a breach. Not because data was stolen, but
because of the potential for data to be stolen. I don't know what kinds of
privacy laws you have in the UK but if they are even remotely similar to here
in the US, I will gladly watch google burn for this.

Edit: Also, the fact that people are actually OK with a corporate entity free
access to our medical records is appalling. Especially on a website such as
this. I can't believe people are actually defending this.

~~~
DanBC
The ICO site lists plenty of fines handed out for that type of lost data.

HEALTH: [https://ico.org.uk/action-weve-
taken/enforcement/?facet_type...](https://ico.org.uk/action-weve-
taken/enforcement/?facet_type=&facet_sector=Health&facet_date=&date_from=&date_to=)

LOCAL GOVERNMENT: [https://ico.org.uk/action-weve-
taken/enforcement/?facet_type...](https://ico.org.uk/action-weve-
taken/enforcement/?facet_type=&facet_sector=Local+government&facet_date=&date_from=&date_to=)

------
ocdtrekkie
This was one of those things that truly blew my mind: That the hospital could
hand confidential patient data over to Google without their consent. My
assumption was that it was just an example of the U.K.'s weaker privacy laws,
but I'm glad to see that isn't the case.

~~~
7Z7
>U.K.'s weaker privacy laws

Weaker than who's?

------
mtgx
> “We were almost exclusively focused on building tools that nurses and
> doctors wanted, and thought of our work as technology for clinicians rather
> than something that needed to be accountable to and shaped by patients, the
> public and the NHS as a whole. We got that wrong, and we need to do better.”

Oh wow. How can any Google executive still be ignorant about the privacy
implications of its data mining and tracking technologies? That seems
unbelievable to me.

[https://www.theguardian.com/technology/2017/jul/03/google-
de...](https://www.theguardian.com/technology/2017/jul/03/google-
deepmind-16m-patient-royal-free-deal-data-protection-act)

With such mentality, people should really question just how seriously Google
takes the privacy of their email, Drive docs, and so on, if they don't even
_consider_ the privacy implications of using your data when building their
data mining tools.

------
taivokasper
Who gave access to the UK regulator to determine whether the shared data is ok
to be shared or not!?

~~~
DanBC
The ICO operates under a framework of legislation.

The list of legislation is here: [https://ico.org.uk/about-the-ico/what-we-
do/](https://ico.org.uk/about-the-ico/what-we-do/)

------
plesiv
Is there any research done on attempting to use crypto for some smart scheme
of anonymizing patient data bundles to:

\- allow for cryptographically safe usage of the bundles for research (with
some limitations?)

\- make exact mapping of the health data bundle to a particular person
impractical?

Considering the miracles of public-key cryptography and blockchains, it would
seem to me that there could be a scheme that would allow for something like
this.

------
shif
Can't they fine the strawberrynet blatant disregard for privacy?, It's based
on the UK, you can access personal information of anyone registered in the
site with only their email address and they say It's not a big deal.

------
vonnik
People tell me the irony is that the NHS data was crap anyway. Too messy to
work with.

------
unwind
Admins: please fix the typo in the title, the second word should be
"regulator" not "regualtor". It's spelled correctly in the actual article
title, so probably just a typo in the submission. Thanks.

