
Google’s Role In Woodland Child Pornography Arrest Raises Privacy Concerns - danso
http://sacramento.cbslocal.com/2013/11/21/googles-role-in-woodland-child-pornography-arrest-raises-privacy-concerns/
======
kanamekun
According to US law, any online service provider that receives actual
knowledge that child porn exists on their services is required by law to
report it to the National Center for Missing and Exploited Children... or face
a fine of $150k for a first offense, or $300k for subsequent offenses:

[http://www.law.cornell.edu/uscode/text/18/2258A](http://www.law.cornell.edu/uscode/text/18/2258A)

To my knowledge, Google wasn't required to check its hosted files for specific
image hashes: "Nothing in this section shall be construed to require an
electronic communication service provider or a remote computing service
provider to— (1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in
paragraph (1); or (3) affirmatively seek facts or circumstances described in
sections (a) and (b)."

But once they did, they were required to report the results. Any privacy
concerns are waived as an "[exception] for disclosure of communications" as
part of 18 USC § 2702: "Voluntary disclosure of customer communications or
records".
[http://www.law.cornell.edu/uscode/text/18/2702](http://www.law.cornell.edu/uscode/text/18/2702)

It's not surprising that Google is affirmatively scanning its servers for
child porn, as they have a long history of providing assistance to NCMEC:

[http://googleblog.blogspot.com/2008/04/building-software-
too...](http://googleblog.blogspot.com/2008/04/building-software-tools-to-
find-child.html)

[http://googlepublicpolicy.blogspot.com/2012/01/crowdsourcing...](http://googlepublicpolicy.blogspot.com/2012/01/crowdsourcing-
to-protect-ncmecs-newly.html)

"$1 Million from Google to Fight Child Sexual Exploitation Received by the
National Center for Missing & Exploited Children"
[http://www.missingkids.com/news/page/4898](http://www.missingkids.com/news/page/4898)

~~~
mtgx
The question is _what else_ are they scanning for, to give to the FBI/police.

I remember Facebook announced a while ago that it would scan its Facebook chat
for "criminal activity", to give the info to the police.

[http://news.cnet.com/8301-1023_3-57471570-93/facebook-
scans-...](http://news.cnet.com/8301-1023_3-57471570-93/facebook-scans-chats-
and-posts-for-criminal-activity/)

Where is the line drawn? And will we know where it's drawn? Stuff like this
makes me trust "cloud computing" less and less. I hope Google and others
realize this, before it's too late (for them) to change course.

I can take a guess where this is going, though, if NSA's actions are any
indication - there won't be a line anymore, if they (government/corporations)
are left alone to continue with stuff like this. Every single crime, no matter
how small and insignificant, will be reported to the police, thanks to the
mass surveillance capabilities of the alliance between NSA/government and
corporations. After all, when "safety" is the ultimate goal, above all else,
that outcome is all but inevitable.

It's already started:

[http://falkvinge.net/2013/11/24/nsa-mass-surveillance-has-
al...](http://falkvinge.net/2013/11/24/nsa-mass-surveillance-has-already-been-
used-for-ordinary-police-work/)

~~~
alfiejohn_
This isn't where the line was crossed. Far from it. Even though eavesdropping
on the everyday citizen is a pretty damning thing to do, in their eyes it
isn't illegal so they can't understand why everyone is complaining.

The line was crossed a long time ago when the Law was treated as an opinion.
This cultivated a "let's see what we can get away with" attitude which is now
starting to unfold before the world stage.

For more info, see:

    
    
      https://en.wikipedia.org/wiki/John_Yoo

------
Houshalter
I don't know how I feel about this. Merely looking at CP doesn't actually hurt
anyone, and pedophiles are not necessarily attracted to what they are by
choice. But there is an argument that it's existence encourages or can lead to
child abuse in the real world, especially the production of more CP.

This seems to be demonstrated by this case. They caught the guy through CP but
it turns out he actually abused a kid. Maybe he wouldn't have been caught
otherwise.

On the other hand is punishing them and throwing them in jail for years
actually helpful? As opposed to a lesser sentence or rehabilitation of some
kind maybe?

Another problem is that it's incredibly easy to frame people for. Just put
some cp on someone's computer or Google account and you can have them sent off
to jail with little effort and no way for them to prove they are innocent.

~~~
samstave
___Child_ __PORN should not exist for people to look at.

I take great issue with your first sentence.

Who the hell do you think was harmed in the creation of said CP? If you are in
any way sympathetic to those that would "harmlessly" watch it, then you are
tucking delluded as to the reality of such a situation.

Seriously, take a moment to think about what you said and try to not pretend
to be so "academic" about your position.

I'll assume you have no children.

~~~
Houshalter
I take issue with it's creation, but once it exists, looking at it is a
victimless crime whether or not you find the act objectionable.

Which is another reason to be skeptical of the laws as they main reason they
exist is because people find it objectionable, rather than rational arguments
about what actually prevents child abuse.

~~~
marquis
>looking at it is a victimless crime

It creates a market. Is there no empathy for the subject of the photo? Would
you like your 6-year old son or daughter to be looked at with a sexual gaze?

~~~
greenyoda
It would only create a market if the recipient paid for the porn or exchanged
some other porn for it. I agree that participating in a market for child porn
should be a criminal act. However, I don't see how merely possessing an image
that you happened to find on the web can create a market (or cause harm to the
person depicted in the image).

------
pstuart
This is a tricky subject because all of us (excluding the pedophiles) can
agree that sexual abuse of children in all forms is _a bad thing_.

That said, the erosion of privacy in the name of "think of the children!" is
equally _a bad thing_.

In this case it appears to be a legitimate case of a sexual predator sharing
his digital trophies and rightfully being caught.

But "protecting the children" doesn't have to mean giving up our expectations
of privacy in the normal course of affairs.

~~~
nawitus
>This is a tricky subject because all of us (excluding the pedophiles) can
agree that sexual abuse of children in all forms is a bad thing.

More like the society doesn't allow discussion on these topics. Any opinion
which is not perfectly in line with the "official truth" is not acceptable, so
there's no point to talk about it.

~~~
arrrg
So, uhm, you think there are large numbers of people who think that sexual
abuse of children in all forms is a good thing? And that they don’t dare
speak?

I sure hope not! Or what was your point again.

~~~
nawitus
The point was that no points can be made.

------
_nullandnull_
Law Enforcement have been using the hashes of known child pornography images
to find pedophiles for probably a decade. This technique is no different than
what anti-virus vendors do to detect malicious software. I'm starting to hate
Google in lieu of their new campaign on killing privacy and anonymity but I
don't see the privacy concerns here. Server side hashing of images and
comparing them to a database isn't exactly gives metadata to the NSA. I think
it's good that Google is doing this. Odds are more image hosting providers
should be doing. It's low hanging fruit but at least it could protect a child.

~~~
decasteve
Is a matching hash enough to get a warrant or is a warrant even necessary? Or
is it probably cause to raid the user's house at that point? I'm curious to
know the process once a matching image fingerprint is found.

Or will we soon be hearing of a police raid on a person's house because of an
errant MD5 collision?

------
pwnna
Slightly offtopic:

"Microsoft also developed Photo DNA to match pictures using just a pixel."

What. So if I give Microsoft a pixel with value #F00 it can determine if
that's from a specific photo?

Great reporting with great insights and sources.

~~~
Houshalter
Ya that's ridiculous but it would be hypothetically possible if you had a
sufficiently high number of colors. Then the chance of two images having a
pixel with the same R, G, and B values all exactly the same would be
exponentially small.

You could possibly do this by averaging all the pixels together for example.

------
mtgx
“The state must declare the child to be the most precious treasure of the
people. As long as the government is perceived as working for the benefit of
the children, the people will happily endure almost any curtailment of liberty
and almost any deprivation. ” -Mein Kampf, Adolf Hitler, Publ. Houghton
Miflin, 1943, Page 403

------
sitharus
There's a mild concern here, but it does rely on two things - having a known
image and having it uploaded to a web service. Judging by the information in
the article an original image wouldn't be matched, and it is in the ToS (good
luck reading all of that).

The organised groups that make money off this are much wiser and harder to
catch.

More interesting is the quote "Microsoft also developed Photo DNA to match
pictures using just a pixel". I'm not sure it works quite how they suggest.

------
increment_i
Considering this person uploaded explicit pictures of children to a cloud
server, is anyone actually surprised that the FBI showed up at his door soon
after?

In addition to what federal and international law says on the matter,
basically every web service on the face of the earth explicitly forbids these
kinds of images in their terms of service. Once he hit the upload button, this
guy's 'privacy concerns' were kaput.

------
wdvh
The least you should expect when you upload something onto a cloud service is
that your data will be subject to all kinds of algorithmic analysis.

And it's just not tenable for Google to not do this kind of matching. Imagine
the flak they would get if they refused to match against a database of child
porn citing "privacy concerns." Such a refusal would be neither pragmatic nor
ethical.

------
sigzero
The Picasa policy spells out that they don't allow those kinds of things. I
would assume one way they would check would be some kind of algorithm. I see
"no evil" here on Google's part.

~~~
venomsnake
Yeah. But a companion question should also be - if they use a potent
technology for something i approve of (and we all approve of that uses) can it
be (ab)used for something we dislike.

~~~
Karunamon
Sure, that's only reasonable. But you're at that point only one step away from
a slippery slope argument. (How meta..)

Problem is it's not always a valid question given the context. Automagically
scanning images with an algorithm to find known sexual abuse pictures is in a
completely different world than scanning images with an algorithm to find
copyrighted content, for instance.

------
khawkins
This doesn't make me significantly worry about my privacy using Google
services. Google knows that it needs to maintain trust with its users and
that's why it's only tackling a crime which has little controversy with
respect to its heinousness and is almost trivial to detect, from a systems
perspective. I'd far rather Google show good faith and help fight crime on
their terms, rather than the government come in and start imposing safety
regulations and make it obligatory.

------
dreamdu5t
Seems incredibly easy to frame someone for child porn.

------
kmfrk
From how I read it: Google match all images to a database of file hashes.

I wonder whether they do it for redundancy purposes like Dropbox, or whether
it's entirely to crack down on child pornography.

~~~
duskwuff
See:
[http://www.missingkids.com/Exploitation/Industry](http://www.missingkids.com/Exploitation/Industry)

There is apparently both a database of MD5 hashes of "known bad" image files,
as well as a "PhotoDNA" system for fuzzy image matching.

------
nemof
Wow, there's some fucking dire opinions being expressed in this thread. How
fucking removed from reality are some of you.

