
Facebook asks users for nude photos in project to combat revenge porn - trueduke
https://www.theguardian.com/technology/2017/nov/07/facebook-revenge-porn-nude-photos
======
wallace_f
Zuck: Yeah so if you ever need info about anyone at Harvard

Zuck: Just ask.

Zuck: I have over 4,000 emails, pictures, addresses, SNS

[Redacted Friend's Name]: What? How'd you manage that one?

Zuck: People just submitted it.

Zuck: I don't know why.

Zuck: They "trust me"

Zuck: Dumb fucks.

~~~
BenchRouter
Wasn't he like 19 when he said this?

I sure as hell said some edgy stuff like this when I was 19.

~~~
tyingq
H̶i̶s̶ ̶a̶c̶t̶i̶o̶n̶s̶ ̶a̶f̶t̶e̶r̶ ̶i̶t̶ ̶l̶e̶a̶k̶e̶d̶ ̶s̶o̶r̶t̶ ̶o̶f̶
̶i̶n̶d̶i̶c̶a̶t̶e̶ ̶t̶h̶a̶t̶ ̶t̶h̶i̶n̶g̶s̶ ̶h̶a̶v̶e̶n̶'̶t̶ ̶c̶h̶a̶n̶g̶e̶d̶
̶m̶u̶c̶h̶:̶[https://www.recode.net/2017/1/5/13987714/mark-zuckerberg-
fac...](https://www.recode.net/2017/1/5/13987714/mark-zuckerberg-facebook-qa-
weekly)

Edit: Ugh. The witchhunt above was unrelated to those IM messages. Apologies
for the mis-step.

~~~
ethbro
I mean, his worries then are still worries now. Except instead of arbitrary
actions from a 19-year old, I face profit focused actions from a giant ad
company.

Not sure which is better.

~~~
danso
I agree, I don't want to sound like a Zuckerberg apologist, but it's
simplistic to view the Zuckerberg and Facebook of today as we would a 19-year-
old Zuckerberg and his elaborate PHP script.

As a trivial example: In 2005, when Facebook was a non-trivial company,
Zuckerberg [0] guest lectured at a Harvard CS50 class. When asked if Facebook
would contribute to open-source, he said that he didn't foresee it being worth
the trouble (can't find the exact timestamp, so this is all IIRC with a grain
of salt). Now of course, open-source is a substantial part of Facebook. Is it
because Zuckerberg in the following years had a Road to Damascus experience
with Richard Stallman? Maybe, but it's more likely that Facebook evolved into
the type of organization where OSS became a benefit to the bottom line, and it
was a decision made by people lower than Zuckerberg at that point.

Even if Zuckerberg is still as much a creep as he was in private IM messages
as a 19-year-old, he's no longer the sole captain of his tiny boat. Him
breaking the law means that many people end up getting in legal trouble, i.e.
it doesn't really much matter what he alone thinks is moral when he has dozens
of people/potential whistleblowers looking over his shoulder with greater
moral concerns.

[0]
[https://www.youtube.com/watch?v=xFFs9UgOAlE](https://www.youtube.com/watch?v=xFFs9UgOAlE)

------
alxlaz
Does anyone have a first-hand source detailing this project, and _especially_
its privacy and accountability principles? This is almost surreal to read:

> A community operations analyst will access the image and hash it to prevent
> future instances from being uploaded or shared.

How long before the first community operations analyst begins selling the
images they access? How long before the first one uses it to blackmail an
already terrified victim? And _when_ \-- not if, _when_ \-- one of these
things happen, what price will Facebook pay (I mean, with corporate
accountability being what it is, we all know what price Facebook will pay, but
it would be useful to at least have a fine print...)?

~~~
mseebach
There are some relatively easy countermeasures you can put in place to avoid
this. Never display identifying information with the images. Never allow
analysts to freely lookup images from specific profiles. Make sure the
terminals that these analysts work on are blocked from internet access, and
have no accessible external drives or USB ports (or run a locked down virtual
desktop on a thin client). Maintain a no-camera policy in the workspace, and a
policy of always having more than one person on duty in the workspace at the
same time.

Considering that we haven't seen any stories of employees abusing backend
access to the Facebook servers, it feel reasonably safe to assume that they
have countermeasures like these in place.

~~~
JoshMnem
That is a disaster waiting to happen.

~~~
mseebach
Not really. This is well-understood problem with well-understood solutions.
There is no reason to believe that Facebook doesn't have both the resources
and the motivation to get this right.

~~~
JoshMnem
I'm highly skeptical about it. Everyone has motivation to avoid getting hacked
or have employees/contractors do bad things, but it happens.

------
rishabhsagar
Wait, what? I submit my nude pics to FB, so that they can hash it and identify
its copies online? That sounds good, but is that the only way of doing this?

Why not use facial recognition + nudity detector to on-demand find potentially
compromising videos online? Maybe even send my Facebook notification that
says, "Hey we found what appears to be a compromising video with you in it
shared to a wide group of people, are you cool with it?"

Am I mis-understanding the concept being proposed?

~~~
danso
What facial recognition technology do you have in mind that would not face the
burden of frequent false positives (nevermind false negatives) and could scale
to do this efficiently on an on-demand basis?

Then there's the issue of how your proposed system requires FB to show users
the porn that other users are privately viewing.

~~~
lucaspiller
Facebook already uses facial recognition to suggest tags on photos, so I don't
think false positives are that much of an issue:

[https://www.facebook.com/help/122175507864081](https://www.facebook.com/help/122175507864081)

You other points are true though, and I guess a lot of these type of photos
don't always contain faces.

~~~
danso
I'm not as heavy of a FB user as I was 5 years ago, but my impression was that
false positives were at an acceptable threshold, but not nearly a solved
problem. And the face recognition tech was heavily augmented by social data,
such as the fact that me and a friend were both partying at the same place on
the same night and that we had just 3 days ago posted and tagged photos of
each other.

That secondary data does not exist reliably in the context of revenge porn
videos that go viral. Nevermind that the penalty of a false positive -- which
would presumably involve exposing the user to a bit of random porn -- is
substantially higher than it ever is with mistagging friend photos.

------
m_eiman
The goal is good, but I'd never under any circumstances entrust photos I don't
want shared with any of the cloud companies. At the very least the
fingerprinting should be done on-device?

------
amelius
Why not just send the hash?

Anyway, does this mean that I can take down other people's legitimate photos
by submitting them to this service?

~~~
danso
If the system allowed users to send the hash, then yes, it would allow users
the power to arbitrarily take down photos. The point of the users having to
post these images from their own account is so that a FB employee can confirm
that the content breaks FB rules and that the content is associated with the
user.

~~~
zAy0LfpBZLC8mAC
In other words: The solution to preventing strangers from looking at your nude
photos is to have strangers look at your nude photos? Brilliant!

~~~
danso
I've only read the posted article so I don't know if "revenge porn" is the
primary or originating motivation for this initiative (as opposed to child
porn victims, etc).

But my perception is that victims of revenge porn -- the ones who feel the
need to make a complaint (and risk drawing publicity to themselves) have had
(or perceive) their images disseminated so widely to such psychologically
devastating effect [0] that the thought of a FB employee seeing them is among
the least of their worries. Probably because FB employees _have_ already seen
them if the user has submitted an abuse complaint.

[0]
[https://www.theguardian.com/lifeandstyle/2015/feb/06/experie...](https://www.theguardian.com/lifeandstyle/2015/feb/06/experience-
i-was-victim-of-revenge-porn)

------
black_puppydog
I just love the idea of putting the very things you want nobody to see into
the hands of the one company that already knows everything else about you.
Better hope you never, ever try to engage in a political campaign that fb
doesn't approve of... m(

------
norswap
Sounds almost like a joke doesn't it?

I feel like many people won't be super comfortable with Facebook having their
nudes on top of everything else.

------
wlll
I wonder if they're doing anything to take into account tiny modifications
causing a completely different hash, or if they are relying on users not
knowing that changing an image would get around a naieve hashing strategy.

~~~
andreareina
FTA (emphasis mine):

> The technology was first developed in 2009 by Microsoft, working closely
> with Dartmouth and the National Center for Missing and Exploited Children to
> clamp down on the same images of sexually abused children being circulated
> over and over again on the internet. There was technology that could find
> exact matches of images, but abusers could get around this by slightly
> altering the files – either by changing their size or adding a small mark.

> PhotoDNA’s “hash” matching technology made it possible to identify known
> illegal images _even if someone had altered them_. Facebook, Twitter and
> Google all use the the same hash database to identify and remove illegal
> images.

------
jstanley
To prevent your nude photos from getting on Facebook, send your nude photos to
Facebook.

~~~
fredley

        with(image):
           post = facebook.post(image)
           if post.is_deleted():
              image.is_blackmail = True
              otherplatform.post(image)

------
v4tab
Hey Zuck, Are you concerned that your nude photos might be used against you at
some point? Please post them here so we can know to take them down if someone
ever uses them against you. Gee, I guess that doesn't quite make sense does
it?

------
ryandrake
I've got One Weird Trick that Facebook hates, to keep nude pictures of myself
off the Internet. You're never going to believe it. Ready for it? It's two
easy steps:

1\. Don't make nude pictures of yourself or let someone else make one.

If you manage to screw up step #1, then:

2\. Don't let other people have those nude pictures or upload them to the
Internet.

That's it! Can you believe that this has actually worked for me for a good 20+
years so far? I know it seems too good to be true, but trust me it totally
works. I should blog about it.

------
DarkCrusader2
Submitting your nude photos to a cloud company, what could go wrong.

------
notyourday
Facebook? The company that maintains shadow profiles and does not delete
anything that its users "delete"? That facebook?!

Every time I think I met the Peak Stupid, Internet proves me wrong.

------
ajaimk
Assuming they are hashing the images or videos and only comparing the hash in
practice, why doesn't Facebook implement a client side JS method for the hash
generation. Only thing uploaded is the hash.

Use standard ML Algorithms to validate if the image includes nudity and
confirm the hash overlap.

Also need to be hesitant since people at FB will be able to see these images
(as recently mentioned in another post on HN where some random folk ended up
viewing a Google Doc shared via Messenger)

------
fredley
This can be framed as the _requirement_ for photos to be nudes in order to be
blocked from the platform. I'm assuming that there are checks on uploaded
images to see if they are actually nudes before they are added to the dataset
of images to be banned. Otherwise I could use the system to prevent any photo
at all from being uploaded to Facebook by submitting it to the system. This
might actually be a good thing, for me, but not for Facebook.

------
JoshMnem
This doesn't really make sense. Isn't machine learning already good enough to
detect nude photos? If nude photos are not allowed on Facebook, why do they
need hashes of your specific photos? They could identify all nude photos with
machine learning and remove all of them.

Or have a checkbox in the profile settings: Do you permit other people to
upload nude photos of you? [ ] Yes [x] No

~~~
m_eiman
It'll be used for private messages, I assume. I guess the rules aren't as
strict there.

~~~
JoshMnem
They have facial recognition data though. If the uploading user is not the
user in the nude photo, and the user in the nude photo has not checked a box
that says "I permit other users to upload nude photos of me" then block the
upload and flag for review.

~~~
m_eiman
It's hard to identify a person when the set of possible values is "every
person on the planet" and not "one of these few persons". They could detect if
the uploaded looks like one of the uploader's Facebook friends, I suppose, and
that would probably be useful.

What would they do if they detect a possible match, though? Ask the person
they think is in the photo "is this you, are you ok with John Doe sending this
photo to Jane?". That could lead to trouble when the system gives false
positives.

------
Pharylon
Couldn't you just take a screenshot of the original pic to get around this?
The new image would have a completely different hash. Or just resize it. Or do
some color correction. Or a million other ways to alter the picture.

------
danso
As absurd as the idea sounds on its face, it seems well-intentioned with the
potential to result in a net positive for victims. The main concern is if FB
is to be trusted. But what’s the alternative? Is there a way for the user to
hash the image in their own machine while allowing FB the ability to discern
whether it qualifies as being off-limits for FB content? Without FB
verification, the system is likely to become a decentralized and easily-abused
version of Youtube’s Content-ID.

edit: Downvotes without suggestions or rebuttals? That's disappointing. I
didn't say that this system is without flaws (or worse, unintended
consequences). If you think the status quo (whatever that status quo is) for
the anticipated victims, then let's hear it.

~~~
Santosh83
My solution is for people to be careful with their nude photos or if they
weren't, then face the consequences. Either accept it (nude photos have power
over you only as long as you're vulnerable to shame & guilt over them), or go
through the established LE process to get them taken down and the offender
actually punished.

------
incompatible
Duplicate of
[https://news.ycombinator.com/item?id=15648080](https://news.ycombinator.com/item?id=15648080)
and others.

------
milansuk
Why don't they just compute hash on user's side?

------
ajcodez
Data mining for VR content generation?

------
ionwake
Got to admire their nerve though

------
nelsonic
What could possibly go wrong?

------
thewhitetulip
Complicated solution for a simple problem.

Don't let your partners take photos when you are intimate. End of story.

~~~
hawski
And how is this gonna prevent them from taking your photos via hidden camera?

~~~
falcolas
At that point they are committing a crime with little grey area for them to
hide in, compared to a picture taken with consent.

~~~
hawski
But publishing either is a crime AFAIK. Now it's two crimes.

~~~
falcolas
US: Only in some states. A casual search puts it at around 30 states. Most of
these laws are new and relatively untested, legally. Not something I'd want to
depend upon to protect me.

On the other hand, unauthorized photos taken of you in a place you'd expect
privacy is a fairly well explored body of law; basically unless you're a
celebrity figure, it's a no-no.

IANAL, of course.

