
Facebook criticised over handling of reports of child exploitation content - ComputerGuru
http://www.bbc.com/news/technology-39187929
======
camtarn
"The BBC first asked Facebook for an interview about its moderation system in
late-2015, and repeated the request following this follow-up investigation.
The social network's director of policy Simon Milner agreed to be interviewed
last week, on condition the BBC provided examples of the material that it had
reported, but had not been removed by moderators. The BBC did so, but was
reported to the UK's National Crime Agency as a consequence."

That's either Kafka-esque levels of bureaucratic crazy, or a very well-
concocted trap...

~~~
gggy
I always thought that the very notion that information can be illegal is like
a contradictory premise that logically leads to crazy conclusions.

~~~
matthewmacleod
I don't think information is generally ever illegal, but rather the
distribution of it is.

~~~
dublinben
In this case, mere possession of this information is a major crime. No
evidence or proof of distribution is necessary. In the US at least, every
distinct image is its own violation as well, so you can easily end up with
potential sentences reaching the thousands of years.

~~~
cr0sh
Let's say someone sent you a file, that was encrypted in such a manner
(4096-bit RSA, maybe?) that you had no way - indeed, no entity on the planet
had the means - to decrypt it.

Yet - unbeknownst to you - that file contained something so illegal, it would
cost you your life if you knew about it.

Is possession of that file still illegal? Or does it only become illegal when
you have knowledge of its contents? Does having that knowledge then entail a
potential for "thoughtcrime"? What if you (somehow) imagined the exact
information that was contained in that file - before seeing the file? You now
have the knowledge, but you never saw the information - is it now illegal? Or
- what if you created a very large number that matched the unencrypted value
of the file? Does possession of that number become illegal?

Illegal enough to forfeit your life?

~~~
sathackr
I am not a lawyer and you should trust what I say as much as you would trust
graffiti scribbled on a wall in the bathroom.

That being said,

In Florida, it is treated much like drugs.

Most cases are constructive possession and in such cases The prosecution has
to show:

1\. You had knowledge of the presence of the contraband

2\. You had knowledge of the contraband's illicit nature.

3\. You had the ability to excercise dominion and control over the contraband.

I read a case in which a person had, on their person, a over-the-counter
Tylenol bottle. Inside the bottle was Tylenol w/codeine, of which they did not
have a prescription for. They were convicted of possession of a controlled
substance. Their conviction was overturned on appeal because there was no
evidence showing that they knew the substance in the bottle was actually
Tylenol w/codeine as opposed to just Tylenol.

Proving those three is a giant rabbit hole. Researching this was extremely eye
opening for me, and if you live in a similar state, I would suggest you
research it as well.

In your example, if there was no evidence that you knew what the encrypted
file contained, and no evidence that you knew the password/key to the file,
then only the first bar would be met -- knowledge of its presence, and thus
your case could never even go to trial -- even an okay lawyer would get it
dismissed for lack of evidence.

Now that 'evidence' word is very squirrelly. If a detective _says_ you told
him you knew what was in the file, that counts and the case goes to trial and
a jury gets to decide whether you actually knew what was there. With drug
cases, they might listen, weigh the facts, and be sympathetic. With a child
porn case, not so much. Yes, the detective would be perjuring himself if you
did not actually say that, but how do you prove that? That's one of the
reasons any lawyer will tell you NEVER TALK TO THE POLICE. If you excercise
your fifth amendment and do not talk to them, they can't put words in your
mouth.

A site with a ton of information on this is
[http://www.ejdirga.com/criminal_offenses/drugs/florida-
drug-...](http://www.ejdirga.com/criminal_offenses/drugs/florida-drug-case-
law/drug-case-law-constructive-possession.html)

Edit: add IANAL disclaimer at the top

------
tristor
"Facebook's rules forbid convicted sex offenders from having accounts.

But the BBC found five convicted paedophiles with profiles, and reported them
to Facebook via its own system. None of them were taken down."

This tidbit in the article didn't so much surprise me exactly as pop out as
something that seems unreasonable upon consideration even if it looks like a
good idea on its face.

Key points for me regarding this is that recidivism rates are greatly affected
by successful integration back into the community after serving time, and sex
offenders have the most difficulty integrating already. While there's
certainly reasonable concerns around the risk of these people accessing
Facebook, by the admission of the company itself it's a new type of
influential global community. By banning sex offenders outright they're being
othered in a way which is a significant impediment to being reintegrated into
society.

That aside, at least in the US, you can end up a sex offender for life for
things as innocuous as stopping on the side of a long stretch of highway to
empty your bladder. There's also no unified worldwide or national database of
sex offenders to check against, nor are their uniform standards between states
and nations on what qualifies. This means it's both difficult for Facebook to
enforce this rule at all and practically impossible to do so in a way which
would only apply to people convicted of offenses we can agree qualify
(molestation, rape, sexual assault).

I find it extremely odd that Facebook would willingly put itself in the
position to police accounts this way and that there'd be public support for
the idea of banning people outright without any basis in their activity on
Facebook.

~~~
aaron-lebo
_While there 's certainly reasonable concerns around the risk of these people
accessing Facebook, by the admission of the company itself it's a new type of
influential global community. By banning sex offenders outright they're being
othered in a way which is a significant impediment to being reintegrated into
society._

What do you mean? Facebook's business speak about being a new global community
means what? People aren't being "othered" because they can't have Facebook
accounts. The ability to have a Facebook does not have anything to do with
getting back into society - lots of people do it (abstain from Facebook)
voluntarily. On the other hand, the access a predator would have to children
is a very real danger and is presumably dangerous to someone who has problems
with that.

 _That aside, at least in the US, you can end up a sex offender for life for
things as innocuous as stopping on the side of a long stretch of highway to
empty your bladder. There 's also no unified worldwide or national database of
sex offenders to check against, nor are their uniform standards between states
and nations on what qualifies. This means it's both difficult for Facebook to
enforce this rule at all and practically impossible to do so in a way which
would only apply to people convicted of offenses we can agree qualify
(molestation, rape, sexual assault)._

Do you have any stats for this? This gets thrown up from time to time, but how
many people are sex offenders cause they took a piss on the side of the
highway?

 _This means it 's both difficult for Facebook to enforce this rule at all and
practically impossible to do so in a way which would only apply to people
convicted of offenses we can agree qualify (molestation, rape, sexual
assault)._

This information is readily available. Presumably Facebook can run the same
check the BBC did. If they've been unfairly convicted that's not Facebook's
issue, it is the courts, but Facebook does a have a very simple qualifier in
most places.

Sorry if I come off as a little aggressive...it is just that in many of these
cases there are real victims here and it isn't the sex offenders, it is the
innocent children (or other) whose lives they have destroyed. There are cases
were people are accused unfairly, but most cases are not that way (it is a
major charge that is taken seriously). Sometimes people do things which are
destructive to others, and there's only so much empathy when rights are
restricted as a result.

Just don't personally feel that sorry for a sex offender who can't get on
Facebook. If they've been wrongly convicted, they've got bigger problems, if
not, then it's a privilege not a right.

~~~
ue_
>Just don't personally feel that sorry for a sex offender who can't get on
Facebook.

How do you feel about other criminals who have been convicted of a crime, then
let out? Sure, Facebook isn't really the issue here, but why are they facing
impediment to being included in society? Are murderers OK to do it? How about
people who steal things?

Facebook's policy is clear; the question is whether it's a just policy,
especially with their aim of building a 'global community' and the almost
necessity of having a social life that has some online component nowadays in
order to lead a happy life.

Facebook on its own is no big issue, but if other providers follow suit, I
can't help but think it is unjust or at lesat unfair.

~~~
aaron-lebo
I think it's very debatable whether people having online lives is beneficial
to them or society.

People lived for tens of thousands of years without the Internet and were
happy doing so. I don't see why it is unfair that someone who has taken
someone's innocence (and perhaps their ability to have a happy life) has
certain privileges taken away.

~~~
ue_
>someone who has taken someone's innocence (and perhaps their ability to have
a happy life) has certain privileges taken away.

They should have already been served the 'punishment' or retribution within
prison, and hopefully have been reformed. Why do you still want to punish
them? Do you not think the punishment is adequate?

One of the aims of the prison system is rehabilitation into society. You're
actively trying to stop that out of revenge?

------
todayiamme
One of the more Orwellian moments I've experienced on Facebook happened when
it didn't allow me to send an image link to my sister in a private chat
because the string had sex written in it.

We're both adults and have been using Messenger to talk to each other every
day for a few years now, but nope Facebook wouldn't relent. It even froze my
ability to message her from my smartphone app for a short amount of time.

This amuses me more than it annoys me, because someone at Facebook decided
that it's a good idea for a messaging product to censor private conversations
between adults. What on Earth led them to believe that my messaging app should
be my nanny?

 _EDIT:_ Here's a screenshot of the dialog:
[http://oi65.tinypic.com/33wt9br.jpg](http://oi65.tinypic.com/33wt9br.jpg)

~~~
MaxfordAndSons
That's bonkers. When was this, and what country or countries were you both in?

FWIW though, it's also possible she unknowingly had some sort of anti-
obscenity filter turned on in her settings.

~~~
todayiamme
She's an American citizen and was in the US at the time. I was in Cambodia.

Nope, pretty sure that wasn't the case. We aren't aware of any such setting.
What's really funny is that the context wasn't sexual at all. It was a medical
conversation with a family member. I'd gotten the HPV vaccine earlier in the
day and we were talking about it.

The image in question was from an online medical resource, and it didn't flag
the image itself as I ended up sharing it directly with her anyway. I think it
got flagged because the string had the letters sex in it. Or, if someone else
had flagged it for some reason elsewhere... Who knows?

------
6stringmerc
Wow, this is quite an ugly story for Facebook, especially hot on the heels of
Zuckerberg championing the platform as the key to building global community.
I'll stand by my assertion that not all communities are good ones worth
supporting. Thus, it puts a lot more responsibility on Facebook to arbitrate
than I generally feel is logical in a capitalistic system. There are flaws.
Like this for instance:

> _When provided with examples of the images, Facebook reported the BBC
> journalists involved to the police and cancelled plans for an interview.

It subsequently issued a statement: "It is against the law for anyone to
distribute images of child exploitation."_

I much more expected Facebook to blame their AI - which, as Zuckerberg
claimed, is going to clean up Facebook just you wait and see - rather than
admit outright they should go to jail for hosting the content that the
reporters found.

Between this and the terrible use of Facebook to exploit Marine Corps females,
it's going to take a lot more gardening in South Dallas before I believe
Zuckerberg is actually interested in ethical behavior more than perceived
image. YMMV.

~~~
dublinben
I wonder if Facebook will report themselves to the police as well, since
they're the only ones in this situation who are distributing these images.

~~~
TallGuyShort
That's not true. The owners of the routers and switches between Facebook and
its users are also distributing child porn.

------
roywiggins
And Facebook reported the BBC reporters to the police for sending Facebook
examples of the content that the reporters found- on Facebook- which Facebook
wouldn't remove.

~~~
Latty
As mentioned earlier, this seems like Facebook were asking for links to the
offending content, got sent the actual content itself, and were then bound by
law to report it. It also seems unlikely there is an overlap between the
things they reported to the police and the things they wouldn't remove.

~~~
megous
It shouldn't matter. If BBC had the links, they obviously had the content too.
BBC's computers downloaded the CP and stored it. Therefore Facebook should
report them either way.

Distinction between sending links sending images + links makes no sense in
this case. Both proves BBC downloaded CP.

~~~
Veratyr
It's not about downloading the CP, it's about distributing it. Sending a link
may not be distribution, while sending the image itself is.

~~~
megous
Possession is a crime too. And if you have indication of someone possessing
it, and don't report, then what?

------
akerro
Just post a few pictures of breastfeeding and it will be closed imminently.

~~~
cr0sh
But a murder scene absolutely drenched in blood and body parts would be a-ok.

~~~
akerro
Yes, video from decapitating a woman in Mexico was alright to keep public for
months.

------
ikeboy
The title is misleading. It's not entirely clear from the article what kinds
of images were not removed, but the article does make clear some of the ones
reported were not child abuse images.

It sounds like some of them were regular images of children, but with
suggestive comments attached. Now, that's still against facebook's TOS, but
not illegal.

Edit: title has been changed to reflect source title better.

~~~
djsumdog
> pages explicitly for men with a sexual interest in children

> images of under-16s in highly sexualised poses, with obscene comments posted
> beside them

> groups with names such as "hot xxxx schoolgirls" containing stolen images of
> real children

> an image that appeared to be a still from a video of child abuse, with a
> request below it to share "child pornography"

That's what I thought. None of the above things are illegal (well, maybe the
last one. IANAL). The 2nd one was common on Reddit's /r/jailbail subs before
they shut them down.

They most likely do violate Facebook's terms of service, but with their
massive userbase, I bet these pop up and get removed faster than their staff
can deal with them.

------
ploggingdev
Assuming all the images reported by the BBC were what they say they were,
Facebook handled the case very poorly, especially the part where they reported
BBC journalists to the police. The least Facebook could have done was to
acknowledge the journalists' role in being proactive about reporting abuse,
have a human review the images and delete whatever was illegal or against
their ToS.

This incident raises the question of how a company like Facebook should deal
with people reporting content. There is no concept of community moderation in
FB where specific users are moderators and can deal with reports of abuse.
Maybe facebook should consider community led moderation efforts until the bots
are ready to take over. Community led moderation efforts (not specifically
dealing with reports of abuse) generally work very well for large communities
where the company building the product does not have the bandwidth to deal
with moderation. Some examples that come to mind : reddit, stackoverflow.

~~~
jauer
> have a human review the images and delete whatever was illegal or against
> their ToS.

Except the BBC didn't "report" it (provide a link to it), they "distributed"
it (spread the actual content). Online hosts/services have to report
distribution or they are liable: 18 U.S.C. § 2258A /
[https://www.law.cornell.edu/uscode/text/18/2258A](https://www.law.cornell.edu/uscode/text/18/2258A)

------
kajecounterhack
I work on spam and abuse (but not at Facebook). Also IANAL.

CP is radioactive / considered contraband. You cannot download or resend it to
anyone under any circumstances. It sounds crazy but it breaks the law -- BBC
should have sent links instead / used platform reporting. By downloading to
their hard drive and emailing it, they engage in proliferation.

"But didn't their browser already download it? When they saw it?" Yeah. But
they had to define sharing such images as being illegal, so...I don't know how
laws work.

When you detect CP, AFAIK you must report it (...ironically sending this copy
is OK though I'm not sure what the full mechanism looks like) and immediately
delete it from your servers. You can keep an irreversible hash to do future
detection.

Also fwiw our org has some special contractor who has to identify this stuff.
None of eng has to look at it, thank heavens.

------
oliwarner
A lot of people here seem to be focusing on the fallout, not the problem. Have
you ever reported something to Facebook?

A while ago, a random guy friended my sister and started posting nonsense on
her wall. I checked his profile and immediately found a dozen groups full of
5-10yos in swimming costumes interlaced with softcore and Photoshop'd merges
of the two. Unpleasant, illegal and against the community rules. I reported
it.

In another case a friend liked a video of a chap visibly getting a blowjob at
the top of a mountain. Sure, why not... But that's hardcore content that's
just started auto-playing at my workplace. Reported.

In both cases I got messages back saying the moderators actively disagreed
with my reports and that the content would stay.

The problem seems to be that Facebook hired teenage boys and paedophiles to
moderate.

Not shocked the BBC got the same results.

------
ccrush
Theyre often under order not to remove terror or child abuse content so the
government can track the perpetrators.

~~~
TallGuyShort
What kind of orders? Do you have a citation?

~~~
Neliquat
While I seriously doubt this is the case here, there is precedent for such
things. See the FBI led TOR sting for example. However 'darknet'=/=Facebook.

------
artursapek
How hard would it be for Facebook to automatically detect this kind of stuff
by keywords? BBC shouldn't have to be doing this. They have how many thousands
of employees? Why on earth are they complicit in this kind of stuff staying
online?

I wonder if we'll see the #PizzaGate tin foil hat people being vindicated the
same way that Vault 7 has vindicated the "1984" people.

~~~
Latty
How hard? Extremely. It turns out that people work around keywords extremely
quickly, creating new euphemisms, acronyms and mispellings to avoid it.
Eventually (or already) they will use such common terms that you get such a
high false positive rate it makes the rest of the site useless.

People always make out like this is an easy problem, it's not. The best we can
do without false positives is probably hashes - something already done (there
are databases of hases of known images, and I think I remember there being a
hash collision that got tons of sites reported for some normal image at some
point, so even that isn't foolproof), and those are trivial to work around by
modifying images in tiny ways.

Human moderation has huge cost and means that you have to give humans eyes on
private content, which can be abused in itself, so you have a catch-22 there.

------
owly
Facebook is obviously not policing the content on their platform very well.
Though it does sound like the FBI could easily run a sting operation and take
down the perpetrators who are sharing. Then force Facebook to be text only. ;)

~~~
freehunter
The crime happening on Facebook should make it _far_ easier for the FBI to
track down the real people behind it.

This is something I don't understand... the NSA is listening to everything we
do, watching everything we do, but when it comes time to find criminals we
still struggle. Why? If they have all of our phone conversations and emails
and social media profiles and the CIA owns Facebook and the FBI doesn't need
warrants and all these doomsday scenarios about how the US is a police state,
why do crimes even happen? And when they do, why does it take good old
fashioned police work to solve them? And why do some still go unsolved?

If Facebook is tracking everything you do across the entire web, if they know
who you are even if you don't have a profile, if they're invading your privacy
_that much_ , and if they're hosting illegal content, why can't they find the
real names of these people? Why can't the FBI just issue a warrant for the
data and Facebook hands over the people's names and home address?

Either law enforcement doesn't want to enforce the law, or the agencies don't
actually have the capabilities we've been told they do. If they had the
capability, you'd think they would actually use it. But when someone is
threatening to shoot up a school on Twitter, it's not NSA wiretaps that catch
him, it's users reporting him to Twitter and Twitter reporting it to the
police. Where's the breakdown here? What happened to ECHELON?

