
Snapchat Has a Child-Porn Problem - rayuela
https://www.bloomberg.com/news/articles/2017-11-08/snapchat-has-a-child-porn-problem
======
falcolas
Photographic chat application aimed at teens has teens putting up nude
photographs. Anonymous person claiming to be not a dog turns out to be a dog.

Say it isn't so!

------
rock_hard
I guess that depends on the definition of child porn!

I personally don’t think teenagers sending nudes to each other classifies at
such

~~~
chatmasta
A user who “sends nudes” is actually sending them to Snapchat, the company,
who then sends them to the user’s friends. Snapchat also stores the nudes on
its servers, and likely even has code that processes (reads!) those images for
analytics purposes.

It’s not like Snapchat servers are some blackbox, either. Any engineer at the
company with proper access rights could theoretically view any user’s uploads.

So I’m not sure “they’re sending them to each other” can really be a defense
here, unless the media is truly e2e encrypted (it’s not).

~~~
late2part
By your definition, the postal service is the recipient of letters containing
child porn.

~~~
someemptyspace
No, because the postal service doesn't have access to read the contents of
letters and parcels in transit. Not only do they not have access, but it is
illegal for them to access the contents without a warrant.

------
Derbasti
Stop saying child porn. You are talking about documented abuse of children.
Not porn.

~~~
jack1243star
On one hand I agree with you, as pedophiles are too often seen as child
molesters. On the other hand there are people who consider those as porn.
Pornographic content is simply impossible to classify.

------
jrs95
Could easily substitute Snapchat with Facebook, Instagram, any cell provider,
etc. I don't see how this really ought to be news.

~~~
pwinnski
On the contrary, the exploitation of children ought to be news every day that
it's a problem.

Snapchat is the focus here because it's better-suited to exploitation than the
others you name as part of its design and marketing. But yes, there are also
clearly issues outside of Snapchat as well.

~~~
tcd
How do you propose fixing the problem? Not giving teens/kids smartphones is a
sensible start, but that's not realistic, and they don't have the awareness
(or the logical thing that screams "don't do this") developed yet. They're not
technically aware of how the internet works (caches, archives, downloading
media) and that images can stick around forever.

people think snapchat images are 'deleted', yet that's not actually true, and
can be intercepted/recorded anyway.

~~~
spike021
>Not giving teens/kids smartphones is a sensible start, but that's not
realistic

I don't know, I got by just fine with a dumbphone through high school
(beginning of the iPhone era). I don't think most teens need smartphones,
honestly.

But we've crossed the point where that matters.

~~~
jrs95
Even dumbphones can do MMS, though.

~~~
ashark
No MMS without a data plan. Unless something's changed.

~~~
spike021
yep, this, at least when I was at that age.

------
totalZero
Wait, so is the point that nobody should have security because a few people
commit crimes under its cover?

------
top256
Even though this story is horrible, it reminds me of a silicon valley episode

[https://www.theverge.com/2017/5/1/15504692/hbo-silicon-
valle...](https://www.theverge.com/2017/5/1/15504692/hbo-silicon-valley-
season-4-episode-2-terms-of-service-recap)

Are they prescient or does things never change?

~~~
hosh
I don't think this is precience, though I often find many authors and artists
to speak up for stuff repressed in our collective subconscious. This kind of
stuff is becoming increasingly more difficult to bury, and it is the kind of
stuff our civilization has never adequately addressed.

I think it can be changed, but not if we keep trying to bury it.

------
throwaway0255
You guys can defend Snap all you want, but the fact is there is no single
company on the planet or at any point in human history that has benefitted
from child pornography more than Snap has.

They became popular as the “exploding image” app. The primary consumer use
case for this is transmitting nude photos. Their valuation is due in large
part to the youth of their demographic (younger users than just about any
social app).

I don’t think it’s a stretch to say they probably owe most of their success to
solving the problem of the consequences associated with sending nudes online
for people in junior high and high school (or at least, these people _think_
it’s solving that problem for them).

Comparing Snap to email or texts on this issue is a false equivalency. Snap is
specifically intended for this, and owes a ton of their valuation to children
using it and spreading it for this purpose.

------
air7
This page loaded with an auto-starting, non-stoppable, not-mutable video...
That's a new bottom mark for me.

~~~
jerrycruncher
FWIW you can disable the autoplay of all media, including animated gifs, in FF
by going to about:config and setting media.autoplay.enabled to false.

------
stevew20
Burn the witches! Down with flikr and pintrest while you're at it!

------
drraid0
Snap should hire Craig Newmark. He was able to get away with pimping
prostitutes on his site for a decade+.

------
trhway
AI filter trained to recognize such images can theoretically help here. Kind
of catch-22 though here as training of such AI isn't practically possible
(except by specially assigned/cleared FBI agents on the FBI database of child
porn). Even if trained, deployment of such a neural net would also be
problematic as the kernels in the deep layers would bear some resemblance to
the parts of images it was trained to identify.

Though there are seems to be a kind of some partial solution here, and i
wonder why Snapchat still seems to not have it - train 2 nets, one to
recognize porn/erotica using completely legal adult images, and train another
net to recognize young adults and children vs. adults on legal completely
innocent images. While obviously not catching all illegal images, both nets
producing positive signal may probably catch significant share of illegal
images.

