
Ask HN: How do I keep child porn out of my site? - VexedSiteOwner
(Pardon this disturbing subject interfering with your Friday night rest and my (very necessary) throw away account.)<p>A year or two ago I started an image sharing site that&#x27;s been modestly successful in terms of traffic (a blessing). No money or fame, but it&#x27;s nice to see movement.<p>I try to filter user uploads to at least classify the sexual stuff (80% of it) as nsfw and feature good stuff on the homepage. This is excruciatingly time consuming with 6000 galleries posted per day, but I suffer through it as I can.<p>Sadly, I&#x27;ve noticed a huge amount of extremely taboo photos on the site. From rape and bdsm, which I can kind of tolerate, all the way to extreme child porn. The latter is extremely disturbing.<p>Amazingly, these people post this openly.<p>I never see the press talking about the nsfw side of Youtube, Tumblr, Reddit, Imgur, and others. How do those sites deal with this problem? What kind of content filtering systems do they use to keep the visible parts of the site clean? How many interns are flagging photos all day long? Is it wise to allow these pages to be indexed? What&#x27;s my legal burden under Safe Harbor?<p>And.. more importantly.. how does the organic traffic in the nsfw sections play into the strategy of these huge user-generated content sites.<p>NB. I&#x27;ve attempted to build user profiles and a kind of self-moderation system, akin to how Reddit flagging works, but my users seem to be mostly interested in &quot;one thing,&quot; and no community-focused members have emerged so far. I still have hope, but need a solution that I can use now.
======
VieElm
If you're in the United States you should call the National Center for Missing
& Exploited Children[1]. They already work with internet service providers to
help identify unencrypted images depicting abuse transported over their
network. They do this, I think, at an automated level. They should have the
information you need. You should probably also call the FBI.

[http://www.missingkids.com/Contact](http://www.missingkids.com/Contact)

~~~
unclebucknasty
But, should he/she contact legal counsel prior to contacting the FBI or anyone
else? Personally, I think I would want to understand my potential culpability
and other factors here.

~~~
ender7
You should definitely consult legal counsel before and during talking to the
authorities (which you should also do). The laws surrounding CP in particular
are outdated and do not fit well into the digital world. For example, simply
looking at CP can be a crime, which can make it difficult to report unless you
know the right words to say. Always consult counsel in these cases.

~~~
unclebucknasty
Exactly. Beyond my own liability, the other question I would want answered is
whether I could potentially be compelled to cooperate in some long-term
investigation. If so, then what could that mean in terms of time and expense,
and is it worth it?

~~~
DasIch
You are not seriously asking whether it's worth it to help law enforcement
stop child abuse?

~~~
praptak
Talking to cops is a bad idea. I'd only do that if I had to and even then I'd
minimize the exposure: [https://medium.com/human-parts/good-samaritan-
backfire-9f53e...](https://medium.com/human-parts/good-samaritan-
backfire-9f53ef6a1c10)

Also, the abuse already happened, you are only stopping the dumber CP
collectors from sharing images of it.

~~~
DasIch
That abuse has already happened sure but it will probably continue. You want
to follow any trace you can find to suppliers. Shutting down demand might also
help in eliminating any economic incentives that might exist on the supply
side.

------
Elepsis
Microsoft made an automated system (PhotoDNA) for detecting known child
pornography images available to the public a few years ago and it's probably a
good starting point: [http://www.microsoft.com/en-
us/PhotoDNA/](http://www.microsoft.com/en-us/PhotoDNA/)

Hopefully this can help you.

(Disclosure: I work at Microsoft but not on PhotoDNA.)

~~~
kyledrake
PhotoDNA is the gold standard for this. I tried to get access to this via the
NCMEC to use with Neocities, but the process was, frankly, very convoluted. I
signed at least 10 forms and still didn't end up getting what I needed.

I'm happy that Microsoft is providing this as a free service. It's going to be
a lot less painful for me to use it than to figure out how to run my own (or
in this case, figure out how to even get it).

~~~
kyledrake
Update: I just tried to get it to work and surprise! It doesn't work.

Somebody please just give me access to the PhotoDNA code, the hashes, and a
little funding. I'll make an API anybody can use for this. It's ridiculous how
hard it is to do this. It's still easier for people to get spam IP lists than
to see if CP is being uploaded to their servers. You can't just have it
available for Facebook and Google or it doesn't work, you need to make it
available to everybody in an easy, simple way.

Seriously, if you are connected with this at all or want to fund this work
please email me, I am more than happy to work on improving this:
kyle@neocities.org.

~~~
contingencies
On the other hand, if everyone uses the same non-transparent list of magic
hashes to ban hosting images then censorship potentially becomes a concern.

~~~
Reef
If non-CP images start being blocked by this system, some gallery author is
going to notice it pretty quickly and report it to the website owner. This
censorship problem is remarkably easy to destroy the trust of the CP filter
and I doubt people at Microsoft would fail to predict this.

------
eli
Are you based in the US? There's a good chance you are _required by law_ to
report images of apparent child pornography. You should talk to a lawyer.

[https://www.law.cornell.edu/uscode/text/18/2258A](https://www.law.cornell.edu/uscode/text/18/2258A)

[http://www.ncsl.org/research/telecommunications-and-
informat...](http://www.ncsl.org/research/telecommunications-and-information-
technology/child-pornography-reporting-requirements.aspx)

------
eli
You may be interested in this article:

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
[http://www.wired.com/2014/10/content-
moderation/](http://www.wired.com/2014/10/content-moderation/)

~~~
klunger
I came here to share this article.

Anyway, I think your best bet is to outsource this kind of work to the sort of
company described in the article. It seems to be a regrettable necessity for
any sizable user-generated content site.

Also, of course, please try and get in touch with the relevant authorities
mentioned in other comments and assist their efforts in tracking users who try
distributing that kind of... content.

------
brudgers
By default, any site that allows users to share content will devolve toward an
attractive nuisance [0]. Like any security issue, passive measures are a
Maginot Line awaiting blitzkrieg, even all the resources of a Google or
Facebook aren't enough to automate all these things...they depend on
communities to report issues [e.g. webmasters for Google]. And that's the only
defense in depth: community.

"Everybody who signs up" isn't a community. There has to be some higher order
interest...and what you're finding is that unfortunately the higher order
interest of the community for your site is child porn.

There's no fixing DNS. If child porn is not what you want, your site is
broken. Shut it down. The sort of users you want don't either don't care
enough to keep out the bad or are overwhelmed by it's volume just as you are.
They are or will be moving on. You have my sympathies.

Yeah it sucks but you have learned some things:

    
    
      1. Community is the hard part.
      2. Technology is necessary but not sufficient.
      3. You can build something that scales to the point where
         it becomes useful to a community.
    

Consider this version 0.1. You've gotten feedback and that says that the
product (not the code) has failed by _your_ definition of "fail" because it
has not attracted the market segment you want. You have a platform from which
to relaunch.

Good luck.

[0]
[https://en.wikipedia.org/wiki/Attractive_nuisance_doctrine](https://en.wikipedia.org/wiki/Attractive_nuisance_doctrine)

------
orionblastar
Until there is a Machine Learning algorithm that can detect CP, you'll have to
have human beings flag it and then other human beings view it and remove it.

Someone brought it to my attention that Bing's cache is full of CP, after the
offending websites are taken down, Bing keeps the images for a long time. The
Rapidshare sites are also full of it and they password protect RAR files so
admins cannot peak into it. It is a major problem that has no solution for it
yet. People run Wordpress blogs and spambots leave comments that link to CP
sites.

This has become a hot topic issue because that Jared guy from Subway had a
manager of his foundation that was found with CP, and they raided Jared's
computers and found more evidence.

My ethics and morals won't allow me to look at porn, but it is a big industry.
There are all kinds of porn out there. The CP is the worst of it, and a lot of
children are trafficked as sex slaves for it. They grow up with a criminal
record and sex offender record, and by the time they expunge the record they
are in their 40s and can't find work. I was contacted by a woman who was in
that situation on Github during the Opal CoC debates. She is trying to get out
of her situation by programming and cannot find work because of it.

This CP stuff ruins the lives of the children who suffer abuses for it. Once
they grow up they have a hard time in life trying to make ends meet. Some have
serious psychological problems that are hard to treat and deal with.

I remember that in some cases the website is found responsible for the content
that users post on their websites. Laws in your nation may vary on that. If
you find illegal content you should remove it, least you be found liable for
it. Make sure to report the IP address of the poster to the government or a
non government agency that handles it.

~~~
VexedSiteOwner
Are there any good ML algorithms for detecting porn at all? I tried to
implement the standard "pink detector" with mixed results.

~~~
jliptzin
No. I looked into this a lot for a dating app I ran and no algorithms came
close to human moderation, even for images you'd consider are obviously
pornographic, which can get expensive.

A funny idea I had was to reverse the whole system - feed UGC content of a
site that's supposed to be SFW into a porn site which is definitely NSFW, one
that has lots of thumbnails. The ones that don't get any clicks to enlarge
probably aren't porn and can pass the test :)

~~~
chrisbennet
That's brilliant!

~~~
jliptzin
Until you go searching for porn and every other picture is corn on the cob or
doorknobs.

------
mirimir
Your safest bet is running a system where you have no way of knowing what
users upload. Depending on jurisdiction, reviewing and moderating content may
increase your civil and/or criminal liability. There's typically a "safe
harbor" for service providers. You just need to respond to LEA and DMCA
takedown requests.

Edit: Other advantages: 1) you never risk viewing stuff that you can't unsee;
and 2) you outsource content review to concerned users and other third
parties.

~~~
frigg
Doesn't there need to be a "report abuse" button on each image? Users could
report it and only then would he use something like Microsoft's PhotoDNA
(which an user mentioned above).

~~~
mirimir
Only a qualified lawyer can say.

But then what? Look at positives, and decide? Or just forward all positives to
LEA, let them decide, and nuke what they indicate? LEA probably wouldn't like
that.

~~~
frigg
A user above mention Microsoft's PhotoDNA. He could run the flagged images
through that first and if it returns nothing maybe look at it.

Or don't look at it until a certain number of users flag it but still run it
through the PhotoDNA. Now I am curious how imgur handles this problem.

~~~
mirimir
Or use all of those CP databases, and just report images flagged by some
subset of them.

I am curious how this gets handled. Having been goatse'd a few times with CP,
I cannot imagine reviewing that crap on a regular basis.

------
tacostakohashi
I used to work for an also-ran social network (20m users), and this was a big
problem for them too, particularly when they found that they were a popular
option for sharing CP. When I say a big problem, it's really an existential
threat for any kind of user generated content sharing site.

Yes, you need a way of finding and flagging this stuff. Algorithms help, but
people always need to be involved, and that's problematic. It can be hard to
find people that want to be exposed to this material as their full-time job,
and it's a liability headache. Even if _some_ employees are ok with being
exposed to it as part of their jobs, other employees might have a legitimate
expectation of not having to be exposed at their workpace, and it's difficult
to contain.

Yes, you will need to develop a relationship with law enforcement. They have a
number of programs for submitting evidence, they're actually quite easy-to-
use, and they are cooperative if you follow their rules. Even so, it's time
consuming, and if you don't maintain a good relationship and comply fully,
then you can become a target for enforcement.

You say you've become moderately successful in terms of traffic, but there's a
big proportion of dubious content. Frankly, this means that certain people
have noticed that your site is not as good at identifying, flagging, and
reporting this content, so they're gravitating to you, having been kicked out
of facebook, etc. That's fine in the short term, in the long term it's
unsustainable from a business and legal perspective. Either you'll need to
devote more resources to fighting this (instead of development, marketing,
more interesting things), and find a way to attract more legitimate users, or
you will become the next attractive target for legal issues.

This is not a simple problem that can be solved with mechanical turk, an
algorithm, etc. It's a never-ending game of cat and mouse, walls and ladders,
and a fundamental problem to be dealt with on any site that allows sharing.
It's not just sexual stuff, there's also copyright - the music and movie
industries are pretty keen about finding targets too.

It might be feasible to compete with facebook on product, or popularity with
niche audiences, but competing with them on their ability to keep bad content
off their site so that it's palatable for a wide audience is a lot harder.
That's their core business, and they employ a _lot_ of humans to make it work.

------
subb
This is just an idea, since I never built such filter, but you could automate
a large part of filtering NSFW images. A quick search on google lead to this
paper : [http://cs229.stanford.edu/proj2005/HabisKrsmanovic-
ExplicitI...](http://cs229.stanford.edu/proj2005/HabisKrsmanovic-
ExplicitImageFilter.pdf) Once you have that in place, I guess it's better to
make it agressive and report false positive as NSFW.

Google "safe image search" has the additional help of searching the content of
the page the image is used. You might be able to do the same, up to some
limit, by checking the http referer header field to know where requests are
coming from. You could scan the referer's page for some keywords. This might
give you a better idea of the context where the image is used. Note that this
might be tricky, since you probably don't want traffic coming out of your
server to some child porn site.

That said, those are just some ideas. Youtube has a good community that flags
videos, but also an army of reviewer that look at the flagged content.

[http://mobile.nytimes.com/2010/07/19/technology/19screen.htm...](http://mobile.nytimes.com/2010/07/19/technology/19screen.html)

Another way to look at it would be to try to manually select some images as
"front page worthy", instead of trying to filter the bad stuff.

------
kanamekun
Other posters are correct; you are obligated under US law to report child porn
to the NCMEC CyberTipline.

The fine for not complying started off fairly low, but has been increased in
subsequent legislation. In my experience though, NCMEC is mostly just
interested in getting regular reports uploaded to their system. I met with
them once, and they have a rough sense for how many reports should be sent
over for a site of a certain activity/traffic level, and if the number of
repots is zero... then they know you're not in compliance.

Their reporting interface is beyond awful though. Maybe they've improved it in
recent years; when I last saw it, everything had to be uploaded and reported
manually.

~~~
mirimir
Would it be enough to report content flagged by other users as CP? Is there a
requirement to review content before submission? That's not something I'd ever
want to do. I don't think that I'd want to pay someone else to do it either.

~~~
kanamekun
As I understand it, there's no requirement to review content before
submission. My understanding is that once someone has flagged or reported it
to you, you're mandated to report it.

------
BorisMelnik
As a father this horrifies me. If this were my site, hobby or not I would
spend a great deal of time implementing a system to report to the authorities.
Just imagine if you did manage to help/save just one kid how good that would
feel.

The PhotoDNA API looks absolutely brilliant. That is one reason why I love
Bill Gates he always (or a lot of the time) gets involved with projects that
truly help people.

Don't think of it as a problem, it is an opportunity to help a child or a
parent that may not know a relative, teacher, or stranger is hurting their
child.

Many times these crimes are committed by loved ones and the children are not
abducted, they are lured / tricked by people near and dear to them.

~~~
Houshalter
In all likelihood the photos are just stuff people copied from around the
internet and not from the original abusers.

~~~
linkregister
What are you attempting to say, that the images shouldn't be filtered because
they aren't from the original source?

~~~
Joeri
I think the implication was that the uploaded are not the perpetrators, so
reporting them is unlikely to reduce abuse. I'm not sure I buy that line of
thinking. I suspect plenty of perpetrators upload in public forums.

------
AnotherWebmster
From the experience of running a similar site.

1\. Monitor only most-viewed pages as 99% of images nobody will never see
again, not the uploader nor the law agencies. The page must have some traffic
to be discovered. Just make a page "top 200 today" and have a look from time
to time.

2\. "report nsfw" button does not work. The pedophiles do not report, the rest
have no chance to hit the pedo-page.

3\. Almost all the pedo-uploaders use Tor. Check how many non-pedophiles use
Tor and consider to block IPs of exit nodes (or make Tor-uploaded images
initially hidden until reviewed).

4\. Own your IP address or setup a relationship with the owner of your site's
IP. Law enforcers send email to them (or to you and CC: to them). If "whois
$YOURIP" shows not your email but for example abuse@digitalocean.com or
abuse@azure.com then your server have a good chance to be disconnected hours
before you would know why).

5\. About the big players - at least Twitter has a lot of pedo-content (my
service is screenshot-oriented and I have seen many screenshots of Twitter
pages with CP). "how does the organic traffic in the nsfw sections play into
the strategy of these huge user-generated content sites" \- very good
question, I would like to know as well.

6\. About the advice from comments to check images in the porn context. It
does not work. SFW-images are very clickable being surrounded by NSWF (think
of a portrait of a celebrity in the context).

PS. My advices may look as semi-measures, but they provide the same level of
quality as Machine Learning or Mechanical Turk solutions (which are not
endloesungen as well) for lower price.

------
daenz
If there isn't already, maybe there should be some kind of public perceptual
hash database ([http://www.phash.org/](http://www.phash.org/)) for this kind
of stuff.

~~~
rendx
Such databases exist, but I don't think there's any that is publicly
available.
[https://wiki.openrightsgroup.org/wiki/Indecent_image_identif...](https://wiki.openrightsgroup.org/wiki/Indecent_image_identification)
lists some, there's more.

------
jagermo
No need to apologize, this is not only an interesting topic and problem, but
also a very good discussion.

Thanks for bringing that up, might it be ok if I use your question to build
around it and see what it is like for non US-websites?

------
nness
There is already some interesting solutions posted here. If you wanted to try
and tackle the issue with a stop-gap in the meantime you could add an image
hashing step in the upload process to identify images that have already been
flagged as NSFW or worse.

dHash is fairly simple to implement, and you might even be able to offload the
hash checking at the database level. Comparing dHash's is just a matter of
AND'ing the two hashes and counting the number of bits.

Obviously as the sample size increases so will the computation time. You could
help the process by prioritising checks against new accounts, certain IP
ranges (if you're seeing more or less content of a certain type from different
countries or VPN providers) or if an account has a history of uploads in the
past.

Its a horrible problem to have. Best of luck!

------
michaelmior
Not sure how useful this would be, but the first thing that came to mind is
CrowdFlower[0].

[0] [https://www.crowdflower.com/type-content-
moderation](https://www.crowdflower.com/type-content-moderation)

------
hayksaakian
At least with reddit, there's community moderation (read free employees) which
enforces the contents of each section.

is there any incentive to participate in your community?

With moderators you feed the "power tripper".

With karma you feed people obsessed with points.

This is a bit complicated: what if you had some sort of capcha that required
users to classify images as nsfw/sfw/illegal?

~~~
Houshalter
Reddit doesn't allow users to upload images. Just links. And they ban
problematic domains.

~~~
toomuchtodo
Linking to certain NSFW domains in a comment will prohibit the comment from
being posted by Reddit's internal logic.

------
chmike
Another possibility is to add a service to label/tag images.

See if if you could turn it into a game where people would gain karma points
by properly labelling/tagging images. Make it peoples choice to participate in
it instead of forcing them into it with a captcha.

Your image stack would gain significant value by being labelled and searchable
by label.

See
[https://www.cs.cmu.edu/~biglou/ESP.pdf](https://www.cs.cmu.edu/~biglou/ESP.pdf)

------
aurizon
If you give people a way to send passworded links to cash subscribers, then
none of his subscribers will rat him out. If you make all images open to view,
you can then appeal to people to flag items for removal, or just autoremove
them with - say - 2 or 3 flags, and trust to your nicer clients to police the
site. If your clients all sign up with throw-aways, then load a huge block of
images, all with their own password, then they can sit far away and sell
passwords all day and never emerge to be caught. If they want to add more
images = a new thow-away account every day if you like. Full accounatbility is
the answer so all images can be tracked back to a real address and name.
sadly, only good people deal with this, but it might be a way to thin the
crowd. A secret untrackable photo site will also soon attract the police as
they hunt for child porn sellers, so they will sooner or later come knocking
on your door.

One way it to make contact with the police and get permission to list the
names of the police agencies that are allowed to inspect the site via backdoor
etc. Of course this might enrage some?? So some sort of middle ground might be
to quietly approach the police for advice

------
pessimizer
Paid moderation. Putting up an open image sharing site with no security is
akin to opening up a nightclub with nobody checking ids at the door and no
security.

I can't tell you what to do to meet the minimum legal standard of covering
your ass, but that's going to vary by jurisdiction and current whims over
time. I can tell you, though, that by the time somebody has stumbled over a
terrible image and reported it, they 1) will be horrified by your site and
never use it again, and 2) the poster will have shared the url with everyone
they wanted to and the image will already have been distributed as far as it
was intended to be. If the number of terrible galleries is increasing, you're
probably becoming well known within tiny circles as a convenient place to
share the stuff.

------
INTPenis
People are recommending very exotic solutions of image detection. But
unfortunately that's an ongoing arms race between two groups of very smart
people. One driven by the moral justification to stop abuse against children
and the other driven by their unfathomable lusts.

So I would rather just K.I.S.S. and put your time into maintaining a reporting
system where any anonymous user can report violations.

Then put emphasis into making this system as easy as possible for you or any
potential moderators. The ideal would be to have an app that would notify you
of any new reports and allow you to just swipe from image to image pressing an
icon if you want to delete, ban or whatever.

Edit: Such a system could eventually form the basis of a bayesian filtering
system.

~~~
mirimir
> ... swipe from image to image ...

I would _not_ want that job!

------
rendx
A "flag content" link should be good enough. Sounds like you're US based;
you're not required to manually check all user-contributed content. Set up a
DMCA section and link to it at the bottom. If LEA contact you, make sure you
react fast, and maybe you want to offer them an automated way of takedown if
this really happens that often.

For the future: Asking legal questions without stating your jurisdiction is...
not helpful. :)

Yes, big sites employ a lot of people to clean content. I remember reading an
article about poor people in $third_world_country that do this all day long.

~~~
VexedSiteOwner
If you can locate the article, I'd love to give it a read.

I'd rather keep my money inside US borders, but it ain't cheap, and staring at
the sickening contents of the internet's collective wet dream ain't exactly a
high-profile career path for young folks.

~~~
rendx
Here's one: [https://www.wired.com/2014/10/content-
moderation/](https://www.wired.com/2014/10/content-moderation/)

"Hemanshu Nigam, the former chief security officer of MySpace who now runs
online safety consultancy SSP Blue, estimates that the number of content
moderators scrubbing the world’s social media sites, mobile apps, and cloud
storage services runs to “well over 100,000”—that is, about twice the total
head count of Google and nearly 14 times that of Facebook."

------
CHaro
Why not report their ip address to the police? Hopefully people will learn
that by posting cp on your site they will get flagged.

~~~
VexedSiteOwner
Supposing I was ok with directly alerting the authorities, which for some
reason seems to be perhaps a bridge too far, how would I even go about that?
Where's the "report pedophiles" REST API?

~~~
rhino369
Why is it a bridge too far? I'd argue it's a civic duty. I'm sure the fbi has
a tip line / email.

Also, post on your site that you report child porno. That should scare people
off.

~~~
tbrownaw
Because making _pictures_ \-- however contrary to present sensibilities --
illegal is a bit too close to thoughtcrime?

~~~
openasocket
Producers and consumers of CP are not prosecuted for their thoughts, they are
prosecuted for their _actions_. The subjects of these images suffer real and
lasting psychological and physical damage.

~~~
vonklaus
Once again, these images don't contain meta-data stating the persons age, ages
of consent are different around the world, someone can accidentally click
something they didn't mean to or reasonably expected was not CP as almost the
entire internet avidly sanitizes these images. I wouldn't go as far as to say
it is "thought crime", but it can be non-obvious what is illegal. I am
assuming the GP meant this and not that since these images already exist it is
acceptable as the fantasy is in one's head, as that is not a reasonable
excuse.

~~~
meowface
That argument doesn't apply here though. If someone is manually uploading an
image of a child being raped to an image hosting website, there's no accident
about it. Reporting someone who uploaded that image (i.e. someone who is
distributing child pornography) is the only ethical and sensible option.

Now, if someone is uploading an image of their 16-year-old girlfriend taking a
nude picture of themselves, that's still illegal but more of an ethical gray
area, in which case I could understand hesitation at reporting to authorities.
But based on OP's description, it's the real bad stuff, so reporting seems to
be a no-brainer.

~~~
vonklaus
I am not crusading to uphold CP on the internet. It is interesting that you
probably think my above comment was distasteful. But then you say:

> if someone is uploading an image of their 16-year-old girlfriend

That isn't a grey area, that is wrong unless you have consent[0]. My point is
that it is hard to tell what is and isn't child porn(hence this thread in
general) and that an 18 y/o could look younger. What I think is possible is
that someone could upload photos of THEMSELVES when they are either in the US
or even in a country where that is legal and it is non-obvious what the age
is.

Child Porn is fucking disgusting, I am not here to defend it. I am just
pointing out how hard it is to identify it and that some people upload their
own photos online. If you had a database that magically let you know if an
image was illegal then of course I would say take it down.

edit: [0]I meant this generally, not that it would be OK to do this if she was
16, but if your gf was of legal age and you posted pictures of her or the two
of you together.

~~~
meowface
Let's say the person did have consent. It's still illegal, but is it still
wrong?

Let's say they didn't have consent. It's wrong in the sense of it being
"revenge porn", but should it really be considered as child pornography in the
way it's classically viewed?

It's a gray area in that regard.

If someone is uploading things to his website which he can clearly identify as
child porn without further analysis (if the child depicted is under 13, it's
probably pretty easy to tell, unfortunately), I don't think he should hesitate
to report it to law enforcement.

And if law enforcement investigates and learns it's not actually child porn...
then they almost definitely won't charge anyone so long as no other crimes
were committed. There are no downsides to reporting it and many downsides to
keeping it to yourself.

------
verelo
Like most forms of human behaviour, if you add enough resistance, people
generally start looking for alternatives. With that in mind, I would suggest
you consider a service similar to Mechanical Turk to perform random checks on
images, and ban any accounts the violate your terms.

Using a service like this (perhaps not MTurk exactly, i'm not sure how they
feel about this content type being reviewed on their service) you could
identify how long it takes until you can trust an account, to reduce your
expenses associated with the filtering of traffic.

Sadly, not this is not a free option (Probably 1c per image + MTurk
commissions), but I imagine eventually you can optimize the process to a point
where either you spend very little reviewing images, or you hit a critical
mass of a user base that the site can moderate itself.

------
RexRollman
Honestly, I am not sure what you can do about this, other than request users
report it when noticed.

------
pm24601
After you follow the suggestions for using PhotoDNA.

I would also announce to uploaders that you will be processing their images
through the service. Including their ip address, etc.

That might help with just the volume of bad images.

But I do have some questions:

1) considering the headache - maybe shut down site for maintenance while
implementing service.

2) process all images currently

3) consider that some images will inevitably be not in the database but be CP.

... therefore, at what point do you decide to just shut down the service.

It would suck to have to spend a lot on a lawyer to stay out of jail.

Its nice to pretend that the FBI would understand that you are an innocent
victim - but what happens when those images end up on your machine (browser
cache) and a fed prosecutor sees things differently?

------
true_religion
I hope I'm not reiterating anything people have said... but generally sites
with a community focus, have volunteer moderators who monitor for this stuff
and help ban users who post it.

If your'e doing anonymous posting, and the images are meant for sharing on
other sites (ala Imgur) then you'll have a harder time of it. You can ban the
images themselves and save the hash of the image to compare to future uploads
and automatically reject those.

I can probably tell you more about how we do this on my own sites and my newer
project, 9Cloud.us, which allows site owners to outsource all these concern.
Just email at admin@9cloud.us

------
patrickmclaren
It would be interesting to see if this kind of content is amenable to
classification. Maybe it would be worth looking into something like Caffe [1],
it may even help you with managing the site in general. I can't search right
now, however I think that a quick search in Google Scholar could yield a few
different approaches in this direction.

[1] [http://caffe.berkeleyvision.org/](http://caffe.berkeleyvision.org/)

------
kybernetyk
Visibly watermark NSFW images with the uploader's IP and the exact time when
the image has been uploaded. This should deter the most casual child porn
uploaders.

~~~
joshstrange
And then no one (including the CP people) would use the site... Sure you don't
have CP on your site but you don't have any users either. There are other ways
to handle this.

~~~
lowmagnet
They could silently do it in IPTC form. I don't know why this isn't already a
thing.

------
zxv
One could use a hash set to identify and exclude known child porn images.
Having done that, the law may require one to report it to law enforcement.

See the 'Child Exploitation Hash Sets' available here:
[http://www.nist.gov/oles/forensics/forensic-database-tech-
di...](http://www.nist.gov/oles/forensics/forensic-database-tech-digital-
evidence-table.cfm)

------
bkovacev
I believe machine learning is the way to go. It might require quite a bit of
coding and time, but in my humble opinion it is your best bet. Also,someone
wrote some JS code a while ago that was posted on HN that would detected
nudity on a given photo. I believe you could automatically tag NSFW if the
script alerts for nudity. I will edit for link as soon as I find it.

------
petilon
Machine Learning is the best way to detect illegal images. Here's a story on
this topic: [http://abcnews.go.com/Technology/detect-digital-child-
pornog...](http://abcnews.go.com/Technology/detect-digital-child-
pornography/story?id=10362616)

------
TACIXAT
Have you blocked known tor exit nodes? That would be my first course of
action. Blacklist IPs for any offence.

------
mgh2
Here is how Facebook and Youtube does it:
[http://www.dediced.com/links/the_laborers_who_keep_dick_pics...](http://www.dediced.com/links/the_laborers_who_keep_dick_pics_and_beheadings_out_of_your_facebook_feed)

------
zitterbewegung
Look there is no 100% way to keep child pornography off of your site. The only
way to attempt to prevent it is have a reporting button on all images and pay
someone to go through the moderation or have an automated system that flags
images and removes them for you.

------
evolve2k
HN today (7 days later): MS have released their PhotoDNA publically so it's
now easy to use for small sites:

[https://news.ycombinator.com/item?id=9903263](https://news.ycombinator.com/item?id=9903263)

------
mirimir
Several third-party services for reviewing uploaded images have been
suggested. I don't argue with their value. But how can one review
effectiveness without viewing images? There is stuff that just can't be
unseen.

------
rfrey
"From rape and bdsm, which I can kind of tolerate"

I hope you mean simulated rape (although I'm not sure how people know the
difference). Victims don't suddenly become unworthy of your concern because
they pass 18.

------
pbreit
First, be careful because the law on this is pretty tough, even though the
content is user generated.

But I'm wondering how much messaging you present indicating that illegal
content is forbidden and will be prosecuted?

------
antaviana
1 - Charge a one-time fee of USD .01 to your users payable only via PayPal. 2
- Report to the police any child pornography issue handing them the PayPal
coordinates.

~~~
wut42
3- Loose all your users ?

I wouldn't pay USD .01 to access a simple image upload site

------
bane
Maybe mechanical Turk?

------
505
[http://queue.acm.org/detail.cfm?id=2721993](http://queue.acm.org/detail.cfm?id=2721993)
perhaps?

------
nailer
There are APIa to find naked people. Some are terrible, pifilter, which is
designed specifically to recognise body parts is pretty good.

------
im3w1l
You should look into automatic classification of the images. Ianal, but I
imagine you probably need some kind of permit if you diy.

------
Raed667
For CP if you're US based, log it, take it down, and use one of many report
GOV/NGO systems available.

------
MalcolmDiggs
Seems like a good use for MechanicalTurk (mturk.com), at least until a
free/open-source API becomes available that can replicate the abilities of a
human.

I'd recommend sending new photos to a queue, and only letting them get indexed
after someone from mturk has marked them as acceptable.

If the workers find something illegal, seek counsel and do as the counsel
advises.

~~~
iLoch
Definitely not!!! Asking people to look at child porn for you is a terrible
idea, and may be illegal in many places.

~~~
true_religion
You're not asking them to look at anything that you've identified as illegal
material. If you already knew it was illegal, you'd have deleted it or
reported it yourself.

You're asking them to moderate _unknown_ material from _new_ users you know
nothing about. That's something that every volunteer moderator from Reddit
does.

------
transfire
what site is this?

------
blazespin
How-old.net

------
mahouse
What's your problem with BDSM, as long as it's consensual?

~~~
linuxydave
Images of BDSM is illegal in some countries, like the UK, IIRC.

~~~
mahouse
Oh, such a weird argument coming from Hacker News, the libertarianism haven.

~~~
linuxydave
Well, to clarify (seeing how I can't edit my post) - if I was in OP's position
that's what my concern would be. As I live in the UK I'd have to be careful
about hosting things like BDSM pictures as the porn laws here are rather
retarded.

~~~
mahouse
I figured. I was just wondering the downvotes with no clarifications. (I don't
care about the votes themselves, but I wanted to know what other people
thought)

