
Ask HN: good idea to create api for detecting porn? - wsieroci
Hi,<p>I am wondering what do you think of idea of creating service which would detect porn for web 2.0 services using humans brains (they would be employees of this service). You would point to image url using API and service after some time (let say 5 minutes) would answer requesting (GET&#x2F;POST) website url given by you with answer (YES&#x2F;NO). Service would charge 1 cent by one image uploaded.<p>What do you think of this idea?<p>Regards,
Wiktor
======
1123581321
If you're going to offer a service like this, you want a few tiers of
verification. You should have a lightning fast level 1 service that returns an
automatic response. Perhaps level 2 would be a more computationally expensive
automatic response. Level 3 might be human analysis, but the image would be
just one of several shown on a screen at once. Level 4 would actually get a
human's full attention for a few seconds. And Level 5 would be to handle
strange cases, do CP reports and handle disputes.

~~~
wsieroci
Yes, such process could be perhaps most optimized, but the most problematic
step is step 1.

~~~
proexploit
Sure, it would be hard, there's some projects that might be able to help out
however, e.g.
[https://github.com/pa7/nude.js](https://github.com/pa7/nude.js). It's not an
easy problem to solve but it's possible. You could also start the service with
2-5 of 1123581321's example or just 2-4 and wait to see what your customers
request.

------
smartwater
Existing flagging systems are adequate and 1 cent per use doesn't seem like a
sustainable business model for anyone involved. It seems like mostly social
networks would be your customer, but 5 minutes is a long wait and eliminates
many use cases. A quick script hooked up to Mechanical Turk (or TaskArmy) does
the same thing and could provide results in a minute or less.

You'd also be associating yourself with porn, which comes with all sorts of
weird consequences.

What makes you think that this is a problem that needs solving? And if it
needs solving, what makes it worth it to you? Are you uniquely qualified to
solve this problem? What is your end goal?

~~~
wsieroci
Yes I think this problem needs solving because I do not see any good reliable
API service like this on the web right now.

~~~
stevekemp
I'd suggest the reverse.

Because there is no existing API this cannot be in-demand. Or it would exist.

Sure companies like OKCupid, Facebook, POF, etc, all want to detect porn in
their user-uploaded images, but they all handle it via user-reporting. That
costs them nothing, and is "fast".

I suspect not many people would be prepared to wait five minutes for a real
result.

------
krapp
Five minutes is a long time for said web services to have to wait for an image
to be validated.

Also, you might (i don't know, ianal) have some legal issues when inevitably
the child porn shows up, because then your employees are being paid to look at
child porn. Which if they're not cops might be a problem.

Although, if i'm running a web service, better you than my own mods, I guess.

~~~
wsieroci
I do not say that you have to wait for an image to be validated. Image could
be validated after you have uploaded it and for example someone has flagged
it.

------
DanBC
So long as you have clear definitions for what is or isn't porn, and you have
some protections and easy reporting mechanisms for images of child sexual
abuse (as smartwater says, that's going to be a tricky legal area.)

But how much are you going to pay the people looking at the images?

~~~
wsieroci
For how many images they have processed

------
martey
The other comments seem to suggest that this is unworkable, but services like
this already exist:

[http://www.nytimes.com/2010/07/19/technology/19screen.html](http://www.nytimes.com/2010/07/19/technology/19screen.html)

~~~
wsieroci
In this article I do not see company which offer such API

~~~
martey
The article makes clear that large technology companies currently outsource
moderation to companies that use human employees to detect porn and other
inappropriate content. This is basically similar to your idea, regardless of
the fact that the article does not go into detail about how the outsourced
employees get access to content that needs to be moderated.

When I wrote my comment, the other comments suggested that having humans
review content would be unworkable. They did not suggest that creating or
providing an API for moderation would be the hard part of the business.

------
timmm
Bad idea, humans are obsolete for this task.

A program can easily scan a picture and detect nudity with a high degree of
accuracy.

