
How AI/ML algorithms from Google/Msft/AWS see nudity in images - mohi13
https://dataturks.com/blog/image-moderation-api-comparison.php
======
mohi13
There still seems to be 10% gap between these ML APIs to miss-classify images,
might be a case where they have been trained very conservatively to filter out
even a little bit show of skin. Wonder how others are moderating their UGC
images? (Seems social networks like FB don't do any moderation, have been
embarrassed many times when an explicit image shows up on my wall :P )

------
gajju3588
Looking at the report, guess its good time to replace those millions of manual
moderators.

~~~
mohi13
I would strongly disagree, moderators apart from filtering nudity also apply a
lot of business rules while filtering content even the report says that the
best recall is around 90% which leaves a fair bit of stuff that still needs to
pass human evaluation. May be that's why folks like youtube employee ten of
thousands of moderator in countries like India or philipines.

~~~
gajju3588
I think the number of employees required might be way lesser now, Just to
verify near threshold courses.

