

The laborers of content moderation on Facebook - InternetGiant
http://www.wired.com/2014/10/content-moderation/

======
bashinator
What does it say that there's no commentary here? Collective guilt over the
human cost of maintaining these services? I also wonder how a site like Ello
would deal with this, if they became big enough to need this kind of content
moderation.

I can't imagine having to look at the worst humanity has to offer for eight
hours a day. It should merit hazard pay and unlimited psychological counseling
at the very least.

~~~
jpatokal
Do we feel collective guilt over having paramedics who need to scrape dead
toddlers run over by trucks off the road, or hospice nurses who spent their
days changing the diapers of dementia patients? It's a shit job, but they're
stopping everybody else from seeing the worst of humanity.

~~~
johnchristopher
In Europe (and I am pretty sure in the US as well) paramedics and hospice
nurses have access to psychological help and support if something on the job
demands it.

------
larrybrin
_The Arab Spring was in full swing, and activists were using YouTube to show
the world the government crackdowns that resulted. Moderators were instructed
to leave such “newsworthy” videos up with a warning, even if they violated the
content guidelines._

This seems related to the other front page article on google and geopolitics.

------
jpatokal
I wonder how hard it would be implement automatic face masking for these
videos, to increase the viewer's emotional distance from what they're exposed
to.

~~~
iamsalman
Google did this with Street View for faces, house and vehicle numbers. But
just masking the face won't decimate the negative effect.

------
shenanigoat
This was an incredibly sad article. Traumatizing thousands of third
(developing?) world citizens to keep our Facebooks and Youtubes free from the
depravity and horrors of humankind....and, um, sex. If there ever was a need
to effectively automate something it is this...for the sake of these poor
workers, if nothing else. Replacing tens of thousands of workers with an
algorithm would be insanely valuable as well.

~~~
SamReidHughes
Here's a way for you to solve the problem. Offer those workers better pay and
conditions than what Facebook is offering. So far nobody's stepped up to the
plate.

~~~
thiagoharry
Because this is so stupid as saying to someone to solve the problem of slavery
starting to pay the workers. The problem is that in a market where there's
cheaper products being produced because someone doesn't pay for the labor
force, someone who give salaries will be in economic disadvantage. The logic
is the same if you substitute "slavery" with "bad conditions and poor
salaries". When a problem is in the system, you can't solve it just changing
individual attitudes.

~~~
SamReidHughes
You can by all means tell us how to solve it. Right now Facebook's the best
job offer they got, you can't just wish that fact away. I guess you could do
what the GP wanted and create a competing product that drives them out of the
job?

> The logic is the same if you substitute "slavery" with "bad conditions and
> poor salaries".

No, it's not the same. Slavery is an entirely different kind of bargaining
position.

------
nness
I would hope that these big services, like Facebook, use content matching
techniques to quickly filter out repeat submissions...

------
jboynyc
Along similar lines, an excellent article on the gendered nature of content
moderation work from earlier this year: [http://thenewinquiry.com/essays/hate-
sinks/](http://thenewinquiry.com/essays/hate-sinks/)

------
maurycy
If anything, the upside is that if we come up with better techniques for
detecting unwated content we'll free these people for more productive
endeavours.

------
kbart
Off topic, but I would kill to have such view outside of office window like
that in the headline picture.

------
4oh9do
Instead of discussing alternative ways the "moderation" could be done, how
about a different idea...stop censoring the content in the first place.

