
Three months of content moderation for Facebook in Berlin - imartin2k
http://sz-magazin.sueddeutsche.de/texte/anzeigen/46820/Three-months-in-hell
======
Barrin92
Okay the speech issue aside because it's frankly been debated to death, what I
find really crazy about the situation is how much this is being treated by
Facebook as a 'mechanical turk' like task.

The author describes having only a few seconds for a pretty complex task and
is exposed to disturbing material for essentially their entire workday,
without any adequate psychological training at all.

This is not how content moderation should be done at all, and I'm not even
sure that this should be treated like a routine job.

~~~
twblalock
At Facebook's scale, there is not a good way to do it. Even if they were able
to scale it, it would not be good enough anyway.

You can automate it, but clearly that doesn't catch everything and automated
systems can result in just as much bias as humans doing the job.

You can hire humans to do it and make them work fast, because the amount of
posts per day is extremely high. You can use some automation to help them, but
that will combine the problems of human moderation and automated moderation.

You can hire lots of humans to work at a normal pace and give them
psychological counseling, and spend so much on their salaries that you'll go
out of business, and even then they won't catch everything and will be
frequently accused of bias.

It's a no-win situation for Facebook because even if they were able to
objectively and fairly moderate every single post, they would be accused of
bias by politically motivated people anyway.

This is not a technology problem, it's a social and political problem and
Facebook will never be able to please everyone. Twitter is in the same boat.

~~~
mschuster91
> You can hire lots of humans to work at a normal pace and give them
> psychological counseling, and spend so much on their salaries that you'll go
> out of business, and even then they won't catch everything and will be
> frequently accused of bias.

Then you should not fucking be in the business at all! What FB does is
externalizing the cost onto societies and skimming the profit!

Cost here being two things: first the cost of mentally fucking up the
moderators (so an extensive amount of psychological healthcare has to be
shouldered by society) and second the cost to society that arises from letting
Nazis and jihadists spread their propaganda (including the cost from real-
world violence enabled by said propaganda).

~~~
twblalock
Any online open forum would have the same problems no matter how hard they
tried to prevent it. We can either accept that some level of hate speech is
inevitably going to slip past the moderators, or we can give up on having open
forums altogether. There is no perfect world with perfect moderation of online
content.

I wish Facebook had the courage to say this themselves.

~~~
mschuster91
> Any online open forum would have the same problems no matter how hard they
> tried to prevent it.

Facebook tries to get by with as few moderators as they can, with paying them
as low as possible and generally trying to teflon away anything negative from
their service (i.e. they negate being responsible for the fake news/hate
speech on their platform).

I believe that there would certainly be a way to provide fair moderation - for
example a ratio of 1 full time CSR (customer _services_ representative) per
1000 users, not the 1 per 20k or worse they are currently trying to do.

~~~
twblalock
> a ratio of 1 full time CSR (customer services representative) per 1000 users

That would require 2 million full-time CSRs, because Facebook has a little
over 2 billion active users per month.

~~~
mschuster91
If that is the price to avoid externalizing the cost of a lack of moderation
to society, then it is 2 million CSRs.

I'm sick and tired of Silicon Valley companies thinking they can save money on
customer support and have societies pick up the tab.

And it's not like Facebook (or Twitter) couldn't get the money: 3€ per user
and month should be enough to cover the cost of a properly salaried CSR - and
at the same time create a barrier for bots and Nazi trolls.

~~~
ropiku
> And it's not like Facebook (or Twitter) couldn't get the money

What makes you say that ? Facebook barely makes $5/quarter (3 months) in
_revenue_. They couldn't even afford $1/month/user. Meanwhile Twitter has
never turned a profit ever.

~~~
eli_gottlieb
If you can't make more than $5 per quarter per user, maybe you shouldn't be in
business. If you have never, ever turned a profit, then what are you, a
charity?

~~~
evv
Nope, you’re a venture backed startup!

------
jacquesm
This sounds very much like the kind of exposure we had moderating
ww.com/camarades.com and files.ww.com. I've had a lot of contact with
authorities over the years about some of the more disturbing things we came
across on the service. Some of those images will haunt me for life. Nothing
but good about the people that do this for a living from an official point of
view, that's got to be a very hard job to keep up for an extended period. I
ended up shutting down files.ww.com because there was no way to run that part
of the business profitable with the amount of oversight it required, and it
seemed to attract the worst kind of content, but I'm frankly also not too
unhappy to be out of the webcam community business.

Content moderation is a pretty depressive activity to engage in for any amount
of time. My biggest stroke of luck was to find a couple of community members
that really enjoyed doing this and that kept a pretty fair hand or instead of
a year ago I would have shut the whole thing down a decade ago.

~~~
lovemenot
>> My biggest stroke of luck was to find a couple of community members that
really enjoyed doing this ...

That must be a rare type of person. Without casting aspersions on those two
individuals in particular, I wonder whether part of a solution might be to
invite the posters of objectionable comment to moderate others' content.
Poacher turned game-keeper.

~~~
flycaliguy
Not a bad idea. Pay psychopaths a good wage, keep them out of trouble and keep
the web clean.

~~~
jacquesm
It's a terrible idea. It is putting the inmates in charge of the asylum.

------
newshorts
It makes me so sad that we are employing people (for very little money) to
harm themselves psychologically. This isn’t good for anyone, this is a case
where machine learning is compulsory.

We as a society have decided to hide the terrible aspects of our culture
instead of deal with it. It’s why we have public and private schools, it’s why
we give homeless people bus tickets out of our cities. We want to take the
easy road and push our problems out of sight and out of mind.

I’m not saying I have an answer but the problem of Hiding from things that
disturb us can’t be a long term solution.

~~~
austenallred
As someone who used to run a news company, I’m not sure watching all the
beheadings and burning that ISIS posts is healthy or necessary, and I would
advise that anyone I love to avoid watching them. I can handle a lot, but that
stuff messes you up mentally, at least temporarily.

~~~
matte_black
I think watching videos where normal people die in some sudden unexpected way
has taught me how to be more perceptive of potential danger, even when a
situation seems perfectly normal. I would say it serves a purpose, and if
people must watch a gruesome video, I’d say those are the ones to watch.

~~~
mclightning
Check out "Active Self Protection", youtube channel. His channel serves
exactly to this purpose. I also felt the same way as you described after
watching a few of these.

But I don't think there is any pragmatic reason at all to watch torture videos
that terrorists do. It is not a defense confrontation. It is the aftermath, it
serves no purpose, nothing to learn there.

~~~
matte_black
Active Self Protection is some good stuff, I've seen a few.

------
printf_kek0
I am probably among the tiny minority of users who _strongly_ objects to any
form of Content Moderation whatsoever.

Regulatory requirements notwithstanding, a policy that dictates what content
gets filtered to users is analogous to a parent forbidding a child from
watching an age restricted movie.

Although I could present a "this is a slippery slope" argument here, the more
salient argument is that content moderation is essentially a form of social
engineering. If you think I am exaggerating but have never seen video footage
of what _real war_ does to real human beings I would encourage you do so;
consider then whether you still experience the same apathy that you did
whenever "Suicide bomber in <place_in_middle_east> kills x" appears in your
feed.

IMHO, people should at least be presented with the option to see what is
getting filtered rather than selectively suppressing objectionable material
lest society remain indifferent..

~~~
cpncrunch
While it would be nice to have online discussion without moderation, in
practice it simply doesn't work. There are just too many trolls out there who
wreck it for everyone else. (I've moderated online chat for > 20 years, so I
have some experience of trolling).

The other issue, which is perhaps more relevant here, is illegal content. Are
you saying, for example, that it shouldn't be illegal for someone to put
posters up on every street corner in their town saying that Mexicans are all
child molestors and should be shot? Where should the line be drawn?

~~~
briandear
Actually, in the US, hate speech is not illegal and the hypothetical posters
you mention would be protected speech. “Should be shot” is protected, “should
be shot at 11am today at the soccer field on Hicks Road” would not be
protected. There’s the concept of “true threat.”

Here is an interesting article related to this: [https://www.heritage.org/the-
constitution/report/true-threat...](https://www.heritage.org/the-
constitution/report/true-threats-and-the-limits-first-amendment-protection)

There is a mistaken assumption that hate speech is illegal. In the US, it
isn’t illegal. Speech that contains a specific threat of violence would be
illegal — but such threats have to be specific.

There is a song by the band Type O Negative called “Kill all the white people”
and the first line of the song is “kill all the white people and then we’ll be
free.”

That song is hate speech and promotes violence against white people — but it
isn’t illegal.

Even speech promoting overthrowing the government is also protected as is
speech calling for killing police or raping people.

Disgusting stuff sure — but not illegal speech.

------
Maarten88
> The only power that the content moderator has, is to delete a post.

This surprises me, but it is consistent from Facebook's viewpoint where the
user is the product. You can fire a customer, but you can't fire your product.

So maybe it is to be expected that Facebook implemented moderation as a QA
process. It would probably be against their interests to implement policies
that would police their community in a way that makes sense, with warnings,
temporary and permanent bans etc.

~~~
ZenoArrow
> "You can fire a customer, but you can't fire your product."

On some level, money is morally neutral, but it's not like Facebook won't have
enough "products" to sell if they dealt with those that violated their TOS
more decisively.

------
ryanmarsh
_I had to quit as I was particularly disturbed by what I saw as signs of
professional deformation in me: a kind of hyper vigilance (especially about
the risks for my family). I was dreaming about the job an my own perception of
reality shifted in a most concerning way. The terrible Las Vegas Shooting
suddenly seemed entirely normal to me._

I know we like to believe otherwise but an observation of human behavior en
masse shows that we are preternaturally vicious. Extreme cruelty and violence
is the default as much our need for connection and love. Those of us who've
had the opportunity to observe or take part in humanity at it's most primal
form, in the unsafe gaps between "civilization" and a well functioning state,
often come away with our reality shattered. I know mine was.

Lastly, the author shows signs of PTSD, granted this is an armchair
assessment.

~~~
discordance
I remember a similar case in 2015-16 where Microsoft employees also suffered
PTSD after moderating content -

[https://news.ycombinator.com/item?id=13378346](https://news.ycombinator.com/item?id=13378346)

------
DanielBMarkham
_"...At the end of the ramp-up process, a moderator should handle
approximately 1300 reports every day which let him/her in average only a few
seconds to reach a decision for each report..."_

If you are doing something 1300 times a day you are effectively not doing it.
I don't know what you call it, "Monkey in a can" comes to mind, but you are
not adding any sort of human oversight or judgment aside from simple image
recognition -- and probably not doing that good of a job at recognition at
that.

I suppose these jobs exists so that the social media giants have somebody to
point to (and fire?) when things go wrong. That's fine. Sounds necessary to
me. But not at that scale. To attribute any sort of meaningful behavior to
clicking images at that rate is a joke. Most normal people would be burnt out
before they got to their first 100.

~~~
Mithaldu
A translator usually does 1000 words per day. Are they "not doing it"?

------
kleer001
It's almost as if a better solution would be to leave mass media to a few
distributors per city and a few national ones.

You know, like newspapers.

What do we really gain by having direct and immediate access to the uncensored
thoughts of casual acquaintances and relatives and all their casual
acquaintances?

~~~
Buge
What do we really gain by Hacker News comments?

What do we really gain by blog posts? I personally have learned a lot about
technology from peoples' blog posts, and Stack Overflow posts. Maybe not as
much from Hacker News posts, but I've still learned some. And by reading
reddit posts such as stuff in /r/askreddit I've learned a lot about people
with backgrounds quite different than mine, and I believe the ability to
better empathize with them.

~~~
kleer001
All of your examples involve moderate to high self selection for
thoughtfulness and have a culture of self-critique. Reddit has it historically
and HN has it in daily practice.

However, Facebook is nearly a wasteland of relatives and acquaintances yanked
into the network, sometimes against their better judgement. Once their and
fluent in the technical aspects they feel free to be themselves, not
necessarily a thoughtful member of the network.

------
MikeGale
1\. I got the impression that there may be no long term safety net for the
enduring impact of this. (Like PTSD support for policemen.) That's not good
enough.

2\. Reading this piece I don't imagine the system can actually "work" in any
reasonable way. This job should be done by the individual user. In FB you can
"block" other people. I imagine that would sort out much of this issue.

3\. If FB and others are forced to police and censor like this maybe they
should simply take themselves out of business lest they become inclined to
exterminate our species for the actions of what, I hope, are a few.

~~~
MikeGale
4\. Other people should not be viewing private posts anyway. (Assuming this is
limited distribution posts (not public).)

------
banku_brougham
Not yet in this thread has anyone discussed the design concept that the
community at large could use to help moderate: a dislike button.

Hacker news comments is a pleasure to read (A) because of the community, which
(B) is curated by the ability to down vote horrible posts.

Facebook places value on these horrors, inadvertently I guess, but they are
not empowering the users to shut down bad stuff. Imagine the possibilities.

~~~
tobtoh
Isn't that what the 'Hide Post', 'Hide All Posts from X' and 'Report Post'
options are for?

~~~
banku_brougham
I dont think that action propagates information to other users though. The HN
downvote button for instance removes visibility for everyone.

------
kriro
Interestingly the original German title is "Drei Monate Hölle" (three month of
hell). I wonder why the English title focuses on the learning while the German
one focuses on the suffering.

Note that it was translated from English to German and not the other way
around despite being about someone who works for a German company that does
the "content moderation".

------
stevenicr
I wish there are more transparency about these networks divisions that make
these kinds of decisions. How do we contact a higher level management person
at facebook and talk about something that was removed or deleted?

The person in this article mad themselves seem very objective and desiring to
be fair, considering many angles... however I do not think someone can click
to remove or not remove, thousands of posts / accounts a day and stay
objective with all of them.

Projects I worked on have had facebook and tumblr accounts pulled, and I have
found no way to speak with someone at either company about the issue. Given
how similar the content looked to some standard spam, I can see how a
moderator would quickly click to remove when they see hundreds of similar
things a day that are indeed quickly made spam. However these particular
projects were not affiliate spam, they were original works, however no amount
of emails or @mentions on twitter ever brought anyone from these companies
discuss the issue.

Had similar issues with google.

When it's just a fake account being used for trolling, the harm in removing an
account is likely minimal. However there is little chance these low levels
mods know if the poster was a troll of legitimate group trying to do right.

If presence on social networks affects search engine rankings, getting a low
level overworked, over stressed person to delete your competitions accounts
could reduce their sales by 80% easily.

As facebook seems to want to "be the internet" and monopolize what is okay to
be discussed and seen, they certainly have poor customer service for righting
mistakes that are made in removing things. This is also true of tumblr and
others.

wasn't there something on hacker news a while back that pointed out many of
these mod jobs or pushed to countries like the phillipines or something as
well?

With the bias that can be created by culture, and the amount of content that
people are forced to see, there should be better options for contacting these
companies and getting things re-instated.

In some cases deleting an account hardly affects anyone. In other cases
deleting an account creates a snowball of removing speech that may be
important to many, not just on facebook, but affecting the other portals as
well.

------
lifeisstillgood
This is fascinating, and disturbing. a few thoughts are that UK police when
cataloging child pornography for courts were given regular breaks, counselling
and, what stuck in my mind, watched BBC children programming on TV at the same
time just to counter the overwhelming ness

this overwhelming is human - we extrapolate from our experience what the world
is "really" like - so a flood of beheading videos teaches you the world is
fucked up. We as society do urgently need to get a grip on this - it's not
just the extreme end but how skewed away from BBC-normal is the viewing diet
of the average youtube kid?

I think there are solutions to the scale problem. the facebook brute force
approach is interesting - but at 6.5M per week unmanageable.

I think this needs to be something dealt with as social media deals with
everything - farmed out to the community. what we have is a free to publish
environment. but we as society have really no insight in what is published -
whereas previously there were few enough newspapers one could read it all.

so why not have an approach like recaptcha - every day or so you get a
suspicious post in your feed, and are asked to comment. Feedback can be
something like "95 of your friends think that post was offensive enough you
should be jailed" \- this would be valuable human level feedback that is not
available in "likes". one can easily see it as a way to understand the
different universes of people online - the fox-hunting lobby generally
approved your post showing a bloodying but people who like superhero movies
were revolted.

I can see the apple faceID thing here being an interesting measure of
revulsion ...

Ultimately social media has been a consequence free zone for publishers
(posters). By adding in feedback from friends and wider society we can get a
much needed insight into what is happening around us, and the feedback will
often be a useful control rod for people posting - as it is online.

Long unfocused rant but interesting thoughts.

One final thing - Berlin has sufficient immigrant population that it can
recruit 1000 multi lingual multi cultural immigrants (plus churn - author
lasted 3 months) with enough tech savvy to be moderators. Just want to mention
that to the 52% of UK that think we can be a tech hub after telling all the
immigrants to piss off.

------
wonderbear
"I showed empathy only when I found something connecting me with the world of
the living beings, these small details that I tried not to notice that would
humanize the corpse and overcome my reflex of repulsion."

This part really stood out to me. It made me think about the difference in my
own reaction between reading about some number of deaths and reading about how
the victims ended up where they did.

------
gre
Reminds me of this article about censoring content at Google from 2012.
[https://www.buzzfeed.com/reyhan/tech-confessional-the-
google...](https://www.buzzfeed.com/reyhan/tech-confessional-the-googler-who-
looks-at-the-wo?utm_term=.jdjgm3ymE#.wekWJ39JE)

------
jwilk
If you have JS disabled, disable also CSS to read the article.

~~~
lsmod
If you use Firefox then you can just enter the "Reader View".

------
da02
What are the job requirements and qualifications to obtain this position?

~~~
Mithaldu
Arvato is a call center company. They probably require only a "lower german
high school" diploma.

~~~
da02
Thanks very much.

------
banned1
And all of this human cost just so we can post about our avocado toast! :-(

