
Facebook not protecting content moderators from mental trauma: lawsuit - ccnafr
https://www.reuters.com/article/us-facebook-lawsuit/facebook-not-protecting-content-moderators-from-mental-trauma-lawsuit-idUSKCN1M423Q
======
jandrese
I'm curious what they expect Facebook to do in this situation. I get that
these moderators have to look at disturbing imagery, but that's the job they
signed up for. It's not like Facebook can only show them nice images, it's
their job to look at those horrible images. I don't think Facebook was sugar
coating the job description or anything.

This feels like a class action lawsuit brought by garbage workers because they
have to handle trash all day long.

~~~
3pt14159
There is a simple way to deal with shitty users. The problem is that it runs
counter to the interests of the organization that hosts them. Require every
user on the platform to use an invitation code from another user. They don't
have to use their real names, but if they do something fraudulent or awful
both the inviter and the invitee get banned.

This is why Dribbble is still nice and most other places turn to garbage.

HN is a rare exception. It's likely because the subject matter requires a
certain degree of intelligence, which tends to negatively correlate with
trashiness.

~~~
dsfyu404ed
>It's likely because the subject matter requires a certain degree of
intelligence, which tends to negatively correlate with trashiness.

I think you're confusing "more homogeneous" with "less trashy".

Trashy is just whatever the local minority does that the local majority finds
repulsive.

A lot of the opinions conveyed in the commentary on HN that are reasonably
well accepted would go over like a lead balloon in some other communities.

~~~
neuroptera
While I see where you are coming from, I think we can all unanimously and
irrevocably agree that posting child porn and snuff videos to social media is
pretty damn trashy.

~~~
antepodius
I think you're sort of missing the point. Posting gore/cp on, say, your public
feed on a public internet social media site is trashy by the common idea of
civil/moral behaviour, but it would be par for the course on liveleak social/
a tor hidden services pedophile social network. It can be taken for granted
that most people think CP is bad, and that sending pictures of dismembered
kids to thousands of strangers is rude. Anyone who doesn't agree* will hide
anyway, or act anonymously. What's the point in a post like this?

------
DavidVoid
[https://www.wired.com/2014/10/content-
moderation/](https://www.wired.com/2014/10/content-moderation/)

You'd figure that four years after this article came out the larger social
media companies would have solved this issue.

A section that stood out to me:

 _Eight years after the fact, Jake Swearingen can still recall the video that
made him quit. He was 24 years old and between jobs in the Bay Area when he
got a gig as a moderator for a then-new startup called VideoEgg. Three days
in, a video of an apparent beheading came across his queue._

 _" Oh fuck! I've got a beheading!" he blurted out. A slightly older colleague
in a black hoodie casually turned around in his chair. "Oh," he said, "which
one?" At that moment Swearingen decided he did not want to become a
connoisseur of beheading videos. "I didn't want to look back and say I became
so blasé to watching people have these really horrible things happen to them
that I'm ironic or jokey about it," says Swearingen, now the social media
editor at Atlantic Media._

------
lawnchair_larry
Microsoft had a similar lawsuit last year. It sounds like their moderators had
to deal with even worse.

[https://www.theguardian.com/technology/2017/jan/11/microsoft...](https://www.theguardian.com/technology/2017/jan/11/microsoft-
employees-child-abuse-lawsuit-ptsd)

~~~
brador
Could Reddit moderators have a case?

~~~
lobotryas
The ones who are actually employees? Maybe.

The ones who are regular users who choose to moderate a community? No. There's
no business relationship.

------
neuroptera
> Facebook not protecting content moderators from mental trauma

> Facebook in the past has said all of its content reviewers have access to
> mental health resources, including trained professionals onsite for both
> individual and group counseling, and they receive full health care benefits.

...which is it?

~~~
klodolph
You don’t get a free pass for injuring your workers just because you give them
access to doctors, why would this be any different?

~~~
jasonmaydie
But that`s _the_ job. It`s not like they were doing something else and this is
a side effect.

If i employ you to clean human feces, you can sure me for not providing
adequate protection etc but you cant sure me because you had to deal with
human feces. That is the job.

~~~
morvita
You've hit exactly what this lawsuit is about. They aren't saying "when I got
a job as a content moderator I wasn't expecting to see traumatizing things",
they're saying "my employer isn't providing adequate protections from the
things I have to see as part of my job".

------
beauzero
After reading the comments and thoughts about the job. This is just new
territory. I am not sure that any company can prepare an employee for this
kind of mental barrage. ...or counter that constant consumption of evil.

------
harlanji
Let’s all just switch to RSS, social readers, and QR stickers and call it a
day.

------
jandrese
I wonder if maybe tighter integration with law enforcement would help the
situation? Part of the problem may be that people feel so helpless when all
they can do is ban the image and maybe send a copy up into the black hole of
law enforcement.

It has to be incredibly demoralizing to see yet another child sex slave image
posted from that Kremlin IP range taken in what is clearly the same room
they've been seeing for months. You can forward all of the pictures to the
FBI, but they can't do anything about it and you're seeing victim after victim
paraded around the Internet. Facebook's jurisdiction stops at the uplink from
their datacenter.

~~~
lawnchair_larry
Where does it say anything about the Kremlin IP range posting child sex slave
images?

------
Timmah
How is it any different from ER doctor, coroner, social worker, or
investigator?

~~~
asveikau
> How is it any different from ER doctor, coroner, social worker, or
> investigator?

Yes. All those people should have support too. And many more.

What is it with people and saying we need to race to the bottom? Like when
people react to other people's high pay and benefits not by saying, "I should
be up there with you", but instead, "why aren't you down here with me?"

The implication being, I guess, we need to do more to spread the suffering and
not lift people.

~~~
Timmah
What is it with people saying we need to sugar coat, bubble wrap, and insulate
our tender egoes from the reality of the world? Psychologists have a term for
that. It's called denial.

~~~
asveikau
If you happen to have a psychologist who is trivializing trauma or PTSD type
symptoms in these terms I'd say I think you need to find a different
psychologist.

------
bluetidepro
Alt Onion Title: "City not protecting coroners from seeing dead bodies:
lawsuit"

------
zantana
I'm pretty sure a traumatized content moderator was extracting revenge on the
people misbehaving on social media on a CSI Cyber episode, was it on Black
Mirror too?

