
Facebook moderators break NDAs to expose working conditions - notinversed
https://www.theverge.com/2019/6/19/18681845/facebook-moderator-interviews-video-trauma-ptsd-cognizant-tampa
======
cldellow
This was a valuable article to read.

Facebook is enormously valuable. They made something like $15B in net income
in the last four quarters.

Content moderators are a necessary condition for that profit. If kiddie porn,
gore and animal cruelty flooded the network, it would cease to be a
destination visited by people that advertisers will pay to reach.

And yet, there are two sets of entry-level knowledge workers at Facebook:
engineers ($150k/year, benefits, upward career trajectory) and content
moderators ($30k/year, no benefits, likely going to acquire mental illnesses).

I understand the arguments about supply and demand of labour, but I'd have
more respect for Facebook if they demonstrated awareness of this issue. The
article talks about moderators re-evaluating the same piece of distressing
content that they've already flagged. Why? I suspect because the moderator is
cheap, and so Facebook isn't putting in the effort to ensure that every
judgment needs to be made the minimum number of times.

More so than salary, I suspect Facebook considers the moderator cheap in terms
of reputation risk. By outsourcing to contractors located offsite from main
campus, engineers aren't thinking daily about the absolutely horrible stuff
moderators are seeing, and so the one group doesn't impact Facebook's ability
to hire engineers. This is a guess - can anyone at Facebook speak to whether
engineers are aware of the working conditions of moderators, and agitate to
improve their lot?

~~~
ummwhat
I've talked about this somewhere else. What you're seeing is just a
manifestation of what I call "the fundamental problem of user created
content." Said problem is that warehousing and distributing content scales
insanely well but curation does not. Until we have strong AI, curation is a
manual process. Moderators just aren't efficient enough at processing content
for their output to pay for a full time salary. You can cut costs by making
end users into moderators (the Reddit model) but results may vary.

This problem applies to other forms of sites and content as well. The app
store gets hundreds of submissions in the game category per day. Hosting and
distributing that content is easy. Only a tiny fraction of those games are
going to be played and rated enough times to show up in a recommendation
engine. The bulk of the incoming content stream isn't being matched to
interested people at all (sorting by interesting is the same curation issue as
filtering by offensive).

Literally every content platform is going to have some problem similar to
Facebook. We can blame Facebook, but the reality is no one has a good
solution. Not even me.

~~~
hashkb
> We can blame Facebook, but the reality is no one has a good solution.

We can take a step backwards and eliminate Facebook, Insta, etc from our
societies and enact legislation that will govern the second generation of
social networks.

~~~
Jedi72
What legislation? Specifically? What do you want to make law, that they have
to hire moderators (which they already do?) or do you want to make it illegal
to post "bad things" on the internet?

~~~
hashkb
Let me start by saying that I am not running for elected office because it's a
really hard job; even though I might believe I would do it with more integrity
and rationality than the next person, I'm also smart enough to realize that
everyone thinks that way and I'd probably end up within a standard deviation
of average corruption.

What I'm saying, seriously, is that we need a society-wide blameless post-
mortem on social networks. It needs to be a slow, careful discussion, where
all the stakeholders have their voices heard, and we decide what is good for
us all. I don't know that we need them at all, I'm not sure they provide ANY
value to ANYONE, but as a non-user I'd of course be open to persuasion by
current victims. Erm, users. In the meantime, it should be illegal to operate
a social network. Nobody needs to go to prison yet, even though I think it's
horrifyingly clear at this point that the ethics are out the window.

One obvious one: advertising and public conversation must always be separated;
the same way no public schoolteacher may read scripture in class, it should be
illegal for an internet service that hosts public conversation (e.g. Twitter)
to allow sponsored content.

We should also establish guidelines for addiction. After cigarettes, drugs,
sugar, etc; we should as a society be prepared to understand the various forms
that addictive products can take and regulate them aggressively before they
become serious problems. UX patterns like infinite scroll, pull-to-refresh,
and push notifications are particularly suspect.

I also think we should close the apparent loophole in COPPA that allows
parents to post photos of their children on social networks.

> that they have to hire moderators (which they already do?) or do you want to
> make it illegal to post "bad things" on the internet?

I tend to come down on the "free speech" side of these issues as much as
possible; I would prefer unregulated public fora that (perhaps by requiring
identity verification) encouraged good-faith participation in substantial
discussions. I think if you are just fooling around, maybe you should head
down to the bar and get drunk with your friends and do your shit talking
there.

~~~
Macross8299
>In the meantime, it should be illegal to operate a social network

So what about the countless number of people whose livelihoods depend on
social media in some capacity? They're supposed to just be fine with being
irrevocably fucked over until the moral panic about social media subsides?
(or, in your words, "we decide what is good for us all")

Where does the small business owner who depends on a social network factor in
as a "stakeholder"?

Drugs, sugar, cigarettes are measurably and objectively harmful to your health
and that's why they are regulated (or should be).

There isn't similar comparable scientific evidence that social media is nearly
as harmful except for questionable non-reproducible psychology studies, so I
don't think it's comparable at all.

~~~
hashkb
I hear you saying you don't believe social media is addictive. Will you please
kindly open the "digital wellbeing" app (or whatever it's called on iOS) on
your phone and share the number of hours you've used social networks over the
last week?

~~~
reciprocity
I couldn't help but notice that your comment doesn't address the other parts
of his argument, nor did you choose to respond to the earlier reply you
received (by dragonwriter) outlining the legitimate flaws in the reasoning you
provided with your initial opinion.

Social media has a lot of problems - even this article on just Facebook
outlines a number of them [0] but I agree with the above poster in saying that
you can't have a societal post-mortem analysis of the effects of social media
given its very much _not dead_ state. Advocating that society just presses a
'shut down' button until "we collectively decide how we should proceed" is an
entirely unrealistic scenario.

[0]
[https://en.wikipedia.org/wiki/Criticism_of_Facebook](https://en.wikipedia.org/wiki/Criticism_of_Facebook)

~~~
hashkb
> nor did you choose to respond to the earlier reply you received

HN's posting frequency limits made that choice for me. The system is
unfortunately biased towards drive-by comments and against engaging with
feedback on your own comments.

Edit:

> Advocating that society just presses a 'shut down' button until "we
> collectively decide how we should proceed" is an entirely unrealistic
> scenario

But that's what I'm advocating. I don't think I'm obligated to respond to
people who only came by to reject the premise of my argument. More interesting
discussions are available to anyone who shows up with an open mind.

------
anon029102
Guy Rosen and other execs within Integrity team continually skirt their
responsibilities here. They claim they're doing better, but the second-order
effects of crappy work conditions and demands keep cropping up. Zuck says one
day we will hopefully be able to AI-away this integrity work (especially the
most traumatizing), but he does not say a whisper as to improving working
conditions or pay while the work needs to be done by humans. And I bet Zuck
wouldn't be able to handle the content that these people have to view. Sheryl
does not care. She keeps referencing the same standard schpiel about how
contracting companies have to abide by a strict set of standards, and how
they're ahead of the market in terms of pay and wellbeing. But it's still
awful. The divide between contractors and full-time workers at Facebook is
truly disgusting.

People who work at Facebook should be pushing for change. But they're numb to
the schpiel. They're cushy and looked after and don't want to create a fuss.

Rosen doesn't care. Zuck doesn't care. Sheryl doesn't care. What DO they care
about? Perception. Sit in any high-up integrity meeting and you'll see the
only thing they seem to talk about is how "x" would be received by users at
scale. There's no comment as to the ethics or corporate responsibility. You
can be talking about something pretty out there like how human rights
intersect with takedown decisions and all you've got is a bunch of people
umming-and-ahhing about lossy metrics and how Zuck wants this or that so we
better hurry up. Or how awwesome it'll look on our PSCs if we ship this thing.

Broken company.

~~~
deusofnull
You're right, Facebook is a broken company. Along that point, we should break
it up.

~~~
allthecybers
I agree and in a more enlightened future I hope we can assess companies like
this on their net benefit to society and apply penalties when they act in a
way that negatively affects society.

------
SolaceQuantum
" _Conditions at the Phoenix site have not improved significantly since I
visited. Last week, some employees were sent home after an infestation of bed
bugs was discovered in the office — the second time bed bugs have been found
there this year. Employees who contacted me worried that the infestation would
spread to their own homes, and said managers told them Cognizant would not pay
to clean their homes._ "

This is utterly nightmarish, given how costly to one's life bedbugs are.
(clearing out a home, including the replacing of all mattesses/couches, and
bagging or hot-cleaning all clothing, sheets, towels, rugs...)

 _" A manager saw that she was not feeling well, and brought a trash can to
her desk so she could vomit in it. So she did."_

This particular manager put their employees in danger of catching illness,
especially given what appears to be the open office floorplan where airborne
sicknesses can travel the entire room. I'm shocked and apalled, and this is
_the stuff I 'm comfortable quoting from the article_ to be shocked and
apalled by. The other stuff has convinced me to help inform friends and family
to get off facebook rather than passively clean myself of it only.

~~~
awakeasleep
This is industry standard in businesses with limited sick day or leave
policies. Especially in call center type environments.

Once an employee has used their allotment, they receive a write up if they
take off again no matter how ill. Too many write ups and you're fired.
Managers have no discretion in this process, so they can only mitigate the
impact by doing something like bringing the trash can over or buying hospital
masks.

Not saying it’s acceptable! But this isn’t a facebook problem specifically.

~~~
organsnyder
A local healthcare system has a similar policy: PTO (paid time off) is not
differentiated between sick days vs. other days off. They don't offer any sort
of maternity/paternity leave, aside from what the US Family Medical Leave Act
(FMLA) requires, which is six weeks unpaid time off. Even worse, they require
employees to burn up all of their accrued PTO days before taking any _unpaid_
FMLA time. So you have new parents returning to work sleep-deprived, picking
up new viruses from the petri dish that is their kid's new daycare, with no
time off. And most of these workers are HEALTHCARE PROVIDERS.

How stupid are we as a nation that we don't mandate more humane/non-stupid
policies, even for people in environments where coming to work unhealthy can
often result in death?

~~~
lotsofpulp
FMLA is 12 weeks unpaid.

~~~
omegaworks
Unless your spouse works at the same company. :D

>If a father and mother have the same employer, they must share their leave,
in effect halving each person's rights, if the employer so chooses.

[https://en.wikipedia.org/wiki/Family_and_Medical_Leave_Act_o...](https://en.wikipedia.org/wiki/Family_and_Medical_Leave_Act_of_1993)

------
_bxg1
I think the most pertinent question is _why don 't they quit_?

It says a great deal about how broken the United States' job market and social
safety net are. If minimum wage were $15, they could find another job that
paid their basic living expenses. If health care weren't left up to your
employer, they wouldn't be out of luck while looking for a different job. If
there were _any alternative_ , they wouldn't stay in this hellscape.

 _They stay because this is the best deal they could find._ Think about the
kind of society that makes that the case.

~~~
bena
That is true, but a common theme among many of the stories is how much more
they pay compared to other places.

These people are literally trading dollars for their mental health. That is
the choice they are making.

~~~
jedimastert
Yeah, 28,000 was barely a livable wage in the incredibly low-cost city I used
to live in. I can't even imagine trying to live on it in the cost-of-living-
helscape that is silicon valley.

~~~
_bxg1
Technically the offices are in Phoenix, Austin, and Tampa. Still, those are
major U.S. cities.

------
aboru
I am surprised after reading a lot of comments here (not all), that I have not
seen any discussion of Cognizant and their role. I am no fan of Facebook and I
believe that they have significant responsibility here, but the contractor is,
imo, the party directly responsible.

These people do not work for Facebook, and we don't know the nature of the
contract in play. Are they paying per person, or a lump sum for some capacity
at some accuracy rate. If Cognizant automated all of this would it be accepted
under the contract?

Anyways, I don't want to shift focus away from Facebook so much as wanting to
recognize the contracted call mpanies like Cognizant (which is what the whole
article is about btw, with some comments referring to Facebook). Accenture and
Cognizant really shouldn't escape the scrutiny just for being overshadowed by
a bigger name.

~~~
danso
It's true that Cognizant has direct power to change things, but ultimately,
the buck stops with Facebook, since they seem to be the vast majority of
Cognizant's work and thus essentially have direct control of the purse
strings. FB has the ability to change how Cognizant treats its workforce, and
it's Facebook's choice to take a stand or to wash its hands of it. FB also has
indirect say in demanding a certain standard ("98% accuracy target") for a
given amount of money ($200M) -- though obviously if FB were to simply pay
Cognizant more for the contract, there's no guarantee Cognizant would use that
money for better worker pay/benefits (as opposed to giving bigger bonuses to
executives, for example).

In the article, one of the contractors says that Cognizant puts up a "dog-and-
pony show" whenever FB executives visit. Again, it's ultimately up to FB to
decide how much they want to push past the facade.

~~~
MockObject
Why wouldn't the buck actually stop with Cognizant management? FB isn't
demanding these horrible labor practices, Cognizant is.

~~~
empath75
They’re paying rates that more or less require it.

~~~
Panini_Jones
Facebook is? What rate does Facebook pay and how much are these Cognizant
employees getting paid?

~~~
chris11
The assumption is that Facebook is selecting the contracting agencies based on
performance or cost. If Cognizant's performance doesn't meet the Facebook's
standards they will get dropped, the same thing will happen Cognizant isn't
competitive on price.

This downward pressure ends up directly impacting moderators. Cognizant needs
to keep payroll costs low so they don't lose the contract, and the contracted
accuracy target of 98% seems unrealistic. So moderators end up fearing for
their jobs when they don't meet accuracy targets.

------
arethuza
Just a warning - I found even a short description of some of the videos they
had to watch fairly disturbing.

I don't think I could do that job for very long - let alone in a badly run,
high pressure environment with low wages.

~~~
Pigo
I don't see how an average person could be expected to witness some of the
things mentioned in the article. I didn't have time to read the entire
article, but do they have counselors on staff or something?

That one paragraph about organs was enough to ruin my day, and it was just
text. I'm surprised such a "rash of videos" wasn't in the news somewhere.

~~~
arethuza
From the article:

 _He sought out the on-site counselor for support, but found him unhelpful.

“He just flat-out told me: ‘I don’t really know how to help you guys,’”_

~~~
colpabar
So the counselors were probably an afterthought. I'd even bet the decision was
made not as a way to keep the moderators from going insane, but to protect the
company's image.

~~~
enraged_camel
I mean, a counselor can only do so much in this type of situation, since the
patient is not in a position to remove the negative stimuli from their lives.

------
arethuza
Maybe Facebook should make all of their own employees do 15 minutes of
moderation per day - just to share the pain out a bit....

~~~
asark
Make the _users_ do a little every so often. Like Slashdot's "hey, go moderate
some posts" thing but forced to before they can use the platform anymore.
That'd be fun.

Of course the real answer is that this sort of site is a bad idea and
shouldn't exist. Wide-open signup and public visibility of content, or ease of
sharing it with strangers. Bad combo, don't care how much money it's making
them (and other, similar sites).

~~~
commandlinefan
> Make the users do a little every so often.

Oh, dear god, no - have you ever used Reddit? This is what happens when you
outsource moderation to the sorts of users who enjoy moderating other people.

~~~
nvrspyx
Have you ever used Facebook? It’s much worse and this is specifically in
reference to media that are against Facebook’s guidelines (e.g. nudity, gore,
etc), not all content as a whole. Users doing some moderating in this context
would simply be presenting flagged media randomly to users and asking if it
breaks the guidelines.

Very completely different things. Subreddits would be the equivalent of
Facebook Pages and they already have moderation tools for those that run the
page.

------
r3vrse
Dipping into whimsical analogies: this is a digital abattoir where the meat =
content.

Now, as before, no-one wants to see how the sausage gets made. Especially
those selling it.

Can't kill demand or bear the visceral truth. So instead we'll pretend the
seedy underbelly doesn't exist. Paper over dissonance with ethical codes and
platitudes.

Not new. Just a context shift in production of sustenance for the collective,
insatiable gaping maw.

~~~
Cthulhu_
Yup; bear in mind that pretty much ALL major user-generated content websites
have to deal with this - think Google (and I now wonder if G+, Picasa, etc
were shut down because they couldn't handle the inappropriate content
anymore?), Dropbox (who may do a passive one where they only investigate if
reported by law enforcement), Youtube (also Google), Discord, Slack, etc. It's
a problem everywhere.

I also believe this is one of the reasons where Facebook's real name policy
comes in - it discourages people from posting the worst of it. Animal
brutality could just be kids fooling around on their phone, accidental, but
produced child pornography is not, and the people making that shit know really
well how they shouldn't put it on sites that require them to use their real
names.

------
dalore
If they made engineers take turns working in the content moderation you would
soon see all sorts of improvements (like the aforementioned ability to
recognize duplicate content for starters).

They would hate it so much, that would make better tooling. But now they don't
have to know about it, just send images to the moderation team over and over
like they are robots.

~~~
Nasrudith
Couldn't it be just done in a more pleasant and efficient way by simply
tasking them on moderation assistance tools instead of misplaced retribution?

While putting themselves in the moderator's shoes may prove helpful they
aren't comparable in skillsets. If they were given a command line interface
for instance most mods would be puzzled. "Produced for self" and "produces for
content moderators" are differing needs.

~~~
oarsinsync
Dogfooding isn't really retribution, so much as a reasonably effective way to
better understand and capture requirements.

Using the things you produce generally helps motivate ironing out bugs and/or
improving the product overall.

~~~
michaelt
People in this discussion are misunderstanding one another because no-one's
making a clear distinction between "Engineers/managers responsible for content
moderation and with the power to improve the tools should have to do some
content moderation themselves" and "All engineers/managers should do content
moderation"

Andy the content moderation tools developer could certainly benefit from
experience using the tools he's working on.

On the other hand Bob the Unix admin who keeps the CI cluster online can't
iron out bugs in another team's product no matter how hard you motivate him.

Needless to say, if someone writes about Andy but a reader thinks about Bob,
the gap in whether the explanation makes sense might leave the reader
sceptical about the writer's claims, or even their true motivation.

------
james_pm
There were many reasons that led to me deleting my Facebook account in May,
2018. The fact that the platform is a magnet for the absolute worst of
humanity and then employs and exploits people in such an inhumane and cavalier
way to filter the garbage out was high on the list. Get Zuckerberg to do the
job for a few hours and see what he thinks.

~~~
lallysingh
I find that I haven't missed anything outside of people I haven't actually
talked to in 10-20 years. Which, frankly, is easier. It's nice to let my
social past stay in the past.

------
Verdex
So ... we all need to start flagging beautiful nature scenes, humorous comics,
lists of health tips, and job postings for less stressful jobs that you could
do if you're already qualified for being a facebook moderator?

At least that way they get a bit of a break from all the other horrible stuff.

~~~
panic
This is a great idea if you could get people to do it at scale -- it raises
awareness, improves the lives of the moderators, and forces Facebook to deal
with the increased volume of reports all at the same time.

------
ljm
I think this is also a symptom of the US’ shocking attitude to worker’s
rights.

The article says staff were constantly reminded of how easily replaced they
were, which is a euphemism for “you’re lucky we gave you a job.”

Facebook and Cognizant are majorly dropping the ball but US government and
legislation strongly enables that. As do individual states.

From a European perspective this is a sad article to read about work
conditions in “the greatest nation in the world.”

------
neuro
Social Media Content Moderation Team Lead

Cognizant Technology Solutions

Tampa, Florida • This is an exempt position, requiring day, evening, weekend,
and holiday shifts, as this delivery center is operational 24/7 and 365 days a
year.

Cognizant is seeking a team of strong Team Leads to manage a team of social
media content moderators for a global social media organization.

The Team Lead will be responsible not only for managing day to day operations
of the team, people management, performance management, but also help the
client determine gaps in processes, identifying innovative ways to solve
problems upstream and scale our operations.

Ideal candidates will be comfortable understanding social media, have an
appetite for research and gathering data insights, a high level of comfort
working with cross-functional partners, and a strong analytical mindset.
Successful team members have a passion for business success, strong attention
to detail, analytical problem-solving abilities keeping a high level of team
motivation and keen eyes for operational inefficiencies.

Responsibilities: • Provide mentorship, guidance and career development to
members of your team • Lead a high-performing team through an exciting
transition to build problem solving, critical thinking, analytical and
technical capabilities which will enable the department to develop deeper,
more scalable solutions • Team management responsibilities for a market team,
whilst also serving as a cross-functional and a global liaison in developed
areas of expertise • Establish team goals and work with direct reports on
strategies for executing, measuring progress and sharing results • Deliver
projects involving quantitative analysis, industry research, and strategy
development, working directly with global cross-functional teams to problem
solve analytical approaches and develop solutions • Identify actionable
insights, suggest recommendations, and influence team strategy through
effective communication • Advocate for users within their market, partnering
with global and cross-functional teams to develop global solutions

------
bpyne
From the article, we don't know how much a moderator "can take" daily. Perhaps
studies haven't been done. Perhaps it's too complicated a subject for good
tests. But, they could check with organizations who have experience already
with employees having to watch horrific crimes.

The chief information security officer in my organization came from our state
police where he headed the internet crimes against children division. People
on his team had to watch videos like the ones described in the article.
Despite having all hardened officers on his team, team members regularly cried
at their desks while watching videos. His team members had to be taken off-
duty for weeks every six months to recover. They had mandatory counseling even
when off-duty.

I would think FB and other large companies could look at the measures taken by
police forces with similarly disturbing jobs as a guideline for the
contractors.

------
arzeth
I have a hotkey in i3wm:

bindsym $mod+backslash exec "xcalib -i -a"

which inverts all colors (btw, I have to `killall redshift` because of a bug).
When colors are inverted, I feel almost undisturbed at any gruesome content
(for me it's like seeing screenshots of Quake 2 with buggy GPU drivers), yet I
can still easily recognise whether the content is disturbing. And when it's a
video, I play it at ≥2x speed so that it wouldn't feel realistic for my brain.

I wonder whether those moderators use these two lifehacks.

~~~
bamboozled
It seems like moderating audio is also part of the job and equally disturbing
as the video.

Not sure how you get around that.

~~~
knd775
Chipmunk voices? Might make it feel less real

------
deogeo
Yet another case where NDAs are abused to cover-up corporate misdeeds. I think
their enforcability should be _severely_ restricted. With penalties if a
lawyer includes them in a contract, despite knowing they are invalid, so
invalid terms can't be used to scare workers not familiar with the law.

------
spunker540
“But had his managers asked, they would have learned that Speagle had a
history of anxiety and depression“

Should employers really be asking about mental health history during the
hiring process?

~~~
mpclark
Of course, if the job involves risks to mental health!

~~~
cloakandswagger
Queue up the next Verge article in 6 months: "Facebook discriminates on the
basis of mental health"

~~~
SomeOldThrow
I mean they really should be in this case. This is not a normal job.

~~~
bena
The real solution is that they present the reality of the job and let the
candidate make the decision to proceed or not.

That way the company presented the job as is. And if the candidate takes the
job, they own some of the responsibility for the consequences.

~~~
SomeOldThrow
I can certainly agree with that.

------
comboy
> But as the weeks went on, the video continued to reappear in his queue (..)
> They kept reposting it again and again and again.

Seems like it should be pretty "easy" to spot similar videos and audio (even
after some modifications) given how great dataset all those moderators are
providing.

I'm pretty sure there are some pretty smart folks working at fb, so it would
mean that given accuracy standards it's still cheaper for them to hire humans
to do the job.

~~~
Buttons840
That part of the article says they allowed the video of animal abuse to remain
visible so that law enforcement could do something about it. Of course, that's
bullshit for many reasons.

Ban the user and the video, if the same video is posted again, autoban it. If
the user is law-enforcent, allow them to see the details of all related
content that was banned.

There are neat things called if-statemets that could do that.

~~~
Cthulhu_
Yeah, I mean Facebook can open up a portal for all local and national police
precincts and law enforcement that goes "This is the content that our systems
and content moderators marked as illegal, this is the exact location, date,
time it was recorded, our facial recognition software identified these and
these individuals", etc.

However, I just also described how Facebook could easily be used in a
dystopian police state fashion. I am fairly sure they aren't allowed to
proactively report crimes. And that law enforcement would be overwhelmed by
the sheer amount of reports coming in all at once.

~~~
Buttons840
Allegedly they are already altering their behavior to assist law enforcement.
I'm not suggesting they put a lot of effort into helping, but being able to
remove a video from the general public without stopping law enforcent
investigations seems reasonable.

------
the_duke
Couldn't violence (both towards animals and humans) be detected in an
automated fashion?

Especially screams in the audio should be fairly easy to find.

Then block those videos by default, with a manual appeals process that sends
it to a moderator, combined with a big warning that submitting videos against
the TOS will get you suspended. Of course this could lead to people submitting
videos without audio, but this will always be a cat and mouse game.

Or is FB under legal duty to review potentially criminal content?

~~~
dexen
_> Couldn't violence (both towards animals and humans) be detected in an
automated fashion?_

Even if it could (and parallel posters argue it cannot), there is a meta-
problem: "Who is allowed to post violent content?". Violence is contextual,
and also legality of posting it is contextual. Same goes for other
objectionable materials (nudity, extreme ideologies etc.).

Point in case, if _all_ users were subject to the same criteria, the big names
in media would quickly get banned. And/or would enter war path with Facebook,
accusing it of heavy-handed censorship against the freedom of the press.

Same for reportage of popular events with any degree of nudity. A news piece
about a Femen protest, a Pride event, a World Naked Bike Ride, maybe even the
No Pants Subway Ride would cause a ban.

Likewise, anybody documenting & discussing historical events, and anybody
documenting & discussing present-day uprising, revolutions, civil wars,
persecutions, etc., would quickly get banned, essentially sweeping a lot of
abuse under the rug. Probably even discussion & documentation of domestic
violence would have this problem.

As a lighter aside, cartoonish violence (video games & movies) could also
easily fall prey to automated violence take-downs.

All in all, Facebook really really wants to give certain users (mostly press &
historians) broad, sweeping exceptions to the general rules.

~~~
lallysingh
Train with signals about the sender?

~~~
the_duke
Also my first thought, but thinking it through, this can easily lead to
(potentially valid) criticism of intrusive profiling and censorship.

~~~
raxxorrax
And that would be bad because...?

------
hirundo
> Nobody’s prepared to see a little girl have her organs taken out while she’s
> still alive and screaming.

On the one hand, of course you don't want to see this and of course you want
it removed from your social media feed.

On the other hand, if it's hidden, it horrifies fewer people, and instead of
rising social pressure to take action against it, it can fester in the dark.

So if we manage to replace a significant chunk of centralized, moderated
social media with decentralized, unmoderated alternatives, many of us will be
exposed to more of this kind of evil. But as a result more of us will be aware
of it and motivated to fight it.

I'd rather participate in unmoderated media, even at a greater risk of being
assaulted by this kind of crap. But at this extreme I can sympathize with
those that want straight up censorship, even if sunlight is a better long term
disinfectant.

~~~
pjc50
Context matters: it's this kind of thing that was used to promote the
massacres of the Rohingya. Or consider the various things that are posted
about Syria. Pictures of atrocities without clear attribution of those
responsible just make the situation worse and provoke reprisals against the
wrong people.

------
solidsnack9000
_I asked him what he thought needed to change.

“I think Facebook needs to shut down,” he said._

------
neuro
This seems to be the place

Cognizant Technology Solutions Woodlands2 7725 Woodland Center Blvd, Tampa, FL
33614

There's another big story here, it may get more horrific, from browsing their
phone directory most of their "employees" appear to be people of south Asian
nationality. Given the circumstances and their predatory behavior, I imagine
they are also taking advantage of H1Bs.

~~~
dRaBoQ
Where do you see their phone directory ?

------
martin1b
The issue is not as much Cognizant or FB as it is the state of the public. The
horrible acts posted online for entertainment by the public is the reason for
companies like Cognizant. I hope these posts can forwarded to local law
enforcement and the posts used as evidence against the poster.

~~~
cldellow
The article implies strongly that Facebook prefers to focus on the short term
task of blocking offensive content and only pays lip service to the long term
task of pursuing criminal charges.

I'd definitely believe that that is the case, as there's little economic
incentive to play the long term game with its uncertain payout if you can just
churn through low-wage content moderators instead.

Users of Facebook, but more realistically, employees of Facebook are the ones
best situated to pressure Facebook to change that behavior.

~~~
britch
There's also the possibility of passing a law around this issue. It doesn't
have to be entirely market forces.

------
ycombonator
Cognizant is a ‘win at all costs’ major outsourcer. Their entire operations
and staff are based in India and they are registered in US to soften the
regulatory hurdles. It’s no surprise they didn’t have a defibrillator in the
building.

------
jokoon
I would have thought they would have the best filtering AI and dataset the
world has to offer, to avoid having those moderators to work like this.

~~~
Avshalom
a lot of "AI" is just poor people.

in general response to this article there was a thread on twitter with some
good links
[https://twitter.com/max_read/status/1141336189800239107](https://twitter.com/max_read/status/1141336189800239107)

------
Balgair
I feel that we're going to view these places and these practices the same way
we view children in coal mines or young women working with radium paint. I've
got a feeling that though we know very little about this now, our children and
grandchildren are going to know a lot about this.

Dear Lord, what a horrible thing.

------
luckylion
There seems to be a very distinct difference between employee classes.
Management and high value tech employees are treated very well while
moderators, gig workers etc are treated as "human resources", quite literally.

It seems to me that they put companies like Cognizant between them and the
exploited workers to deflect criticism as "we didn't know, we've been told by
our partners that everything is great".

And on a technical level: if true, how can it be that FB needs the same videos
and pictures to be moderated over and over? Are they not using any form of
content id? Is developing such a system more expensive than running content
moderation sites and then swiftly firing the burnt out moderators?

At what point do you become complicit if you're working for Facebook?

~~~
jsgo
The only problems I see are that someone could potentially upload a lower
resolution version of the video a moderator has seen and then perhaps bypass
the content id system (I don't know how tolerant they tend to be for
situations as that).

There's also the aspect of if we make it search out for distinct frames (like
"scene change" type things), there's the possibility that some safe things
could become auto-moderated which would be another controversy. Perhaps if
they could locate start/stop times for inappropriate elements of the source
video and then check for videos containing any of those in a later uploaded
video, but then instead of asking human moderators to discern "is this video
good or bad" we're now forcing them to get timestamps of when bad elements
take place which would force them to _really_ watch the video.

Dunno, I'm of the belief that it's just not an easily solved problem. I hope
I'm wrong and they can toss money at the problem to fix it, but I genuinely
don't know how they could.

~~~
luckylion
Even with lower resolutions, fingerprinting (on a frame level) should still
work relatively well, shouldn't it? If you get a certain level of confidence
on each frame - if you have 80% of the frames with 80% confidence to be the
same as $someVideo, that ought to be good enough overall to not get false
positives but catch repeat views - how many 30s videos will randomly share 80%
of their frames with a 90s video of an animal being abused?

I assume that most of these videos aren't manipulated by professionals to
escape a ban by looking completely different (which actually would make it a
different video), but that they are mirrored, cut, have extra material
appended or, as you suggested, have their resolution changed. I'd guess that
videos are easier to ID than, say, image data, because you have plenty of
material to work with.

Are there well-organized troll armies attacking Facebook? Because trying to
sneak a video in that shows an animal being abused and doing sophisticated
changes to evade detection sounds weird on a platform where there isn't even
(as far as I know) monetary gain connected to posting a video (as it might be
on YT or for Email spam).

~~~
jsgo
I started playing FFXIV a bit over a month ago. During that time, there was
someone in the Novice Network (new player advice channel in game) posted a
"does this look like an okay PLD build?" with a URL. I click the URL and it is
a video of something being held down and attacked with a hatchet or something
and it sounded like a child (as soon as I realized what was going on after
like 1-2 seconds, I closed the video then reported the user and blocked them.
It was a bit much and not the type of content I'd seek out, so that's probably
why details are hazy).

Reason I give that anecdote is for the last part: I don't think it is about
monetizing this type of content and that they're only after the shock factor
of it.

Again, maybe they could automate safeguarding against it, but my guess is
they'd find a way to get through. Like take a 5 minute YouTuber video and
inject a slightly modified version of the shock video in it with hopes it
bypasses the filter. Don't know, but wouldn't put it past anyone that would do
this sort of thing.

~~~
luckylion
That sounds tough and like you were in luck that you reacted quickly.

> Reason I give that anecdote is for the last part: I don't think it is about
> monetizing this type of content and that they're only after the shock factor
> of it.

Possibly, I assume so as well, but I believe there's a difference in
commitment. If I'm making money off of something, I'm going to put a lot of
effort (at least until I approach $0) into evading filters, and can easily pay
for advanced technical solutions (possibly hand-crafted for my purpose). If
it's to mess with people, my effort and monetary investment will be much
smaller. If I need to put significant time into each video upload only for it
to be blocked after a handful of views, it doesn't scale. Trolls are an issue,
of course, but video seems to be so hard to create/manipulate at scale and
easier (I think) to check (compared to text for example), at least until there
will be something like GPT2 for video.

------
dsfyu404ed
I've worked some shit jobs over the years an I can't say I wouldn't be very
tempted to try working there for $15/hr if I was still in the market for that
kind of job and everything else around is paying closer to $9. I know HN likes
to grandstand about how important workplace conditions are but when you're
doing unskilled jobs you don't have that luxury. If you get the opportunity to
do a worse job for 66% more money you take it and try to do it well enough to
keep cashing that check.

I know that a lot of people here are probably bothered by the fact that the
workplace sounds like impersonal zoo but most of these minimum wage workplaces
are along that spectrum. Dirty bathrooms, inflexible policies for everything,
etc. are normal. It's just how workplaces like this are. If you want a shitty
job that's more personal and flexible then you need to work for a small
employer but that can have its downsides too.

Sure the content itself sucks particularly but that's why they pay bigger
bucks than everyone else if you can cope then good for you. If you can't you
leave. It's just like a call center but more extreme.

Don't mistake me as defending Facebook or Cognizant here, I'm not. I'm just
saying this that all things considered this doesn't seem like a particularly
bad deal as far a shitty unskilled jobs go. They all have their pros and cons
and you gotta find something that works for your personal preferences. I'd
take digging ditches over anything with rigid corporate policy about every
facet of my job but that's just my preference.

Personally something I think would help a lot would be if they'd over-staff
the place and schedule people for few enough hours that they can easily hold
another job and make efforts to accommodate people's other commitments. Just
doing content moderation every day is probably where a lot of the mental and
physical health issues comes from. In my experience the kinds of jobs that
really grind you down grind you down a lot less when they're your side gig to
some other slightly less shit job.

------
almost_usual
Facebook really doesn’t need to exist.

------
mengmeng
Dear Facebook Employees and Zuckerberg: More than the other megas, you claim
moral highground. You have a passion culture where people `actually believe`
they are doing something great. To this day, even with the retarded masses and
corrupt congress thinking something is wrong, you `still believe` this. So
where the fruit of your labor. Where has the blessings of your existence
improved the lives of the unfortunate? where? where? im asking you right now
reading this looking into your eyes. where?

well we've heard the line that the tech isn't there yet. or how you have
learned from past attempts to help and are taking more targeted approaches.

ok, that works for the retarded masses and corrupt congress.

but I call bullshit. its actually more than that, its malicious intentional
lying.

you know how you can help? heres a really easy way. take some of your money,
give it to these slave/content-moderators.

100,000 to you is not 100,000 to them.

how much would you need to pay me to watch that shit? zuck man, you can't give
me your salary to watch that. that shit damages you for life. and you think
its worth whatever?

yall walk around (figuratively, on the internet) like you have something to be
proud of. donald abraham is confident as shit.

but you know, yall are just like any other mega. sure each mega has its own
demographic. but the entire reason for the existance of a mega ensures 100%
you will engage in unethical behavior. and you will recreate the extremely
oppressive society we live in everyday. Furthermore, you will recreate the
belief in the opposite: things aren't so bad and we `good peoples`

------
b3lvedere
"He watched videos of people playing with human fetuses, and says he learned
that they are allowed on Facebook “as long as the skin is translucent.”

Wow. Just Wow.

~~~
lazugod
This sounds like a fantastic example of basing moral guidelines around what is
easy for a computer to detect rather than what is actually moral.

We know that FB has nudity detection systems, due to their past problems with
banning breastfeeding discussion groups.

~~~
LucasLarson
Unfortunately, it’s human work. The paragraph reads “For the six months after
he was hired, Speagle would moderate 100 to 200 posts a day. He watched people
throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He
watched people mutilate the genitals of a live mouse, and chop off a cat’s
face with a hatchet. He watched videos of people playing with human fetuses,
and says he learned that they are allowed on Facebook ‘as long as the skin is
translucent.’ He found that he could no longer sleep for more than two or
three hours a night. He would frequently wake up in a cold sweat, crying.”

------
doctorRetro
“I think Facebook needs to shut down."

As I've spent the last few years pissing away absurd amounts of time on the
platform, gotten in countless fruitless arguments, and seen the truly vile and
toxic elements of my communities exposed and worn like a badge, this is an
idea I've been thinking an awful lot about. After reading this article, I've
never been more certain of that statement.

~~~
hvs
"We have met the enemy and he is us."
[https://upload.wikimedia.org/wikipedia/en/4/49/Pogo_-
_Earth_...](https://upload.wikimedia.org/wikipedia/en/4/49/Pogo_-
_Earth_Day_1971_poster.jpg)

Shutting down FB will just create this problem in the next popular social
media platform. This is a problem that needs to be solved, and shutting
companies down won't do it.

~~~
50656E6973
That's like saying there's no point in putting out house fires, because
there's always going to be somewhere else that catches on fire.

When a social gathering (whether its a party at a club or a riot in the
streets) exceeds capacity and becomes dangerous, destructive, and out of
control the police shut it down for the good of public safety.

~~~
hvs
That's the opposite of what I'm saying. I'm saying you need to put out the
fire not just let it burn down and build a new house. Houses burn, find a way
to put them out, don't just plan on creating new ones.

~~~
50656E6973
There comes a point when a fire gets so massive that its foolish/impossible to
try and put it out, as it would just endager more lives. Human lives are more
important than buildings

------
firefocus
Basically, you get paid $15 an hour to poison your subconscious mind.

Absolutely unethical. Facebook should not be allowed to hide behind
subcontractors.

------
prirun
The article mentioned that "In May, Facebook announced that it will raise
contractor wages by $3 an hour". If Facebook is dictating the wages of content
moderators, then it is an employer, and all of these "contractors" should be
reclassified as such, with Facebook paying all back wages, benefits, etc.

Here are some other ideas:

Instead of having content moderators watch 200 of these videos per day,
distribute the load over the entire Facebook workforce. Or, when a content
moderator flags a video for removal, verify that by making a regular Facebook
employee watch the video to confirm the removal. This should increase the
accuracy rate above the 98% threshold Facebook has set. That should put a
finer point on the problem.

I could barely read descriptions of these videos. There's no way I could watch
one, let alone 200 a day. This work is like being in a war, and they should be
paid like a soldier who is risking their life / mental health.

------
apanloco
800 workers, 1 bathroom. Really?

~~~
dillonmckay
Sounds like my current HQ.

I have worked for too many startups with inadequate bathroom setups.

It is a big red-flag for me now.

~~~
joezydeco
I've used this as a hidden criteria for judging a company during an interview
cycle. I always ask to use the bathroom and then see how clean it is, and if
it looks like it's been maintained.

I visited one private company where I talked to the owner - his office was
plastered with pictures of his racing Porsche but the bathroom had rusted
metal and peeling paint. Took me about 5 seconds to see where his priorities
lay and I bugged out of there.

~~~
dillonmckay
The other red-flag was seeing all the framed photos on the CEO’s desk, of his
family.

All the pictures were facing away from him, so it seemed like he was
advertising his family to anybody sitting in front of his desk.

Maybe he didn’t like looking at them?

Very bizarre.

~~~
joezydeco
Wow, that's messed up.

All I can figure is that he's trying to soften or deflect anger at him when
people come into the room. How can you be mad at him when he's just a simple
family man?

------
thtthings
All this content reprograms our brain. Even if a sane person works for FB for
content moderation they will be screwed up pretty soon.

I am so glad i am not on facebook since 2008. If you think it's not messing
with you, THINK AGAIN. Facebook is the worst thing that happened to us, except
for Mark and it's employees

------
negamax
This is done for Youtube as well btw. Not a FB only problem.

------
cronix
Don't fret, FB mods. You'll soon be replaced by algorithms so you won't have
to do this much longer. You know, by those people making a min of 4x what you
do and look down upon you. They'll tell ya to learn to code as they chuckle
under their breath.

------
close04
> Florida law does not require employers to offer sick leave, and so Cognizant
> workers who feel ill must instead use personal leave time.

It seems that Facebook and Cognizant are not the only ones completely failing
at protecting their people (employees, contractors, citizens).

~~~
save_ferris
Lobbying plays a huge role in how legislative decisions like this are made.

The city of Austin recently passed a mandatory sick leave policy, only to be
struck down at the state level after lobbying by employers.

It’s not that companies like Facebook simply fail to protect their people,
they’re financially incentivized to undermine their rights. And since money ==
free speech in the US, it’s perfectly legal for them to do so.

~~~
close04
> they’re financially incentivized to undermine their rights

In the end most failings can be traced back to financial interest (getting
cheap, staying cheap, aiming for cheaper). But the whole article was about how
Cognizant and Facebook fail those poor souls who have to go through a
"peaceful war", with all the gruesome tragedies but none of the preparation.

The state doesn't seem to try harder either. This particular detail related to
sick leave was surprising to me, I hadn't expected this.

------
aerovistae
Where are the organ harvesting videos coming from, I wonder? Anyone have any
thoughts on that?

------
gerbilly
This might be an opportunity for facebook to do a lot of real good for the
world.

They should have an investigative unit that tracks down the source of this
material and partners with law enforcement so we can catch the bastards doing
and posting this horrible stuff.

------
skizm
They should pay users small amounts of Libra for accurately moderating
content. It completely solves the problem and they get to pay their new, more
willing, contractors in what basically amounts to monopoly money (in the near
term at least).

------
ralphstodomingo
I'm sure such scale requires heavy moderation. It pains me to realize the cost
of enjoying the benefits of social media - if any, at all - that it makes me
wonder whether we should have it in the first place at all.

I can imagine a world without Facebook.

~~~
overthemoon
This is what I keep coming back to. I don't think Facebook is worth it. I
would personally extend this to most, if not all, social media.

------
s_dev
I live in East Wall, Dublin -- beside the Facebook moderator (The Beckett)
building.

I wonder if this is true for them as well or if it's just N. America -- we've
better employment protections and minimum wage in Ireland compared to the US.

~~~
anon029102
AFAIK Content reviewers in the Dublin office work for Facebook as perm
employees (are not contractors), and mostly deal with escalated cases from
offsite contractors.

~~~
disgruntledphd2
Nope, that's not true (unfortunately). Most of the moderators in Dublin
(approx 75%) are contractors, but there are a lot more full-time moderators in
Dublin, as global moderation (ex US) is run from that office

------
ryanmarsh
I read the bit about the children having their organs harvested while alive
and didn’t believe it so I did some googling and now I’m done for the day. I’m
gonna go hug my kids.

~~~
kilroy123
I'll take your word and skip on the googling myself. Seriously disturbing
stuff.

------
starpilot
How much of this originates with FB/Cognizant? This seems mostly
characteristic of other low-paying job work environments. You're not going to
get the cultural elite paying $15/hour. You get desperate people with patterns
of maladaptive behavior and dysfunctions. You get horseplay, various illnesses
that come from bad home environments, and overall balls of stress that are
just looking for a job, any job.

------
gopher2
Would be interested in seeing some kind of legislation where content
moderation jobs that deal with XYZ categories of content must be compensate at
least some % downtime/recovery. So e.g. in an 8 hour day you can only spend 4
hours a day doing moderation, and 4 hours of 'paid mental preparation' for
doing the moderation work.

------
atishay811
I imagine if Facebook could use its users to moderate content from others, say
you are sign up for moderation, then have to moderate 20 posts from unrelated
users to get 20 days of ad free Facebook. Facebook could employ a system like
re captcha uses to identify users faking it.

Win-win?

~~~
dRaBoQ
Facebook won't be able to install the necessary measures to prevent the
moderators from saving the videos onto their own machines.

A lot of the bad content is highly desired by criminals and can even be
profitable to sell on the dark web (e.g. child abuse).

------
firefocus
Basically you get paid $15 an hour to poison your subconscious mind which is
95% of your life.

Absolutely unethical.

------
toss1
>> "Speagle vividly recalls the first video he saw in his new assignment. Two
teenagers spot an iguana on the ground, and one picks it up by the tail. A
third teenager films what happens next: the teen holding the iguana begins
smashing it onto the street. “They beat the living shit out of this thing,”
Speagle told me, as tears welled up in his eyes. “The iguana was screaming and
crying. And they didn’t stop until the thing was a bloody pulp.”

>>"Under the policy, the video was allowed to remain on Facebook. A manager
told him that by leaving the video online, authorities would be able to catch
the perpetrators."

Utterly disgusting and inhumane policy on facebook's part.

Continuing to display abject animal cruelty only lowers the bar for would-be
imitators.

There have also been shown to be strong links between animal cruelty and human
cruelty, including murder.

To be clear, the ONLY proper way to handle this is to immediately take it down
and file a report with the relevant police agencies. Expecting the local
police agencies to maintain the same staff of 30,000 people to monitor FB
posts for crime is stupidly absurd.

This is at best depraved indifference on FB's part, and more likely a
deliberate dishonest rationalization to keep posted something extreme that
will get lots of 'views' and 'engagement'.

This is only one of millions of examples, including cooperating with the
Russian / Cambridge Analytica / etc groups to corrupt elections in the US,
England, and elsewhere.

Simply put, Facebook is deliberately poisoning society for profit, and far
worse than any tobacco company ever did.

They need to be shut down. Now. There are far better ways to do everything FB
claims to do.

(edit: add police report paragraph)

~~~
iBasher
After trying to find the video I've found that iguanas are an invasive species
in Florida and blunt force trauma to the head is the legal "humane" way to
kill the invasive species. Not sure if that's what's going on in the video,
and if it took multiple swings it certainly wasn't done correctly but maybe
look into it before assuming things.

~~~
toss1
There are plenty of invasive species that I'm 100% in support of eradicating
from their invaded territories. This is no justification for cruelty of any
kind, nor promoting cruelty, which this video and Facebook does by leaving it
posted. No assumptions needed.

Assume that you are 100% correct, the species is undesirable invasive and that
is the proper procedure.

Repeatedly bashing their head while the creature is obviously screaming in
pain is blatant cruelty. Not only do all the previously stated reasons to pull
the video & report it to the police still stand, you provide yet another
reason to pull it -- it shows a horribly bad example of how to do it,
providing negative instruction and promoting bad & cruel technique.

Moreover, based on this and numerous other articles and events (e.g., leaving
posted a doctored video defaming the person 3rd in line for the US
Presicency), this sort of behavior is fully consistent with Facebook's modus
operandi.

Simply, FB are happy to poison society for profit, and disingenuously justify
it in the name of 'free discourse' and by claiming to have no editorial
control.

We must at the very least hold them responsible for their content.

~~~
igravious
User's account is new and the name is "iBasher" (iguana basher?). Proceed with
caution.

~~~
toss1
Oops, yup, I don't even usually check user names, not accustomed to trolls on
here.

Some of the responses and voting patterns are indeed starting to indicate that
an infestation of trolls is starting...

------
Havoc
Even second hand in article format that’s disturbing.

A classic sweatshop operations except it also destroys the slaves psyche on
top of it.

Two parties weren’t really mentioned. The executives that set up the operation
and the people posting the content. Both must be pretty twisted

------
saagarjha
> Work stopped while we were there to ensure we did not see any Facebook
> user’s personal information.

Do users know that their personal information is being looked through by
employees who don't work for Facebook?

------
nonwifehaver3
I haven't used Facebook since 2012 so maybe I'm missing something. When
someone posts a video of them torturing a dog under their own name, or some
other sick thing, does that not cause some sort of problem with their friends
and local community? Many would permanently block and ostracize an
acquaintance for such a thing. Why does nobody call the police on someone like
that, especially when they know their work/address/etc in real life?

I guess I don't understand the key change in the medium or the culture that
requires Facebook's ponderous rule tome and 10000 anonymous content moderators
in Manila or Phoenix. Is this just about the risk of some corporate ad being
next to some undesired content for a few page views?

------
save_ferris
It’s strange how quiet the pro-Facebookers become on threads like these.
Yesterday, we saw a lot of energy around Libra on both sides of the Facebook
spectrum.

I’d love to hear an argument from someone defending this company that isn’t
“everybody does it.”

~~~
holidaygoose
It seems like content moderation is just a crazy wicked problem. And the
company is doing the best it can with a lot of constraints. Better
communication with the workers sounds like it would help, but what else should
they be doing realistically?

~~~
anon029102
> what else should they be doing realistically?

\- Pay them better

\- Give them proper healthcare

\- Don't isolate/silo them

These things would enable them to better deal with the inevitable stress and
secondary PTSD that comes with their work. And would help FB perms to observe
difficulties and quickly affect change.

~~~
bilbo0s
Well, they should be elevated above the other departments to handle the
obvious security and conflict of interest issues that naturally exist in any
company like this. So their salaries should be much higher. And actually, more
important than healthcare, which the employee takes advantage of at the
employee's own discretion, there should be mandatory mental health counseling
and screening. I don't know what the frequency should be, I'm not a clinician.
But my layman's guess would be a minimum of 3 times a year for each employee.

But I disagree about siloing them. I mean, I'm sure there are some pretty good
security reasons for siloing content moderation off from other parts of the
company. Not saying that anyone from FB would necessarily do the following,
but imagine an ad sales bonuses start going away because clicks are down. You
just can't have ad sales cooperating with content moderation in any way shape
or form to get more clickable content through. There should just be a content
policy, and content moderation zaps whatever they please. End of story. That's
how it should work. If ad sales wants input, they should have to convince
legal to change the content policy.

In other words, if this thing were structured correctly, content moderation
would be _above_ most everything else. (Everything other than legal.) And
completely untouchable via any mechanism other than an official change of the
acceptable use and content policies.

------
phosphophyllite
Maybe censoring is not effective?

Why not to embrace any content and just make police raids when someone uploads
questionable content?

Is censorship solves anything?

~~~
bilbo0s
Um...

police busting down your door for uploading a video kind of _is_ censorship.

But yeah, I'm now starting to see why more and more people are just wanting to
go to the censor and police raid system. A lot of this stuff people are
uploading just doesn't belong in a civilized society. What started off as
maybe just content on making non violent jokes about blacks or gays has
morphed into showing children being disemboweled and videos of little old
black or jewish ladies being gunned down in their place of worship. It's just
gone too far.

Probably just have to file it under:

"This is why we can't have nice things"

~~~
pjc50
> maybe just content on making non violent jokes about blacks or gays

The nonviolent "jokes" have a habit of escalating into real violence. If you
let enough people post N-word rants without consequence and with validation
from their hateful peers often enough they will egg each other on until
eventually someone burns down or shoots up a church.

~~~
bilbo0s
Yeah, I guess that's what I'm starting to see.

Like I said, "This is why we can't have nice things."

~~~
saalweachter
Ultimately you can only have one absolute principle or right; if you have two,
eventually they will come into conflict and you have to decide which is
_really_ absolute and which is just _mostly_ absolute.

------
neilv
Where is the Facebook walkout, to demand humane working conditions for
everyone at Facebook and its contractors?

------
mythrwy
Why are the same videos coming up again and again?

Of everything, that seems like an easy problem to solve.

------
pfortuny
I cannot believe they require signing an NDA for this job. You cannot JUST
VENT?

------
wrongdonf
PSA: I watched a video and I got PTSD.

I used to casually browse a subreddit called watch people die. In 2016 I
watched a video on that subreddit that gave me ptsd. At that point in time I
had watched probably hundreds of intensely graphic videos, and the total
pieces of intensely violent media I had consumed probably numbered in the
thousands. I had been into it since 2008. I did it because of morbid
curiosity.

At first I scrolled through the comments and noticed something very unusual:
very emphatic comments warning people that videos can give you ptsd. Most
videos have comments where people talk about how “I couldn’t even finish it”
or whatever, so I brushed it off. After watching the video I immediately knew
something was wrong. My body felt strange. My mind was in a state of hyper-
tension or vigilance. It’s very difficult to describe. I also noticed that my
libido would come and go in waves. I would go from not feeling any sexual
feelings to being more horny than I’ve ever been in my life. I knew that
something deep inside of me had been deeply affected. I went to sleep without
much trouble. When I woke up I went into the bathroom to brush my teeth. I
felt something coming over me. A sensation of panic. It surprised me because
it was out of the blue and I’ve never felt something like that before. I then
entered a full blown panic attack, which rocked me so hard that I fled the
bathroom and threw myself on the couch. It passed, but I was drowning in
anxiety and a sensation of doom. At this point I knew that I may have
permanently fucked myself. I was scared. but I still had to go to work. I
spent the next few weeks forcing myself through each workday while being
suffocated by an overwhelming sensation of doom, anxiety and panic. It was the
toughest thing I’ve ever done. But I got through it.

I noticed many things that I later learned are indicative of ptsd: having your
eyes lock up, feeling ready to fly off the handle at the slightest
provocation, an intense desire to subdue the anxiety with alcohol and drugs. I
would walk down the street like an insane person, ready to rage on anyone who
even looked at me wrong, and I had no history of anything like this.

I never had bad nightmares or trouble getting to or staying asleep, so I think
I had some kind of light beer ptsd. But it was hell on earth. Everything I had
heard about veterans losing their jobs and killing themselves all of a sudden
made so much sense. Take it from me: ptsd is one of the worst things that can
happen to you. And I didn’t even have full blown ptsd.

I got help from a few therapists, and they informed me that if my symptoms
persisted more than a month or something, I would technically have ptsd. Other
than that, the therapists were basically of no help whatsoever. The symptoms
lasted well beyond three months.

As time went on, the symptoms got better. They seem to have stabilized now. If
I’m distracted, I feel normal. But if my mind is idle then my thoughts always
go back to it and with those thoughts comes the anxiety. Long drives can be
uncomfortable. I’m at a state now where I’m in the clear: the symptoms are
weak enough that they don’t threaten my ability to work and bathe and etc. and
my ability to recognize and cope with the symptoms has increased a lot too.
But it still bothers me sometimes and I am keeping my eye out for breakthrough
treatments. Sgb and mdma look promising.

A thought can either be in your mind or not. When I’m feeling symptoms, it’s
almost like the memories are somewhere in my mind, lurking. But other times
they aren’t around. It’s like they have a life of their own. It’s something
you don’t have control over.

The best coping mechanism I’ve found so far is meditation sort of. I think
that part of ptsd is that your mind is fighting to block the memories and
their emotional consequences. So when I feel symptoms I let my mind be open to
any and all thoughts or memories. I totally relinquish control of my own
thoughts and whatever comes into my mind, I allow it to come and then I watch
it pass on. Opening the mind and simply observing the thoughts. This
dramatically reduces the severity of my symptoms and often leads my mind to
organically become preoccupied with something else.

It’s strange to think that a video can be so dangerous. But they can be. I was
a grizzled veteran of gore videos and I thought surely that if they damaged
the mind, I would have noticed a long time ago. Some videos, especially high
definition ones, can for sure fuck you up. If you have children, don’t allow
them access to the internet unfiltered. I saw this video on Reddit for Christ
sake.

------
unicornherder
I didn’t even realize that they had human mods. Interesting!

------
AnaniasAnanas
NDAs and non-compete agreements should not ever be considered as valid
contracts by the government.

~~~
skybrian
Some NDA's can be too broad, but this is a bad take. It needs to be possible
to hire people that you trust not to disclose all your secrets, and your
customer's secrets. This is what privacy regulations are all about. (At
Facebook in particular, disclosing stuff about users is pretty bad, see lots
of news stories over the last few years.)

The balance between protecting privacy and making abuses public is pretty
nuanced and doesn't lend itself to one-bit thinking.

~~~
dustingetz
> needs to be

Nothing needs to be anything, though the world order would certainly look
different and reflect the interests of different classes of people than today

~~~
ghaff
No. But, in the absence of NDAs and other agreements for both employees and
external partners, you'd see a great deal more limits on sharing information
both within and without companies to a strictly need to know basis. Certainly
those limits exist today to a degree because NDAs basically just allow for
consequences. But if you _can 't_ keep someone from turning around and sharing
anything you tell them other than through some sort of mutual trust, you'll be
less inclined to share it.

------
anentropic
sounds like the workers need a union

------
DannyB2
> there are two sets of entry-level knowledge workers at Facebook:

> engineers ($150k/year, benefits, upward career trajectory) and

> content moderators ($30k/year, no benefits, likely going to acquire mental
> illnesses).

If Facebook could get away with paying engineers $30K/year, no benefits,
believe me, they would.

~~~
meruru
>likely going to acquire mental illnesses

I find hard to believe humans are so sensitive that some visual stimulation is
going to give them mental illnesses. We evolved in an environment where
violence was part of real life and a real threat to our own well being, seeing
it on a screen will elicit emotions we may not be used to in modern life, but
causing mental illness is a stretch.

~~~
jstarfish
The job is to view an endless parade of the most creatively horrible things
people have ever said to each other or captured on film, all day every day.

The job itself requires you to actually scrutinize the details-- averting your
eyes is not an option. The job is to look at and mentally process each
artifact.

Violence in our origins at least had a point. Kill for survival. Kill for
food. Killing _meant_ something.

Raping children never served any biological purpose, nor does watching actual
people explode. There are many reasons we try to make the world such that
neither is a part of anybody's life.

(ed) Put it this way-- if it's so harmless for people to be exposed to, why
moderate anything at all?

~~~
meruru
Yeah, to be honest I wrote that before reading the article and I was painting
a much rosier picture than what it was described.

I do wonder what is the distribution of content they have to moderate. Is it a
continuous stream of violence or is that something that appears once or twice
a day?

>if it's so harmless for people to be exposed to, why moderate anything at
all?

Obviously if the public doesn't like it enough, you're going to moderate it
away. Some places don't even allow cleavage, so it's not like something has to
be harmful to be moderated.

------
bksenior
You ever read Upton Sinclair's the jungle? It's a tale as old as time. Just
publicly shame the company and ultimately they change or congress passes laws.
This is a major reason the media works this way, it's part of American
democracy.

~~~
president
> Just publicly shame the company and ultimately they change or congress
> passes laws.

This just doesn't work in today's world. If it did, Equifax wouldn't be
thriving the way it still does today.

~~~
bksenior
The effectiveness of types of persuasion change with time.

Eventually there will be something that moves people if it becomes important
enough.

~~~
emn13
Relying on congress to respond to public issues is fundamentally weaker now,
and weakening by the year due to the duopoly in politics: any issue one side
takes up is immediately something that - if possible - is criticized and
turned into a point of differentiation; and media are growing into that too
(online media already are highly polarized).

Whatever small forces of reasonable consensus remain aren't enough to address
issues like this: because let's be honest, it's not really all that clear
exactly what the problem is (there are lots of aspects), how serious it is,
let alone how to address (if that's even possible). You're not going to get
proper consensus overnight on an issue like that in the best of times, and
it's hopeless now. At best - and I don't think even that's likely - you'll see
some hyper-targeted no worker-abuse kind of laws, but nothing addressing the
underlying dynamic that creates the situation, and something that a creative
business is likely to be able to route around. Essentially a base-pleasing
legislative patch that punts the problem until it reemerges. Given the topic
that's only going to happen if the democrats win both houses.

~~~
crooked-v
> any issue one side takes up is immediately something that - if possible - is
> criticized and turned into a point of differentiation

This kind of "bothsame" response willfully overlooks that one party has gone
to substantial lengths to offer compromises on bills, appointees, and policy
goals, while the other very much hasn't.

~~~
emn13
You're totally right to point out that Republicans are much more shameless in
this regard. I didn't intend to imply the two parties are the same here, but
that's how it reads.

But even if the republican party+electorate are at "fault" here - I don't
think that fact is going to help anyone, nor that Democrats are acting like
little angels here anymore (if they ever did). They don't have the power to
stop this dynamic, but they sure are participating pretty wholeheartedly in it
now.

More fundamentally, the idea of open markets and in general an open society
(i.e. the anti-Trump) was in some sense fallacious, and Democrats didn't admit
this quick enough. A rising tide does _not_ lift all boats - not
automatically; at least. And there was precious little care for ensuring
people had the negotiating position to actually profit from the changing
times. And what rebalancing there was didn't actually serve to reduce
inequality, merely to make it sting a little less - but people don't like
being treated as inferiors needing a paternalistic pat on the head.

I think of it a bit like a prisoners dilemma: sure, it's utterly self-
destructive to do what Trump (and by extension his voters) are doing. But it's
partly a reaction borne of lack of other options: if they're going to get
screwed, then screw everyone; time for a reboot. Only: that reboot isn't in
sight, and all kinds of other nasty social habits are coming along for the
ride.

I'm exaggerating a little as a point of debate, but you might say that the
liberal elite (say) 15-30 years ago was being a little deceptive, and were
doing so in a somewhat condescending, abrasive way, and that was plain _dumb_
\- because that helped _cause_ something like Trump. For the playground
analogy: maybe they started, but this fight was avoidable.

~~~
rapind
The idealogical polarization of the parties was really kicked into full gear
by Gingrich. Trump's very clever at manipulating media to take this
polarization to new heights, but the playbook it's Newt's playbook.

It certainly doesn't help the Democrats (and reasonable republicans?) that
they appear weak when they try to be reasonable or compromise.

------
sonnyblarney
Funny how Zuck and even Bezos have this worker problem coming up all the time.

Their making a lot of money, can they not at least make working conditions
decent? Too much to ask?

------
dx7tnt
I can't wait until my bank has to pay teams of people to keep child porn and
beheadings off its website!

------
redm
This story reads like a journalist trying to find some dirt on Facebook
because its “hot” right now. Its the inverse of writing fluff stories when
everyones excited about a startup. This story isn't even Facebook.

I worked tech support for Acer in the 90s and its a similar setup. If they
took the time to have “relaxation” rooms, i doubt its as bad as the story
makes it out to be.

~~~
rawrmaan
You obviously didn't read the article.

~~~
redm
Actually, I did, obviously. I feel I have a much better picture of contracted
centers like the one described in the article than a sensationalizing story
from The Verge could inform me of though.

~~~
rootlocus
You said:

> If they took the time to have “relaxation” rooms, i doubt its as bad as the
> story makes it out to be.

The article said:

> Facebook contractors are required to use a browser extension to report every
> time they use the restroom, but during a recent illness, Lola quickly took
> all her allotted breaks. She had previously been written up for going to the
> bathroom too many times, she said, and so she felt afraid to get up from her
> desk. A manager saw that she was not feeling well, and brought a trash can
> to her desk so she could vomit in it. So she did.

