
Facebook moderators in America - ajay-d
https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona
======
scoutt
I always wondered why they don't have a system where the videos or images are
shown first filtered, like blurred, and then at the discretion of the
operator, they can take "several levels of blurring" off until they see the
original image. That way, for some images/videos/ you can tell if it's
violent/inappropriate right away, and the operator may see "less" of whatever
is disturbing for them.

I've also wondered how they deal with long videos that may seem correct for
the first minutes, but the real inappropriate content is interlaced for a few
frames in-between or later in the video.

~~~
perbu
It's low-value work. Facebook doesn't value these people enough to employ them
so they likely don't have access to properly engineered tools.

~~~
dymk
What makes you think that? How can you say that with such confidence? Are you
letting your pre-existing negative feelings color your assumptions about what
people actually think at the company?

I used to work there, and I built tools for these workers. Every one of the
other engineers that I knew had nothing but respect for them, and everybody
recognized that it was important work.

~~~
agentdrtran
So important their bathroom breaks are timed, they get inadequate counseling,
and they get paid $28k a year!

~~~
tjkrusinski
You have to remember that those are policies put in place by the contracting
company, not Facebook or the engineers that make the tools that these people
use.

~~~
thebradbain
And yet if Facebook really valued these contractors as much as their regular
employees, they would be hired as regular employees and not contractors. The
fact such crucial staff are contractors shows how much Facebook values them.

~~~
tjkrusinski
Yeah I don't disagree, ultimately though, it's a scaling problem. It's
unrealistic to hire thousands of people in a high churn context. Cognizant,
Accenture, et al are companies designed to do that and they do it well.

~~~
cmiles74
If they were treated better and had access to better tools, it might not be
such a "high-churn context."

Facebook makes literally billions of dollars, they could easily pay more and
provide better support to these people. They simply do not wish to do so.

------
duxup
I remember when "user generated content" was a common phrase. It was great,
users generate the content for the site and you just have to provide a
platform for it. It seemed like a positive thing.

Not it seems clear that if you open yourself to host something for your users
you also need to be prepared to accept that you're going to be hosting the
worst of humanity, exposing others to it, including people who work for you.

It seems like such an extreme contrast from what it seemed like user content
could be and what it is.

~~~
Bahamut
This isn't particularly surprising to anyone who has been online for a while -
sometimes you'd see gruesome stuff even on forums, and then you also had the
culture on sites like 4chan.

Moderation isn't something that one could really escape from even about 20
years ago, just now we're more cognizant of the effects of poor moderation and
users are more aware of how people behave when moderators/admins are pushed to
the boundaries of the set rules.

~~~
013a
The the scale of the platform is the differentiator. If you're moderating a
few hundred or thousand people on a forum, especially a site with a
constrained vision and purpose, you're simply not encountering content and
behavior like this on a regular basis, if ever.

Now, look at facebook. Billions of users. No constrained vision or purpose to
being there; if you're human, you're a customer, and human behavior can be
pretty gross at the fringes. And maybe most importantly; they can't just Ban
whatever people or content they want, due to both a profit motive and public
outrage due to their preeminent position.

This isn't a problem that has already existed, because the problem is with the
Scale, not with the Behavior.

~~~
jachee
> if you're human, you're a customer

No. If you're human, you're a product. Can't have the product spoiled, now,
can we?

------
_bxg1
I get a certain amount of secrecy for their own protection, but this:

> They are pressured not to discuss the emotional toll that their job takes on
> them, even with loved ones, leading to increased feelings of isolation and
> anxiety.

Just doesn't make any sense. Not only does it not protect the moderators, I
can't even figure out what the cynical corporate interest would be.

~~~
duxup
I'm guessing these moderators are outsourced through another company and that
allows for a lot of unreasonable policies.

I think it is generally understood that if you outsource you can get another
company to enforce policies you'd otherwise be embarrassed by / legally
responsible for if they worked direct for the company. So if something goes
wrong "oh that wasn't us, we told that crazy outsourced company not to do it".
The other company doesn't care as they're not selling a product and it doesn't
hurt them.

I worked for an outsourced customer service company when I was in college. The
policies for their in house service and outsourced were explicitly different
but customers were exactly the same. This was no mystery to anyone in the
company. Policies for the outsourced companies were much much less generous to
the customers. So much so at one point they retroactively changed their
warranty policy that was enforced by outsourced customer service. In house
stuck to the original rules, outsourced declared X, Y, Z wasn't covered. When
they got caught after a year of saving money they declared that the company
they outsourced service to did it wrong. Then after a time they'd switch
back....

~~~
solotronics
I actually know a Facebook moderator who lives in Austin. I found it very
interesting that they do a lot of moderating of illegal political content in
countries that have strict rules about it. She works directly for Facebook and
gets the same perks as the other employees in ATX.

~~~
cvwright
Whoa. Are you implying that Facebook employees in Austin are helping, say, the
Chinese government censor pro-democracy groups?

I'm not sure the employee's perks (or lack thereof) are the biggest story
there.

~~~
duxup
Could be more benign such as a country that has rules about no political adds
between X and Y times.

Still your question is valid.

------
WisNorCan
I would be impressed if Zuck and Sheryl decided to spend a day per year doing
this job.

I suspect that as a billionaire and the CXO of one of the largest companies of
the world, you can get disconnected from the details.

It would be valuable for them to see a different reality of their platform.

~~~
Cthulhu_
Zuck has been around long enough to have experienced goatse, rotten.com and
all the other shock sites; back then, I'd argue, it was easier to find and be
'accidentally' exposed to than today.

But, that's the odd experience. Having to deal with new levels of depravity
every day is different.

Anyway, what do you believe would change if Zuck did it for a day? They need
moderators, as long as the technology for detecting it automatically isn't
good enough yet. He knows it's a problem.

~~~
will_brown
>Anyway, what do you believe would change if Zuck did it for a day?...He knows
it's a problem.

Well for starters he might stop outsourcing the work through a contractor who
pays its employees 1/8 what the average Facebook employee is paid. Maybe the
powers that be accept and acknowledge these low/underpaid contractors are
developing PTSD like symptoms and are clearly not getting services internally
much less the financial compensation to get such services externally...not to
mention the contractual arrangement which seems to keep the contractors from
seeking help externally.

~~~
rock_hard
Companies like Facebook pay local market rates.

People that work at FB in London also don’t average a $240k average Silicon
Valley compensation.

~~~
will_brown
I don’t think the point OP is making is about Zuckerberg spending a day doing
the work of one of these positions is to determine the fair market wage of
this position in Arizona...

After all what is the local market rate for a job that has shown the tendency
to trigger PTSD without sufficient benefits to treat said PTSD? You would hope
at least enough to cover PTSD treatments...Would you continue to use Facebook
or allow your kids to use Facebook if there was a good chance of them
developing PTSD symptoms? Would you take a job where there was a good chance
you would develop PTSD and the job wouldn’t cover it and didn’t pay enough for
you to cover it?

------
aboutruby
For those used to shock sites from 10-20 years ago, this is very different,
I'm not going to link to anything but the disturbing content is on a few whole
new levels.

I can't even imagine doing this job. This should be 100% automated with only
0.01% error rate for the first-pass filter (e.g. before it reaches humans).

They are willing to spare their users the horrors but not their own
employees/contractors...

~~~
intended
Facebook tried the automated system by using employees to train the machine.
It failed.

They returned to non Facebook employed humans doing the moderation

~~~
fencepost
One thing I hope they're doing is advanced detection of duplicate
inappropriate content - e.g. by splitting a confirmed properly blocked video
into frames and identifying matching videos based on frames matching (or even
key parts of frames).

You'd still have to investigate/review partial matches, but something like
that could cut down a lot on duplicate effort and could auto-identify a lot of
things that would need manual review.

------
intended
Remember that Reddit quarantines /watchpeopledie

These mods may randomly see gang videos (go look for them if you want to see
how tough you are) on any day.

Viewing them would make many people I know retch, worse than if they saw a
gory horror movie.

And then some ideas, if presented without an antidote (creationism, flat
earth, conspiracies) will subvert the people watching it.

And then comes the worst part - the damage to society itself.

If violent images are removed, depending on what people in Facebook and their
friends, consider safe- then they are also deciding what is newsworthy.

And Facebook mods are not in a position to decide if an image of burning
child, will stop a war or awaken the conscience of a people.

All they know is that it breaks “rule 4- death: graphic wounding, no gore.”

~~~
casefields
The only one that got to me was Funkytown. I've got to imagine if they
increased pay and hunted through those users, they could find people that can
withstand sifting through this stuff.

------
jrochkind1
> Every moderator I spoke with took great pride in their work, and talked
> about the job with profound seriousness. They wished only that Facebook
> employees would think of them as peers, and to treat them with something
> resembling equality.

> “If we weren’t there doing that job, Facebook would be so ugly,” Li says.
> “We’re seeing all that stuff on their behalf. And hell yeah, we make some
> wrong calls. But people don’t know that there’s actually human beings behind
> those seats.”

Some respect and acknowledgement, and more than $15/hour, seems like it would
be a good start.

~~~
dessant
> Some respect and acknowledgement, and more than $15/hour, seems like it
> would be a good start.

If it isn't already the case in the US, psychologically hazardous jobs should
come with wage premiums.

~~~
jrochkind1
It is most definitely not already the case in the US.

------
iambateman
Question: is there a legal prerogative for Facebook to proactively moderate
content in the US?

Don't get me wrong: the article is incredibly hard to read. When humans become
untethered from social mores, they are capable of shocking and evil behavior.

But...can't people mostly police this themselves? If I had a "friend" posting
video of a dog being stabbed, I wouldn't be their friend for much longer. It
seems that the result would be relatively small cesspools of highly antisocial
behavior. Cesspools which will continue to exist regardless of any moderating
effort.

Alternative to moderation: introduce "public content ratings" for the content
users post _publicly_. Machines automatically assess content and rate the user
based on the maturity level of their public discussion. I would be "teen-
mature" because I sometimes say "shit" online and perhaps another user would
be "highly mature - disturbing violence and sexual content". Their ratings
would weight groups.

Then, it's up to me to decide who I associate with online.

Caveat: nothing is a panacea. We're all trying to figure out how to handle the
fact that we gave every maniac a megaphone to the masses.

~~~
Despegar
If HN can't do without moderators why do you think Facebook could?

~~~
iambateman
I don't think Facebook can do without mods entirely. Maybe they don't have
enough mods or strict enough rules. I honestly don't know.

But Facebook has a more user-curated thread, whereas HN is public. In a sense,
I am my own mod on FB.

FWIW...I definitely don't use FB on a regular basis. Haven't since Oct 2017.

------
rhema
Is it just me, or do should we expect that some small population will be
naturally equipped to deal with this job? We don't expect everyone to want to
be a soldier or medical doctor. These people deal with situations that would
make most people throw up. If you can't handle an hour of browsing 4chan,
maybe you should know that content moderation is not a good fit for you.

~~~
dallashoxton
which might be a good argument for gigification of these types of content
moderation positions since the machine learning tech simply isn't there yet
and might never be in some cases.

Someone with a strong constitution/stomach (i.e. the people who can browse
4chan while eating) might find this work relatively leisurely/tame for decent
pay and flexible hours if adapted to a gig-economy model. I mean, there's a
whole class of sick fucks who watch the stuff on reddit's /r/watchpeopledie
for no compensation.

Might end up with less mental health problems if this type of desensitized
personality was leveraged for this nasty-but-necessary job instead of
recruiting from naive 19-year-old kids or whoever in between retail jobs and
exploiting them in atrocious work conditions until they get PTSD.

~~~
kilpikaarna
"relatively leisurely/tame for decent pay and flexible hours"

Uhh, this doesn't sound at all like the stories of driving Uber that I've
heard.

I don't think any of the people in the article would be doing this job if they
had other options that paid as well, so I don't exactly see how your idea
would improve their lives.

------
lazugod
> When I ask about the risks of contractors developing PTSD, a counselor I’ll
> call Logan tells me about a different psychological phenomenon: “post-
> traumatic growth,” an effect whereby some trauma victims emerge from the
> experience feeling stronger than before.

Wow. Wow. Is this really a thing, excusing forced trauma by claiming it makes
people better?

~~~
dsfyu404ed
People respond in varying manners to traumatic stress. Post traumatic growth
is simply the other extreme.

For example, some people take traumatic stress at really poorly (PTSD). Some
people develop a world outlook to the tune of "well no matter what happens
life is looking up from here" (post traumatic growth).

The counselor is looking at the glass half full. You kind of need to be a
glass half full person to persist in that line of work.

~~~
watwut
PTSD is not an issue of attitude while your description of post traumatic
growth is nothing but attitude. The two are incomparable.

The actual benefit of what you describe as growth is quite low to offset
clinical PTSD which affects life quite a lot.

~~~
darkpuma
PTSD isn't universal. Not everybody that experiences trauma will develop PTSD.
So it's not a matter of "you have PTSD but at least your outlook on life has
improved."* Rather, sometimes with some people, trauma won't induce PTSD but
_will_ cause post-traumatic growth. Research suggests there is a correlation
between PTSD and post-traumatic growth, but it's not a hard and fast rule.

Furthermore there are many factors in play. The nature of the trauma and the
predisposition of the person experiencing the trauma both seem to play a big
role in whether or not post-traumatic growth is likely to occur. People with
social support networks or spirituality are more likely to experience post-
traumatic growth. Perhaps for related reasons, trauma that is systematic or
collective (such as being a prisoner of war) is more likely to induce post-
traumatic growth than trauma which is personal or individual (e.g. sexual
assault.)

(Furthermore, no matter how distasteful the possibility may seem, it is
possible that post-traumatic growth _can_ take people past whatever their
baseline was prior to the inducement of PTSD.)

~~~
watwut
The point is, people with PTSD suffer and their lives are influenced
significantly. They have harder time to keep jobs due to symptoms, their
relationships are influenced as they are harder to be around, they are more
likely to abuse alcohol or drugs, they generally need help.

Other people experiencing post-traumatic growth does not offset all that. It
does not make for losses and answer to "is it overall beneficial for people to
go through that" is still no.

Counselor answer suggests yes and that is what people take issue with. Whether
one can have PTSD and post traumatic growth at the same time is different
question.

------
aeling
If I think this is unacceptable[1] and I'm skeptical of the nebulous "AI will
do this job in the future" claims, are there conclusions to be drawn other
than "UGC isn't sustainable on a global, public platform"? That is, are there
serious alternative options, or anybody working on ideas in this space?

I think it's readily apparent that "just show everything" doesn't work if you
want to attract a mainstream audience, but I'm reluctant to just give up on
the global public platform that FB was originally idealized as.

[1] I think I'd still find it unacceptable if the moderators were being paid 6
figures, had extensive 1:1 counseling, or any other perks - selling mental
health for money is something I'm happy saying a utopian society wouldn't
include.

~~~
0x54D5
The public platform needs to be decentralized. We need a social protocol where
the data lives in the protocol and not in some corporate server where it is
subject to their whims.

We currently live in a dystopia where your Twitter or Facebook could be banned
at their whim leaving you a digital outcast.

~~~
gipp
That's pretty much the opposite problem of what the GP is asking about.

------
tareqak
It looks like Facebook has responded to this article here:
[https://newsroom.fb.com/news/2019/02/commitment-to-
content-r...](https://newsroom.fb.com/news/2019/02/commitment-to-content-
reviewers/) .

~~~
gcb0
what a joke this response is. Article mentions A, B and C. with pictures and
first hand anecdotes. FB press release issue a statement that they have
contracts that do not allow for A, B and C.

They don't even try anymore.

------
ohiovr
This job sounds more horrific than a crime scene cleaner. 4500 foot soldiers
to clean up a billion users. Murder rape assault fraud. Burn it all with
brimstone.

~~~
canttestthis
Are you saying that there aren't enough people cleaning up or that cleaning up
should be done differently?

~~~
ohiovr
I honestly don’t know how moderators work and i actually don’t use facebook.
But 1,750,000,000 / 4500 is about around 380,000 users per moderator. I think
they need 5 armys of moderators instead of just a brigade trying to find the
worst. They need to step in with private messages or public posts to keep the
discussion from going rancid fast. I see it as being logisticaly impossible
unfortunately but to keep discussions from going to that depth of evil they
need to step up moderatoon pressure a couple orders of magnitude. I’ve said
things here that moderators told me it isn’t allowed (actually said things
here that were hurtful and unnecessary blanket statements) and I gripe to
myself and limp back. If this pressure isn’t there yc would turn into kuro5hin
in short order. I’ve seen discussion groups fall into the toxic slime pit more
than a few times never to return. I think at least facebook groups can self
moderate i have been told. I really doubt good behavior can be coded into a
forum.

Now that pit is billions of people not thousands. I have little hope it’s
going to ever get better. Is this the destiny of all internet discussions? I
worry about this a lot.

------
walrus01
A huge amount of content moderation has been offshored to the Philippines,
where salaries are considerably less than $15/hour. Combination of access to
good internet connections, modern office space, an English speaking workforce,
and low salary levels.

I have not seen this documentary, but compared to the verge article, I think a
major difference is that the filmmakers did not get cooperation from Facebook
or the third-party content moderation company. In fact their photos were
posted and employees were told to stay away from them.

[https://www.vice.com/en_us/article/ywe7gb/the-companies-
clea...](https://www.vice.com/en_us/article/ywe7gb/the-companies-cleaning-the-
deepest-darkest-parts-of-social-media)

------
intended
This sounds about right

>dark humor and camaraderie

Yes. No one can know what happens if you start moderating, except unusually
inquisitive users or other mods.

> believing conspiracy theories

Not all ideas are equal - and not all brains/people are equipped to deal with
them equally.

Your mod team will soon get polarized or even subverted if faced with the kind
of crazy work these facebooks mods have to do.

If you are actual community mod, life is slightly better, since there’s more
context.

------
jrochkind1
See also this upcoming book by Dr. Sarah T. Roberts, the product of several
years of research.

[https://www.amazon.com/Behind-Screen-Content-Moderation-
Shad...](https://www.amazon.com/Behind-Screen-Content-Moderation-
Shadows/dp/0300235887)

Here's a presentation Dr. Roberts' gave at conference last week on the topic
too:

[https://www.youtube.com/watch?v=7mdMtukvtxc#t=15m50s](https://www.youtube.com/watch?v=7mdMtukvtxc#t=15m50s)

(Until recently at least, most of this content moderation work was done by
contractors overseas, largely in the Phillipines. Can you imagine what the
working conditions are like there?)

~~~
2sk21
The presentation you linked to by Sarah Roberts was very interesting to say
the least. Highly recommended. She adds a lot of information about the
condition of content moderators in other countries.

------
endofcapital
If I was running a marketing team or PR firm working on this problem I'd keep
pushing the nonsense of AI magically solving everything someday. The actual
nuts and bolts of content is pretty horrific, and they are obviously going to
want to hide this reality.

At least pay these people more than a dishwasher, please.

------
crankylinuxuser
Yep. And this is what you get when collectively "We" demand moderation. Its
the other side of the fence... Usenet was the unmoderated wasteland, and
Facebook is a the moderated hell-landscape of "ok not ok".

~~~
intopieces
Usenet still exists and is moderated in a similar way against exploitative
content in partnership with The Internet Watch Foundation
([https://www.iwf.org.uk](https://www.iwf.org.uk)), depending on your
provider. Giganews, the largest Usenet provider and owner of other affiliated
providers, is a member of the IWF: [https://www.iwf.org.uk/member/giganews-
inc](https://www.iwf.org.uk/member/giganews-inc).

~~~
crankylinuxuser
But due to its decentralized system, subscribing to a block list from IWF is
just that. Its the equivalent of a spamblock list.

~~~
intopieces
Not really. Usenet providers voluntarily subscribe to the list for you,
leaving you without the choice.

------
NoblePublius
The only thing remarkable here is that these people are not hired by Facebook
directly. Someone needs to ask Zuck why he thinks these hires aren’t worthy of
catered lunch and @fb.com emails.

~~~
mmagin
Then there would be the risk that the rest of fb's employees might be exposed
to the reality of this.

------
raverbashing
Oh so "in the future" she will be able to watch it without sound or to pause
the video?

Seriously, who thought blocking this is a good idea? Do you really need to see
an entire video to see it is against the rules? It will also save (a lot of)
time

"Oh but then they can just do a shallow evaluation of the video" yeah I don't
think that is the case with most of the videos.

------
wespiser_2018
I wonder how far we are from automating this with image recognition. The
interesting thing is policy aspect to the learning problem, does an
post/image/video violate a collective set of rules? They are certainly
building a huge enough labelled dataset of images, decisions, and policies. I
wouldn't be surprised if this went automated in 5-10 years!

------
homemadejam
Wow, that was eye-opening. Maybe a suggestion would be to have certain
filters. Anyone below the age of 18 wouldn't be able to view posts by
groups/people that have been flagged as "NSFW" posters? I know Tumblr
implemented a similar idea a while ago.

------
daenz
>If we weren’t there doing that job, Facebook would be so ugly

It would be 4chan, minus the anonymity.

~~~
faissaloo
That would be many times worse than all the chans combined, imagine a chan
exclusively used by middle aged mothers where anyone you talk to is doxable at
the click of a button.

------
unicornann
These contractors work from different agencies. Sometimes they become full-
time if they don't end up quitting their job. To make things worse, toxic
culture like work place harassment is something they also experience

------
throwaway-1283
I feel like selling moderation tools would be a huge biz for any co with
UGC...

~~~
tjkrusinski
Yes, it already is.

------
elchupanebre
What's going to happen next: FB will quietly layoff all US moderators after
moving the operations offshore.

------
statictype
Interesting that the moderators were provided by Cognizant - essentially a
software outsourcing company.

------
faissaloo
The monolithic social network model just does not scale, this is exactly what
we need federation for.

~~~
agentdrtran
How does federation solve the problem of underpaid moderators?

~~~
faissaloo
Underpaid moderators are a side effect, not the main issue. If the network
were federated each node wouldn't grow to such an unmanagable size, at least
not without other nodes severing ties.

------
TorKlingberg
This part is scary:

 _The moderators told me it’s a place where the conspiracy videos and memes
that they see each day gradually lead them to embrace fringe views. One
auditor walks the floor promoting the idea that the Earth is flat. A former
employee told me he has begun to question certain aspects of the Holocaust.
Another former employee, who told me he has mapped every escape route out of
his house and sleeps with a gun at his side, said: “I no longer believe 9 /11
was a terrorist attack.”_

I have a hope that people will eventually develop immunity to conspiracy
theories after falling for enough of them and eventually seeing them
disproved. But with things like pizzagate/Qanon it seems some people just keep
going deeper.

~~~
monetus
When it comes to neuroplasticity and change, all it takes is time. If you
spend more in the shoes of those flagged for moderation than their opposites,
maybe its inevitable that you attain a form of Stockholm syndrome.

~~~
megous
It's a bit cultlike. Also add isolation from family/friends, and tight control
of what you can and can't do, and having no trusted person (therapist there
works for the comapny).

It doesn't surprise me that it can lead to similar outcomes, as being in a
cult.

------
paul_milovanov
I think that's called "censors"

------
_bxg1
The internet is one great big libertarian experiment. It illustrates all of
the creativity, and all of the depravity, that result when you just put
millions of people together and tell them to do whatever they want without
consequences.

------
Mr_Shiba
This is a great opportunity to encourage everyone to delete your facebook &
Instagram.

~~~
tjkrusinski
Also stop using Google, Twitter, YouTube, etc etc because they all work
similarly.

~~~
Mr_Shiba
Agreed, duckduckgo ftw.

------
hnaccy
If seeing it makes you cry it seems like you should be looking for a new job.

There is a portion of population that would be generally unfazed by the
content, it would be best for everyone if they worked these jobs.

~~~
untog
You really think the people being paid $28k a year to watch videos of literal
murder haven't thought that maybe they'd rather have a different job?

~~~
Nadya
Tons of people watch these kinds of videos by choice over on /r/watchpeopledie
(Warning: Very NSFW Subreddit if the name of it doesn't make that immediately
obvious. Contains many videos of accidental deaths and murders, including
those of children.)

I'm sure some of them are jobless and would love to be paid $28k a year to do
something they already do as a bit of a hobby.

~~~
bendotero
That's sick. What you just described is someone who is unhealthy, not a person
who needs a full time job to further indulge in something deviant.

~~~
hnaccy
Even if we concede that they have some dysfunction why is it an issue they do
this sort of work?

It's like having someone with poor sense of smell take out the garbage.

~~~
untog
> It's like having someone with poor sense of smell take out the garbage.

No, it's like having someone a poor sense of smell detect which things smell
and which things don't.

Not to mention, having them do this work is taking advantage of a dysfunction
rather than helping to address it. In fact it would be likely to make the
dysfunction worse.

~~~
hnaccy
>No, it's like having someone a poor sense of smell detect which things smell
and which things don't.

I'd guess most can identify murder and gore films even if they don't mind
them. They're pretty obvious.

I don't understand why taking advantage of this "dysfunction" is a bad thing.

~~~
bendotero
So watching the destruction of human life and/or incredible suffering of other
people is not problematic to you? If someone has a dysfunction is it not cruel
to take advantage of the dysfunction, whatever benefit might be derived by
others? Seems like the very definition of dehumanizing rather than caring
whether or not your fellow man is thriving.

~~~
hnaccy
I'm not saying force these people to work, simply that they're people who lean
more to the callous side of the emotional spectrum and they're better suited
for this kind of work.

What you see as dysfunction, their apathy when viewing bad content, is their
normal state of being.

~~~
anigbrowl
Rather than argue I'd like to suggest two questions for consideration?

Why do you think some folk are drawn to habitually view such content?

What other things do you think satisfy that impulse?

~~~
darkpuma
> _" Why do you think some folk are drawn to habitually view such content?"_

It's paradoxically unusual but at the same time incredibly relatable.
Everybody dies, so there will be broad spectrum curiosity for the subject,
inhibited only by natural squeamishness.

Years ago when highschools still had shop classes, the teacher showed my class
a binder full of color photographs of what a lathe accident looks like. The
lesson was that lathes aren't toys. I'd never seen anything like those
pictures before. They were disgusting, and fascinating. Probably
psychologically damaging, but not as damaging as getting caught in a lathe.
Did I mention the pictures were fascinating? For most people, seeing that sort
of thing is rare. Some people are drawn to novelty, particularly when they can
relate to it. Very little in life is as relatable as death; death is even more
universally relatable than eating.

Is the suppression of squeamishness a form of mental illness, or a form of
psychological damage? I think it definitely can be. But I'm far from convinced
it _necessarily_ is.

~~~
anigbrowl
Almost everyone has a degree of morbid curiosity, or there wouldn't be a
market for horror movies and true crime media. But my question was
specifically about why that would develop into a habitual preference for
specifically gruesome ends. I'm gonna go out on a limb and guess
/watchpeopledie hasn't recently been taken over by video of people in comas
flatlining.

~~~
darkpuma
> But my question was specifically about why that would develop into a
> habitual preference for specifically gruesome ends.

Because their squeamishness is suppressed to an extent that is unusual among
the general public, and what remains is the now uninhibited fascination.

------
dsfyu404ed
Meh. Sounds like a run of the mill shitty job. You can destroy your body
laying bricks and unloading trucks or you can destroy your mental health doing
something like this. Of course they're not paid well enough to compensate for
the long term damage. Nobody ever is.

There's all sorts of shit you might have to put up with in a shitty job, the
work, the environment the customers. This job is shitty but some people are
bothered less by this particular kind of shitty than they are by by angry
customers yelling at you or laying bricks in the Arizona weather. Sure you
might have to see some really screwed up shit among the sea of nip-slips and
racial slurs but you get to do it in an air conditioned office without killing
your body like the guys in the Amazon warehouse. Some people prefer that.

Different people have different degrees to which they'll tolerate the various
ways shitty jobs are shitty. At least one person the author interviewed said
"well, it sucks less than Walmart". Back at the point in my life when I was
doing shitty jobs I would have preferred brick laying or custodial work but I
wouldn't have turned down this job if I needed it. Of course it's terrible for
you and turnover is high but all the equally shitty jobs are this way. There
really is no winning in the "minimum wage or thereabouts" income bracket.

Edit: Since apparently this opinion is unpopular can anyone tell me why this
particular implementation of shitty job is worse than all the other
implementations?

~~~
untog
Disregarding for a second that you're dismissing this entirely without having
done so much of one minute of the job yourself...

A lot of manual labour jobs are protected by unions, which establish safety
standards, limits to shift length and healthcare to help employees when they
need it. Sounds like these employees could do with the same.

~~~
dsfyu404ed
>A lot of manual labour jobs are protected by unions, which establish safety
standards, limits to shift length and healthcare to help employees when they
need it. Sounds like these employees could do with the same.

I'm not comparing to those jobs. The union dishwasher in some public
university cafeteria gets time off, healthcare, etc, etc. that makes his or
her job much less shitty than the infinite-term temp that works beside them.
I'm comparing to the temp.

You can make a job better/worse by adding/subtracting pay, benefits or working
conditions. If I could get tech pay and benefits for construction work I'd be
doing construction.

If we're gonna compare this job to others then we should compare to other jobs
that are "equally shitty" not jobs that are similar but actually compensated
more highly if you roll the benefits into total comp.

------
b1r6
I would never be able to to do that job.

Not because of the exposure to "bad things" like the article pushes, but
because I've always felt totally fine (even bored) looking at that stuff, and
I don't agree with censoring it.

It's just "meh", another day of reality on Earth; the Internet is just a
mirror held up next to it.

Will we ever collectively realize this? It seems younger crowds are more
normalized to this stuff because they grew up on the Internet. But add in the
hypersensitivity in today's public social network sphere, and we're all
freaking out over anything even potentially flammable.

~~~
pjc50
I think you're on the far side of the "desensitization" process. It's not
entirely normal to have a flat emotional response to everything, including
images of human suffering.

~~~
b1r6
Certainly a good point. I'd add that when it comes to suffering, I certainly
feel discontent with that happening:

In the article, the example of a stabbing video is given. I would strongly
feel the need for retribution or justice for that victim. Same for all content
that shows someone being hurt.

I guess what I mean is that it's wrong to try and purge all this stuff as if
it just doesn't exist. This is real content, real people, being hurt in
reality. I'd feel disgusting trying to censor it, not disgusted by the
content.

~~~
opT52ifk
Is not publicizing (in a system like Facebook, _designed_ to proactively
promote and spread attention-grabbing content to any audience it can) the same
thing as censorship?

In other words - I sympathize with the hard-ACLU/EFF stance on free speech.
But there's a difference between government censorship and societal moderation
- the latter has _always_ existed, and the scale and automation of modern
platforms in publicizing content that normally wouldn't spread so far is
what's new.

Should people be aware of bad things that happen in the world? Absolutely. Is
broadening the audience for disturbing videos the right way to raise
awareness? Maybe not.

And if folks really feel such content needs to be published, they still have
more options and reach than they did in years past even if "mainstream" places
like Facebook moderate them. Granted this last point is getting a bit
trickier, as people go after registrars and web hosts themselves for political
reasons occasionally (i.e. if this was Cloudflare we were talking about I'd be
in full agreement with you - then again Cloudflare doesn't run a
recommendation service that automatically causes new unexpected content to
appear in front of billions of people).

------
mruts
Maybe Facebook just let all videos be posted, except when it’s illegal (the
video, not the act). Pretty much the only thing that would fall in that
category would be child porn.

I’m not sure I really understand why big companies feel it’s their job to
censor the world. Just let the floodgates open: allow violence, allow porn,
allow whatever else. Who cares? And maybe this might actually good for people
to see. I feel like we are leaving in this outrage culture that wants to
censor everything. It’s not healthy for individuals or society.

~~~
duxup
Including allowing the very first thing mentioned in the article?

~~~
cf141q5325
The alternative is paying people to moderate it all day every day. With the
mental damage this causes to people who are desperate for a job.

Just blocking it yourself seems like a better solution. You dont have to be
fair in what you block yourself and (surprise surprise) you dont need to watch
it in the first place.

~~~
duxup
"Just blocking it yourself seems like a better solution. You dont have to be
fair in what you block yourself and (surprise surprise) you dont need to watch
it in the first place."

How does that work if nobody watches it, how do you know what to block?

~~~
cf141q5325
Close a video if its a shaky video of some guys with machetes. You dont have
to figure out how it ends.

You only have to watch it through if you have to make sure its actually gore.
Which an individual different to an employee doesnt have to.

Or simply automatically collect reports and if a high enough percentage of
views vs reports comes in it put it nsfw.

~~~
duxup
"You only have to watch it through if you have to make sure its actually
gore."

I'm not sure that is any different than what the article describes.

~~~
cf498
>She knows that section 13 of the Facebook community standards prohibits
videos that depict the murder of one or more people

>It’s a place where employees can be fired for making just a few errors a week

As I understand it, an employee has to make sure its extremely likely that it
is a murder. She is told she can pause it by the psychiatrist, not to close
it. You as an individual dont need to find out. Just close it immediately if
you assume its gore.

~~~
anigbrowl
Not everyone processes things the same way as you and many people (children,
poorly educated folk) don't necessarily have the self-awareness to make such
decisions reliably. And we both know there are people who delight in
propagating such material and causing discomfort to others. Perhaps consider
that the problem is not as simple as it appears to you.

~~~
cf141q5325
>Perhaps consider that the problem is not as simple as it appears to you.

I dont think its a simple problem. It is one without an optimal solution. I
just think the downsides of having the stuff around are preferable to having
people ruining their mental health out of financial desperation.

I also believe that the reliability would go up with time, people are able to
learn quite a lot.

