
YouTube moderators forced to sign statement acknowledging job can give them PTSD - untog
https://www.theverge.com/2020/1/24/21075830/youtube-moderators-ptsd-accenture-statement-lawsuits-mental-health
======
Shish2k
"Forced" is a weirdly charged word - like "garbage disposal workers forced to
sign statement acknowledging that their work will involve handling garbage".
What's the alternative here?

Yeah, it's awful, dirty work, and we should give them as much support as
possible (definitely more support than they are getting right now) - but at
the end of the day, "dealing with things most people don't want to touch" is
literally the reason that the job exists...

TBH I'd be more concerned if YouTube was signing up moderators WITHOUT telling
them that they're in for a life of emotional torture :/

~~~
dredmorbius
Signing a nonnegotiable disclaimer, without full knowledge of impacts, under
economic duress, under an extreme power and negotiation disadvantage, as a
third-party contract employee, without collective bargaining, is not free and
fully-informed consent.

[https://news.ycombinator.com/item?id=22139444](https://news.ycombinator.com/item?id=22139444)

~~~
bagacrap
What do you mean without full knowledge? This article is about adults being
informed of the risks of PTSD up front.

Obviously everyone who seeks a job is under some form of "economic duress" for
if they had no need for income they'd either not work, or work pro bono.

What's the alternative here? A YT moderators union isn't going to negotiate
away the intrinsic shittiness of YT moderation.

~~~
FireBeyond
Upfront, you mean other than those employees already working.

The more nefarious part of it is that at no point does Accenture suggest that
they have any responsibility or will assist with said PTSD. Hell, they even
imply that they'll most likely fire you if they find you do have it (one,
because the document requires you to tell them, and two, because their
supervisors repeatedly pressure therapists at the in house WeCare to disclose
that information).

~~~
mbostleman
All the more reason to find work elsewhere which will cause those available to
do the work be more scarce which will cause the compensation for the work to
go up. For those under economic duress as it were, there's time. This isn't an
imminent danger.

~~~
wpietri
Nope. We should not sacrifice poor people on the altar of quarterly profit
goals. Doubly so for companies like Google and Facebook, which are hugely
profitable.

There is a sufficient supply of desperate and/or naive people that the market
equilibrium is significant and lasting harm to humans. If companies won't
solve that problem on their own, then the alternative is regulation. Given the
history of physically dangerous jobs over the last century or so, I expect
regulation is the likely outcome.

~~~
mbostleman
But these are not poor people. At $37k annually they are in the middle third
of US income and substantially above the poverty level.

The article linked below profiles a man that "worries that he will not be able
to find another job that pays as well as this one does". This indicates to me
that all else being equal, they are already offering a premium.

Nor does it sound like he is either desperate or naive. He is well aware of
market compensation and he is struggling to decide on the tradeoffs.

~~~
wpietri
Oh? What's that relative to your income? Because my point isn't about absolute
dollars. It's about America's long-running tendency toward exploitation of
people with less money.

------
dredmorbius
There are risks that people can understand and appreciate up front.

There are risks that simply are not fully conciously appreciated until they've
been experienced directly.

There are risks whose cumulative effects only build with time, often taking
years, even decades, to fully manifest.

And there are risks _which fundamentally change the capacity of the affected
to even recognise or admit their existence or severity._

PTSD is all of these.

Self-monitoring, encouragement to seek help, and provision of a nonclinical
"wellness coach" is grossly insufficient.

The companies, and in this case, contractors (Accenture), providing such
services, as well as outside oversight entities such as government regulators,
physical and mental health services, unions, and insurers, know (or damned
well should), can monitor, and impose regulations, limits, risk premiums,
compensation and settlement systems, and collective bargaining powers, for
such work.

Business is a risk-externalising engine. Some risks absolutely need to be
fully internalised. This is a prime case.

NB: I've worked at various times as a moderator, on spam-detecting and
reporting systems which entail seeing some unpleasantness. And on a large
social media network where I was tasked with removing flagged images from the
storage network. The flagging process hadn't been explained in any detail, and
as I would be deleting millions of items of user content, I spot-checked a
small handful to confirm the flagging was accurate.

After no more than a half dozen (and probably fewer) I simply didn't _want_ to
see any more -- the image from over a decade of a young girl still lodges in
my mind, though I've never wiped, nuked, and shredded files on disk harder.

I'm satisfied with the fact that the scripts I created wiped all that content
from our and our CDN's storage not in the weeks or months of the initial
estimate (our CDN vendor apparently had never considered widespread deletions,
or experienced them), but less than two days, with verification. Never heard
any reports of unwarranted deletion either.

Somewhere else within the organisation, unknown to me, others _had_ seen and
verified those images. I think of who they might have been and the impacts on
them.

~~~
anon9001
I really don't understand how images or even videos are causing PTSD. Maybe
it's because I grew up on 4chan in the age of goatse, but shocking content is
a tiny fraction of how disturbing real life can be.

I understand that looking at disturbing content is going to have some impact
on people, and that they may not be able to evaluate that risk properly, but
it's nowhere near what every nurse, doctor, firefighter, police officer,
soldier, and even teacher will live through over the course of their career.

I've been diagnosed with PTSD after seeing some deeply disturbing medical
shit. I'm definitely impacted for life, but there were doctors and nurses
there too, and they're impacted too.

Maybe I've been on the internet too long, but there's a massive delta, at
least in my view, between seeing a disturbing video and living through real-
life trauma. I'd take the video moderation job over being an ER nurse any day.

~~~
FireBeyond
Forgive the bluntness, and speaking as someone who knows 4chan all too well,
and is a paramedic:

goatse is one thing. We're more talking high definition videos of people being
beheaded by Mexican drug cartels, people being held down while dogs eat their
genitals. Videos of toddlers being raped.

You can't really compare that to a fairly mundane, if explicit, naked man
showing his anus.

Also, as a paramedic, and speaking for many that I know (though not all, of
course) - trauma is rarely (or a lot less) PTSD inducing. Gruesome, gory,
sure, but in the end it's all blood and tissue. What gets to most of the
people I know is the emotional violence - being called to child sexual assault
cases, accidental deaths, things like that, that take the toll.

~~~
anon9001
Thank you for your service to the community as a paramedic. I have no idea how
you people do it.

As someone who has exposure to that world, how do you do deal with it? Is
there some kind of training or protocol or therapy that's built in to your job
that's different than that of the moderators? Does any of the mitigation even
work?

For every one of those beheading videos, someone has to actually go collect
the head. I would think that must be orders of magnitude more traumatizing
than skipping through the video of it happening enough to flag it.

~~~
FireBeyond
It's a very good question, because for the longest time,
paramedics/EMTs/firefighters -were- expected to just "suck it up".

Now, with increases in the protection of our PPE (bunker gear, etc.) and other
knowledge, there are less line of duty deaths due to accident or illness
(typically cancer, though that's still a big one) - now the biggest cause is
suicide, mostly as a result of PTSD.

There's a documentary that was funded in part by Denis Leary called "Burn: A
Year in the Frontlines in the Battle to Save Detroit", talking about fire
departments there. One of the veterans says "I wish my mind could forget what
my eyes have seen".

Around here, the PNW, at least, there's a huge movement toward handling it
proactively, access to counseling, therapy, hotlines, and as importantly as
anything else, active efforts to remove the stigma associated with things.

We used to do CISDs (critical incident stress debriefings), which are now
largely discredited - essentially "put everyone in a room and 'make' them talk
about how they feel after a bad call, whether they want to or not", but now,
more and more departments are hiring full-time mental health professionals.
One near me has someone who specializes in PTSD, and another who works with
sleep regulation (all those alarms in the middle of the night), and
alcohol/drug use.

------
sparker72678
This feels like an externality that needs to be internalized by these
platforms.

It shouldn't be legal to destroy people's minds, then drop them back off into
society with no help.

If you want to run a massive platform where anyone can upload anything, then
you should have to pay for it, at least insofar as financially supporting the
potentially life-long therapy someone is going to need after doing a
moderation job.

One of many, many, externalities these platforms create, I realize.

~~~
gfo
Isn't this the same risk anyone enlisting in the military also runs?

Still, I agree - it would be a lot easier to avoid these awful posts if people
had to pay to be on the platform. It wouldn't be perfect of course, but it at
least provides a barrier to entry and a slightly harsher consequence if you
forfeit your membership fee when banned for inappropriate content.

~~~
tremon
_Isn 't this the same risk anyone enlisting in the military also runs?_

Yes, and we've been slowly forcing the military to acknowledge to own up to
that. When PTSD was first recognized (as shell shock in WWI), the initial
response of the (UK) military was to declare these people physically healthy
and return them to the front, or execute them for desertion. I'd say the armed
forces have come a long way since.

------
remote_phone
Companies like Google, Facebook etc should force their executives, from VP to
CEO, to do a week of content moderation every quarter.

Then we would see how sympathetic they would be to the plight of these workers
and how quickly the problem would be solved.

~~~
jrimbault
It's basically reinventing
[https://en.wikipedia.org/wiki/Worker_cooperative](https://en.wikipedia.org/wiki/Worker_cooperative)
and rotation of responsabilities.

But yeah. That would be good.

~~~
AndrewUnmuted
To me, a more fair comparison would be how Amazon requires all office workers
to shadow a customer support representative for one week out of each year.

I don't think the OP's suggestion does anything close to reinventing the
concept of a worker cooperative.

~~~
mewpmewp2
Amazon really is not a good example of a successful employee management. Their
attrition is horrible, worst work life balance culture out of FAANGs, their
reputation is horrible and working there is most disliked. They cast a wide
net, but lose employees fast.

~~~
AndrewUnmuted
Have you worked there? I have quite a different experience than this.

I found Amazon's hiring practices and employee management to be very effective
compared to other places I've worked. In general, I found their whole dogma
really worked in practice, which surprised me at first.

~~~
mewpmewp2
I have not worked there. It is what I hear from other people's experience. Of
course it will vary between teams, but the reputation is still low and
attrition rate very high.

------
PaulHoule
Making repetitive decisions over a long period of time can make your mind get
weird. This counts for the "normal" development of training sets but it gets
much worse when the detailed semantics are challenging and hairs must be split
carefully.

For instance I was filtering images on my tablet while lying in bed in a dark
room and by the time I got to the 2000th image my visual system started
getting weird. (e.g. I would see an image that was photocomposited and feel
strong cognitive dissonance... I would see the photocompositing and not see
the image I was supposed to see)

When you put a group of people in a room and have them label things that
require careful discriminations, some of which are critically important,
others of which are necessary for the process but not necessary for the
product, you will see people get emotionally disturbed. I haven't seen punches
flying but I think I could make it happen if the group was compressed enough
in space and the stakes were high.

I also had a toxic data set that was derived from Wikipedia where if you
started clicking on things from the first record, you would quickly come to
various phrases involving the "F-word". Just about everyone who looked at this
data set would soon be sitting in a daze in front of the computer repeating
the "F-word". I'd tell them that they have the "F---s" and they should get a
grip on itself.

------
ma2rten
_The PTSD form describes various support services available to moderators who
are suffering, including a “wellness coach,” a hotline, and the human
resources department. (“The wellness coach is not a medical doctor and cannot
diagnose or treat mental health disorders,” the document adds.)_

They don't even provide a licensed therapist? Most tech companies have an
employee assistance program where employees can talk a licensed mental health
professional 24/7\. It seems like Accenture didn't even provide that here.

------
AndrewDucker
I wonder what a good approach would be for jobs which need to be done, but are
harmful to the people doing them?

They talk about "the industry-leading wellness program and comprehensive
support services we provide" \- but should there be a basic standard of
support set which they have to meet?

I'm not sure of what a good answer would look like.

~~~
pgrote
>jobs which need to be done, but are harmful to the people doing them?

Does this job need to be done? Perhaps we have learned that opening a platform
for anyone to upload anything in an anonymous state is not good for society.

Maybe it is time to force anyone uploading to YouTube to be approved prior.

All the hand wringing concerning online services and the belief they should
exist since everyone seems to like them is something society has to deal with.
No one is practically looking at the ill the services are doing to the world.

Why should anyone get PTSD to ensure YouTube can continue to offer videos?
Wow.

~~~
samatman
Approved... by content moderators, perhaps?

Rearranging the deck chairs doesn't solve the fundamental dilemma.

~~~
goatinaboat
_Rearranging the deck chairs doesn 't solve the fundamental dilemma._

Charging $10 or $100 per upload would kill this material stone dead. And
ensure that everything could be traced to the person responsible if it did
slip through.

------
superbrane
This is just to protect the corporation from potential legal action. It s like
a disclaimer on a police agent job that you might get shot when doing your
job. Problem is when you don t care enough to take care of the ones who are
affected by the stress.

------
fapjacks
No amount of technology will be able to stop all people from being exposed to
all forms of content at all times which may one day trigger PTSD symptoms.
Social media shitholes use blacklists when they "should be" crowdsource-
whitelisting content and not moderating anything at all. And then users decide
for themselves if they wish to turn off the whitelist content filter and
expose themselves to the content that may one day trigger PTSD symptoms.

Prepare your child for the path, not the path for your child.

------
moomin
Mr Burns is currently getting Homer to sign something saying that his job may
give him inoperable cancer and a third arm.

------
Mangalor
Google reps should be forced to say it out loud to new employees. Some random
sentence in a contract is far too casual.

~~~
James_Henry
I get your point, but it doesn't seem to be a random sentence in a contract.
For starter, "The PTSD statement comes at the end of the two-page
acknowledgment form, and it is surrounded by a thick black border to signify
its importance."

------
thinkingemote
Using AI to replace human moderators doing this stressful work must be morally
good, right?

~~~
fapjacks
There's mountains of money to be made if you can fulfill this political
promise Google and Facebook have so far failed make good on. Go for it.

------
oceanghost
I have PTSD. Believe me, you don't want it.

------
chooseaname
How sad is it that moderators are even needed. Why can't people just be nice?

~~~
LookAtThatUgGuy
Max Woolf (@minimaxir on twitter and here) has downvoted you, screenshotted
your comment, and posted it on twitter for public shaming ️

~~~
minimaxir
I do not regret downvoting the comment since it is an _unproductive_ take in
2020, and there's been more than enough incidents in the past _decades_ to
show why content moderation is necessary and inherent human virtue is not
sufficient. I've deleted the tweet (the intent wasn't public shaming, it was
more highlighting an unusual take).

That said, creating a throwaway for snitch-tagging isn't a moral highground.

~~~
chooseaname
> That said, creating a throwaway for snitch-tagging isn't a moral highground.

Yeah, that wasn't me. I don't know who you are or follow you.

~~~
minimaxir
I wasn't accusing you. (I'm more-or-less curious who did; it's the first time
I've seen snitch-tagging on HN)

------
ronilan
If this was up to me, each and every employee of YouTube (and Google, for as
long as YouTube is not set free) would be trained as a content moderator and
would be required to do so for a couple of weeks (or as sufficient to complete
the task) every year.

Edit: yes. Every single one.

~~~
mewpmewp2
Most of their top engineers would quit and go to work for another company that
values their time. Quality of service would fall because top people have left.
They would have a lot more difficult time to find any engineers to work for
them.

~~~
ronilan
_“Most of their top engineers would quit and go to work for another company
that values their time.”_

Maybe. But I doubt it.

First because people don’t quit that fast and second because I think something
else might happen that is worth the risk of loosing a few “top” quitters.

I think top people will realize there is an opportunity here, and after seeing
how effective it can be for a team to work toward a common goal, will request
other such tasks be added to the list.

Eventually, people at that company can choose where to use their “20% service
job” time.

But, then again, don’t worry, it’s all theoretical anyway, it’s obviously not
up to me.

~~~
mewpmewp2
Engineers switch companies already on average every few years. All companies
have trouble keeping top engineers as it is with all the competition going on.
Look at the perks and benefits being offered to them. If those perks can sway
engineers then imagine how being forced to do something someone is not good
at, might cause them PTSD, might be waste of time for progression in their
life, be in general just a lot more unpleasant activity can affect the choice
among companies they work for. All of those extra activities besides being
able to focus on work will affect productivity and achieving goals. There is
enough work and goals to achieve, I am not going to choose a week every
quarter to do something that is not going to help me towards career goals I
have. I do not think I am superior or others are beneath me but it is logical
for me to choose a company that aligns best with what I want to achieve and if
I have the option to do so, why not?

~~~
ronilan
“ _I am not going to choose a week every quarter to do something that is not
going to help me towards career goals I have._ ”

In which case, if it was up to me, and it is not, obviously, I’d open a short
discussion about goals, and if we couldn’t find a common ground, would write
down “culture fit” and wish you better happiness somewhere else.

~~~
mewpmewp2
If there are enough engineers who do not want to be forced to do chores
unrelated to what they excel at, what they don't see benefitting their lives
and what has dangers to their mental health it is likely you won't have many
to build your video platform left.

Why should engineer choose this culture compared to other cultures where he
can just build things and provide value at what they are best at?

Knowing other engineers most would avoid such activities where they can't use
their skills valuably like a plague.

Also you would lose goal/results oriented people. The ones who are best at
reaching those goals are the ones who can not stand doing anything that does
not align woth those goals.

~~~
ronilan
“ _you won 't have many to build your video platform left._”

I’m not building a video platform. It’s already there. I’m just, totally
hypocritically, fixing the company that operates it.

