
The Worst Job in Technology: Staring at Human Depravity to Keep It Off Facebook - mef
https://www.wsj.com/articles/the-worst-job-in-technology-staring-at-human-depravity-to-keep-it-off-facebook-1514398398
======
Waterluvian
I found this juxtaposition unsettling:

“I was watching the content of deranged psychos in the woods somewhere who
don’t have a conscience for the texture or feel of human connection,”

"...If the managers noticed a few minutes of inactivity, they would ping him
on workplace messaging tool Slack to ask why he wasn’t working."

The texture of human connection is severely diminished when you are managing
what is essentially a drip-fed trauma survivor remotely using a metric of
trauma exposure per minute.

~~~
harryf
Likewise this part...

> Whisper no longer employs U.S.-based moderators. It uses a team in the
> Philippines along with machine-learning technology.

Great that they moved the problem to a place with even fewer protections for
workers.

The underlying issue is seeing content moderation purely as cost that should
be minimized. Taking YouTube Kids for example I'm sure you could sell parents
products based around moderated content that educates their children, supports
their morals and beliefs etc. i.e. find ways to generate enough income with
moderated content to treat people fairly.

~~~
oscargrouch
> Great that they moved the problem to a place with even fewer protections for
> workers.

Other big problem with this is the lack of cultural background for doing this.

Only if you belong to a given culture you can understand the subtleties and
the cultural tolerance according to each culture.

This is one of the things that should make us think, about how bad is the
situation of centralized social networks, when the values of one culture can
be arbitrarily imposed into another.

People and cultures in general should have the right to organize according to
their own cultural values. So the only right way to do this, at least if its a
centrally managed network, is to have native people doing this for each
country they have presence for.

If this is too expensive to do it in the beggining .. maybe hire not just
cheaper labor, but from countries (cultures) that have good sensibility about
foreign cultures, and select the people with talent to understand cultural
nuances.

I feel particularlt outraged when according to a given foreign culture, one
thing that locally is not a big deal, get censured, and then you think how
close we are to the big brother (by the way theres a certain country already
going through this path).

But in the western hemisphere, if nothing change, we are slowly consolidating
a regime of digital feudalism were corporations are free to dictate rules for
our lives.

------
quantummkv
The real problem is not that such content exists, the problem is that Facebook
makes it horrifically easy to distribute such content. Back when Facebook was
more of interacting with people rather that liking and subscribing to whatever
these content and joke pages shoved down these concerns where not present
much. Clickbait, etc is just a natural progression of like and subscribing
culture.

My Facebook feed about a year ago, when i last opened facebook was filled with
the same memes and "inspirational messages" shoved down by a handful of pages
being shared by dumb idiots. I had to search and go directly to a person's
profile to see what they were up to.

The real problem is that Facebook started behaving like a TV channel that a
social network.

If Facebook goes to only keeping profiles of real persons and removes any and
all of these pages and blogs and news agencies and the lot, a lot of its woes
with regard to content will be solved.

These pages give a sense of anonymity to the people behind them. Take that
anonymity away. Once you know that your personal image will be directly tied
to whatever you post and held responsible by everyone in your friend list, you
will begin to curb your tendencies in public.

~~~
Viliam1234
I believe that the "one account = one real person" policy actually makes
things worse.

Regardless of what Zuckerberg claims to believe, in real life I have multiple
"profiles". For example, I don't debate Java programming with my family
members, I don't debate politics or religion (or generally any opinions,
because no one knows what will become a hot political topic tomorrow) with my
colleagues, etc.

Similarly, I have different rules for reading stuff. I hate memes, but if my
cousins post some, I am not going to block them like I would most other
people; I will just sigh and scroll down. More importantly, I do not want to
read stuff from all my social circles at the same times; sometimes I am in a
mood for family stuff, sometimes programming, sometimes other things.

Even better would be to have advanced tools (but with simple user interface)
to specify what I want to read; e.g. "what my cousin wrote, but not what
someone else wrote and my cousin shared"; to filter posts containing certain
words, etc.

Merely removing anonymity will simply divide users into two groups: those who
care about not getting fired tomorrow so they only post completely
unobjectionable politically correct vanilla stuff, or those who are not very
smart or young and inexperienced or too rich to care about what other people
think, so they post anything that comes to their minds.

------
mbrumlow
This sort of stuff at one point was handled by hosting providers. It now seems
that the internet is "facebook/google" so it is no wonder they are getting the
brunt of this sot of work.

I can tell you back in the day when I worked for Rackshack I never envied the
abuse department. They always looked stressed out.

8k post in a day is just too many for one person. Its not about how much work
they are doing, its about the content they are subjected to look at and
review. To do that you have to actually think about it and make a decision --
and that takes a toll on people -- you don't get to forget.

I am not one for regulation but if there is on place in tech that should be
considered for regulation, then I think this is a good place to start.

These workers need paid more, access to therapy and much more time off. I also
think there are technical solutions to help ease the work needed, but that
cost, and nobody seems to want to pay with human workers left holding the bag.
This would also include stricter rules to make filtering easier.

Good job, you guys can put a dancing hotdog on the screen why not use that
talent to work and make back-end systems to automate this sort of horrid work
away. I know it will be hard, but if NN and deep learning is all cracked up to
what keeps being preached then it should be within the realm of possible.

Also, while with the when it comes to the law, I am for the notion that it is
okay if 10 bad people get away if it means not falsely convicting 1 good
person. However on the internet with regards to what normally amounts to
pointless shit that people post on the internet. I am okay with 100 good post
being automatically removed it if means 1 bad post is also removed.

~~~
dbg31415
> Facebook will have 7,500 content reviewers by the end of December, up from
> 4,500, and it plans to double the number of employees and contractors who
> handle safety and security issues to 20,000 by the end of 2018.

Seems like the bulk of these roles are contract. "$24/hour" was mentioned in
the article. Probably no health benefits. Probably no counseling.

251 working days * 8 hours * $24 = $48,192. Less taxes. Less sick days. Safe
to assume these workers aren't all going to work out of the main office in SF.

Seems like Facebook would have to set up some sort of work from home system,
like people who transcribe medical records, or some sort of off-shore system,
to make this work for workers. But... spreading out the people who do this
work is likely to increase security risks that sensitive data will be shared,
and reduce workers ability to find comfort with peers.

No way to look at this where automation isn't going to be better.

~~~
spaceseaman
This definitely seems like a good place for either regulation or unionization.

Unionization would likely be difficult though due to the low barrier of entry
for the job - artificial barriers would have to be created by the union and
those break down pretty easily due to human factors (see Walmart).

I'm all for regulation in this case. Require that folks who review content for
illegal posts be limited in the amount of content they review in a weekday,
must have certain benefits like mental health counseling, and get some
vacation / sabbatical from the work.

Risks might be reduced innovation in the field, but the big boys like Facebook
and Google have already found excellent ways to snuff out their competition
within the existing market.

~~~
adventured
Unionization doesn't work very well when you can trivially outsource the
job/task. It'd be necessary to entirely remake the US economy, because you'd
have to enable union powers that could directly control all actions by the
parent corporation, including setting up overseas / outsourcing.

------
rpmcmurphy
The internet really is an incredible cesspool. It would be interesting to see
how the public reacted if YouTube and Facebook turned off their content
moderation for a week. It would make the goatse meme look like a Sunday school
picnic.

~~~
saas_co_de
Yes, but really, humanity is the incredible cesspool. Facebook is just mirror.

The interesting thing is that instead of trying to prosecute the criminals
behind this content we would rather just censor it so we can pretend it
doesn't exist.

~~~
JKCalhoun
I disagree. 99% of us are good people. The cesspool loves a stage/soapbox
though.

------
Santosh83
Why are we surprised or shocked? Hasn't society always used servants, police,
soldiers, miners, loggers, garbage men, wardens, and such to do tasks that the
rest of us are loath to do, and to keep certain stuff away from 'civilised'
society?

Why would online be different suddenly? Analogues of all the above are needed
online too. And somebody who has no other option will be unfortunate enough to
fill these roles.

~~~
jstanley
> Why would online be different?

Because you don't have to look at anything you don't want. Because it's not
bounded by physical space limitations, so everyone can have as much "space" as
they want. Because you can travel from any point to any other point almost
instantaneously. Because you can be in as many different places simultaneously
as you have the capacity to keep track of. Because you can come and go as you
please without anyone else even knowing.

The internet is _nothing_ like the physical world.

------
gonzo41
What they don't tell you, you can get PTSD from being a witness to trauma. It
happens to cops and lawyers investigating child abuse all the time.

------
radmarshallb
I would expect that social media websites whose content is largely decided
democratically (via votes, shares, or the like) would relegate the majority of
this content to a place where it is not seen by many. I would argue that the
best way to handle this issue is to let the sites mechanisms deal with the
content accordingly and then focus efforts on developing processes that will
be able to detect and remove it automatically.

The article implies that they are forcing moderators to view the content at a
high clip. Why, so as to get false positives back online as quickly as
possible? Maybe moderators should only review content that reaches a certain
threshold of complaint, and other content is left as is?

~~~
legulere
Reddit is such a website and it’s generally consensus there that voting alone
does not work.

~~~
wlesieutre
Subreddits are pretty heavily moderated via the "report" buttons. Moderators
are basically whatever community member created the community plus the other
volunteer users that they picked, and if you have a problem with a subreddit's
moderation the usual response is "It's my sub, you can leave."

I'm not sure how that would work for facebook.

~~~
ben_w
Reddit recently deleted a whole bunch of subreddits because they decided that
wasn’t good enough.

Anecdotally, I know someone who was interested in one of those topics
(zoophilia), who thought the subreddit in question was “a dumpster fire” yet
was still upset about it being banned.

------
tyingq
Can't help but being reminded of this Silicon Valley episode:
[https://youtu.be/dvn-hpZdElo](https://youtu.be/dvn-hpZdElo)

------
wglb
In the very early days of the internet being used at a very large corporation,
I had the task of reviewing proxy logs to monitor what was euphemistically
called "non-business use of the internet". I started by scanning for URLs with
"XXX" in them, then pivoted to make a more extensive list.

I never looked at the content itself. Just the seeing the URLs was corrosive
enough.

------
versteegen
A Kaggle competition just started a week ago, hosted by Jigsaw (part of
Alphabet) for classifying toxic content (insults, threats, vulgarity, etc) in
online comments. $35,000 prize pool. [https://www.kaggle.com/c/jigsaw-toxic-
comment-classification...](https://www.kaggle.com/c/jigsaw-toxic-comment-
classification-challenge)

Also interesting, they already have an API for doing this sort of
classification: [https://perspectiveapi.com/](https://perspectiveapi.com/)

------
mef
non paywall link [http://archive.is/tmFps](http://archive.is/tmFps)

------
guuz
It's a good space for regulation. The treatment these workers are subjected to
is ignominious.

------
lyra_comms
One of our team members used to do this job; luckily, she managed to do it
with deep learning, so didn't have to spend too much time looking at
unpleasant images.

This experience is one of the main drivers that pushes our team to develop an
open, nonprofit conversation platform on which harrassment is difficult by
design.

www.hellolyra.com/introduction

------
mikehines
I wonder what AI thinks and then does to us when we have one.

------
aglavine
The problem is more the job conditions than the job itself.

------
meritt
It's not a popular opinion but remove the anonymity/fake accounts and you
eliminate a very significant portion of these issues.

------
megamindbrian2
I want to do this job.

------
katastic
South Park did an episode on this. Where everyone was so afraid of "reality"
they elected someone to censor it all and keep their tweets only positive.

Butters had to see every depraved thing. And he ended up trying to kill
himself.

And in the end, when he almost died, they blamed _him_ for failing to be the
perfect filter.

~~~
creaghpatr
Great episode

[http://www.esquire.com/entertainment/tv/videos/a39070/south-...](http://www.esquire.com/entertainment/tv/videos/a39070/south-
park-season-19-episode-5-shaming/)

------
Pica_soO
If there ever was a ISIS recruiting pool, this is it.

------
jorgec
It starts censoring topics that are illegal, then topics that aren't family
friendly and finally, it will end censoring point of view and free speech.

Its 1984 all over.

Im not an anarchist however, if something is illegal then go ahead and cut it
but, censoring without a legal cause is a crime.

~~~
stuntkite
Social media platforms are owned by privately held companies. Their moderation
is up to their discretion. Sure there's a larger conversation about their
ubiquity and how social media posts relate to free speech, but as things work
now in the US, Facebook isn't breaking any rules by censoring anything for the
most part.

If you want to collect pictures of horrible crap and start a website, as long
as it's not illegal you can host it on your own.

