Hacker News new | past | comments | ask | show | jobs | submit login
Facebook moderator: ‘Every day was a nightmare’ (bbc.com)
86 points by punnerud on May 13, 2021 | hide | past | favorite | 62 comments



Honestly, I think it is high time people uploading content to the web were subjected to standard laws, as any other citizen trying to share something on the public square:

You are pasting graphic content on the square: try it and see how soon the police come after you and you have to at least answer before a judge.

You are uploading graphic content to the web? We call the police, nice and simple. And let the judiciary deal with it.

Otherwise what you have is this: people with absolutely no training, with no real support and with bad salaries doing something they should not: policing.


>You are uploading graphic content to the web? We call the police, nice and simple. And let the judiciary deal with it.

This is fantasy, even if it were desirable. In most places the judiciary struggles to enforce the laws we already have. In the UK something like 90%+ of burglaries and robberies go unsolved. The figures for rape are even worse. And those are crimes where there is a certainty they were committed by someone in the same country as the person reporting the crime (at least at the time) and where there is broad agreement they are serious crimes, worthy of police attention.

If you hand out really graphic leaflets in the street you might eventually be arrested and you might eventually up before a judge (but this is no way certain) but you'd really have to go out of your way to annoy people. You're more likely to just be told to stop it and have your leaflets taken away after a fight occurs. If you were handing out child porn then sure, you'll get arrested but that's because you're standing right there and it requires very little police time to investigate the matter.


The problem isn't just people sharing illicit content. The problem is those who facilitate that.

The 'town square' notion isn't apt because that's a public space. A better analogy is a private party as each and every platform is, at the end of the day, owned and managed by private legal entities.

If publishers were to verbatim publish whatever crazy, illicit content was mailed to them by their subscribers/readers, they'd be prosecuted within a fortnight. There's plenty of jurisprudence in the U.S. alone.

Online platforms escape legal liability because of Section 230 which states:

> No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

The big dilemma is that Section 230 needs to be reconsidered if you want to do something about the platforms that facilitate sharing graphic content, hate speech and so on. That's problematic because Section 230 has, arguably, allowed for the Internet to become a free haven for whatever anyone wants to share and build.

The other option is passing the buck to the free market. That is: if you don't like what you see, move over to a different platform. Or: don't use, or work for a platform that doesn't align with your own morals and values.

While that's a nice sentiment: the reality clearly doesn't work like that. Why does someone decide to work for Facebook as a moderator in the first place? Is it because they genuinely want to make a career out of this? Or is it because social/economic circumstances push them in that direction (e.g. lack of other career options, tradeoffs preventing them to move towards other career options,...)?

The same is true about the rise of social media. Billions of people are connected online. And they might seemingly have a ton of agency. But reality is that the vast majority of ordinary people spend the vast majority of time on a handful of social media, posting and sharing content, building an online identity... on a 3rd party platform which they don't own. Without hyperbole: they can't readily migrate lest they lose connection with the digital communities which is inextricably hosted on these platforms. That's just how a matter of networking effects.

All in all, in closing this diatribe: law enforcement and justice can't do much unless there's a clear, modern legal framework which can be readily applied to the middle men.


It _is_ illegal, and they can and do call the police.

The problem isn't that it's not illegal, it's that they still need to delete the content.

It's like saying "just call the police" to someone who's just been mugged, they still need to get to a phone and cancel all the cards. You haven't solved the mugging problem and they were probably going to anyway.


If I ever need the help of law enforcement, I'd be quite happy if they weren't all tangled up in investigating people uploading "graphic content" on Facebook.

Especially given that in many countries Facebook doesn't contribute to public infrastructure all that much - eg. their effective tax rate in the UK is below 2% [1]. So the deal is that I (as a taxpayer) support the infrastructure, while they'll outsource their workload onto it, "calling police" for the pettiest things all around the clock. Thanks!

[1] https://www.bbc.com/news/business-50011441


People uploading to the web are subject to standard laws - graphic content is already illegal in many cases, as well as against the terms of service of many websites. "You are uploading graphic content to the web? We call the police, nice and simple. And let the judiciary deal with it" is already what moderators do.

The problem here isn't that there aren't enough laws, it's that the scale of illegal content being uploaded is vast and moderators are unprepared, untrained and underpaid because their jobs are considered unskilled, low value labor. that is what needs to change.


Yes: what I am trying to convey is: "Facebook must call the authorities whenever any attempt at doing so within their site happens". Yes, it is troublesome and costly but it is the lawful thing to do.


are the law enforcement agencies even prepared for this? i do not think that they would be handle the load anyways


No. But we don't adjust laws or tell people not to report crimes just because the legal system can't keep up. It's better to have a record of it being filed even if it's dismissed right away.


What says that they are not already doing that?


As long as there are protected places where you can still legally post or distribute such material (there are legitimate reasons for doing so), I'm happy for it to be illegal in "most" places.


It only partly solves this problem, because good luck calling cops on people from countries like Russia


Well, if it is something like phishing US citizens, then yeah, Russian police have zero motivation to do anything about it.

Child porn though? Or something else that is an easy statistics boost? Russian police would be __very__ glad to act upon such information :)


Do you really want to involve the police and waste taxpayers money because someone posted a photo with half a nipple visible, and someone got offended and reported it?

Police currently doesn't even solve most of the cases where there is actual damage (stolen items, violence,...).


It's not a public square.

It's a private house open to the public.


Basically we need to accept that places like Facebook are effectively public spaces now. That means two things: "disturbing the peace" type laws should apply, and freedom of speech laws should apply.

These platforms don't want this, though. They don't want the public to own their platforms and they don't want freedom of speech to apply.

What will probably happen is the government will police the platforms for them, while they maintain ownership. We see this pattern a lot.


Is it time for the internet to get something like maritime law?

I could see each power controlling their own section in that case


Facebook is not the public square. And plenty of uploading is private.

Do you think sites like old Liveleak should be illegal?


No, but they should be under the law as any other citizen. Just that.


Sites like liveleak and similar are already under the law.

What Facebook wants to do is curate content beyond the legal minimum and that means they have to sift through a lot of legal but objectionable content.


We do not allow traffic controllers to work full 40h work weeks because the job is taxing. Moderators should get the same treatment. Have the same 40h pay for a much more limited exposure time. Have psychologists weigh in, and legislate to protect workers.


We do not allow traffic controllers to work full 40h work weeks because the job is taxing and they're directly responsible for the lives of thousands of people.

I'm not sure the analogy really holds even though I agree that it is not reasonable to expect these people to perform this role full time.


I'd further add to this "We do not allow traffic controllers to work full 40h work weeks because the job is taxing and they have a strong union".

Many other jobs (e.g. in healthcare) are just as taxing and equally responsible for lives, but fail to enjoy these protections.


I would argue that there is enough crap going on on social media that the people moderating are at least indirectly responsible for actual lives.

Bullying, threats, the list goes on.


I think it's a good analogy.

Ideally something like this should make it into a law or something similar, because trusting FB to do the right thing is probably not the way to go. (Having said that, that law would apply to only a handful of companies might be difficult to put in place; and then they'd outsource the job to "friendlier" country...)


We don't need to trust Facebook. You need to 'trust' individual employees to look for a better job, or not sign up in the first place.


> We do not allow traffic controllers to work full 40h work weeks because the job is taxing.

I suspect the proper reason is because they might endanger others.

Outside of France, the law generally doesn't protect you too much from taxing yourself on the job.

Moderators should look for a different job, if they don't like it.


Job pay is dictated by the free market. If they worm fewer hours, they will make less money.


Every time I imagine what it must be like to watch countless highly disturbing photos and videos, with the only goal being to prevent the public from seeing what you’ve seen (and not even being able to privately vent about it), I shudder. Timelines with your friends’ baby photos come at a cost.

I wonder whether content moderation jobs at Facebook et al. include sufficiently stringent pre-screening of candidates for mental fitness.


The idea that _any_ humans could be screened to do this job is questionable. Otherwise mentally fit people can become mentally unfit when constantly bombarded with images of child abuse, or worse.

It is the least bad solution we have to the most bad problem. I'm not saying you're wrong, but the discourse around how we can find the "right" people to do this job assumes that such people exist and aren't just desperate for a job.


I used to read my daily news on bestgore.com when it was still alive.

Some people would be perfectly fine doing this kind of job.


I used “mentally fit” here to mean not so much “normal” (whatever that means), and more like “fit for the job”.


You probably don't want to employ the type of people who have no problems dealing with the worst type of content the web has to offer.


Perhaps some of the people you’re referring to are otherwise unemployable for related reasons—by giving them paid work, FB could do public good.

Of course, that is assuming they truly have “no problems dealing with” that content, and won’t be triggered by it to do something terrible in real life. Not sure if mental health care is/can be advanced enough to tell precisely.


Like clinical psycopaths? I mean I guess psychopaths don't have to be bad persons.

I read a story about a cop that watched all seized child pornography to catalogize the victims and perpetrators. It probably requires a certain kind of person.


I don't understand the hate for facebook, proportiately to that for twitter.

One is a joke platform where no one really take each other seriously, and the other is a group of people who think of themselves too seriously, but really communicate in the same patterns, pretending to solve problems of the world

bunch of morons.


Not sure much can be added here that was not already discussed before:

https://news.ycombinator.com/item?id=20222594

If you care about your own well-being, then don't become a Facebook moderator.


> If you care about your own well-being, then don't become a Facebook moderator.

Do you realise that not everyone has a choice of which jobs they can take?

We require health and safety assessments in physically hazardous jobs to make them safer for workers. You aren't allowed to work in a nuclear power station without a dosimeter - we don't say "if you care about not getting cancer, then don't work with nuclear."


The argument here is to refuse to join as long as you're not supplied with a trustworthy dosimeter.


Some people are (unfortunately) desperate enough for work that they won't refuse a job despite the lack of a dosimeter. Which is why you should legislate that power plants must give dosimeters to their employees -- to ensure that those who cannot refuse cannot be placed in that position in the first place.

And this is entirely common-sense -- if a company doesn't provide protections for its workers, "just don't work there" won't solve the problem of the company not protecting the workers it does have. The workers who can refuse to work will (probably) do so, the company will then hire more desperate people, and the steady-state of this system is that such companies will be full of people who were unable to refuse -- thus the company can just continue not protecting its employees. This is basically the fundamental argument behind all employment protections -- substitute "dosimeters" for "lunch breaks", "minimum wage", "paid leave", or "overtime pay" and the same argument applies.



But what is the equivalent of a dosimeter for this job?


I knew a woman who worked with soldiers suffering post-traumatic stress, who was hired by Facebook to counsel some of these people. On the one hand a good sign they’re at least trying to look after them, but still a pretty grim indictment of the work.


I can help Facebook with this. Stop using traditional recruiters to find people for this. Advertise exclusively on 4chan. Yes, this will come with other risks but the moderators will not be traumatized. This will just be another day on the internet. Have the moderators utilize a group voting and incentivizing system to remove content so that those too desensitized or into the content won't just ignore it. Better yet, add an achievement system for those finding the content worth the most points. Let them earn ranks and titles. What should the monthly winner with the most achievement points get?


They does have an army of people to moderate content, however the fact that it is too overwhelming. Honestly, this really hurts mental health in a long term to run this job.


TBH, of course I do know that moderators exist, but I've never consciously thought that it can be really awful what they see (children stuff, suicide, etc)


"The Net interprets censorship as damage and routes around it" -- John Gilmore

"On the internet, nobody knows you are a dog" -- Peter Steiner

I think this is the pointy end of a deeper problem in the design of the internet.

If a priest or paediatric oncologist knows how to use tor and can maintain anonymity skilfully enough that it would be too effortful for the law to track him down... then his ability to upload graphic content will scale faster than the ability to censor it.

I do not have a solution.


Is the content she viewed every day worse than an average page on 4chan? I am just wondering if because of filters she is exposed to a higher level of bad material, or if its the same kind of environment.


Does 4chan have absolutely 0 filters? Including for illegal content.


Illegal content, such as child pornography, goes against the rules and is removed. But graphic content (gore) is allowed on the random board.


Viewing 4chan all day is also mentally damaging.


Web 2.0 was a mistake I guess


The ability to upload illegal content to the web long predates "Web 2.0"


Back in the day few people had the capacity to Broadcast.

Even on a college campus with a decent network, you ran the risk of getting kicked off if you misused the Broadcast address. Why? Cause there were software and hardware limitations that would bring down or cripple the network.

All that has been overcome. And now everyone on the planet can broadcast 24x7 almost for free.

This is the big issue.

If every neuron in the brain had Broadcast capability how would such a brain work?

Some food for thought - https://www.edge.org/response-detail/10464


how about ability to post anything on almost any website?

forums, social medias, meme sites, youtube, comment sections everywhere, hn or its brother reddit, etc...


It's never been possible to post anything onto almost any website. It's always been possible to post anything to any website that allows user uploaded content, and it's always been necessary for moderators to moderate that content.

"Web 2.0" changed nothing in that regard except making asynchronous requests possible with AJAX, allowing sites to become more complex and allow for realtime updating. But yeah, people have been posting gore, porn and CP everywhere since the dark ages.


Garbage collector: `Every day stunk’ (vaguely reputable source desperate for clicks)


As a hard-of-smelling person, I enjoy cleaning gross things and collecting garbage.

I'd hate to be one of these moderators.


I would hate to be a garbage collector or a Facebook moderator so I leave those jobs to people who can stomach them.

And also to people who can't, apparently, and "go public" about it.


Shadow ban + fake likes = ez win


Doesn't work with illegal content, because you'd have to keep hosting it to keep up the facade of the shadowban.


Why do so many devs use React if it's the spawn of an evil empire? Rails got so much shit talk lately because of Basecamp yet React keeps churning out new devs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: