
The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People - dsr12
https://motherboard.vice.com/amp/en_us/article/xwk9zd/how-facebook-content-moderation-works
======
smt88
Facebook didn't need much moderation until it started allowing strangers to
enrage each other through public comments.

My experience on FB is that 100% of the value of the platform comes from
parties I choose to engage with (friends, local businesses, groups), and
almost all of that value is wiped out by having to see comments from trolls.

~~~
bufferoverflow
What do you mean by "trolls"? Someone who disagrees with you? Like half the
country is "trolls"?

And what stops you from simply unfriending these trolls, if they are so toxic?

I'm not buying that narrative at all. What I think actually happened is that
groups of people always had irreconcilable differences, but these groups were
mostly physically separated, because people tend to form bubbles of like-
minded people. But then FB came, and everyone's opinions became visible. Half
of your coworkers that you went out drinking with turned out to be Trump
supporters or antivaxxers, and that just makes you and them angry.

The solution is to stop demonizing the other side, and start having rational
conversations. But good luck with that.

P.s. Calling them "trolls" will only drive them further away from your
politics, you're helping Trump to get reelected.

~~~
komali2
I completely disagree. I grew up as the outsider in the south - liberal
ideology, atheist, etc. Through the Clinton, Bush, and Obama eras it was No
Big Deal. I'd have spats with friends and neighbors, and they'd end friendly.
In South Carolina I went from being called the "n-lover whiteboy" to
convincing my white friends that my black friends are just as cool, and have
good BBQs. We'd all disagree with eachother, but it'd be respectful. Online
and off. This is when Facebook/Myspace is just coming out, and "normies" are
learning about interacting online, and about how easy it is to forget about
the face on the other side of the screen. Even there it wouldn't flare out too
horribly.

Something's different, these last few years. Sometime around the tail end of
the Obama era is when I began noticing it, that it wasn't a "political
disagreement," it was a fundamental inability to reconcile values. It was
"freedom of speech vs political correctness," it was "nazis vs antifa," it was
"black lives matter vs all lives matter." People getting mass-defriended on
facebook, me being mocked by my southern friends for my liberal values (that
are the same as they've always been). Things have gotten _far_ more divisive,
and if I had to make a guess, I blame the political organizations paying
trolls to disseminate divisive fake information across the internet, and
prominent politicians engaging in disgusting divisive language (Duterte, Le
Pen, Trump, Netanyahu, Erdoğan, ISIS high-production-videos).

~~~
castlecrasher2
Maybe you're right. I wonder how much of it is non-stop political articles in
our faces if we consume social media combined with the fact that we rely so
little on our neighbors and friends because we don't have to so understanding
and sympathy skills have plummeted.

On the latter point I wonder if lower human social contact and ease of escape
online means lower tolerance towards differing opinions, or even things we
deem wrong. It feels more and more like typed opinions are people shouting
into the sky, and when disagreed with they get offended and lash out or they
say "you're not who I was referring to," almost like that racist grandma
stereotype.

~~~
komali2
I'm not sure. I remember having the realization in ~2015 that you aren't
allowed to be a moderate, in ANY discussion, on reddit (back when that was the
only forum I used regularly). Be the discussion about racism, police
brutality, or even whether or not sourdough should be steamed when baked. You
are RED or you are BLUE. And if you're BLUE, you're FULL BLUE or you're a
nazi.

So like I'd come in talking about being a 3rd party voter and get shit on. Or
about how there may be middle ground, on, say, gun control, and get downvoted
to obvlivion.

I don't know, polarization is powerful, I guess. I mean, it seems absurd to me
that the 300 million people in this country are literally split black and
white down the middle on all issues. Surely not. It defies logic that 150
million all perfectly support total access to guns while also opposing all
forms of abortion while also hating all forms of welfare (this definitely
can't be possible because shitloads of republican voters are on welfare). So,
something's up. My guess is that it behooves those in power to keep the
population divided? No idea.

------
j-c-hewitt
The risk with this is that eventually it will erode Facebook's legal argument
that it isn't responsible for the content that its users post to the network
or share through Facebook Groups.

As it is there are tens of thousands or even more Facebook groups dedicated to
some kind of either illegal, rights-infringing, or grey market activity in the
US alone. The more that Facebook holds itself responsible for moderating and
controlling that content, the less weight that Facebook's arguments that it is
just a dumb pipe will hold. This has been going under-remarked relating to
this cluster of issues.

Strongly doubt that FB or any similar company will successfully navigate this
issue in the long run.

~~~
telltruth
They definitely screwed up how they handled the crises. Zuck & Sandberg's
apology tour has made them responsible for any future liability or slip ups
which are inevitable given the size of operation.

There were 4 separate issues

1\. People posting fake news

2\. Recommandation serving fake news due to similarity

3\. Apps stealing data

4\. Advertisers able to target vulnerable audience

They should have kept these issue absolutely separate and taking
responsibility for all but #1. They should have stayed very firm on not
accepting blame for what people decide to share on their platform. Instead
they ended up accepting blame for everything and now they are screwed.

~~~
j-c-hewitt
Right. This is a giant multi-billion dollar target that is increasingly
portrayed as a machine for the mass production of liability. People seem to be
under the impression that internet companies putting themselves under the
cloak of free speech came purely from 'idealism.'

The real reason is to avoid being Napster'd. How can FB turn around to a
copyright holder and say "Ha ha, this content is totally not something we take
responsibility for -- you should take it up with the user" if it is actively
policing content on the platform with an army of staffers and actively shaping
what people can and cannot write on the platform? That looks a lot more like
editorial behavior than 'moderation'.

It's different when it's just automated systems that are dinging people for
potentially posting copyrighted material or a snuff film. The system doesn't
exercise human judgment on an active, day-to-day basis. To the extent that
services like FB took a hands-off approach had much more to do with avoiding
liability than any iron moral commitment to allow people to share flat earth
and lizardman related content.

------
EGreg
Why do social networks have to be so big and centralized?

Because we don’t have good open source software like Wordpress for blogs that
people can install and manage their own networks on their own servers. And
why? Because a social network is far more complex than a blog, including
contacts integration, permissions, realtime notifications and so on.

[https://m.youtube.com/watch?v=pZ1O_gmPneI](https://m.youtube.com/watch?v=pZ1O_gmPneI)

~~~
superkuh
Notabug.io is doing good work in this area. It's a reddit-alike with a
distributed architecture. It has federated nodes hosting the website but also
the in-browser client distributes posts p2p between nodes in the federation.
Additionally, voting is proof of work based.

It's the early days now and the lens/filter system for individual users hasn't
rolled out yet but the idea and implementation are far ahead of anything else
I know about.

------
jimnotgym
> The Impossible Job

Leads me to two possible conclusions

1) Facebook gets a free pass to ignore the law because 2bn members requires x
million moderaters and that doesn't make economic sense, is hard etc.

Or

2) Facebook has to abide by the law, and since it doesn't make economic
sense...Facebook closes.

~~~
notriddle
The question is whether Facebook is a communications company or a media
company. Are they like GMail, where they aren't responsible for everything
that goes through their pipes, or are they like Vice, where they clearly are?

Facebook obviously wants the publicity of a media company _and_ the legal
protections of a communications company. Which is obviously stupid, but that's
the weird situation we're in.

~~~
WorldMaker
Facebook also obviously thinks of themselves as a tech company first and
foremost, so the question of moderating and scaling moderating is considered
almost solely from the standpoint of it being a technical problem.

The problem is a social and human one. Newspapers have traditionally handled
it with editors and people and human policies. Facebook isn't inclined to
explore editors and people and human policies, not because they don't scale or
we don't know how to scale them (Wikipedia has scaled relatively okay, for
what that is worth as an example; Newspapers have used editors and researchers
for a long time), but that it's not an interesting technical problem to scale
people.

Facebook even had people hired in some of these moderation roles for a while,
but it was sexier for them to use essentially blackbox algorithms that turned
out were easy to game by malevolent entities. Now they only seem to question
"fixing" the blackboxes rather than finding solutions that use smart or
trained people.

------
tvh
>For example, its revenge porn policy and recently created software tool—which
asks people to preemptively upload nude photos of themselves so that they can
be used to block anyone from uploading that photo in the future—was widely
mocked.

This is hilarious. I understand the logic, but to believe that users will
willingly submit nudes as a preemptive measure seems to widely underestimate
the ramification of the user's decision-making process, and the reliability of
Facebook when it comes to user privacy protection. Hey, let me upload my nudes
to a Facebook software, they've been handling privacy really well so far, said
no one ever.

------
donmcronald
Part of the problem is Facebook doesn’t want anyone to have a negative
experience because those people disconnect, at least temporarily.

I can go on Facebook, praise Hitler, and get likes. I think that skews
peoples’ perception of what’s acceptable and what’s not. Maybe Facebook needs
a way to give negative feedback.

It’s such a hard problem though. The quantity over quality friend groups
everyone has now makes it difficult to trust others. When I was growing up, if
one of my good friends disagreed with me it was a good indication I needed to
reevaluate my opinion. I think that’s been lost.

~~~
eksemplar
I’m not sure that is really facebooks problem. If I saw you praising Hitler,
I’d unfriend you, so it’s not like there isn’t social consequences on
Facebook.

I think the real issue is the users. I mean, Facebook makes it easier to find
people who are like you and believe the things you do, but Facebook didn’t
make you believe the earth was flat.

The fact that people believe the earth is flat shouldn’t be blamed on social
media, because it’s a much bigger problem than that. Those people have managed
to grow up wrong, with terrible education and horrible life choices, and maybe
they find each other in groups on Facebook but they sure weren’t made this way
by the internet alone.

~~~
donmcronald
I think Facebook has some responsibility because they're such a large
platform. The problem is constant positive reinforcement of everything, even
things that are unacceptable. Part of the reason those people grow up "wrong"
is because they're growing up on social media where the feedback they get is
unnaturally skewed to agreed with them.

I don't think it's intentional, but I think it's going to be impossible for
Facebook to solve with watered down feedback like the "empathy" button. They
need a way for people to express disagreement, but it's going to be difficult
to do that without reducing engagement, so it probably won't happen.

------
coldcode
If Facebook's revenue was not tied to keeping people accessing the service
enough to see ads to prop up the stock price, then they would have more
options. Maybe they should consider charging instead of serving ads. But it
would likely reduce their overall revenue.

------
siruncledrew
Think about the scale of 2.2B users. Imagine how difficult it would be to
moderate just America. Now imagine handling 6.8 x America. Or 1.6 x China or
India. That is a massive undertaking for any centralized entity. Probably
bigger than many countries national security efforts to feasibly implement. Is
it really worth it to Facebook when their primary concern as a corporation is
their bottom line?

~~~
dralley
In which case the argument becomes

"If a platform/service provided by a corporation becomes large enough to
credibly threaten the fabric of society and/or government, and said
corporation won't tamp down on said problems, should they be allowed to
exist?"

------
twblalock
The impossibility is not just a problem of scale.

Even if Facebook does everything right, and moderates every single thing they
aim to moderate, they will still be blamed by conservatives and/or the alt-
right if they censor too much, and by liberals if they censor too little.
There is no type of moderation that will make all sides happy.

~~~
Verdex_3
Wait, so the _conservatives_ want less censorship and the _liberals_ want more
censorship?

Is this one of those mirror universe things?

~~~
firemancoder
To be fair it has flipped considerably, if you consider the extremists in each
group.

In the 2000s the extremist conservatives wanted to censor everything (mostly
due to religion) and liberals wanted freedom of expression.

Now it seems that extreme liberals want to censor "hate speech" which is
absolutely valid, however the lines are moving or getting blurry to the point
where they want to censor conservative ideas altogether, and conservatives
want "freedom of expression".

I've seen this shift and I'm sure many others have as well.

------
wpietri
Impossible only because they have underpriced the service. The notion that
only good things will happen when you create a context and let humans interact
is at best naive. In this case, I think it's willfully blind. I was talking
recently with people who set up the first web-based chat; abuse was almost
instant and they what we'd now call a doxxing within the first year. And they
were hardly the first people to see abuse happen online.

In the real world, if you want to have something as anodyne as a parade, you
have to provide for sufficient security to keep things safe. There's no reason
that companies pocketing billions in profits should be exempt from this
principle.

------
fertilityrspctr
The thing that makes it hard is differing cultural standards between parts of
the world. You either acknowledge them and push moderation decisions down as
far as possible to local authorities, users, and group moderators or (what
they seem to be doing) build an authoritarian, culturally-imperialistic global
AI censorship algorithm that makes people mad and (hopefully) causes them to
leave.

------
pimmen
The smartest engineers in the world have been figuring out how to make people
click on ads. If they have to figure out how to quell genocidal movements I
think that's an improvement.

------
arcanus
Not impossible, impossible for hoards of humans.

This is going to be a major use-case for AI, which can scale to billions of
people.

~~~
CharlesW
The real solution is probably hybrid (i.e. AI-assisted human moderation).

I've assisted with moderation for a bunch of communities over the years, and
IMO the problem isn't that there's not enough humans — for every bad actor in
a community there are _many_ good ones — but that Facebook doesn't provide
good actors and moderators with anything beyond the most basic tools
imaginable. Had Facebook invested even 1% of their resources into this, I
don't believe that Facebook would be the raging culture infection that it
currently is.

~~~
SllX
So I've never managed a subreddit or a Facebook page/group. I once managed a
forum where my options were to Delete and Edit comments, but I only used
Delete and would message the user why, it was a small forum though so if it
were say, 3x or 5x larger I'm not entirely sure I could have moderated the
entire thing by myself.

I'm explaining my ignorance on this subject matter because I want to ask you,
and you seem like someone who might know, how do Reddit's and Facebook's tools
compare? From my point of view, Reddit is pretty well moderated, each sub
having its own rules that at least in the subs I frequent, are actually fairly
well moderated. They have tools like automoderator and I've heard of subs I
don't frequent preemptively banning users for having too much karma in subs
they dislike. Probably overkill, but given the types of communities they are
it seems to work for their purposes regardless of whether I agree with them or
not.

If I look at a comment thread on Facebook, there doesn't really seem to be any
kind of conversation happening, just "conversation". There's some basic
threading now but even with that a Facebook "conversation" seems to just be
one long string of human consciousness.

Now there is a difference in scale, Facebook has over a quarter of the Earth's
population on its platform. That said, there is also a difference in scale
between Facebook's resources and Reddit's, so is there some reason they choose
not to develop new and better mod tools? Or are their mod tools cutting edge
already and the percentage of humanity on their platform is just so large that
even with cutting edge mod tools it just isn't enough and throwing more money
at it doesn't make it any less _hard_?

