
Facebook to ban white nationalist content - anigbrowl
https://newsroom.fb.com/news/2019/03/standing-against-hate/
======
talmand
At what point do such things change into a public discourse problem?

Do we wait until it can be shown that companies such as Facebook, Twitter,
Google, etc. influenced elections because they refused to serve information
from a candidate they didn't like? If we aren't already there, it's not long
before it is.

What do you think would happen if these "private companies" with such deep
hooks into our communication infrastructure suddenly decided to remove all
data associated to the Republican Party? For that matter, the Democratic
Party?

It seems to me that Facebook and Twitter are trying to have it both ways. They
can choose to police the content provided by their users but can't be held
responsible for said content? Are they a publisher or a platform?

I don't think comparing old thinking based around old methods of communication
compares to what we have today, it requires new thinking. These aren't like
newspapers sold by kids on the corner in a city that can have dozens of
newspapers countering each other. Imagine if there were only three newspapers
in the entire country, soon the world, controlled by a small group of people
who wish to use their publishing for their own agendas.

Tim Pool is right, at this rate, sooner or later, the Feds will come knocking
and will shut that party down.

~~~
sagichmal
> What do you think would happen if these "private companies" with such deep
> hooks into our communication infrastructure suddenly decided to remove all
> data associated to the Republican Party? For that matter, the Democratic
> Party?

They wouldn't do that, because there would be justified public outcry.
Luckily, as moral agents, we humans are capable of differentiating "general
political party" from "white nationalists", and can target the latter and not
the former.

~~~
hcurtiss
Are you sure? Do we even know, for sure, what "white nationalist" means? I'm
white, and I favor a government that's principally oriented toward advancing
this nation's interests. Am I a white nationalist?

~~~
clucas
You're right that some of these terms have ambiguity, but your example is
silly - it's well-accepted that the term "white nationalist" stands on its own
and means something different from "white and a nationalist."

But even setting aside your specific example, I don't believe that ambiguity
should paralyze us into inaction. I think it's fair to say "OK, we'll ban
anyone who advocates distributing political power based on race, with the
white 'race' getting the most power... when people cross that line won't
always be clear, but we'll do our best." For private action especially, we
should not let the perfect be the enemy of the good.

~~~
NotAnEconomist
> it's well-accepted that the term "white nationalist" stands on its own and
> means something different from "white and a nationalist."

Many right-wing people claim that in practice, there's not: they're accused of
being "white nationalists" for being "white and a nationalist".

Edit:

If people downvoting me think I'm wrong, explain why people like Jordan
Peterson, who merely espouse non-leftist positions and happen to be white,
routinely get accused of supporting the alt-right and neo-Nazis.

For the people who doubt what I'm saying -- here's a video of Jordan Peterson.
This is who the leftists routinely call "neo-Nazis" or "alt-right": moderates
who refute their positions and calmly assert values like personal
responsibility over collectivist victimhood culture.

[https://www.youtube.com/watch?v=o2bFzK2EdIo](https://www.youtube.com/watch?v=o2bFzK2EdIo)

Edit 2:

I'd originally written "conservative"; I'm not sure Mr Peterson would describe
himself in such terms. I've made it more neutral.

~~~
manfredo
I concur that terms like "Nazi", "white nationalist", "alt-right" and similar
have essentially morphed into "person I don't like".

And this applies to both ends of the political spectrum:
[https://www.youtube.com/watch?v=nYlZiWK2Iy8](https://www.youtube.com/watch?v=nYlZiWK2Iy8)

~~~
josteink
> have essentially morphed into "person I don't like"

I’d say it’s even closer to “people I disagree with”.

~~~
kgwgk
For many people the latter implies the former.

------
btilly
In principle, who could be opposed?

In practice, how will this actually work out?

True story. I have a friend who got a temporary ban from Facebook (I think 90
days?) for commenting on a story about a Texas billionaire paying to hunt
endangered animals, "How much would it cost to hunt Texas billionaires?" That
was considered a violation of their anti-bullying stance. Knowing her, it
wasn't. It was sarcasm.

Moving on to this policy, I'm happy to see neonazis and the KKK have a hard
time. But there are people in the UK and EU who would see Brexit as being a
separatist cause motivated by racial animosity. Will we see discussion of a
future such measure banned by Facebook on such grounds? And remaining banned
even for people who support it on economic grounds because they think that
having to follow EU policies on GDPR, copyright, and so on will be a net
negative for the UK?

Facebook is going to have some difficult conversations ahead. And the more of
these lines that they draw, the more difficult boundary cases they will run
into.

~~~
godshatter
In principle, I am opposed. I don't think hiding unwanted dialogue is very
helpful in the long run. It just makes it fester somewhere else that isn't as
visible. If Facebook feels they have to do something about it, then I'd
suggest just flagging it. That gives people more information that they can use
how ever they see fit. There is also the thought of collateral damage, like
your friend ran into.

I also oppose it on general grounds in that I'm a big proponent of freedom of
expression. Facebook can do what they want, we're not talking about the
government here and I understand that, but I would rather they didn't go this
route. I don't want to live in a society where what can and can't be said is
strictly regulated either by force or by general consent. I seem to be one of
the few that feels that way, though, any more.

There is also the practical concern of what happens if the ideas of what is
acceptable and what isn't changes over time? Who knows which opinions that you
hold now might become anathema at a later date. For example, the opinion I'm
expressing now used to be a lot more common than it is today.

~~~
joshuamorton
> In principle, I am opposed. I don't think hiding unwanted dialogue is very
> helpful in the long run. It just makes it fester somewhere else that isn't
> as visible.

Let's prod at this:

\- Should ISIS recruitment videos and propaganda be allowed (nay, encouraged)
because deplatforming them will make them put their recruitment pamphlets
elsewhere? Where else do they put their recruiting materials?

\- Should we publicly and loudly encourage people to self harm, ideate
suicide, etc. because if we don't, they'll just find secret places to do it?

\- Should we consider white supremacist recruiting materials "dialogue" at
all? If we're talking about dialogue, as in a formal debate between Richard
Spencer and pretty much anyone else, one on one, that's potentially
interesting (also embarrassing for Spencer). On the other hand, a video posted
by a white supremacist isn't dialogue. It's even less dialogue when they can
delete comments they can't aptly respond to, and when the people there are
already interested (Richard Spencer videos weren't ever going to cross _my_
feed). You're calling this dialogue, but it's really a one sided dog and pony
show with maybe some unlucky sacrifices. Perhaps we shouldn't ban dialogue
between white supremacists and normal people, but you're not advocating for
dialogue, your advocating for Facebook (and whomever else) to support
(distribute, platform, etc.) white supremacist propaganda and theater.

>I don't want to live in a society where what can and can't be said is
strictly regulated either by force or by general consent.

How do you propose to create a society where people aren't allowed to dislike
you? That's what you're asking for, essentially. "Freedom from consequences"
is the common way of putting this, but really what you're asking for is an
infringement on my freedom of association. If you piss enough people off,
that'll come back to bite you. What's the alternative? That people can't hold
you accountable for your previous words? That quickly devolves into a society
of 4chan, which, well, I'm not sure why you'd want to live in that.

>For example, the opinion I'm expressing now used to be a lot more common than
it is today.

How certain of this are you? Did black people or women have the freedoms you
suggest 100-150 years ago in the US? Was _anyone_ advocating for that?

~~~
orblivion
What about this for an idea (and I'm not sure I love it, but it's just
spitballing): We allow expression of any _ideas_, but we put a limit on
_recruitment_. I.e., anybody gets to say "White people need a home country" or
"the Caliphate shall reign supreme", but nobody gets to say "come to the
meeting at 10th and Main 5 o'clock". That way we at least know what's on
people's mind so we can engage them but we're putting a limit on helping them
grow the movement (until and unless we decide they have some sort of valid
point, which is part the point of letting them speak).

~~~
KozmoNau7
The really radical stuff doesn't happen on open platforms, they're more of a
tool for early exposure.

The actual planning and such happens on private sites with secret forums.

As a real-world example, a Danish right-wing nationalist network called ORG
was exposed in 2011, after having allegedly been active since the mid-80s. Its
members counted a number of well-to-do individuals, as well as members of the
police and known violent white nationalists. The group ran and had control
over several sports clubs, including martial arts and gun clubs, which they
used for training members in street fighting and tactics.

They ran a database of basically every politician and public figure who ever
espoused left-wing views, and thousands of ostensibly left-wing citizens as
well, labeling them as "traitors" to be "dealt with".

At least one of their members was from the armed forces in some security-
related capacity, and they had reasonably good opsec procedures in place.

I have no doubt that even after being exposed and having its members publicly
named, the group or a new equivalent still exists, with new secret sites and a
heightened level of paranoia towards possible infiltrators.

By deplatforming these people and driving them further underground and
paranoid, we limit their ability to attract people through social media, for
fear of exposure.

~~~
orblivion
Do you have a way of going about this that isn't authoritarian? (Extending the
definition of "authoritarian" to Facebook acting within its own platform).
EDIT: Sorry this was really vague. By authoritarian in this case I mean that
some authority has to decide what is and isn't an acceptable opinion (as
opposed to, say, deciding on an impartial process that somehow weeds out bad
opinions).

And somewhat related, does this mean you have as dismal a view of humanity as
it seems, that you don't want people exposed to dangerous ideas? This all
seems necessarily paternalistic. EDIT I can't imagine, for instance, trusting
in democracy with such a view.

~~~
Vraxx
I agree with KozmoNau7 here, so many notions of constructing some system that
will somehow weed out the "bad" ideas seems misguided in my opinion. The
reason is that these systems necessarily treat all ideas as equal valid inputs
by requiring _anything_ be allowed. This is done when we have bountiful
evidence to the contrary, evidence that fascism and racism lead to horrendous
consequences if left to spread through disingenuous tactics and deception.

In fact, we humans already form a decent system for weeding out bad
ideas/opinions, we just don't listen to our own past experiences on the
matter. We found out that fascism is putrid and yet now we try to come up with
a new system that will weed it out instead of just chucking it into the
garbage bin and moving on.

It's not paternalistic or a dismal view of humanity or that "humanity cannot
be trusted with such a dangerous idea", it's because the people advocating for
it constantly lie about it and intentionally trick/indoctrinate others into
following. Under the right circumstances of me growing up, I fully believe
someone could have deceived me into believing it, so I don't think I'm better
than anybody who got sucked in. We can trust in democracy as long as we take
proper precaution against things that prey on the freedom of expression and
association in order to remove those rights from others.

Put another way, what real benefit is there to "freedom of speech, except for
advocating fascism" as opposed to "freedom of speech, no exceptions"? The US
already doesn't have pure unrestricted free speech because there are
exceptions made for outlier situations where free speech is not protected in
efforts to secure the safety of others. That hasn't led to total collapse or
censorship.

~~~
nilskidoo
I completely agree with what you wrote. I think too many people erroneously
presume that free speech somehow guarantees an audience and acceptance, and
when neither one happens then they feel their rights are being violated.

Personally, I feel there should be legal protections for all speech, but not
protections from social ramifications. By this logic, I am totally fine with
the idea of punching fascists. Which itself could be hit with assault charges,
but then, how many juries would disagree with the reasoning for the violence?
Let the masses decide for themselves, basically. It's what bothered me about
the firing of James Gunn and Roseanne both despite their polar opposite
politics. The companies that employed them cared so much about what the public
thinks or feels that they couldn't be bothered to let the public decide for
themselves to support either celeb or not. (And I know Gunn has since been
rehired, but this happens so often, my point stands.)

------
duxup
I suspect there is some value in pushing such content off of large sites to
reduce the surface area available for recruitment.

I used to think that pushing such content off a site or forum was futile and
calling it for what it is on a site had value, I'm not sure that is the case
anymore.

If you look on platforms like reddit and such where you have groups in
subreddits calling for race wars, general hatred, and violence and of course
the lesser efforts to just normalize hate takes place... there's no discussion
anyway. In their own subs any hint of anything less than absolute commitment
to their talking points of the day, ever changing theories, and general hatred
is an instant ban from their subreddit(s). Then they venture out spam their
fake news articles / take over other subs and push their message with endless
streams of new accounts. There is no discussion, just constant spam, memes and
etc.

We've seen topics like Gamergate, incels and such (while not that tasteful to
begin with) quickly pulled to the extreme and go from something about "gaming
journalism" or general angst to an identity movement steeped in fear / hate,
harassment, and for some actual violence.

I suspect there may be some value in ushering these people away, if only to
prevent recruitment of some others who otherwise might avoid their message
that fear or dissatisfaction with life and other problems can be explained
with hate.

~~~
seventytwo
I spend most of my time on Reddit, and you’re spot on here. Deplatforming is
the most effective way to fight the spread of hate. It’s literally a cancer,
and it’s goinf to take the chemo therapy of stepping on the toes of free
speech to protect our country and society.

Sucks, but that’s what it takes - especially when they wrap themselves up in
the flag and hide behind the first amendment to spew their hate.

~~~
starik36
You are not forced to look at subreddits that offend you.

> hide behind the first amendment

I think people need to hide behind first amendment, you misunderstand what it
is.

~~~
seventytwo
Not if your speech is advocating for the erosion of the rights of others’
speech. Just as there is a paradox of tolerance, there is a paradox of free
speech, and the hate groups use it to great effect in this country.

------
dsfyu404ed
>Our efforts to combat hate don’t stop here. As part of today’s announcement,
we’ll also start connecting people who search for terms associated with white
supremacy to resources focused on helping people leave behind hate groups.
People searching for these terms will be directed to Life After Hate, an
organization founded by former violent extremists that provides crisis
intervention, education, support groups and outreach.

The fact that they're readily going down the slippery slope of propaganda by
identifying people with Wrong(TM) beliefs and directing them at the Right(TM)
beliefs does not sit well with me. Censorship aside, this is a different issue
and a very dangerous precedent for a "platform" to be setting IMO.

~~~
ratling
There is no slippery slope argument when you're talking about literal Nazis.

We went to war on the subject.

~~~
dmix
This comment perfectly encompasses _why_ it's a slippery slope. The definition
of "literal Nazi" is use so broadly in political discourse. Politicians have
been called Hitler and Nazi since Hitler and Nazis became a thing.

The massive danger is going to be the countless false positives this thinking
generates. Just like the craziness of arresting the guy whose dog made a nazi
salute in the video.

He would be denied access to social media because of stupid interpretations of
a joke.

This isn't about defending or persecuting Nazis, it's about protecting the
public from systems of abuse.

Not to mention the ineffectiveness of online censorship. The only worse thing
are Nazis organizing on public platforms is forcing them to congregate only
with other Nazis on radical websites where no one will challenge their
worldview. But that's a totally different discussion.

~~~
ratling
That's a lot of words to type to say, "I support Nazis being able to openly
operate and recruit on Facebook."

~~~
dmix
Yes, that's exactly what my comment said. Better ban my FB account! (/s)

If they promote violence or brigade, harass, or attack other members directly
on FB then. Then yes, they should have their pages shut down. That is more
than sufficient rather than saying any language that can be categorized in a
very broad and vague category will be banned.

The simple fact alone if you asked various people in America to define a
"literal Nazi" you'll get a hundred different answers of who should fit into
that category.

~~~
qualipetv
Being a nazi is literally promoting violence and in any case amounts to an end
game involving genocide. I agree there should be no censorship. But then you
should be very vigilant to not let nazis control the discours as they do now.
Nazis are not be talked with, they need to be scared out of the public space
so that either they reevaluate their positions and change and can be
reaccepted into society or they live pithy lives in solitude.

And for definitions: I think it’s very clear to indentify fascist thought -
favorable views on ethnic cleansing and stories about mighty groups of people
that can’t be proven should never be tolerated.

------
andrewla
I am also nearly a free speech absolutist, but I feel that platforms like
YouTube and FB are nearly in-scope. I can't specifically say what the criteria
is, but there's some notion of how much responsibility they take for the
content.

Something like the notion of a DMCA shield; where a platform can expose third-
party content without consequences so long as they response to take-down
requests means that they do not get to do filtering. If they get to do
filtering and curation, then they are also directly responsible for copyright
violations or libelous use of their platform -- the shield shouldn't apply
unless it cuts both ways.

That is, by accepting some immunity from the consequences of free speech, they
are then subject to free speech restrictions on their enforcement of their own
opinions. A newspaper, for example, is not subject to the same restriction --
they are responsible for all content they publish, so can choose to publish or
not publish whatever they want.

This isn't a precise framing, but I'm trying to capture some of my general
thoughts on why I find it unappealing that YouTube and Facebook can filter out
content that they deem offensive.

Facebook has a natural filtering mechanism through the ability for a user to
restrict what they see. If you find content offensive, unfriend or block the
person and you're done. Youtube can exercise some discretion in auto-play and
recommendations, but they shouldn't be able to ban content unless they are
required to do so by inheriting legal requirements.

~~~
wvenable
Should anyone be forced to _pay_ for someone else's speech that you don't
agree with? Because that is what you're asking these companies to do.

I think the explosion of hate speech on the Internet is entirely because
someone else is basically paying for people to post whatever they want. They
don't need to create their own newspaper and get advertisers. They don't need
to print signs. It's all free for them. You can consume as much services as
you possibly can given your free time and those with more free time,
therefore, get more free (as in beer) speech.

~~~
andrewla
> Should anyone be forced to pay for someone else's speech that you don't
> agree with? Because that is what you're asking these companies to do.

If they are going to get the benefit of _not paying_ for speech that they
would ordinarily be penalized for (libel, copyright infringement) then that's
the cost, yes.

I think they should be able to impose arbitrary limits on the cost of serving
content, either the size of the content or the bandwidth consumed by serving
it, but by invoking any sort of copyright/libel safe harbor, they should lose
their ability to curate based on content rather than metadata.

Sort of like being able to opt in to being a common carrier -- it carries both
benefits (immunity from most criminal and civil prosecution for content
served) and obligations (complying with takedowns and not discriminating based
on content).

~~~
wvenable
> If they are going to get the benefit of _not paying_ for speech that they
> would ordinarily be penalized for (libel, copyright infringement) then
> that's the cost, yes.

Why? That's an interesting statement, but I don't see how one follows from the
other.

If you create a community online, should you not have some control over that
community? Hacker News frequently censors and bans people in order to maintain
a certain community. What if you created EverybodyBeNice.com that wants only
"nice" posts? Is it not that fair that these companies can enforce their own
standards while also getting protection from libel and copyright infringement
made by their users? I don't think there is a good argument why one should be
tied to the other.

I think arguing that every single website should be a common carrier takes
that concept way too far.

~~~
andrewla
> Why? That's an interesting statement, but I don't see how one follows from
> the other.

You're right, of course, I don't think there's a direct implication between
the two. There's a difference of degree, not kind, to some extent. My
intuition is that the value that YouTube derives from being a place where
people can post videos (and, somewhat importantly, monetize them) is much
higher than the value that HN derives from people posting content.

EverybodyBeNice.com is an interesting thought experiment. It wouldn't be very
convincing, though, as a shield, if there was no effort put into active
moderation not only for "niceness" but also for copyrighted or libelous
content. The idea lives in that grey area between "completely curated content"
and "just a place where people can post content". I guess my feeling, roughly,
is that at some point during EverybodyBeNice.com's rise to serving billions of
videos, they can no longer credibly claim that they are preserving the
"niceness" unless there is some sort of review process that by its nature must
also police for other kinds of illegitimate content.

~~~
wvenable
> My intuition is that the value that YouTube derives from being a place where
> people can post videos (and, somewhat importantly, monetize them) is much
> higher than the value that HN derives from people posting content.

I don't see how "value" contributes to the argument. YouTube makes money
showing videos to people that other people have posted. Hacker News makes no
money linking to articles that other people have written and showing comments
other people have written. But in neither case is there a moral or legal
imperative to allow _any_ video or post on either platform based on any
concept of value. Value doesn't (and shouldn't) matter.

> It wouldn't be very convincing, though, as a shield, if there was no effort
> put into active moderation not only for "niceness" but also for copyrighted
> or libelous content.

If such a site did or did not moderate for niceness, copyright, and libelous
content I don't think it matters. You can't assume perfect moderation and some
mean comment, copyrighted content, or libel might get through. As long the
site makes an effort when discovered to remove copyrighted and libelous
content then they should be shielded by the law. Furthermore, I think that's
morally acceptable. It's incredibly difficult to filter for niceness or
copyright.

> The idea lives in that grey area between "completely curated content" and
> "just a place where people can post content".

There is no such thing as "just a place where people can post content" in the
sense that you are going for. Every place is owned and controlled by someone.
They have no moral obligation to allow any content. It's also perfectly
reasonable for them to also be shielded from copyright and libel if they make
a good faith attempt to remove it.

~~~
andrewla
> There is no such thing as "just a place where people can post content" in
> the sense that you are going for.

It's a bit contrived, but the internet itself is such a place. I think it's
widely accepted that the internet itself, and to some extent DNS, is a common
carrier in the sense that it shouldn't be doing discrimination based on
content, just on bandwidth. I mean, I don't want to start a whole net
neutrality thing here, but a router or the operator of a router does not face
any consequences for routing traffic, no matter how offensive or illegal that
traffic is.

Web sites that are principally public repositories, where people can upload
their content, seem to be a borderline case of that -- if they want to offload
the risk of hosting the content, then they can do so, but it seems to me that
they have to provide a service to society in return, namely, some sort of
neutrality about the content posted.

I think it's equally reasonable to say that they should have no shielding at
all -- society has no vested interest, necessarily, in allowing such a site to
exist that has absolutely no say over what is posted on it. Allowing users to
contribute to a site is fine so long as the site, in the end, bears the
responsibilities for societal negatives associated with posting content
(primarily libel and copyright, the two unambiguous restrictions on free
speech).

~~~
wvenable
> It's a bit contrived, but the internet itself is such a place.

The Internet isn't a place and that's what makes it a carrier. Just like the
telephone system, the Internet connects you to other services.

Websites are _not_ principally public repositories; they are privately owned
services. If they allow you to post, that doesn't make it public in an
ownership sense or carrier.

> If they want to offload the risk of hosting the content.

But they aren't. The law doesn't allow companies or individuals to get away
with hosting copyright infringement. It simply makes it possible for companies
that accept user uploaded content to function in any way as the alternative
would make this impossible. That's all. Implying this requires a moral
imperative to allow any content is a stretch at best.

------
function_seven
I consider myself a free speech absolutist. Or very nearly so.

But that runs both ways. Not only do I think that almost all (to borrow a
mathematical term) speech should be permitted, but also that the onus is on
the speaker to make themselves heard or to find an audience. No company or
platform owes them anything.

In earlier times, that meant that no newspaper or printing press was required
to print everything sent their way. Today it's the centralized websites that
also don't have this obligation.

If FB decides to ban bigots, or YouTube wants to kick anti-vaxxers off their
site, so be it. You can host your own video if you'd like, or maintain a
personal blog site.* You are _not_ entitled to widespread distribution. Never
have been.

* Which leads to a place where I'll agree on obligations: root infrastructure like DNS or network connectivity. Those are the common carriers of our Internet.

~~~
Inu
"Goebbels was in favor of free speech for views he liked. So was Stalin. If
you're really in favor of free speech, then you're in favor of freedom of
speech for precisely the views you despise. Otherwise, you're not in favor of
free speech." \- Noam Chomsky

~~~
bjt2n3904
Thank you. Free speech isn't just an amendment for the government, it's a
principle for society.

If Facebook were a television station with limited air time for a single
track, it would be perfectly valid for them to decide (reasonably) who gets
air time. However, Facebook, Twitter, and YouTube have infinite tracks and
time. I don't think the OPs argument is as strong, with that in mind.

~~~
thieving_magpie
Well now you're violating my free speech by telling me I can't decide what
content I host on my website.

~~~
vokep
the difference is that youtube/facebook are not publications themselves, but
platforms that allow you to publish. So you already aren't deciding what to
host, as a user-content driven site, you by default allow anything, then
naturally hopefully for moral reasons but necessarily legal ones, you ban
illegal content. Beyond that, it still is a privately hosted thing so I'd
still agree that your self run user-content based site may ban whatever it
likes. _However_ when you reach a general use size, where people rely on your
platform for free speech, things certainly feel different. I'm not sure what
the right move is, but I don't think its as simple as "my private site is my
private site, thats the end of it"

~~~
dr-detroit
What about the legal risk associated with letting the Frog Brigade post their
terrorist cookbook on your platform? Does Zuckerberg have to come to their
house and load the bullets into the gun for them so they can make a statement
about how angry they are? Because he made an app for people to cheat? I fail
to see the connection.

~~~
vokep
>What about the legal risk associated with letting the Frog Brigade post their
terrorist cookbook on your platform?

Like I said, banning illegal content is the first step that anyone ought to
take. Generally this is done via moderation staff of the site + user submitted
reports. If you're providing a free platform and terrorists start using it as
their communication channel, then you're only at legal risk if you do nothing
to stop them. However, I'm imagining things like the NZ shooter's 8chan post,
where a direct threat is made or a direct implication of violent acts to come.
Posting a "terrorist cookbook" actually is a good example, as its sort of
right on the line. It seems to be the sort of information that only has
negative uses, on the other hand its just information, as opposed to an actual
threat. I'm not really sure if such things should be allowed... it could be
helpful to terrorists which is bad, but also censoring information is nearly
impossible and almost always seems the wrong way to go.

>Does Zuckerberg have to come to their house and load the bullets into the gun
for them so they can make a statement about how angry they are? Because he
made an app for people to cheat? I fail to see the connection.

As do I...I really don't get what you're trying to say here, can you restate?

------
rdl
This seems absurd and unreasonable once you look at it more closely. If any
group of people say they want to go live apart from others (white, gay, men,
linux users, whatever), I'm fine with that. I wouldn't want to live in a
racial separatist region myself, but I have no problem with other people
choosing to do so. Treating black nationalists as heroes (which in a
historical sense they were) while making white nationalists a target of scorn
seems...unsustainable.

If FB wanted to redirect funding from "bad" things they don't like to a decent
group, Life After Hate is about as bad as SPLC. There have to be better groups
focused on helping people of various types.

~~~
thundergolfer
Those identity categories are not all equivalent. Wanting to live among only
"white" people is not a coherent or rational preference. It can really only be
grounded in racism.

~~~
lyxsus1
Do you realise that any person has a right to have a racist point of view,
exchange it in conversations with like-minded people and discuss it with
others, unless it doesn't result in real discrimination?

~~~
thundergolfer
> unless it doesn't result in real discrimination?

It's incredibly naive to think that "hav[ing] a racist point of view" wouldn't
result in "real discrimination" out in the world.

Here's something relevant from William Cliffor'd "Ethics of Belief":

> I shall surround myself with a thick atmosphere of falsehood and fraud, and
> in that I must live. It may matter little to me, in my cloud-castle of sweet
> illusions and darling lies; but it matters much to Man that I have made my
> neighbours ready to deceive. No belief held by one man, however seemingly
> trivial the belief, and however obscure the believer, is ever actually
> insignificant or without its effect on the fate of mankind, we have no
> choice but to extend our judgment to all cases of belief whatever. Belief,
> that sacred faculty which prompts the decisions of our will, and knits into
> harmonious working all the compacted energies of our being, is ours not for
> ourselves, but for humanity.

~~~
lyxsus1
Sorry, I don't think a misused quote can justify a thought crime. It would be
funny and educative if we could temporarily reanimate Mr. Clifford to give him
a chance to appreciate some twisted logic that is required to apply this
particular quote in this particular context. But we can't, so I suggest we
leave dead in peace.

I never said that such point of view don't have consequences. But there is a
law and means to enforce it when discrimination is realised. And you can
indeed define some reasonable limitations on a limit on speech. But as soon as
you go after personal opinions, you do more harm than good.

By doing so, we steal from people a chance to make their own minds and
actions. It's an important part of human experience for many. And when that
freedom is threatened, they often resist as it were almost an existential
threat to them.

We know for a fact now, that humans are evolutionary wired in favour of
tribalism and prejudices and overcoming our own nature is a hard and delicate
process. And slow, conflicts can last for hundreds and thousand of years.
History says that even if you manage to keep them dormant using state force,
it's not forever. State will eventually fail and people will continue on
killing each other with same amount of enthusiasm if not more. Just have look
at pogroms that took place the moment USSR collapsed in brand-new national
states, an excellent illustration.

There must be a solution to those problems that relies on force as little as
possible, otherwise it's not very sustainable. But instead I see justification
of mob justice, thought policing and global censorship. The last one is
especially infuriating. Tools that are required to implement such censorship
should not exists and will certainly be exploited.

After all, people we disagree with are just regular people, not monsters.
Likely, troubled, needing a group they can identify with, lost in rapidly
changing world, unneeded, susceptible to human flaws we all have to a certain
degree. And yet it's not uncommon to believe that it's a good idea to just
dismiss and alienate them and at the same time somehow pretend to hold a
higher moral ground. I like those opinions no more than you do. But I see
hate, arrogance and cruelty just shapeshifting and nothing good coming out of
it.

------
JPKab
My question is this: Are they also going to ban other ethno nationalist
movements? Last I checked, the Hebrew Israelites (a black nationalist and anti
semitic group) are on FB. They believe that the people who currently identify
as Jews are evil impostors.

~~~
dragonwriter
And the last time there was a “Hebrew Israelite”-motivated mass killing was...
when, exactly?

~~~
wyclif
Since when was triggering "mass killing" the standard for censorship on the
internet?

~~~
geofft
At least since Facebook's current community standards were written, probably
earlier:

 _We aim to prevent potential real-world harm that may be related to content
on Facebook. [...] We remove content, disable accounts, and work with law
enforcement when we believe there is a genuine risk of physical harm or direct
threats to public safety._

 _In an effort to prevent and disrupt real-world harm, we do not allow any
organizations or individuals that proclaim a violent mission or are engaged in
violence, from having a presence on Facebook. This includes organizations or
individuals involved in the following:_

 _\- Terrorist activity_

 _\- Organized hate_

 _\- Mass or serial murder_

 _\- Human trafficking_

 _\- Organized violence or criminal activity_

 _We also remove content that expresses support or praise for groups, leaders,
or individuals involved in these activities._

That seems like a pretty clear statement that people who are engaged in /
advocating for / recruiting for groups that engage in mass killing will be
censored.

It would surprise me if Facebook were the first internet platform with rules
like this.

------
charliesharding
As a non-white person living in America, this is scary to see. It's a policy
targeting a specific race of people's content... How is that not racist? It
will only foment more resentment and retaliatory sentiment while serving to
make "certain kinds of racism" socially acceptable as long as it's the "right
race" to discriminate against. Ban ethno-nationalists if you feel so inclined,
not {{insertRaceHere}} nationalism. Otherwise this is a slippery slope to head
down

~~~
chapium
I don't see why facebook would be obliged to host content they don't agree
with. These people will always find a platform on the internet. Facebook
doesn't need to make their recruitment more convenient.

~~~
charliesharding
I get what you're saying but if they're going to curate what content that they
put forth, then they cease to be a platform and they become a publisher.
Publishers are liable for much more under the eyes of the law. They can't
cherry pick the advantages from the two systems without any of the drawbacks.

~~~
NoodleIncident
Platforms choose what to remove, and Publishers choose what to include. No
matter how much a Platform removes, it never becomes a Publisher.

~~~
shrimpx
This seems a rather pedantic difference -- "let it be posted and remove it
later" versus "reject it upon submission."

------
jelliclesfarm
I am not a ‘white person’ (whatever that means) ...but I don’t want to see
‘white nationalist content’ banned.

When people communicate on a public platform unself consciously, they reveal
themselves openly and freely. There is no greater protection to vulnerable
populations than transparency. Freedom of speech is a gift and a valuable
tool.This move is dumb beyond words.

~~~
WhompingWindows
How do you transparently protect those in Christchurch NZ or at one of the
sites of nationalist, extremist violence? Transparency does nothing to deter
the mentally ill and psychotic individuals who perpetrate mass murder.

~~~
anigbrowl
Just going to add a caveat that most terrorists are not, in fact, mentally
ill, and arguments of the form 'you'd have to be mentally ill to commit this
sort of violence' stigmatize mentally ill people without having any foundation
in fact. They're often advanced by people who are unfamiliar with political
violence and don't like discussing it.

Here's a decent and current introductory article on the topic:
[https://medicalxpress.com/news/2018-11-link-terrorism-
mental...](https://medicalxpress.com/news/2018-11-link-terrorism-mental-
illness-complicated.html)

~~~
WhompingWindows
Then let me amend my statement: I was not intent on stating that those who are
mentally ill are more likely to be violent, rather I'm interested in if those
who do indeed commit mass murder have problems with mental health - would you
disagree with that interest? Is it the act of a healthy mind to kill dozens of
people? If we allow that someone who can willfully kill 48 peaceful
individuals may have something unhealthy in their mind, then maybe we should
look into the treatment of that specific subset of mental illness.

I'm not part of the group who says the tail wags the dog here. I don't think
mental illness leads to violence, but that violent people may be more likely
to have certain kinds of mental illness.

~~~
anigbrowl
The thing is that empirical inquiry already suggests that mental illness is a
much less important factor than personal associations. Considering that
psychiatry has a severe etiological problem already, I don't think it's such a
productive line of inquiry.

I mean, a pacifist could argue that choosing to join the military or obey
orders that might result int eh deaths of civilians is pathological, or that
states themselves suffer a sort of institutional pathology. But if one accepts
a _casus belli_ as a valid premise for military action, such questions are
immediately collapsible into a cost-benefit analysis - which is, to some
extent, how military forces actually operate.

[https://psycnet.apa.org/fulltext/2014-33751-001.html](https://psycnet.apa.org/fulltext/2014-33751-001.html)

------
rmbryan
I don't know as much about hate as I should, perhaps, but it seems arbitrary
to make rules about what's hate and what's not.

This part, especially, from the article... We also need to get better and
faster at finding and removing hate from our platforms. Over the past few
years we have improved our ability to use machine learning and artificial
intelligence to find material from terrorist groups. Last fall, we started
using similar tools to extend our efforts to a range of hate groups globally,
including white supremacists. We’re making progress, but we know we have a lot
more work to do.

My question is, how are you going to know when you're done? What would
Facebook look like if they didn't have any more work to do on hate? I'm
honestly asking this.

~~~
99052882514569
>I don't know as much about hate as I should, perhaps, but it seems arbitrary
to make rules about what's hate and what's not.

It is arbitrary. This is a closed for-profit platform, run by decree. In the
interest of shareholders, it can (and ought to) make whatever rules it wants,
as long as they don't violate any laws.

~~~
rmbryan
Yes, well said. That's a fair point. If I think about my response to an
announcement that said, "This kind of content is being banned because it
reduces our shareholder value." I can't argue with that. I think, "Sure,
you're running a businesses. It's a business decision. Good on ya." It's the
moralist explanation that confuses me, and your comment helped me see the real
decision free of the spin and justification that cam with it.

------
root_axis
I don't see what's so complicated about it. Facebook will set some standards
and they will meter out moderation action as they feel is necessary. Will it
be selectively applied in some cases? Yes. Will they get it wrong sometimes?
Yes. So what? People who don't like Facebook's approach to moderation can
quit, protest or otherwise boycott Facebook. This is the same problem of
moderation that has always existed in online communities, the only difference
between today and the past is that today the masses feel like they are
entitled to use these platforms.

~~~
gfodor
the other difference today is that 'the masses' used to mean ~millions of
Internet-connected people staring at screens on their desks, now it means the
whole damned planet, at all times.

as these networks have grown to global scale we need to assess them for what
they are, not for abstractions that may be less relevant now that these
systems represent unprecedented "public squares"

~~~
root_axis
> _millions of Internet-connected people staring at screens on their desks,
> now it means the whole damned planet, at all times._

So what? Those people use Facebook because they like to do so. If you feel
like Facebook treats you unfairly then stop using it. It is permissible to
communicate with people that use Facebook even if you don't use it. If you
fundamentally disagree with the way Facebook operates then stop using it. Just
stop using it.

~~~
gfodor
I deactivated my account years ago. My point wasn't about me, but about the
fact that these platforms are intertwined enough into society they are
blurring the lines between optional service customers can switch off of, and
global-scale utilities that are necessary to operate in society.

~~~
root_axis
> _these platforms are intertwined enough into society they are blurring the
> lines between optional service customers can switch off of, and global-scale
> utilities that are necessary to operate in society._

That supposed blurring is illusory. There are no impactful consequences for
not using Facebook; it is wholly unimportant and an entirely superficial
distraction, in fact, _less_ use of Facebook correlates positively with mental
health. You most certainly do not need Facebook to operate in society.

~~~
f1refly
>There are no impactful consequences for not using Facebook Yes there are. I
decided not to use facebook years ago. I also don't use any of their other
services. Consequences range from being an outsider socially to being outright
excluded from university/school/work events, because they're organized solely
on Facebook and nobody told you. Everybody just assumes you're on there. When
you enter a new peer group of 20+ people you can't just tell them to
completely mix up their habits because you happen to not like what they're
doing. You might be able to do that when a network is reasonably small. When a
third of the world population is on there, that doesn't work anymore. The
thesis you're defending is wrong.

~~~
root_axis
Sorry but Facebook is not a social necessity, _period_.

> _When you enter a new peer group of 20+ people you can 't just tell them to
> completely mix up their habits because you happen to not like what they're
> doing. _

You're constructing an image of reality that isn't correct in order to support
your position. If you encounter a gathering of your peers and someone says,
"hey, you on Facebook?" and you say "no, I don't really use Facebook" in
reality, the conversation just moves forward with a discussion of a different
communication medium like SMS or snapchat or twitter or e-mail or whatever.
The suggestion that lacking a Facebook is social suicide is obviously wrong.

Perhaps in your particular friend group people only communicate through
Facebook, but that is definitely not typical and it still doesn't mean you
_need_ Facebook, it means that you find the Facebook network convenient for
communicating with a particular group of people. Participating in a group-chat
is not a necessity. What if your entire friend group communicates on Discord,
does that mean you _need_ discord? Obviously you don't need it, you just find
it convenient. If you refuse to use Facebook and none of your friends care to
adjust their habits to communicate with you that really has nothing to do with
Facebook and certainly doesn't imply that Facebook is now a social _need_.

> _The thesis you 're defending is wrong._

My thesis is just common sense. The idea that Facebook is a necessity is
laughably absurd.

~~~
gfodor
The overarching point is arguable even if you don't think Facebook is
necessary -- this is subjective and is why I used the term "blurring" and not
"blurred." It's certainly the case that platforms like Facebook and Twitter
provide examples of what could be on track, if they are not currently, towards
becoming privately owned, unregulated, Internet-based, global scale
communication networks that anchor our daily lives.

Even if those two platforms die off before they reach the tipping point where
most would agree membership on the platforms are necessary to operate in
society, we should still realize this is a potential eventuality. If you
disagree with the core point of this potential outcome, then I'd be interested
in the argument, but it certainly seems possible (and I'd argue, likely) that
Facebook or a site like Facebook could quickly evolve into something that you
_need_ to be connected on to conduct business, interface with the government,
get a job, get into school, or in general be connected to society in a way
that is not making you "sub human" in a future where the whole world is
continually connected via the Internet at all times.

------
gatherhunterer
There is merit to maintaining an open environment. Censorship makes the
censored content seem lucratively dangerous; it looks like the powers that be
are afraid of what some people are saying. I think it is best to let bad
arguments fail on their own merit. The problem is that most people are unable
to recognize a bad argument.

~~~
ianleeclark
The problem is that an open environment cannot work when there are bad faith
interlocutors actively seeking to undermine that open environment. There's a
famous Sartre quote that deals with the paradox of tolerance in regards to
anti-semites.

~~~
pathseeker
Either they are presenting valid arguments or they aren't. If you have the
mental tools to analyze and quickly dismiss arguments it doesn't matter if
they aren't acting in good faith.

~~~
DoofusOfDeath
> Either they are presenting valid arguments or they aren't.

I agree. But one problem with bad-faith actors is they can bring productive
discussion to a near standstill, due to the time / attention / emotional-
energy costs.

------
insickness
Marsh v. Alabama, 326 U.S. 501 (1946), was a case decided by the United States
Supreme Court, in which it ruled that a state trespassing statute could not be
used to prevent the distribution of religious materials on a town's sidewalk,
even though the sidewalk was part of a privately owned company town. The Court
based its ruling on the provisions of the First Amendment and Fourteenth
Amendment.

[https://en.wikipedia.org/wiki/Marsh_v._Alabama](https://en.wikipedia.org/wiki/Marsh_v._Alabama)

> "Ownership does not always mean absolute dominion. The more an owner, for
> his advantage, opens up his property for use by the public in general, the
> more do his rights become circumscribed by the statutory and constitutional
> rights of those who use it."

~~~
tedivm
This case was about a "company town", where there was very much a blending of
public and private.

> The Court initially noted that it would be an easy case if the town were a
> more traditional, publicly administered, municipality. Then, there would be
> a clear violation of the right to free speech for the government to bar the
> sidewalk distribution of such material. The question became, therefore,
> whether or not constitutional freedom of speech protections could be denied
> simply because a single company held title to the town.

It's also not a good idea to bring up this case without bringing up Cyber
Promotions v. America Online, in which America Online put up spam filters and
Cyber Promotions tried to use the above precedent to say America Online was
blocking their right to free speech. You can probably guess by the fact that
we still use spam filters how that case came out.

As a final note, even that original case has had limitations based off of the
type of property-

> In Lloyd Corp. v. Tanner, the Supreme Court distinguished a private shopping
> mall from the company town in Marsh and held that the mall had not been
> sufficiently dedicated to public use for First Amendment free speech rights
> to apply within it.

~~~
nilkn
This is a nuanced topic, but I think the subsequent cases you mentioned all
make sense and are consistent with the proposition that certain social media
sites actually should obey the "company town" precedent.

Malls aren't and have never been the primary space used for public social
discourse and commentary. Today, such discourse increasingly happens online,
and it's increasingly true that sites like Facebook and Twitter actually _are_
the primary space for public discourse.

It's really never before happened in history where a private company has
become the mediator and facilitator of discourse involving millions of people
simultaneously. Previous forms of media had extremely restricted participation
roles in comparison. Newspapers, TV, and radio can all only publish a
relatively tiny amount of content, so the idea that millions of people could
simultaneously _publish_ content to those platforms was never viable; they
were only capable of pushing out content from a few folks to many, but they
were never many-to-many. This almost infinitely scalable many-to-many
communication property of modern social media makes it feel much more like a
public space than newspapers, TV, and radio ever did.

~~~
joshuamorton
>the proposition that certain social media sites actually should obey the
"company town" precedent.

This only follows if you believe you're under an obligation to use social
media.

>It's really never before happened in history where a private company has
become the mediator and facilitator of discourse involving millions of people
simultaneously.

This depends on what you mean. On the one hand, up until ~30 years ago,
discourse involving millions of people was impossible. So in one way you're
correct. On the other hand, media companies of all kinds have historically had
a huge level of control over what people see. When the only form of
communication was the town crier, he controlled what you heard. When we added
printed paper, the people with the press had outsized control over the media.
And the barriers to entry were much higher then: I can spin up my own website
in an hour. Your average person couldn't afford a printing press.

>This almost infinitely scalable many-to-many communication property of modern
social media makes it feel much more like a public space than newspapers, TV,
and radio ever did.

What public space offers infinitely scalable many-to-many communication?
Public space is distance limited, these platforms aren't. Just because these
platforms chose not to heavily regulate the content doesn't preclude their
ability to ever do so in the future, nor does it make a comparison to a
theoretical public space that never existed any more valid.

~~~
mediocrejoker
> This only follows if you believe you're under an obligation to use social
> media.

Were the town residents in Marsh v Alabama, required to inhabit the company
town? I think the fact that they did is what made is a public space,
regardless of whether they were required to.

~~~
tedivm
I don't think that's quite right. The issue there is that the company town
wanted to be an actual town- they were providing services, inviting people to
move in (including people who didn't directly work for them), and generally
saying "anyone who wants to can be here"\- they were very explicitly trying to
create "public spaces".

Facebook is not the same thing- people aren't moving there and people don't
live there. I know we want to pretend that digital and physical are similar,
but this particular case that everyone is citing was very specifically about
the company towns and precedent afterwords has been pretty clear.

------
throwawaysea
Large privately-owned platforms carry so much discourse across today's
society, that censorship and deplatforming in those spaces has the same impact
as governmental censorship, for most intents and purposes. Even if these
corporations do not constitute what we might traditionally call a "monopoly",
they control a large-enough share of traffic to have significant impact when
they take artificial actions. That sizable impact is exactly why they are
being targeted (not just on this topic but others) by activists or other
agents pushing for deplatforming/censorship favorable to their causes.

The big risk is this: when only a few entities funnel so much societal
discourse or control our communication infrastructure or process payments,
those entities making arbitrary decisions about who they serve has similar
impacts/risks to the government imposing similar restrictions through the law.
These companies should not act as a thought police and should not impose their
own personal governance above what is minimally required by the law. Nor
should they rely on the judgment of an angry mob to make decisions.

~~~
robenkleene
"The same impact as governmental censorship, for most intents and purposes."
False and dangerous. Government censorship means imprisonment and other forms
of punishment, sometimes including death.

This doesn't mean your point is wrong (I don't think it is), but it's more
effective to make your argument on its own merits rather than making a false
equivalency.

------
iliketosleep
In practice, it's going to be an arbitrary mess. They draw a series of vaguely
defined lines across highly nuanced issues, make their fancy press release,
and then send it off to implemented by underpaid and undertrained workers (in
3rd world sweatshops last time I checked).

~~~
Macross8299
>3rd world sweatshops

Jeez man, Arizona isn't that bad, I wouldn't call it third world [1]

Jokes aside, you are correct that Facebook does have lots of third-party
contractors that do moderation located in underdeveloped countries and
Facebook seems to exploit a lot of their contractor labor, regardless of where
that labor is located.

[1]: [https://www.theverge.com/2019/2/25/18229714/cognizant-
facebo...](https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-
content-moderator-interviews-trauma-working-conditions-arizona)

------
randyrand
Why not ban all racist nationalism entirely? What is unique about white
nationalism versus Chinese nationalism in China, or black nationalism in South
Africa? There are many racist movements all around the world.

Only banning white nationalism is giving fuel to the white nationalist belief
that whites are being unfairly targeted. That's not something we should
encourage IMO.

~~~
philwelch
The notion is that the history of European colonialism and imperialism
"justifies" black and Chinese nationalism as a countermeasure to white
supremacy.

There's a "punching up" vs. "punching down" model that pretty much explains
most "hypocrisy" on this issue. For instance, even subtle hints of anti-
Semitism expressed by white Westerners (e.g. being anti-George Soros in
particular) are condemned far more vehemently than anti-Semitism expressed by
black people or Muslims, because of the perception that Jews, being perceived
as predominantly white and Westernized themselves, are more privileged than
blacks and Muslims but less privileged than white Gentiles.

I don't agree with this mentality myself because the tides can turn a lot
faster than your ideological model of privilege and oppression can change to
reflect it, but it's similar to the ideas that are commonly expressed on the
left and serves as a decent predictive model of their attitudes.

~~~
oh_sigh
Just because white nationalists are white, doesn't mean that they've received
any kind of largesse because of it. If you look at the demographics of white
nationalists, it will be largely overrepresented by poor whites from
marginalized, backwater communities.

~~~
philwelch
Oh, for sure.

Let's talk about the word "cracker". In the US, it's a derogatory term for
white people that was coined by black people. The etymology is that a
"cracker" was a white man who was hired by the plantation owner to crack the
whip at the black slaves. Even in white society, these "crackers" were pretty
much what we'd call "white trash"\--they had effectively zero social status,
received no largesse, and lived in poverty themselves. Literally the only
thing they had going for them was that they were white, which meant they were
the ones cracking the whips and not having the whips cracked at them. Which
only made them all the more eager to crack the whips and do the dirty work of
oppressing black people. Being white was literally the only thing these people
had to be "proud" of. And so, the most evil people weren't necessarily the
plantation owners (who were evil in a detached, hypocritical way), but rather
the crackers (who were evil in a directly hateful, sadistic sense). A
plantation owner could come around to the idea that slavery was wrong, as
Washington and Jefferson did, and merely live in a state of complicated
hypocrisy. A cracker is a pure mass of utter racial animus.

The same dynamic remained after the abolition of slavery, and it's the same
dynamic that you describe.

------
weberc2
The obvious questions are and always have been "what constitutes 'harmful
speech'?" and "who gets to decide?". Sure, we can agree that "white supremacy"
is harmful, but in the last 5 years we've seen the definition of white
supremacy slide from "people who espouse views that whites are better than
nonwhites" to "people who espouse views that are not in line with progressive
orthodoxy".

~~~
josefresco
> we've seen the definition of white supremacy slide from "people who espouse
> views that whites are better than nonwhites" to "people who espouse views
> that are not in line with progressive orthodoxy".

No we haven't. We've seen white supremacy being legitimized at the highest
levels, and as a result white supremacists claiming to be "victims" openly
because they feel emboldened by the election of white supremacist politicians.

~~~
zobzu
you've watched too much tv

------
_iyig
So what about black nationalism? The Dallas mass-shooter, Micah Xavier
Johnson, was reportedly radicalized in part by various black power Facebook
groups:

[https://www.latimes.com/nation/la-na-dallas-police-
shooting-...](https://www.latimes.com/nation/la-na-dallas-police-shooting-
live-suspect-joined-facebook-groups-that-1468001887-htmlstory.html)

~~~
Corya0687
Well... the fbi has a specific intiative targeting them(BIE). And while they
are a problem, they aren't nearly as big of a threat.

------
lgleason
Many are rightfully making the argument that they are a common carrier because
of their monopoly in the space. Right now there really isn't a viable
competitor to them. The same is true with Twitter etc. because they are all
substantially different products.

IMHO I would supports Facebook's right to do this if: 1\. They gave up the
safe harbor protections and were subject to the same liabilities as news
organizations. 2\. Or they were broken up.

Otherwise, this is akin to kicking someone out of the town square without the
ability to move to another one....and that is not free speech...

~~~
anigbrowl
I'd argue that there are lots of alternatives. But the smaller and less
exclusive ones which tolerate extreme content often have reputations like Mos
Eisley spaceport, so while people can gather and organize there relatively
freely it seriously limits their fundraising and recruitment outreach.

------
gowld
It's OK to say you believe that you are pro free-speech only in cases of
government/military/police censorship. It's not OK to redefine free speech to
mean "government censorship", especially not to call your position "very
nearly absolutist".

You believe that private companies have the moral right to censor and filter
user-generated content. That's your opinion/morality. But it's not "free
speech absolutism" in the common English language.

~~~
hn_throwaway_99
> It's not OK to redefine free speech to mean "government censorship"

I beg to disagree, and in fact I believe you are the one that is redefining
free speech. "Free speech" as I have known it has _always_ referred just to
government restrictions, because the government is the only entity that has
the very unique punative powers that they do. Indeed, a strong corrolary to
free speech, freedom of the press, by definition means that a (private)
journalism organization is free to publish (or not publish) whatever they
want.

I do agree that the relatively new, immense power of large technology
companies with respect to how we communicate may require a sense of updating
what we mean by "free speech", but this would be _an update_ \- it is not
inherent to the original meaning of free speech.

~~~
throwawaysea
Free speech means freedom from censorship. And from
[https://en.wikipedia.org/wiki/Censorship](https://en.wikipedia.org/wiki/Censorship)

> Censorship is the suppression of speech, public communication, or other
> information, on the basis that such material is considered objectionable,
> harmful, sensitive, or "inconvenient". Censorship can be conducted by a
> government, private institutions, and corporations.

From Merriam-Webster ([https://www.merriam-
webster.com/dictionary/censorship](https://www.merriam-
webster.com/dictionary/censorship)), where there is no mention of 'censorship'
being a government-specific concept:

> the institution, system, or practice of censoring

From Oxford
([https://en.oxforddictionaries.com/definition/censorship](https://en.oxforddictionaries.com/definition/censorship)),
where again there is no mention of 'censorship' being a government-specific
concept:

> The suppression or prohibition of any parts of books, films, news, etc. that
> are considered obscene, politically unacceptable, or a threat to security.

~~~
stronglikedan
> Free speech means freedom from censorship.

As I understand it, it means freedom only from prosecution, but not
persecution or censorship. I agree with GP in that this is their platform and
their rules.

------
JimBrimble35
I fully support scrubbing hateful content from any place where it appears.

I think people should take note though, that this is the exact point in time
when the machinery for censoring and filtering the internet is being
implemented. Although hateful speech is a great use case for this technology I
can't help but wonder what it will get pointed at next. This isn't the kind of
development that can sit back idly after performing its primary function.

~~~
philwelch
The arguments for censorship and "deplatforming" follow the fashions of the
times. In one era, the notion of screenwriters inserting subtle communist
propaganda into Hollywood films was terrifying enough to motivate an industry
blacklist. In an earlier era, advocating resistance to the draft was
considered dangerously treasonous. Somehow these threats always seem less
severe in hindsight than they seemed at the time, though.

~~~
JimBrimble35
I think that's an interesting argument, but I also think you need to consider
the type of tooling and the reach of content in today's world. The apparatus
that we have for tracking an identifying individuals and their media
consumption habits gives unprecedented control over perception at a very
granular level.

It also sounds like the events that you mention were dealt with quickly and
severely, maybe they would have been more severe threats given the space and
time to form more completely.

My fear is that we'll get to a point where people won't know about the
suffering or injustice that is occurring in the world without physically going
to the places where it's happening. Or that people will have to go to extreme
lengths or break the law simply to escape their filter bubble.

/tinfoil hat

~~~
philwelch
> It also sounds like the events that you mention were dealt with quickly and
> severely, maybe they would have been more severe threats given the space and
> time to form more completely.

In the case of draft resistance in the world wars, perhaps. We did win those
wars within 2 and 4 years of the US entering them, respectively.

In the case of the Hollywood blacklist, the Communist screenwriters became a
cause celebre and, once the anti-Communists went overboard, were rehabilitated
within Hollywood. This is despite the _fact_ that they were members of an
organization under the control of the USSR under Stalin. To some extent there
was even an informal counter-purge of Hollywood conservatives who supported
the blacklists and investigations.

> My fear is that we'll get to a point where people won't know about the
> suffering or injustice that is occurring in the world without physically
> going to the places where it's happening.

Ah, back to the last hundred thousand years of human history!

------
roenxi
They've left a lot of room for the details to be stupid.

1) In particular, what about black nationalists, Chinese nationalists, etc,
etc?

2) One of their main justifications is "conversations with members of civil
society and academics". It seems unlikely that these conversations are with
politically neutral groups, particularly since academics focused on race
relations tend to self select for slightly off-beat views of the world.

Also, to poke touchy subject, white nationalism includes some much milder
ideas than white supremacy does like 'white people should maintain their
majority in majority-white countries'. A reasonable and rational person could
hold that view after reviewing what happens to minorities in any country. Look
at the Jews in Germany or the white farmers in Zimbabwe. Whites don't have a
magical pass to be free from racial discrimination and ethnic cleansing.

Whites should be allowed to organise around race like everyone else, as long
as they keep it civil and nonviolent.

[0]
[https://en.wikipedia.org/wiki/White_nationalism](https://en.wikipedia.org/wiki/White_nationalism)

~~~
beat
And when Richard Spencer asks if we should maybe just send every black person
to Africa and create an all-white America, he's being civil and nonviolent.

Which is the problem.

~~~
antt
When I say that we should send every non Indian person back to where they came
from and create an all Indian North America I'm also being civil and
nonviolent. Yet I don't see facebook banning that type of speech.

~~~
XorNot
Because you haven't done that, can't point to any groups which have done that,
and haven't bothered to file any complaints about these nonexistent groups
with Facebook moderation.

~~~
antt
I absolutely have during the Dakota pipeline protests.

------
randyrand
There was a time when the FB newsfeed was chronological, and just showed what
your fiends posted as they post them.

It seems like less blame for the world's problems could be put on FB if it was
just treated as a dumb bulletin board. Why not go back to Facebook being a
"dumb pipe?"

~~~
themacguffinman
How does that change Facebook's dilemma? Facebook faces PR pressure because
they host the content at all, serving it chronologically or non-
chronologically makes no difference. Who is complaining that the problem is
really that Facebook serves white nationalist content in the wrong order?

------
SovietDissident
Yes, FB is a private company and all that. But they have clearly been
censoring speech and cannot be protected by common carrier rules, as they are
a publisher.

This change of designation, of course, would be a significant financial blow,
once they started losing lawsuits. But you can bet that if one considers their
gov't influence and status as a money-spigot for leftist politicians, they
have no fear of this actually happening.

It's only the little people who have to worry about running afoul of the law
---not HRC and not FB. Welcome to the late Roman Empire.

------
piokoch
Hmm, I don't even know what "white nationalist" mean. Nationalism is often
confused with chauvinism. The difference is rather important. Nationalism is a
natural and mostly positive attitude: a person lives at some place (country,
city, village), surrounded by people with that person spends most of the time,
with whom that person talks, trades. So it is natural to be more eager to help
those people around than someone living on the other side of the globe.

On the other hand, chauvinism is a negative attitude, this is misunderstood
nationalism. The good example is the following.

Let's say I have kids and my neighbour too. Obviously I want the best for my
kids, so I take care of them, send them to a good school, so they have the
best possible education and perspectives - this is "nationalism". It does not
mean that I wish anything bad for my neighbour and his kids.

Now, chauvinism means that I am going to hurt my neighbour's kids, since they
are competition for mine, and this is obviously bad.

So, coming back to the FB declaration, "white nationalist" does not make much
sense. The colour of the skin is hardly a predictor of common interests. I
assume FB means chauvinism, otherwise they will have to ban all international
sport competitions news (funs are naturally nationalistic since they support
national team), they will have to also ban all national holidays, including
4th July, as they are nationalistic by very nature.

~~~
dragonwriter
> So, coming back to the FB declaration, "white nationalist" does not make
> much sense

That's because you are trying to reason out what it might mean as if it were a
simple combination of two words whose meaning should be transparent from the
constituentsz rather than a name for adherents to the ideology named “white
nationalism” which isn't “white” modifying “nationalism” but rather a
construct with (very loosely; it was originally a bit of a PR term for those
selling it) the same relationship to the idea of a (putative) “white nation”
with specific traits that “nationalism” has to “nation”.

But, you know, you don't have to guess about things like that, there are
search engines and decent general references at hand any time you have access
to the internet, so even if you didn't know the meaning, you could easily look
it up and find out. [0]

[0]
[https://en.m.wikipedia.org/wiki/White_nationalism](https://en.m.wikipedia.org/wiki/White_nationalism)

------
aestetix
Does this mean Facebook can now be considered a publishing platform, and thus
liable for content put on the site?

~~~
duality
It should.

------
unimpressive
"Let’s go all the way back to 1996 and talk about Section 230. I think
historians are completely in agreement that this is the law that made the
internet what it is today.

\----

We thought it was going to be helpful. We never realized it was going to be
the linchpin to generating investment in social media. We envisioned that the
law would be both a sword and a shield. A shield so that you could have this
opportunity, for particularly small and enterprising operations to secure
capital, and then a sword [by allowing them to moderate without facing
liability over the practice], which said you’ve got to police your platforms.
And what was clear during the 2016 election and the succeeding events
surrounding Facebook, is that technology companies used one part of what we
envisioned, the shield, but really sat on their hands with respect to the
sword, and wouldn’t police their platforms."

\- Ron Wyden on Section 230

[https://www.theverge.com/2018/7/24/17606974/oregon-
senator-r...](https://www.theverge.com/2018/7/24/17606974/oregon-senator-ron-
wyden-interview-internet-section-230-net-neutrality)

I see a lot of people in this thread expressing the view that by moderating
their content Facebook is becoming responsible for it. To the extent that's
true under the law, it's not necessarily a great idea. Certainly the intent
from a lawmaking perspective when some of the underlying framework was written
is that this wouldn't be the case. You could moderate and remove bad actors
from your platform without fearing legal persecution for it.

------
nyxxie
This ethnocentric approach to censorship is terrifying. Somewhere in Facebook
is a small group of people who have the hubris to say they can decide on
what’s right and what’s wrong, and are enforcing these views on some 1/9th of
the world’s population. With outright banning white nationalism, they’re
essentially shown us that they’re willing to play ideological kingmaker if
anything enters the current news cycle that is viewed as evil.

How are people ok with this? Whatever your opinions are on the subject matter
that was banned today are, are you honestly ok with the idea that people are
now deciding what billions are allowed to see in broad strokes? History is
filled with examples of controversial viewpoints being viewed as societal
cancer only to be later accepted in some form as sane. Are we really going to
now decide that we’ve reached final evolution and can now start judging
wrongthink?

I view this as part of an alarming trend of people simply getting tired of
seeing people they disagree with continue to exist and wishing they and their
speech were just _gone_. It terrifies me that this is starting to get
political will and we’re shifting to censorship as a solution.

~~~
scarlac
> With outright banning white nationalism, they’re essentially shown us that
> they’re willing to play ideological kingmaker if anything enters the current
> news cycle that is viewed as evil.

There have been doing this since they launched. You could not post semi-nudity
even for educational purposes. Remember the breast-cancer debacle? You
couldn't post anything that incites violence. The list is fairly long and is
often referred to as their "Community Guidelines".

Twitter has rules. Reddit has rules. Most online communities do in fact have
rules of some sort and in the end it's up to humans to patrol it.

------
wozer
This seems very America-centric. I wonder what it means for the rest of the
world.

I guess a Finnish person living in Finland can now no longer use Facebook to
express a wish that the demographics of his home country stay unchanged?

~~~
duality
"I guess a Finnish person living in Finland can now no longer use Facebook to
express a wish that the demographics of his home country stay unchanged?"

Your guess is probably correct.

------
panarky
Curious that a post about eliminating white nationalist content actually
includes an image with white nationalist content.

[https://imgur.com/a/fSm1pHa](https://imgur.com/a/fSm1pHa)

It's perhaps an unintentional example of the difference between advocacy
(banned) and education or discussion (not banned).

Very interested in how Facebook will distinguish between them when both use
identical words.

~~~
bcruddy
Did you read the sentence right below that conveniently cropped screenshot?

| Searches for terms associated with white supremacy will surface a link to
Life After Hate’s Page, where people can find support in the form of
education, interventions, academic research and outreach.

------
untog
I'd encourage people to please read the post before they comment. It seems
that this isn't actually Facebook targeting a specific ideology, it is
removing a previous rule that _excluded_ that ideology:

> We didn’t originally apply the same rationale to expressions of white
> nationalism and separatism because we were thinking about broader concepts
> of nationalism and separatism – things like American pride and Basque
> separatism, which are an important part of people’s identity.

I've been plenty critical of Facebook in the past, but this post is
refreshingly honest. I hope it's followed up by some discernible results.

~~~
identity-haver
I don't think this is correct. They seem to be saying that they _allow_
posting about nationalism and separatism, and white nationalism content was
allowed under this umbrella, but now they specifically disallow white
nationalism and separatism.

Honestly though, posting about any serious separatist movement has a good
chance of violating their "terrorism" community standard: "Any non-
governmental organization that engages in premeditated acts of violence
against persons or property ... in order to achieve a political, religious, or
ideological aim"

You can't tell me with a straight face that if Facebook existed 50 years ago,
Sinn Fein would be allowed to post on Facebook, because they were not the IRA.

------
overthemoon
"Free speech" comes up so fast in this conversation it stifles any other
aspects, like what counts as "public". We have to figure out exactly what it
means that Facebook is a massive platform with billions of users centrally
controlled by a corporation and how that affects us, and free speech issues
are only one aspect.

I find it so strange how people on the left and right swap argument styles on
the issue of free speech, though. As pointed out by Freddie DeBoer, people on
the left want to strictly define free speech as government intervention and
criticism, contradicting their usual tendency to interpret law more broadly.
Then the right, contradicting their own approach to law interpretation, want a
more expansive interpretation of free speech to include things like corporate
censorship on platforms that aren't necessarily "public" as we'd traditionally
understand it, but have public-y qualities nonetheless.

Anyway, good riddance to white supremacy, but this is a transparent bid by
Facebook for PR repair. Their public image was trashed by the 2016 election on
several fronts, and this hardly rectifies it. It seems like cynical bullshit
to me.

EDIT: It also makes me think of Freddie DeBoer's essay "Planet of Cops":
[https://medium.com/@jesse.singal/planet-of-
cops-50889004904d](https://medium.com/@jesse.singal/planet-of-
cops-50889004904d)

------
alexmingoia
The great thing about the Internet is that anyone can publish a website and
any other Internet user can access it.

The web is pleasant over here in the blogosphere. People write what they want,
and readers read what they want. There’s no clamor about ads, algorithms, and
censorship. Try it sometime!

------
narrator
As an anti-racist conservative, the most fun I have on the internet these days
is arguing racists out of their beliefs on the armpits of the internet. I
relish my downvotes there. I would be bummed about these beliefs being banned
everywhere because I wouldn't get to exercise my Internet debate skills to
make a positive impact on the world without worrying about trampling on
somebody's right to an echo chamber.

~~~
bdhe
What does it mean to be an anti-racist conservative, and what do you think
about the conservatism movement in that you have to use that extra qualifier?

~~~
whatshisface
The vast majority of both conservatives and liberals are passive about the
vast majority of issues. It would make sense that if you are actively seeking
out specific conflicts, you can add an anti- prefix. If the parent
specifically seeks out and engages with racists then they are much more anti-
racist than the average person irrespective of political alignment. The
conservative side definitely does not include "arguing with people on voat" as
a platform.

~~~
bdhe
> The vast majority of both conservatives and liberals are passive about the
> vast majority of issues.

While this might be true, it doesn't necessarily mean that the views held by
conservatives and liberals as a whole are comparable.

I cannot for the life of me imagine someone being called a pro-racist liberal,
it just doesn't compute. But someone who looks the other way and claims to be
conservative? Sure, plenty of those folks to go around.

> The conservative side definitely does not include "arguing with people on
> voat" as a platform.

While true, I think the author recognizes that sadly conservatism doesn't
often include anti-racism in its platform which is what my comment was trying
to highlight.

------
toomim
> Going forward, while people will still be able to demonstrate pride in their
> ethnic heritage, we will not tolerate praise or support for white
> nationalism and separatism.

So you can praise and support your ethnic nationality, as long as you're not
white.

~~~
nepeckman
White is not an ethnic nationality. A person from Ireland has no ethnic or
national ties to a person from Italy. The fact that the citizens of those two
countries were not considered "white" 100 years ago demonstrates the
invalidity of "white" as an ethnic or national group.

~~~
toomim
The way I see race actually agrees with you, but the issue is that "white
nationalists" DO see themselves as a race and ethnicity:

[https://en.m.wikipedia.org/wiki/White_nationalism](https://en.m.wikipedia.org/wiki/White_nationalism)

It's in the first sentence of their description:

> White nationalism is a type of nationalism or pan-nationalism which espouses
> the belief that white people are a race and seeks to develop and maintain a
> white national identity.

------
gramstrong
I don't think that private censorship is necessarily a problem, but I do think
that corporations having the authority to exercise what is a pretty effective
form of muting is a problem. This is of course due to the size and prevalence
of Facebook as a platform. I don't think the solution (or best solution, at
least) is to break up the tech giants as Elizabeth Warren would suggest. I
would instead prefer to see some form of socialized media platforms that take
the market question out of the equation (does white supremacy content hurt
Facebook more than removing it would?). This is of course, because, the market
is not always right (even though in this case I would make a strong argument
that it is).

I'm not sure what the best social media model for such a platform would be --
preferably something mostly non-anonymous like Facebook, but self-moderated
(within the bounds of the law) like Reddit used to be.

------
wolco
In the case of facebook. If they removed one side from debating that side will
leave. Creating a smaller more similiar group of people. That would go against
the goal of connecting the world and change the goal to connecting the left or
right. Plenty of those sites. Facebook's ideal strategy is not to get
involved.

~~~
bduerst
Except deplatforming is shown to work against alt-right hate speech:
[https://motherboard.vice.com/en_us/article/bjbp9d/do-
social-...](https://motherboard.vice.com/en_us/article/bjbp9d/do-social-media-
bans-work)

Anti-semitic speech has traditionally relied on soap boxes to spread their
message, for decades before the internet even existed. That's why most
academics won't engage them, because these groups are not interested in
genuine discussion but in broadcasting their prejudice.

~~~
lazyjones
VICE is trash content, not journalism. And the current debate obviously stems
from the tragic results of increased alt-right hate speech, despite all the
"deplatforming".

~~~
JamesLeonis
Ad Hominem.

If you want the data, check out these studies from the quoted researchers:

1: [http://comp.social.gatech.edu/papers/cscw18-chand-
hate.pdf](http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf) 2:
[https://datasociety.net/output/oxygen-of-
amplification/](https://datasociety.net/output/oxygen-of-amplification/)

~~~
someguydave
1\. Is a study on how effectively Reddit banned some users 2\. Is some woman’s
opinion on how people should write about Internet-related news

Neither of those sources demonstrate that banning “alt-right” content “works”
unless you mean that banning results in bans.

~~~
JamesLeonis
From page 2 of the first study:

> r/CoonTown was a racist subreddit dedicated to violent hate speech against
> African Americans. It contained “a buffet of crude jokes and racial slurs,
> complaints about the liberal media, links to news stories that highlight
> black-on-white crime or Confederate pride, and discussions of black people
> appropriating white culture” [28]. Their banner featured a cartoon of a
> black man hanging, with a Klansman in the background [20]. It had over
> 20,000 subscribers at the time of banning.3 The following is a
> representative, highly-upvoted comment from the subreddit:

> “It would be so much easier if this [n-word] was taken outside and shot.
> Then rasslle up his eight or nine [kids] and shoot them so we can terminate
> that line of genes.”

And from the abstract:

> We find that the ban worked for Reddit. More accounts than expected
> discontinued using the site; those that stayed drastically decreased their
> hate speech usage—by at least 80%.

Would you like to try again?

~~~
someguydave
“The ban worked” means something very specific in the context of one website,
that does not generalize to reasonable public policy.

And yes, it borders on the tautology I pointed out before - banning content
results in banned content. This fails to address the question: is banning
content a good idea?

~~~
JamesLeonis
Why not read the study?

> 1.2 Research Questions & Findings > We analyze the effects of the ban at two
> levels: the user level and the community level.

> RQ1: What effect did Reddit’s ban have on the contributors to banned
> subreddits?

> RQ1a: How were their activity levels affected?

> RQ1b: How did their hate speech usage change, if at all?

> RQ2: What effect did the ban have on subreddits that saw an influx of banned
> subreddit users?

> RQ2a: To which subreddits did the contributors to banned subreddits migrate
> after the ban?

> RQ2b: How did hate speech usage by migrants change in these subreddits, if
> at all?

> RQ2c: How did hate speech usage by preexisting users change in these
> subreddits, if at all?

>And yes, it borders on the tautology

It's the study of the before-and-after effects of banning two subreddits,
particularly on users of those subreddits as they migrated to others. Where is
the tautology?

> that does not generalize to reasonable public policy.

The second paper, from the person you Ad Hominem described as "some woman’s
opinion on how people should write about Internet-related news," directly
talks about how you can generalize these ideas into combatting attacks on the
Internet to prevent their amplification.

EDIT: That "some woman": Whitney Phillips; PhD 2012, English with an emphasis
on folklore, University of Oregon

Would you like to try again?

------
jimrhods23
I'm glad. Now we can stop using it as a scapegoat for all of the worlds
problems.

~~~
Wohlf
People will never stop blaming human issues on businesses and devices. It's
much easier to blame Facebook and demand they change that it is to get people
to change.

------
interlocutor
Attitudes towards free speech have changed a lot in recent times. I believe
this is related to the rise of social media. The advent of social media had
made it too easy to spread hate online. This has caused an upheaval in
attitudes towards free speech. For example, consider that UC Berkeley which
gave birth to the Free Speech Movement is now making news for banning
controversial/harmful speech, such as that by Ann Coulter. The people (as
opposed to governments) have decided that some censorship is in order. This is
a natural evolution of societal norms. This particular evolution was causated
by social media, and it is befitting to see that a social media company is now
in the news for censoring harmful speech. This type of censorship, as opposed
to absolute free speech, will be the new normal.

------
XIVMagnus
I'm seeing comments mainly focusing on "free speech" and that no company or
monopoly should control it. I think we are stemming away from the actual GOOD
that comes from banning such content. Why are we allowing people who influence
others to do harmful things to minorities? Why should we give people with
harmful ideas a platform to begin with? If they feel like they need to be
heard, sure let them develop their own platform and spread it among
themselves. Rather than influence young people who don't know any better.
That's just my opinion on the matter.

------
garbonicc
So now facebook is explicitly editorializing content, are they now exempt from
safe harbor provisions and legally responsible for illegal content published
on the platform?

------
dfee
There was a good Intelligence Squared debate about this recently
“Constitutional Free Speech Principles Can Save Social Media Companies from
Themselves”:
[https://www.intelligencesquaredus.org/debates/constitutional...](https://www.intelligencesquaredus.org/debates/constitutional-
free-speech-principles-can-save-social-media-companies-themselves)

------
Dowwie
What would happen if Facebook introduced friction during sharing, making the
decision to share more salient? For instance, click "share" and get prompted
with questions about whether the source is trustworthy and whether you
personally would accept responsibility if the content were misleading or
misinforming, etc.

------
henvic
As an anarcho-capitalist, I feel very happy that Facebook is banning
nationalist content. They should have done it ages ago.

However, banning all separatists?

Well. Not all separatists are created equally. Sometimes it is a question of
getting out of a union for the sake of reducing the state, and this is in no
way vile. Quite the opposite.

------
Teknoman117
> Going forward, while people will still be able to demonstrate pride in their
> ethnic heritage, we will not tolerate praise or support for white
> nationalism and separatism.

Can anyone elaborate on this? Speaking as one of them, I'd hazard to say that
the majority of 'white people' in the United States are so far distant from
their 'ethnic heritage' that we'd just define it as 'American'.

Mexican-Americans celebrate their Mexican heritage, Japanese-Americans
celebrate their Japanese heritage, etc. What does your average 'white person'
celebrate? We don't know a heck of a lot about my mom's side other than that
her father is French Canadian and her mother is Polish American. Dad's side
has been here since the 17th century (according to my grandmother anyways).

~~~
undersuit
>What does your average 'white person' celebrate?

Holidays they imported from their European past. Saint Patrick's Day isn't
originally a drinking holiday, it's a Christian religious holiday. My friend
with Serbian ancestry celebrates 'Serbian Christmas' or Julian Orthodox
Christmas.

Lots and lots of Christianity.

------
kacamak
Who decides what white nationalist content is? This is just another move
towards wrongthink.

------
zerogvt
Regardless of anyone's political affiliation and/or fear or trust in
moderation, FB is a company operating a private service. Private as opposed to
general elections where you and I have a constitutional right to participate.
FB service is FB's home not our collective. So, I don't get why their new rule
(long overdue IMHO especially after the NZ shootings) is a cause of concern,
criticism or outrage.

I should also remind you that similar (equally sane IMHO) bylaws apply in the
current forum - at least from what I've seen as long as I'm here.

~~~
Pimpus
Have you tried reading more of the discourse before proclaiming to the world
you don't understand it.? Or did you skip that step.

------
jh0486
Facebook, Instagram, Twitter, Youtube, etc.. are user-generated content
publishers. They have the right to refuse whatever content they feel like.

This is the same as a newspaper refusing to publish an op-ed they don't agree
with.

------
staticautomatic
I'm in favor of values-based governance. At some point you have to say "these
are our values" and if you don't share them then don't use our platform. They
do not necessarily teach a specific solution to a specific problem, but they
can guide you toward an internally consistent position on one. As a practical
matter, you must be willing to rank-order your values. Many emerging rights
issues, particularly those involving speech and religion, simply cannot be
dealt with if you treat all rights as equal and inviolable. This is government
101.

------
ConceptJunkie
I think it's fine if Facebook wants to ban white nationalist content, however
I don't trust their judgement with respect to deciding what is "white
nationalist" content or not, because as we've seen the MSM will label anything
that it doesn't like as racist, and Facebook definitely falls in with this
crowd, and this could be just an excuse to block things they don't want
because it goes against their political agenda, and they can use "but muh
white nationalism" as an excuse.

~~~
hannasanarion
Blame the white nationalists for deliberately trying to turn everything into a
dogwhistle, with stuff like "Operation O-KKK" where they flooded social media
with fake outrage at nonexistent people saying that (HN doesn't support emoji,
ok symbol should go here) is racist, thereby turning the it into a racist
symbol through the medium of trolling. Nazis around the world now use it to
identify themselves while calling it """ironic""" to anyone who asks.

Turning benign stuff racist is how modern racists thrive and spread their
ideas, and pretending they don't exist doesn't help.

------
40dslf
Separatism too? As a Spaniard I couldn't be happier about that.

~~~
ceejayoz
You'll be disappointed here. They are specifically targeting _white_
nationalism/separatism. They specifically exclude other types in the article.

> We didn’t originally apply the same rationale to expressions of white
> nationalism and separatism because we were thinking about broader concepts
> of nationalism and separatism – things like American pride and Basque
> separatism, which are an important part of people’s identity.

~~~
40dslf
So separatism is only bad if it targets non-whites?

So when Basques and Catalans say they want to leave because the rest of the
country, especially the south, is full of half-moorish lazy fucks, that is
cool.

Guess the same about the two halves of Italy.

I don't care as I don't use Facebook, but it's a double standard.

~~~
ceejayoz
> So separatism is only bad if it targets non-whites?

I can't speak for Facebook, and I'm not super with-it on the conflicts in
Spain, but it seems like they're making the distinction between "we want our
own country" and "we want our own country _and we want to violently purge
everyone else from it_ ".

~~~
neetdeth
But they don't say that, and if they did, you certainly wouldn't need any new
policies to ban them. There are the 8chan lunatics, but outside that there are
a wide range of views on what the white nationalist end-goal is and how to
achieve it.

I'm reminded of the Sargon v Richard Spencer debate here. Sargon's line of
attack was essentially this: You advocate for an ethnostate, people will not
react well to the non-violent policies you publicly advocate to bring this
about, thus there will be violence, therefore your rhetoric is violent. Even
if you approve this line of reasoning, you have to admit that it can be
applied to any separatist movement anywhere in the world.

~~~
ceejayoz
> But they don't say that

Nor should they. I helped run a large, active web forum for years. We found
general rules and moderator discretion were far more valuable than a list of
800 different "you can't do foo" specific rules, because the bad actors tie up
all your time lawyering about the specifics.

~~~
neetdeth
As long as you're admitting that there's no general principle being evenly
applied here, and it is entirely discretionary on Facebook's part, we have an
understanding.

e: But it's not an arbitrary exercise of power when I do it!

~~~
ceejayoz
One can have a general set of principles behind moderation decisions without
making them available to the public.

------
mythrwy
Well what can they do really?

If they leave content up after big incidents they are enabling. If they take
it all down, they are promoting certain points of view over others. Both will
have folks screaming.

Probably they were smart enough to do the math and determined which action
likely results in less blowback. That's all.

Then again, Facebook like many public areas always was a cesspool in my
opinion. Not a fan of drinking from cesspools. So they can do whatever brings
in the most money and I'm sure they will.

------
tmcw
Not everyone on the orange website is a ‘free speech absolutist’, or foolishly
conflates freedom of speech with platform access.

Good on them for finally cracking down on bad ideas.

------
rurban
They'll have big problems in many Asian countries then, where patriotic and
racist nationalistic extremism is much more widespread. Think of India, China,
Japan or Korea. Hindi nationalism is not much different from "white"
nationalism and racism, mostly even worse. Will they block 50% of their user
base there? How to draw a line?

------
hhs
This is good. I am, though, concerned about the negative externalities that
could arise. Will this lead to new and stranger commercial activities?

~~~
mrguyorama
If you want a negative example, check out Voat. It started as a "less
moderated/censored reddit" and quickly filled with actual nazis and other
vitriol. A business cannot survive on "We accept hate speech" in the same way
4chan has never really been a business proposition; Nobody wants to look like
they are _supporting_ hate speech so advertisers won't go near it with a ten
foot pole.

IMO, this is a good thing, and implies there is at least _some_ decency in the
world

~~~
0815test
"The trouble with fighting for human freedom is that one spends most of one’s
time defending scoundrels. For it is against scoundrels that oppressive laws
are first aimed, and oppression must be stopped at the beginning if it is to
be stopped at all."

~~~
fixermark
With respect, putting a bad idea in quotation marks doesn't make it a good
idea. Mencken was extremely intelligent, but applied over-broadly, the concept
embodied in this quote implies we should defend even murderers against the
oppression of banning murdering.

I much prefer "Yes, I'd give the Devil benefit of law, for my own safety's
sake."

------
messo
The direction that Facebook and Twitter is moving in is likely to provide a
decent boost to smaller but federated social media like Mastodon, PixelFed,
Pleroma etc. The corporate-owned and "one-size-fits-all" media seems to come
apart at the seams as it struggles to balance an impossible array of different
concerns across countless communities and cultures.

------
dayvid
I wish larger tech companies would follow Jack Dorsey's path and have more
open communication about their policies. His debate with Tim Pool on the Joe
Rogan podcast[0] especially provided insight into why they're making these
type of decisions.

[0]
[https://www.youtube.com/watch?v=DZCBRHOg3PQ](https://www.youtube.com/watch?v=DZCBRHOg3PQ)

------
hodder
Facebook is a private company. They are free to purge whatever they want off
the platform for whatever they determine is in their best interest. And that
is how it should be.

White nationalists are not a protected group, therefore it is not
discrimination. If FB wanted to purge all pro Republican or all Democrat
articles from the platform, that would also be in their right (though not best
interest).

------
mesozoic
Does this mean that they implicitly support any hate groups that they aren't
specifically banning? Antifa, IRA, Isis and the like?

------
tjpnz
I don't think anyone with leftist or conservative views (at least how we
define them today) would have it in them to murder 50 innocents in a mosque. I
don't see the slippery slope applying here either - the views held by white
supremacists haven't exactly evolved over the years .

------
peterhadlaw
I just don't get why people can't be adults and just decide what conservations
they do or do not want to participate in. Instead of the Masters of Universe
building tools to more easily manage and filter desired conversations, they
enforce the "views of the anointed" at every turn.

------
api
I'm still such a fence sitter on this. I am fairly close to a free speech
absolutist, but I agree with others that FB is a private platform and isn't
obligated to provide bandwidth to anyone.

Another concern of mine though is that just banning this stuff atrophies our
ability to effectively debate it.

------
pmarreck
White racists used free-speech-limiting laws against Martin Luther King, Jr.

You start limiting free speech, that's going to cut both ways, eventually.

I've had comments on Facebook get auto-censored _simply by discussing racism_
while trying to combat it.

Talk about shooting yourself in the foot!

------
PavlovsCat
Let me guess, they'll do it with a total lack of transparency and
accountability, e.g.
[https://www.youtube.com/watch?v=QzlPhxf4Rd0](https://www.youtube.com/watch?v=QzlPhxf4Rd0)
?

------
C14L
I don't believe Facebook. This is just a publicity move.

If they really wanted to combat the hatred they cause, then they should change
their algorithm and stop putting stories on top that cause the most emotional
reactions (and clicks in consequence).

------
pc2g4d
If the views can't expressed, then they can't be refuted either. Banning this
sort of content from Facebook will only foster a sense of persecution in white
nationalists, which will likely strengthen their movement overall, while
isolating them further from contrary opinions.

John Stuart Mill's "On Liberty" is essential reading for those who wish to
silence speech they find offensive, especially corporations which may or may
not have the right to regulate content on their all-but-universal, essential-
to-modern-life platforms.

In the meantime we should all ask ourselves how we can move forward to a world
where no single Facebook-like entity has so much say over the ability of
Americans to freely assemble, speak, and publish in the modern world.

~~~
beat
I don't buy the "essential to modern life" part. I've quit FB cold turkey,
_because I concluded that it was making my life worse_. And I seem so far to
be happier for it, and to have lost very little.

------
sunshinelackof
Getting caught up in the freedom of speech question is folly. In context it
reads as a not-so subtle dismissal of racism. I’m not even questioning the
right to free speech. But when the first response to platforms removing white
nationalist content is about “just wanting to have a conversation,” people of
colour tune out. Freedom of speech is a valid concern, but I think in this
instance it belongs further back in the sprint queue.

In regard to comments concerning why it only addresses white-nationalism,
understand that these statements are made in the context of North America’s
on-going flirtations with white nationalism going back 300 years. It’s not an
easy pill to swallow and speaks little about individuals regardless of colour.

------
qwerty456127
Why just white? I've read asian nationalism (incl. in US) happens to be nasty
too. E.g. an asian-american girl dated a white boy and had to face lots of
ridiculous hate speech in Facebook.

------
xienze
The bigger question is, will they also ban content that would be construed as
hate speech if you simply replaced the words “white people” with “[!white]
people”?

~~~
jitl
This is not the bigger question.

~~~
xienze
Sure it is, because it sounds like Facebook isn’t committed to getting rid of
all hate speech, just the kind that isn’t socially acceptable. Same with
Twitter.

------
baud147258
I'd hope they would also ban antifa/black block content too. Those assholes
are trashing the city where I live every weekend for months

------
john_moscow
Does it mean that other nationalist content is allowed?

------
EGreg
What about other nationalist or bigoted content? Like the new Black Panther
Party (Black Nationalists) or Louis Farrakhan’s comments about Jews?

------
sabujp
Good, they try to hide behind the 1st amendment but there's only one endgame
that they're after, and we know how that turned out.

------
dexx
I support free speech (in the truest sense).

If Facebook wasn't a platform so many use then I might support there
censorship as creating a space for a unique kind of dialogue. Just like hacker
news has certain rules that help shape this unique community.

But since there as big as they are, and since there not the only ones
censoring this kind of content. There not just creating a unique space, there
making it extremely difficult if not impossible in some cases, for this
perspective to be voiced at all.

And that is damaging to free speech.

------
rufus_2
Censoring them confirms their narrative and only makes young white men who've
been exposed to their rhetoric pay closer attention.

------
AndyMcConachie
Not sure how I feel about this, but I do think it would be interesting to see
a white supremacist challenge this in court.

------
tonymet
Does anyone know what this means in practice?

------
b1r6
I despise these weird nationalists, but this is the final straw; I've deleted
FB over this. :)

------
m23khan
as a person of color, I would be very grateful if facebook can also ban
material offensive towards Whites as well as hate literature towards white --
and this should include translation of non-English messages/materials in long
run to weed such things out.

------
aszantu
they could just use their algorithm to make unwanted content insignificant...
This will not quench the problem with nationalists, it just drive them to
other plattforms and they will become more radicalized because they only hear
their own voices there.

~~~
omegaworks
Your assumption being that Facebook presently serves as an effective platform
for deradicalizing extremists.

Do you have evidence backing that up? Because from what I've seen[1], Facebook
has been serving the opposite purpose: showing people things that reinforce
their beliefs and trigger outrage reactions for clicks.

1\. [https://www.nytimes.com/2018/10/15/technology/myanmar-
facebo...](https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-
genocide.html)

------
pqs
The major problem I see here is why making a specific policy for a specific
group of radicals? This problem could be tackled with the generic rules
already in place, such as "it is forbidden on Facebook to promote violence".

There are many violent chauvinisms in the world that are not white, and they
should be treated equally.

~~~
cannedslime
But its okay to be a south african black nationalist. Its OK to pose with
decapitated heads if you are of arabic descent, but its not okay to be white
apparently.

~~~
pqs
That's exactly the point. In Africa, India, South-East Asia, and other places
there is a lot of violent racism and nationalism going on.

------
malloreon
Zero credit for deciding white hate is bad 15 years after your company's
founding.

~~~
AgentME
Better late than never.

A lot of commenters seem to be giving them negative credit for this outcome,
and a number of people who claim to want racist content gone are also giving
them negative or zero credit. As someone that wants racist content gone, this
seems like a perverse state of affairs. No wonder they allowed it for so long
given this.

~~~
malloreon
Then facebook should name and fire every single person who fought this long to
keep racism prosperous on facebook.

facebook is not a monolith, it is a collection of people making evil decisions
every single day. When it takes them 15 years to overturn just one of their
evil decisions, there still should be some penalty for it.

~~~
AgentME
I was mainly talking about outsiders who Facebook presumably tries to please,
like commenters in this thread. It could be that any number of Facebook
employees personally wanted to get rid of racist content, but Facebook thought
that would be an extremely unpopular decision. Assuming the sentiment across
this entire HN thread isn't unique to HN, I can see why they might have
thought this would be an unpopular decision and maybe a bit of why they took
so long to make it. I'm very glad they decided to do it.

------
tomohawk
So, who gets to decide what a "white nationalist" is? These guys?

[https://freebeacon.com/issues/soros-bankrolls-unverified-
hat...](https://freebeacon.com/issues/soros-bankrolls-unverified-hate-crime-
database-used-by-major-media-outlets/)

------
vixen99
Some of the people you won't be hearing from on FB are here:
[https://en.wikipedia.org/wiki/List_of_X_nationalist_organiza...](https://en.wikipedia.org/wiki/List_of_X_nationalist_organizations)
where X = white.

------
vixen99
Some of the people you won't be hearing from on FB are here.
[https://en.wikipedia.org/wiki/List_of_X_nationalist_organiza...](https://en.wikipedia.org/wiki/List_of_X_nationalist_organizations)
where X = white.

------
return0
Good. Facebook is not a free speech platform, it's a medium like the old
media. The more mature it gets the more it will look like TV. They have no
reason to pretend to appeal to the geekies or their besties in the valley -
they re big enough on their own. Good. On to other ventures

------
ordu
_> We didn’t originally apply the same rationale to [...] because we were
thinking about [...] things like American pride and Basque separatism [...]_

What do they mean by "American pride"? Googling it brings a lot of marketing,
I do not know how to filter all of that to find out an idea.

~~~
Dylan16807
Search "patriotism".

~~~
ordu
So it is not a meme, like a song or some group of active people from the past
or anything like that?

~~~
Dylan16807
Correct, they were referring to a very general concept as far as I am aware.

------
smarri
I reduced my chance of seeing toxic content by deleting Facebook.

------
Causality1
"Facebook to pretend they're banning white nationalist content by adding a
couple of keywords to a blacklist."

They will put zero real effort into this because they put zero effort into
everything.

------
tu7001
Anybody expected freedom of speech in Facebook?

------
daveheq
What if they end up banning dissent material that merely looks like white
nationalist material because of popular mischaracterizing narratives by fear-
mongering clickbait newsfeeds?

------
europsucks
But what exactly is "white nationalist content"? If I say something positive
about Trump, will it already get me banned? What if I disagree with something
a PoC says?

Of course FB as a private company is entitled to censor whatever they want.

In a general sense, I don't really understand the need, as usually in social
networks like Facebook people only see the stuff they want to see. There are
lots of tools to facilitate your own filter bubble, and people use them. So
the worry that "ideology x is spreading via Facebook" is somewhat unfounded in
my opinion.

------
Octoth0rpe
About damn time.

------
fullshark
How does the ban work? The user is banned? The content is shadowbanned? What
exactly are they doing? I can't figure it out from the message.

------
lazyjones
If they ban it like they claim to ban ISIS content or respect our privacy
(Zuck's "privacy first" vision), nobody needs to worry.

------
ptah
interesting 180 degree turn for them. will they also stop banning people for
calling out nazis?

------
chubot
I don't have a strong opinion on this particular issue, but I don't think your
comment gets to the core of it.

If Facebook decided to ban all Chinese, Islamic, or African-American content,
then the conversation would be completely different. People wouldn't say, "so
be it -- go set up your own servers".

Obviously they have to make a value judgement, and they have done so here and
said as much (which I appreciate). In other words, they made a considered and
conscious decision that's beyond my expertise. But it's still a value
judgement about particular content. The decision really can't be made on an
abstract "free speech" basis.

~~~
joshuamorton
>If Facebook decided to ban all Chinese, Islamic, or African-American content,
then the conversation would be completely different. People wouldn't say, "so
be it -- go set up your own servers".

This is because we have laws about discrimination based on race. White
_supremacy_ is not a race. You're free to discriminate against someone due to
the views they hold.

~~~
justinmchase
But you are not allowed to suppress the speech of people based on the views
they hold. The first amendment does indeed protect even racists.

Or at least corporations and the government are not allowed to.

~~~
joshuamorton
Just the government. A corporation is not at all beholden to the first
amendment.

------
smsm42
Legally, of course, FB is completely right. Morally - try to replace "white
nationalism" with, say, "communism" (one can argue communists killed a lot of
people, so why not?), "Christianity", "Islam", "Black separatism",
"Wiccans"... I could continue forever, but I hope you get the point. If you're
consistently fine with any $GROUP put there, no matter how much you love
it/disgusted by it, then your position is consistent and you are one of about
0.001% of people that has a consistent free speech position. For the rest, the
answer would be "but this is completely different - $INGROUP can not be
banned, they are a nice people that sometimes are misunderstood, but $OUTGROUP
are definitely all violent bigots that should be banned everywhere for
everybody's safety".

Of course, FB does not owe anybody to neither have consistent free speech
position, nor support free speech at all, nor be an inclusive platform. They
could convert their site to the fan club of Jar Jar Binks tomorrow and make
every user to pledge loyalty to Supreme Lord Jar Jar or take a hike.
Completely within their rights. Presenting themselves as an open platform
while executing a clear political censorship is less morally clear (though of
course still legally completely OK). It doesn't matter that political
censorship is now directed to the group that is outgroup for most people here
- it always starts that way. It never ends there.

~~~
beat
Morally, FB is completely right.

They only become wrong by running down your slippery slope. They're not
banning Christian speech, or Republican speech, or whatever.

The existence of a slippery slope does not demonstrate that everything that
could reach the slippery slope is de facto immoral.

------
KorematsuFred
"But over the past three months our conversations with members of civil
society and academics who are experts in race relations around the world"

Sorry I do not agree with this sort of shit. I have left facebook long back as
a platform. I do not want some Phd in Gender Studies and Race relation to
dictate what I can read and what I can not.

------
Lidador
How about blue, red, green, black, yellow or purple nationalist content?

------
jeffdavis
The US President has identified himself as a "nationalist"[1], and appears to
be racially white. So how would this policy apply to Trump supporters and
their speech?

This is a general problem with trying to ban speech. It's often very hard to
figure out what someone intends to mean versus how others interpret it versus
what it "really means". Especially when symbols, slogans, etc., come into
play, not to mention different audiences. That ensures that this policy will
be applied inconsistently and have unintended consequences, and create a
flurry of argument over what was banned, by whom, and why.

It's just a mess and does not help the discourse.

[1]
[https://www.usatoday.com/story/news/politics/2018/10/24/trum...](https://www.usatoday.com/story/news/politics/2018/10/24/trump-
says-hes-nationalist-what-means-why-its-controversial/1748521002/)

~~~
dragonwriter
> The US President has identified himself as a "nationalist"[1], and appears
> to be racially white

A white nationalist is not the same as a nationalist who is white; a white
nationalist is one who adheres to the ideology of white nationalism.

~~~
jeffdavis
I understand that, but I was trying to point out the difficulty in drawing
that distinction when it comes to applying a censorship policy.

------
d_burfoot
I think this is bad news for Facebook, the tech industry, and America as a
whole. This kind of activity is going to prompt an enormous backlash from the
right. And the right-wingers aren't going to make fine distinctions between
the good tech companies and the bad ones.

Censorship may be a necessary evil, but if so, it must be done by duly elected
or appointed government officials. FB and other tech platforms should build
APIs that allow government officials to make the relevant decisions about when
a piece of content is unacceptable. That is a principled and nonpartisan
strategy that will allow the tech companies to focus on what they're good at -
technology innovation - and avoid what they're bad at - politics and public
relations.

------
ghobs91
Far right: private businesses such as bakeries should be allowed to refuse
service to whoever they want!

Also the far right: why is Facebook censoring my ability to spread hate
speech!

I'm oversimplifying it, but the hypocrisy and lack of self-awareness is
astounding.

~~~
MockObject
Seems no more hypocritical and self-awareness-lacking than arguing that
Facebook has the right to turn away white nationalists, but that a bakery must
serve all the cakes.

Either way, it's conceptual gerrymandering to carefully carve out a wiggly
boundary that -- strictly by coincidence! -- protects their own side and
restrains the other side.

~~~
ghobs91
You're equating someone asking for a cake for their same-sex marriage to hate
speech? Are you kidding?

~~~
MockObject
No, I'm equating one business refusing a customer to another business refusing
a customer. That is how analogies work.

------
dbg31415
Cool, morality determined by ad revenue opportunities.

What could go wrong?

Clearly nobody is going to shed any tears over Nazis... but is it wrong to
support Palestinian claims to Jerusalem? Where's the line?

I don't like this move since I feel it will just push people into places where
we can't determine real identification, should that be needed after a crime.

I don't like this move because I don't, fundamentally, think we should all
have to agree with Zuck to have a voice online.

I don't like this move because it's reactionary, and there's no consistent
philosophy behind why these groups would be banned but not others.

Just feels skeezy to censor people, no matter how skeezy those people are.

------
featherrust
Gross. About time to replace facebook.

------
umvi
> Last fall, we started using [machine learning and artificial intelligence]
> to extend our efforts to a range of hate groups globally, including white
> supremacists

So... is this going to be a train wreck like YouTube's content ID where
certain keywords will trigger the robots to censor my post with no human to
appeal to?

"Sorry, you said: '80 years ago today Hitler committed suicide and axis power
was extinguished.' This post contains keywords associated with dangerous
groups and individuals and has been removed."

------
golergka
I would love for all people around the world to become level-headed and
rational enough so we can have complete and total free speech. Personally, I
enjoy 4chan and don't experience any negative emotions conserving to people
who openly advocate genocide that would target me and my family.

However, people en masse are different, and tech platforms have to acknowledge
that. So, I believe it is a good thing to ban and police hate speech, at least
when private companies do it (and not government).

~~~
fixermark
Without delving too deeply, because I don't want to engender any negative
emotions: do you take the people who openly advocate genocide against you
seriously?

If you do and it doesn't cause you to experience any negative emotions, I am
surprised.

If you don't, I am also surprised. In light of what has happened in Pittsburgh
and Christchurch.

~~~
golergka
My mother and grandparents have had to make a run for a bomb shelter a few
times this week. So, yeah, I don't think that people who have "from the river
to the sea" in their bio are joking.

------
shiado
.

~~~
dragonwriter
[For context, the post above, before all of its content was edited out, called
for “enforcing” CDA Section 230 against tech firms by making them liable as
publishers for user content]

You aren't calling for 230 to be enforced, you are calling for it to be
_abolished_.

Which is already happening one step at a time (SESTA/FOSTA), each of which
radically accelerates self-censorship. Actually, I suspect a lot of the self-
censorship that happens that _isn 't_ connected to legal changes but is
connected to popular mood is to _avoid_ political pressure for new limits on
the 230 safe harbor which would force even more drastic self censorship.

------
Inu
Might sound good in theory, but in effect it means that some low ranking
Facebook employee will have to decide whether Jordan Peterson is allowed to
publish on Facebook or whether he is to be categorized as a white supremacist,
a decision the outcome of which will vary according to ideological
disposition, the ability to read and comprehend texts, and the power to
navigate the realm of ambiguity and conflicting interpretations.

------
seventytwo
GOOD.

Now do anti-vaxxers and flat earthers.

------
oyvey
I would never become sympathetic to the ideas of white nationalism if it wasnt
for the exposure to them on mainstream media platforms. This makes me mad as
hell. How am I supposed to call this a democracy? Might as well live in
communist china.

------
normalperson
will they ban Black Nationalism as well?

------
otabdeveloper2
Thought experiment for ya:

> Facebook to ban homosexual hookup content

How about that, eh? Funny how a simple change of political platform will
U-turn your support of so-called 'free speech'!

~~~
darkwizard42
There is a relatively far line from 'free speech' to 'free speech that
actively encourages discrimination, harm of others, or infringing on others
rights'

As a private platform, Facebook is free to curb the topic you have suggested,
in the same way you are free to avoid said private platform.

------
cowwithbeef
T

------
atonse
Good to see the Valley is starting to get off its "we are just a platform, all
information should be free and neutral no matter how bad, so we're not
responsible" extremism of the past few years and understand that some forms of
speech are harmful should not ever be encouraged and accelerated on any
platform or technology, ever.

To the people that say "Where does it end?" – that's pretty easy. Let's start
with curbing speech that encourages harming others.

They've been dragged kicking and screaming into this realization but we'll
take it.

~~~
Kalium
> To the people that say "Where does it end?" – that's pretty easy. Let's
> start with curbing speech that encourages harming others.

This may come as a surprise to you, but this response will not be comforting
to those with genuine concerns about civil liberties. Avoiding the question of
where to draw the final line with the question of where to draw the _first_
one just implies that you intend to draw more lines in the future.

~~~
mcphage
> Avoiding the question of where to draw the final line with the question of
> where to draw the first one just implies that you intend to draw more lines
> in the future.

No, it just implies that we don't need to have the perfect final answer now.
Maybe we'll draw more lines, maybe we'll find out we don't need to, maybe
we'll decide that this line was drawn too far and pull it back. We don't need
to pretend that we have all the answers yet—we don't even need to pretend we
have all the questions yet.

~~~
Kalium
You are very wise and thoroughly correct!

Perhaps people might consider offering such elegant points as yours, instead
of making the claim that the current line is correct and that's all that
matters.

~~~
ssully
I would argue that the people who are concerned about "what future lines will
be drawn" when the first line being drawn is banning white supremacists are
the ones who should be making more elegant points.

------
tathougies
Why are white people specifically targeted. As someone of an Indian
background, I've seen plenty of Hindu nationalist content on Facebook, and --
despite it actually affecting my Christian family who are still in India --
there is no ban on that content. Such nationalism has led to more violence and
death than any of the white nationalism stories.

~~~
odorousrex
Cynical Answer:

It's affecting FB's bottom line.

White nationalism is in the news and on the rise in the US/Europe and
therefore spooking advertisers who only really care what the pundits in
US/Europe are talking about.

This soothes them over = money for FB.

My hope is that the algorithms and models developed for slowing the spread of
White Nationalism will also work (down the road) for things like Hindu
Nationalism, Chinese, etc.

~~~
baddox
Is that really cynical? The presence of hate groups on your platform being bad
for your bottom line is a _good_ thing, right?

~~~
odorousrex
> The presence of hate groups on your platform being bad for your bottom line
> is a good thing, right?

Yes it undoubtedly is, but they only seem to care about it _now_.

I would love to see a more proactive approach, but unfortunately proactive
approaches don't often generate profit.

~~~
baddox
Sure, doing it earlier would be good, but at any point in time the only
choices are to do something now or do it further in the future. So I'm glad
they chose now.

------
cynoclast
I can't believe people are applauding this.

The whole "you can go make your own platform, facebook/google/twitter/youtube
doesn't owe you anything" excuse is bullshit. Look at what happened when
people DID try to make their own platform with Gab. They were DDOSed and
people uploaded child porn and then reported it to try to get the whole site
shut down to prevent people who were silenced elsewhere from talking there
too. They're behaving more like the nazis than the white nationalists!

Cheering on authoritarian censorship of things you don't want to hear doesn't
make you virtuous it makes you a coward.

The solution to bad speech isn't censorship, it's better speech:

First they came for the socialists, and I did not speak out— Because I was not
a socialist.

Then they came for the trade unionists, and I did not speak out— Because I was
not a trade unionist.

Then they came for the Jews, and I did not speak out— Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.

-Martin Niemöller

------
egao1980
This is clear discrimination. As if people with different shades of skin being
nationalists are nicer than white nationalists. Is it 1984 already or we still
have some time?

~~~
dragonwriter
> This is clear discrimination

Every decision is discrimination.

> As if people with different shades of skin being nationalists are nicer than
> white nationalists.

A “white nationalist” isn't “a nationalist with white skin” but “an adherent
to the ideology of white nationalism”.

You can be a white-skinned nationalist and not be a white nationalist.

> Is it 1984 already or we still have some time?

1984 didn't feature discrimination against either white nationalists or white-
skinned nationalists, so it seems unrelated to your complaint.

~~~
egao1980
White nationalists = nationalists that think that white people are better than
other, but we have guys from ICGJC and other afrosupremacist groups, etc. Why
not ban them all?

1984 is mentioned as a reference to the dystopian world of propaganda and
Facebook / Google are quickly turning into corporate Ministry of Truth.

~~~
dragonwriter
> White nationalists = nationalists that think that white people are better
> than other

No, white nationalism is not “nationalism” + “white preference”. It's closer
to nationalism around a concept of a white nation.

> but we have guys from ICGJC and other afrosupremacist groups, etc. Why not
> ban them all?

Because we don't see scores of bodies piled up motivated by the message of
those groups.

------
duado
And the slippery slope begins. I am not a white nationalist (I’m not even
white) but I think that recent immigration trends in Europe are illustrative
and that we should avoid a similar situation occurring in the United States.
Am I now a Nazi?

------
Simon_says
My router bans Facebook content.

------
089723645897236
Nothing wrong with deplatforming.

------
djanogo
As a brown person this pisses me off, liberals keep pushing for these social
justice rules on the platforms to silence content they find offending.

If your argument is that "they don't have right to access the platform", you
are acting ignorant when you know there are only handful of platforms on
internet.

The next step would be blocking DNS registrations of websites whose content
you don't agree with, oh wait there is entire country which does this and you
disagree with that country's rules too.

~~~
mike10010100
> you are acting ignorant when you know there are only handful of platforms on
> internet.

Your continued participation in said platforms ensures that they will remain
the "only" platforms on the internet.

There is absolutely nothing guaranteeing you a right to have a private entity
host your information, especially when you're paying absolutely nothing for
them to do so.

> The next step would be blocking DNS registrations of websites whose content
> you don't agree with

Slippery slope fallacy incoming!

~~~
jakeogh
You do realize calling slippery slope a fallacy does not mean it's something
you should ignore... right? Incrementalism is -by far- the most common
technique used to implement totalitarianism.

~~~
mike10010100
Blocking _access_ to information via DNS blocks is an entirely different
concept than private entities not wishing to host extremist content on their
private platforms.

~~~
jakeogh
Sounds like a slippery slope.

~~~
mike10010100
It really isn't.

------
rc_kas
It wasn't banned already? What the hell?

~~~
weberc2
From TFA:

> Our policies have long prohibited hateful treatment of people based on
> characteristics such as race, ethnicity or religion – and that has always
> included white supremacy. We didn’t originally apply the same rationale to
> expressions of white nationalism and separatism because we were thinking
> about broader concepts of nationalism and separatism – things like American
> pride and Basque separatism, which are an important part of people’s
> identity.

------
DigiMortal
Censorship is evil, facebook sucks, whats new

------
GreaterFool
What exactly is "white nationalism"?

------
ghani
No! That is a terrible thing.

------
whatiseu
Does this include policing the debate about EU expansion to Turkey or Morocco?

------
mamon
Ok, what about black nationalist content? Asian nationalists, Jewish
nationalists?

------
piokoch
"Today we’re announcing a ban on praise, support and representation of white
nationalism and separatism on Facebook and Instagram"

First of all, why white nationalism, how about black or yellow nationalism,
are they ok?

Second thing, they will ban "separatism". Really? Are you FB guys really going
to ban Kurds who want their own land, are you going to ban Palestinians who
want their own land? If FB existed 30 years ago, would it ban anty-communistic
opposition in central and eastern Europe which was fighting to get separated
from Soviet Union? What the about Scots, what if they want to separate from
UK? Will they be banned? Same with Catalans?

------
cowwithbeef
I am surprised their PR team didn't advise them to use the neutral term
"ethno-nationalist."

------
oh_sigh
Why not just any race-based nationalism? It seems weird that facebook would
ban white nationalism but black nationalism can go on.

------
ProAm
Why not ban all racism, all hate, everything? Doing just one is PR only.

~~~
untog
Read the article.

~~~
ProAm
I did, this is so selectively enforced its comical. That's my point.

~~~
untog
I guess you must have brushed over the part where it says:

> Our policies have long prohibited hateful treatment of people based on
> characteristics such as race, ethnicity or religion – and that has always
> included white supremacy. We didn’t originally apply the same rationale to
> expressions of white nationalism and separatism because we were thinking
> about broader concepts of nationalism and separatism – things like American
> pride and Basque separatism, which are an important part of people’s
> identity.

Very specifically, it points out that they are not just "doing one", they are
bringing one in line with a broader policy that is applied to many different
types of hate. So I'm really not sure what point you're trying to make.

~~~
ProAm
Why not just ban all nationalism or nationalistic posts? Depending on ones
relative position it's hateful towards others. But regardless:

> Our policies have long prohibited hateful treatment of people based on
> characteristics such as race, ethnicity or religion

I'm just saying this has always been selectively enforced at FB, the reason
they are speaking to white nationalism now is because of recent current events
and they are in the spotlight. I feel this is purely this is a convenient PR
campaign.

Not that it's an easy topic to control or address all the platforms have
issues with this, but FB is being opportunistic for press.

~~~
Dylan16807
Most "nationalism" is based on physical location and community and you're not
going to get a lot of support for banning that.

------
patrickg_zill
Will Facebook ban Chinese nationalist, Black/Mexican/etc. nationalist accounts
also? Or maybe Jewish nationalist content?

If not, then why not?

------
cannedslime
I just realized that facebook have deleted most generation identity pages. Im
not affiliated with GI, but afaik they have ONLY done peaceful protests and
they aren't even white nationalists... What the hell, this is beyond
dystopian.

------
kyleperik
Quite clearly this is just a move to gain back support.

But this has nothing to do with free speech. If I tell someone something, they
have no obligation to broadcast what you said. Why should Facebook act
differently?

I don't like that they're targeting such a specific group. I think we should
be on the lookout for real racism, and this is certainly an example of that.

~~~
oyvey
The purpose of free speech is that you can influece the reality you live in,
without that its useless, might as well not have it. As such free speech
requires to be allowed on the most popular social platforms. Just like free
speech had to be allowed on a towns square.

~~~
kyleperik
Facebook has no obligation to share what you share. No one has that
obligation. They may be a "platform" but that doesn't mean it's a free playing
field.

~~~
oyvey
Do you also support the right to discrimate based on race?

------
qwsxyh
Very glad to see everyone here is defending white nationalists. Exactly the
kind of content I expected to see on this website

First they came for the Nazis. And I did not speak up because I was not a
Nazi.

Then they came for the pedophiles. And I did not speak up because I was not a
pedophile.

Then they didn't come for me, because I'm not morally reprehensible, and life
was good.

------
781
So is the Scottish independence movement now forbidden on Facebook?

[https://en.wikipedia.org/wiki/Separatism_in_the_United_Kingd...](https://en.wikipedia.org/wiki/Separatism_in_the_United_Kingdom)

------
caprese
Nice move

I was annoyed that instagram has now become just reshares of twitter and
facebook posts that spread the same message without being indexed as easily as
the text versions are

Hope they don't go after things I like and target my accounts next! I know how
the limitations of their process and lack of appeals or justification.

Also easy enough to find my friends on group chats elsewhere

I would have very different views if these were state run systems, but
fortunately that's not relevant here.

------
cannedslime
When you ban groups with relatively harmless names such as "Its OK to be
white" I think you just end up giving people an eye opener... Its NOT OK to be
white? I guess that makes it easier for Caucasians to decide their political
affiliation... I mean just search for terms like "kill white people" "Kill
whitey" etc. There is plenty of people glorifying white South Africans getting
slaughtered in their bedrooms on facebook. But I guess thats OK, because fuck
dey white man am i rite?

------
ardy42
> Our efforts to combat hate don’t stop here. As part of today’s announcement,
> we’ll also start connecting people who search for terms associated with
> white supremacy to resources focused on helping people leave behind hate
> groups. People searching for these terms will be directed to Life After
> Hate, an organization founded by former violent extremists that provides
> crisis intervention, education, support groups and outreach.

I wonder if they can re-purpose this technology to re-direct people who post
content critical of Facebook and social media to pro-Facebook propaganda.

------
dna_polymerase
I am very much pro free speech. In the sense of nothing should be disallowed.
I think platforms should be built in a way that there really cannot arise an
ultra-(right/left/X) bubble. Facebook should build mechanisms like they can be
found here on HN. If I would write something really offensive I'd get
downvoted and eventually my post would vanish. That feedback by the community
is extremely important, because I get to learn what goes and what not. Just
deleting unwanted opinions makes things worse. People build up aggression
against "the system" and feel censored by the wrong mechanisms. If I, however,
see that the majority dislikes my opinions I know where the problem is.

------
root_axis
What is it about nazis and white supremacists that causes everyone to boot
them from their platform? From cloudflare to microsoft to twitter and now
facebook, why is everyone so hostile to them? They just can't catch a break,
if this keeps up they'll only be able to host content on platforms that are
supportive of the message or they'll be forced to build their own platforms
and since so many people are hostile to them already nobody will use their
platform except people with similar beliefs (e.g. voat). Should these
companies really be allowed to just ban them simply because they regard their
ideologies as repugnant?

edit: I'll note here that the above should be read as tongue-in-cheek sarcasm,
but I do think the question is worthwhile when considered in earnest by those
who support nazi speech on private platforms.

~~~
deegles
Yes.

------
kadendogthing
I'm going to caution against people using the phrase "free speech" as a
thought terminating cliche when discussing this.

Banning speech can and typically has been harmful, especially when governments
get involved with doing it. No one is ignorant about this fact. But we're
collectively discovering that not doing anything about what can be published
and said is just about as harmful, if not more so in concentrated
circumstances. There will be irrational people who act irrational on
fundamental erroneous information. It's a practical matter, not an ideological
one. White supremacist ideas aren't up for debate. There isn't a point in
debating them. And the people adhering to these ideals aren't looking for a
debate and most definitely won't consider other ideas. You can't debate
rationally with someone who hasn't arrived at their position in a rational
manner.

There's also something else at play here. We're seeing new systems that
replaced old ones re-implement "lost" functionality they thought wasn't
needed. Turns out it was, in fact, needed. The old media empire of yesteryear
was always subject to regulations around the content they spread.

If bitcoin is an exercise in why we need financial regulations, facebook and
other social media networks are an exercise in why we need content and
publishing regulations.

~~~
someguydave
Are black nationalist ideas up for debate? Who gave you the power to decide
what is up for debate?

~~~
kadendogthing
Removing yourself from any kind of rational framework removes the required
considerable characteristic of an idea or view point. Especially when those
ideas and view points have been tried on top of just generally being a Bad
Idea (TM).

So I haven't done anything. There's no need to start an epistemological
bankrupt conversation here. Or to engage in philosophical waxing about
intellectual curiosity surrounding view points that serve no purpose.

