Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand why none pushes for a justice process for taking out hate speech and bad speech.

Instead we are basically begging Facebook to be the one who makes this kind of judgement. Obviously, Facebook doesn't want to do that. Nobody wants to do that.

I'm not talking about obviously blatant cases. I'm talking about a high profile one like Trump tweeting something bad.

Even a senate/former lawyer like Elizabeth Warren doesn't ask for a due justice process. She is also screaming at Facebook to just take down Trump's posts.

I imagine, if what Trump posts is extremely bad and illegal, it won't be hard to prosecute him or get a court order to ban or take down his posts.

(Apology for inaccurate language usage. I'm not familiar with these legal terms. But I hope you get the main idea)



It depends on what you mean by a justice process, but there is one already. In the U.S. the right to free speech is pretty securely enshrined in law (knocks wood). There are definitely limits to the first amendment, but speaking very broadly, private institutions can do more to limit speech on their platforms, whereas the justice system's role is to protect speech. Taking it out of the hands of Facebook (et al.) probably would not get you what you're after.


It's a tough problem.

Imagine going out to a bar and some nazi walks in and starts screaming about this or that group that they hate. At the top of his lungs.

Most places would kick his ass right out.

Those that didn't, the customers would probably leave.

Social media is a bit more complex than all that, but in some cases, it really is like that bar where the guy stays, and invites his buddies, and no one normal wants to go there.


The problem is that the speech that these people want censored isn't illegal. We have a strong constitutional prohibition against censorship.

Many activists would prefer that we didn't have this prohibition, but they don't have the political clout to get it repealed --- rightly so, because despite everything that's happened, explicit censorship is very unpopular.

Because the activists can't get the state to censor the public, activists have used increasingly underhanded tactics to get tech companies to censor the public. They've been very successful so far, but there's a growing resistance to their antics.


One of the most successful PR strategies Facebook has used throughout this is to position it as a free speech issue and the boycott as calling on them to censor speech. It's been so successful that I've seen it repeated a number of times on HN.

To the extent that they can make the argument free speech vs. not free speech, of course they win hearts and minds, because as you say, censorship isn't very popular.

The problem is that by making this all about censorship, they can ignore any responsibility they for harm they create in other ways. For example, creating incentives for publishers to create divisive content for the sake of enraging people, or recommending people join white supremacy groups. As far as I can tell, it was these sorts of measures that the boycott organizers called for.

The cynicism of Facebook's PR “free speech” stance is especially annoying given their arbitrary and non-transparent block of Dreamwith a few weeks ago[2]

1. https://www.buzzfeednews.com/article/ryanmac/facebook-employ...

2. https://andrewducker.dreamwidth.org/3861716.html


> create divisive content

I'll quote a paragraph of http://www.paulgraham.com/say.html

> We have such labels today, of course, quite a lot of them, from the all-purpose "inappropriate" to the dreaded "divisive." In any period, it should be easy to figure out what such labels are, simply by looking at what people call ideas they disagree with besides untrue. When a politician says his opponent is mistaken, that's a straightforward criticism, but when he attacks a statement as "divisive" or "racially insensitive" instead of arguing that it's false, we should start paying attention.

The most interesting part is that the content that was classified as "divisive" when that essay was wrote is not the same that is classified as "divisive" now.


Whatever you want to label it, surely I'm not the only one who has observed that the best way to get an article shared on social media is to amp up how controversial it is. Then people who agree share it to agree with it, and people who are enraged by it share it because they are enraged by it.

Whether you want to call this “divisive content” (which definition fits it pretty neatly, in spite of PG's good essay) or “scissor statements” or something else is up to you, but it's a real phenomenon.


I agree, but I prefer to call them "flamewar topics".

It is not only used in social media sites, it is also used be journalist in newspaper and TV. (Sometimes it is more evident is the sport section. Every time the national soccer team lose a match, there is a tempest in a teapot about each one of the decisions of the team manager and the players.)


"If you have always believed that everyone should play by the same rules and be judged by the same standards, that would have gotten you labeled a radical 50 years ago, a liberal 25 years ago and a racist today." -- Thomas Sowell (who, ironically, is the topic of a similar fight about free speech on a different HN thread right now).


The implication is that if it's not false, it must be true.

But that does not follow. Mostly, when people have trouble calling something false, it's because it is obviously ambiguous and the meaning is disputed. The point of words like "inappropriate" and "divisive" is to shift to something less easily disputed.


But we all agree that hate speech should be punished by laws though, and it currently is punishable.

It's that Facebook shouldn't be the one who makes that judgement.

And that's my point. We should get a court to make that judgement.


No, we don't all agree on that. That's my entire point. The position you're espousing is not shared by a big enough fraction of the US public to get the country's free speech protections overturned, so you're just going to have to deal with people saying things you dislike.


Do you not know what is happening in Hong Kong right now? You don't let the government decide what is "hate speech" and what is not.


On the flip side, you don't let Facebook decide that as well.

As a society, there doesn't seem to be a solution.


If FB wants to decide that what I say is not allowed on their platform and consequently ban me, that's fine by me. It's their platform, so their rules. Very different from a government banning speech, and hauling people away to be tortured like in China's case.


You make it sound like, since it's a private business, they can impose any rule they want.

I don't think I agree with that.


Freedom of speech is the solution.


But we all agree that hate speech should be punished by laws though, and it currently is punishable.

We don't, and it isn't, at least not in the US.


It is sometimes punishable by federal law https://www.pbs.org/newshour/nation/how-federal-law-draws-a-...


Among the organizers' 10 headline demands (https://www.stophateforprofit.org/productrecommendations) are:

4. Find and remove public and private groups focused on white supremacy, militia, antisemitism, violent conspiracies, Holocaust denialism, vaccine misinformation, and climate denialism.

6. Stop recommending or otherwise amplifying groups or content from groups associated with hate, misinformation or conspiracies to users.

7. Create an internal mechanism to automatically flag hateful content in private groups for human review. Private groups are not small gatherings of friends - but can be hundreds of thousands of people large, which many hateful groups are.

So while they do want other things as well, the censorship seems like a pretty central part of the platform.


Fair point, I'd agree that 4 and 7 verge on censorship (I'm less convinced about 6 -- I draw a line between functions that Facebook performs as a content transmitter and as a discovery platform. I would argue that disrupting content transmission constitutes censorship, but curating the discovery platform does not.)

Still, it gets to my point that 7 or (arguably) 8 of the 10 demands do not directly call for censorship.


For number 4, who should make the judgement whether a speech falls into that category though.

Should it be Facebook or our justice system?

I'd rather not have Facebook do that.


It can't be our justice system without a constitutional ammendment - which leaves just Facebook.


Then, pushing for an amendment would be the right approach and far better than having Facebook be a judge.

Using Facebook as a judge becomes more like popularity contest with who can scream the loudest and are willing to harass Facebook employees for results.


I can understand some random activists doing this. But Warren is a senate and former lawyer. Even she didn't want to go the legal route. It's pretty disappointing. Instead, she pulled a stunt with fake news facebook ads stuff.

> The problem is that the speech that these people want censored isn't illegal.

Please excuse my little legal knowledge. But I thought hate speech and fake news were illegal.


> But Warren is a senate and former lawyer. Even she didn't want to go the legal route. It's pretty disappointing.

She doesn't want to go the legal route because she knows she'd lose. Free speech protections in the United States are ironclad. In other jurisdictions, e.g., Germany, they're not so strong and activists in those countries have succeeded in making the state compel tech companies to censor. This approach will not work in the United States.

> Please excuse my little legal knowledge. But I thought hate speech and fake news were illegal.

One of the most infuriating habits of the authoritarian activist types is their way of pretending that whatever they don't like is de jure illegal. That they've convinced people that there's some law against "hate speech" is sad.

No, hate speech and fake news are not illegal, nor should they be: any prohibited category of ideas invariably expands to encompass whatever it is that the people defining the category dislike. The strict American prohibition of censorship is the product of centuries of experience in England with exactly this sort of creeping totalitarianism. Humanity has not changed since then. Power still corrupts.

https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...


Wait a min, hate speech isn't illegal?


Not in the USA. It's been firmly ruled protected by SCOTUS, with the same caveats tied to it as other speech. In other words, direct call to harm is not protected, saying you hate them or a subset of their characteristics is absolutely protected.


Not in the USA, no


> the speech that these people want censored isn't illegal.

Nor, 99% of the time, particularly hateful.


>bad speech

Lol. Yeah, we should totally ban “bad speech”.


Come on, man. I'm not a native English speaker who is well-educated in laws and social study.

By bad speech, I mean the ones that a lot of people (e.g. Warren) were yelling at Facebook to ban the accounts/posts.

I would appreciate if you try to get the main point. Thank you


> I don't understand why none pushes for a justice process for taking out hate speech and bad speech.

The term 'justice' implies government. The administering of justice is ostensibly function of government, not private entities like Facebook. The unspoken understanding that perhaps you are missing is that it is taken as given that any 'justice process' created to prevent and/or punish unpopular speech will ultimately be abused and extended to effectively all meaningful speech that the Powers That Be (kings, presidents, whomever) don't approve of. That understanding comes from the experience of history and was the basis for the 1st Amendment of the US constitution.

It is certainly true that in many places on Earth 'wrong' speech is criminal and is prosecuted. In the US the attempt to outlaw speech or prosecute people for their speech is what is considered 'wrong' and 'bad.' Those points in history when these principles were violated are understood to be mistakes and aberrations.

So you don't get very far in the US with ideas of making speech into a matter for law enforcement or courts. With a few narrow exceptions we don't entertain these ideas and we don't reward anyone that tries it, or even suggests it, as you can see from the moderation.

> By bad speech, I mean the ones that a lot of people (e.g. Warren) were yelling at Facebook to ban the accounts/posts.

Senator Warren's views do not have the force of law and no one in the US is required to honor them. Further, Senator Warren's opinions are not universal. Giving Senator Warren the ability to control what expressions Facebook does and does not permit would require something on the order of revolution, probably violent.


Because it’s almost all politics and tribalism. Democrats complain that there’s not enough content take downs and republicans complain that there’s too much of it. No matter what internet companies do, they’ll face tons of criticism.

And both sides think that they that support free speech. But free speech mean to them speech they agree with, and everything else is a hate speech.


How does your characterization - that Democrats want more take downs and Republicans less - support or even comport with your contention that both sides claim interest in free speech




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: