Hacker News new | past | comments | ask | show | jobs | submit login

My question to everyone supporting restrictions on speech and supporting privacy violations, even for good reasons, is this: are you prepared to have these tools fall into the wrong hands? The same methods used to quickly shut down a dangerous website or to track down a terrorist by their online posts can be used to silence protests and to hunt down political opposition.



What tools are these actually - the delete button?

When it comes down to it - i like moderated spaces on the internet. I dont want to wade through spam, not to mention violent and abusive content. I don't like seeing shock images.

So yes, i support having moderation. The question to me is not: yes/no moderation, its where you draw the line; what are your policies and how are they enforced.

Fundamentally i think the solution is competitors. This was a much easier position to take when the web did not just have a few social media giants with network effects, but still: if youtube becomes to dranicon, start a new website. The tools to stream video (albeit not at scale) are just an apt-get away.


>if youtube becomes to dranicon, start a new website.

You are clearly ignorant of the kinds of shit actual YouTube alternatives have to deal with. Their websites and videos get de-ranked or de-listed from search (Rumble [1]), their apps get removed from app store (LBRY [2]), their infrastructure gets constant attacks (BitChute [3]). Meanwhile, the mainstream media smears them at every opportunity.

This is nothing like the environment that Google and YouTube were started in.

https://rumble.com/vcxrxh-rumbles-anti-trust-lawsuit-against...

https://lbry.tv/@lbry:3f/googlebanslbry:7

https://www.investmentwatchblog.com/censorship-bitchute-depl...


There's (somehow) a widely held belief that there is no barriers to entry, on any market, ever. Things from anticompetitive behaviour of incumbents to needing enormous upfront capital invesment simply do not exist. This has to implications:

- Nothing should be done, whether regulation or consumer action, because everything can and will be fixed by the entry of a new company.

- We are living in the best of worlds. Otherwise, a new company would have sprung to fill the gap and then we would be living in the best of worlds again.

I am yet to convince people holding this belief that it may not be the case.


I never heard of these before. I wish there was a plugin that'd signal me if I visited a Google-owned platform and provided me with alternatives.


Good luck hosting such an extension on Chrome.


And by smear do you mean accurately describing far right conspiracy junk or something else?


You're telling me that incumbent platforms are behaving in anticompetitive fashions? if only someone would do something


sounds like anti-monopoly laws should be enforced then, which is a separate problem.


IMHO the central question is: do you tolerate zones moderated (or not moderated at all) by their owners exactly how they want it, provided that you remain free to avoid any zone and also to create your own zone(s)?

No one likes spam, abusive content, fake news... However many won't qualify a given content the same way.

IMO, for example, history shows that anyone thinking that he is _right_ and has to educate others by coercing them to pay lip service to his ideas and also by censoring contradiction, walks on a path leading to violent and abusive acts.


> IMHO the central question is: do you tolerate zones moderated (or not moderated at all) by their owners exactly how they want it, provided that you remain free to avoid any zone and also to create your own zone(s)?

Is that not what we do currently? (subject to certain legal restrictions, but generally in usa the restrictions other than copyright are pretty light. And if they aren't, well Tor). You have pretty wide discretion if you self-host, especially on bare metal. Not perfect mind you, but pretty wide.


DNS registrars have also removed sites with views they don't like.


That's about where I would draw the line. Domain registration needs to be regulated or run by the gov't so that it can be protected by the first amendment. Pretty much everything else can be solved by enforcing antitrust laws so as to prevent one entity controlling too much of the space.


The other challenge is DDoS protection. Any controversial site is going to attract large DDoS attacks; if the big DDoS mitigation providers drop a sites that site is either effectively dead, or pushed into the arms of a sketchy "semi state sponsored" mitigation provider like DDoS-Guard.


Yes, indeed, by and large (in 'Occidental' nations). However during those last ~25 years there are IMO more and more pushes towards censorship, from memory laws ( from https://en.wikipedia.org/wiki/Memory_laws ) to deplatforming ( https://en.wikipedia.org/wiki/Deplatforming ). Albeit none of those is really new it seems to me that they are gaining speed.


Yep, that's part of why reddit is so great.

> Fundamentally i think the solution is competitors. This was a much easier position to take when the web did not just have a few social media giants with network effects, but still: if youtube becomes to dranicon, start a new website. The tools to stream video (albeit not at scale) are just an apt-get away.

There's no serious YouTube competition because YouTube is not (just) a video host. Video hosting isn't that big a deal (though hard at scale).

YouTube is a network of videos, creators and viewers powered by recommendations, subscriptions advertisements (paying creators) and massive network effects.

You could build yet another video host and even have niche success with it. But you have no chance of providing the equivalent effect of a video landing on the YouTube frontpage.


>You could build yet another video host and even have niche success with it. But you have no chance of providing the equivalent effect of a video landing on the YouTube frontpage.

Most people on Youtube never even have their videos land on the frontpage, and yet many of them are successful. That network effect and scale isn't necessary for the ability to broadcast and disseminate speech online, so it doesn't really contradict the premise that free speech on the web (as far as video is concerned) requires Youtube as a host, or that competition couldn't be as effective as, say, repealing Section 230 or having the government take over Youtube and force it to act as a public utility.

At the end of the day, all Youtube is, is popular. They're not like J.P. Morgan, holding monopoly power by physically controlling infrastructure and making competition impossible. The reason there isn't "effective" competition isn't because Google controls the web, it's because no one considers anything less than Google-scale to be "competition," so they write off the tons of other media streaming sites and social media platforms as irrelevant because they can't gain dominant market share. But this conflates the free speech argument with arguments about effective business strategy.


The current climate is one of not allowing people to "start a new website". Sure, one can quite technically run their own platform, but the minute it becomes significant there seems to be a risk of shutdown via external pressure. Deplatforming is a very dangerous tool, the likes of which I think should only be reserved for the most heinous crimes. I have zero qualms about deplatforming a child molester, for example. But there was just a story on here about how someone lost their job simply for having accounts on Gab and Parler. Not for any content they posted, but for the mere existence of the accounts.

Even as recently as the 80s and 90s, liberals were much more likely to be anti-censorship. Now people who claim that label are all for shutting down "bad" things, without understanding that "good" and "bad" are largely human constructs, and the things that are considered "good" and "bad" can and will change over time. What about in the far future when bigotry inevitably becomes commonplace and systemic in a much worse way than humanity has already seen? Do we really want to just hand these tools of suppression and control to people like that on a silver platter? The entire point of freedom of speech is supposed to transcend that sort of control.

To go on a slight tangent, it's quite similar to how many traditional Christians in America would wish for their religion to have special privileges in law, when they don't realize that it would open the door for any other religion to do the same, should their population grow to the numbers they'd need to influence policy. The very same people who would wish to make Christianity privileged would bemoan Islam one day becoming the majority religion and instituting Islamic law via those same legal mechanisms. Nobody I've talked to face to face in my area has any kind of answer to that one.


Competition makes it harder for malicious actors to manipulate everyone's content, but lets not pretend that companies don't know how to create the illusion of competition [1,2]

We already have the problem of people creating too many filter-bubbles in their online lives. A fragmented social media environment will not usher in an age where everyone gets exposed to a more diverse set of ideas, we'll just get FoxNewsBook and MSNBCBook, or ChristianTube and MuslimTube, or whatever other social groups happen to be the most defining.

Moderated communities are basically the HOA neighborhoods of the internet. There is nothing wrong with liking the high property values, but competition alone does not prevent their abuses [3].

[1] https://www.buzzfeednews.com/article/expresident/all-the-por...

[2] https://www.foodbeast.com/news/pepsi-boycott-kendall/

[3] https://www.businessinsider.com/personal-finance/homeowners-...


Competition makes it harder for malicious actors to manipulate everyone's content, but lets not pretend that companies don't know how to create the illusion of competition [1,2]

In real life, you get to choose the people to be friend with or associate with, and we all have number of circles with interests that doesn't necessary overlap the same circles. That's pretty much the same thing that I have on the internet when using certain social media. It's great since you don't have spammers, and you don't have to deal with racists or people who just turn up the temperature of a room. Maybe that's the way it should be.

It's also much more difficult to spread vitriolic meme since it has to infiltrate the various disparate communities.

Moderated communities are basically the HOA neighborhoods of the internet. There is nothing wrong with liking the high property values, but competition alone does not prevent their abuses [3].

OK. Pretty much any places that aren't garbage have moderation of some sort. If nothing else to keep out spam.

And a community isn't a HOA, which you are basically stuck with if you live in them.


Sure, but the goal isn't to prevent people from crafting their own filter bubbles, it's to prevent someone else from controlling bubbles not their own.


Whether or not you even see specific content on Facebook, Google or Hacker News depends on sorting algorithms, so that would be one of the tools used to censor content. It's not just "the delete button", it's so much more, like how DMCA processes are handled or AIs created to find naked people (which affects artists).


>Fundamentally i think the solution is competitors. This was a much easier position to take when the web did not just have a few social media giants with network effects, but still: if youtube becomes to dranicon, start a new website. The tools to stream video (albeit not at scale) are just an apt-get away.

I agree. I wouldn't have a problem with YouTube doing anything if YouTube wasn't the video website. We need decentralization.

But at the same time more and more often I also see these same concerns raised with decentralized networks (going beyond standard practices like blocking and preventing federation with instances), and that gets me worried.


There's always talk about decentralizing content platforms but these decentralized non-blockchain social network and messaging protocols have existed for years:

- Scuttlebutt

- GNUnet

- Secureshare

- Fereenet

- ZeroNet

- Retroshare

- Diaspora

- Mastadon

- Matrix

- Cabal

Decentralized platforms put more responsibility on the user (key management, account recovery, content hosting, etc) which is unlikely to get mass adoption from mainstream users unless the decentralized platform is as easy to use as twitter.


For anyone scrolling past this list, I had a lot of fun with Retroshare. It's like the Napster we never had, but with gardening books instead of music.


They exist, but I'm worried that they can't break out because of network effects and lock in by the big players.

What would be a good way to make these platforms mainstream? Would you support some sort of a mandatory requirement to federate for the big players? Of course, they would still be able to refuse to talk to instances that violate their own policies.


>Of course, they would still be able to refuse to talk to instances that violate their own policies.

But isn't that ability exactly what many people consider a violation of free speech on mainstream social media platforms, only applies to accounts rather than instances?


If mainstream platforms were federated, you could move to a third instance that would communicate both with the big instance and the banned instance. Hell, you could even host your own instance and still be able to talk to people on the big social network. That's the missing piece that would make those "you don't have to use that website" arguments valid since it would greatly lower the cost of moving away from one social media website to another.


The market provides though. There are are tons of sites for alternative and niche content, and new sites come up all the time.

Complaining that you don’t have a ready made audience is pretty weak. You need to hustle for this stuff. People used to xerox and give away zines for an audience. As the Pirate Bay founder said recently[0], it’s embarrassing how easily people are giving up.

It’s almost as if they don’t want to actually put the stuff out there, but rather be held up as martyrs. Pirate Bay and SciHub are still up, and have multiple billion dollar corporations actively gunning for them socially, legally, and extra legally. It can be done.

[0] https://www.vice.com/en/article/3an7pn/pirate-bay-founder-th...


This. I believe that the problem has more to do with most users relying on platforms that are fundamentally advertising companies, and content that doesn’t go well next to an ad gets removed because it becomes a direct threat to revenue. Video sites exist which show very graphic violence, among other things not allowed on YouTube. They’ve been around for years. The issue, I guess, is that they tend to run ads for sketchy business, as they’re the only ones willing to put their name next to that content. The problem IMO isn’t so much that free speech is in jeopardy as the financial incentives of the incumbents are not aligned to support some of the things people want to say.

It can be done, people are just being lazy instead of creative.


Well, that’s the thing with being ostracized. Reputable people don’t want anything to do with you, and so you’re left with only the disreputable ones.

It’s like there’s a whole generation of engineers that don’t realize that you can actually buy a computer instead of just rent someone else’s.


I wouldn't have a problem with YouTube doing anything if YouTube wasn't the video website

Then you're in luck, YouTube is not the only video website. It's not even the biggest. Some video formats it can't even compete in because competitors are so far and away superior.

I think you really mean that YouTube is a popular site. But it is certainly not the site going by site size. That would be FB. Not the most engaging either with snapchat and tik tok out there. It's certainly better than the other "also rans" like daily motion though.

(Actually, now I think about it, I'm not even sure I would say it is necessarily "better" than DM. Maybe a better phrase would be "seems to be more popular"?)


Decentralization of hosting responsibilities would certainly help bring down costs which is usually what makes it hard to make a competing hosting site. Then again any popular website _is_ going to need to go beyond standard protocol practices and try to censor certain illegal content.

We could just change our culture to be satisfied with 720p content and use traditional websites that don't use p2p tech. Like, we could all just start using bilibili.com or some other chinese competitors instead of youtube.


Does bilibili.com have less censorship than Youtube?


> I dont want to wade through spam, not to mention violent and abusive content.

I mean, ok? Can't you simply not watch that content? What if there were a warning on the content, so that you could choose to not watch it?

Or maybe you could specify on your account, that you don't want to see that content, so it never shows up for you. Problem solved, and everyone wins.


There would be no issue if this was plausible in practice, but its not.


>What tools are these actually - the delete button?

Reach


I don't have a problem with moderation but I do have a problem when infrastructure providers like AWS decide how their clients should moderate their content.

You can't have competition when some of the competitors can't use the infrastructure. It is not without reason that utility providers can't discriminate and I think it is time to decide what is considered a utility in the internet age.


AWS is by no means the only infrastructure provider.


You can make the same argument in the opposite direction: for those who support unbridled electronic speech amplification, are you prepared for it to fall in the hands of people who are bent on changing minds en masse to fill society with falsehoods, uncertainty, and violence?

On a more detailed note, I draw a sharp line between normal speech and selectively amplified speech. Twitter and Facebook are in a bind because their business is to curate speech amplification in order to maximize exposure to ads. So if you ask “why did person X see post Y?” it’s because Twitter/FB decided to show Y to X. So they have automatically become at least partially responsible for the effect of that post.

So to me this Twitter/FB censorship fiasco is not about free speech censorship whatsoever. It’s about these companies being in the business of deciding who sees what, being externally pressured to do a better job.

It wouldn’t make any sense whatsoever to ask Twitter/FB to “stop censoring” while admitting that they should be allowed to curate feeds to maximize ad revenue.


> for those who support unbridled electronic speech amplification, are you prepared for it to fall in the hands of people who are bent on changing minds en masse to fill society with falsehoods, uncertainty, and violence?

You mean a society where the most charismatic/popular members get the attention of the masses, mostly by telling them what they want to hear? Yes, I think I'm prepared to live in such a society.

I think, in your attempt at making free speech look undesirable with your mention of "amplification", you forgot that this amplification is for everyone, including you and those who agree with you.


We should clarify that "free speech" is unrelated to what Twitter and Facebook are doing. Free speech is a constitutional concept, protecting people from draconian government laws that suppress their speech, or discriminate them based on their speech.

What Facebook and Twitter are doing is providing a product that amplifies and places speech in front of particular eyeballs and ears. There are many, many issues with what Facebook and Twitter are doing, but they're all unrelated to the legally-protected constitutional free speech.

Some people have argued that "posting on Twitter/FB may as well be considered protected speech, since society depends on it so much" which is an interesting concept for sure, but has no basis in law yet.

Finally, to your last point, the amplification is not for everyone, and that's at the core of why people hate these companies right now. Users are subjectively amplified or suppressed, or kicked off. And this unequal treatment of users is at the very core of their business model: they make money by selectively amplifying content.


Free speech is not a "constitutional concept" whatever you mean by that. Free speech is a philosophical concept that has been embedded in the US Constitution as the 1st Amendment.

The US Founders thought that people must have the ability to speak their mind, to discuss and challenge each others ideas. They thought this ability was so important they enshrined it in law.


And what that means, practically speaking, is that unlike other places in the world, we can speak out against the government and not worry about going to prison for it. That was the intention of it, and we still have that. Whether content critical of the government converts well enough for a company to allow it to stay on their platform is a different question, and hate speech

  public speech that expresses     hate or encourages violence toward a person or group based on something such as race, religion, sex, or sexual orientation
yet another separate question.


“Free speech” is about limiting government retaliation based on speech. So Uncle Sam can’t legally put you in prison because you spoke against Uncle Sam.

Free speech doesn’t mean you’re protected to speak whatever you want in any setting. Speech is subject to private property and common sense. If you are profane on my property I’ll kick you out and ban you. That is NOT a violation of your free speech.


> You can make the same argument in the opposite direction: for those who support unbridled electronic speech amplification, are you prepared for it to fall in the hands of people who are bent on changing minds en masse to fill society with falsehoods, uncertainty, and violence?

This seems to equate the ability to publish with a society that's filled with falsehoods, uncertainty, and violence. Is it possible that society won't have any more or less falsehoods, more or less uncertainty, or more or less violence than a society in which you don't appoint some "knows better than the reader" censor to decide what people are allowed to see and read?

It seems insanely presumptuous to me that some censor at Facebook or Google won't turn into a violent maniac when they see the things you seem to propose that they censor, but the general, unwashed masses will. Give adults some credit.

We've had the church around spreading falsehoods, uncertainty, and violence for centuries, and the world is still here (and still improving). I don't think letting people publish whatever legal content they like is going to suddenly send our world to hell in a handbasket. The danger from uncensored publishing is overblown.


I'm a big believer in constitutional free speech, but I certainly don't believe that laws or conventions should protect unbridled amplified speech; giving everyone a megaphone connected to everyone else's ear. Plenty of examples in history show how propaganda was intrinsic to installing and sustaining terrible dictatorships and cults. Sure amplified speech is fine in many instances, but naturally people will want to curb it when it goes too far. People and societies should definitely be free to curb it when it goes too far. They should not be forced to endure it.


You're just going to ignore the body count from the spread of falsehoods? The lynchings? The wars? The genocides? None of that matters?

Not to mention the world is getting better only in a very narrow sense. Environmentally, we're entering a very dark period. The falsehoods surrounding climate change will cost us a great deal.


> Environmentally, we're entering a very dark period. The falsehoods surrounding climate change will cost us a great deal

Those falsehoods comes mostly from old people watching fox news. Young people who grew up with mostly unfiltered communication on the internet are typically for doing something against climate change. So this only strengthens the point that free flow of information is more desirable than filtered communication.

People who want to manipulate the masses hates this though, so they try to make every argument they can that free flow of information is harmful and we need to stop it. Basically anyone with power have nothing to gain from you and me being able to freely communicate, so they want to reign that in as much as possible. I'm not sure why you want to make the rich and powerful even richer and even more powerful, but that is what you are arguing for.


I think the point is not that "all speech should be suppressed", but that, generally speaking, the suppression of speech should not be disallowed. Users and employees should be allowed to pressure companies to suppress hate speech; a bar owner should be allowed to kick into the street a racist and profane customer; a school should be able to fire a teacher who is getting a lot of complaints for teaching racist ideas or gaslighting students.

It would be a horrible outcome if speech maximalism succeeded and prevented people from acting to protect their customers, property, society, etc., against destructive speech.

Of course, you can say "yeah but if you allow suppression of speech then it'll be used for evil" -- as in, Twitter can kick off whoever they want. But going as far as legally protecting any and all speech, in any situation and on any public or private property, is far more destructive in my opinion.


> in any situation and on any public or private property, is far more destructive in my opinion.

I think this is perhaps a straw man. Nobody is asking to protect unrestricted speech in all private venues.

Would it be so bad to make it illegal for large services that have private DMs to snoop/censor those, such as we do presently for large services that have private telephone calls?

The dichotomy isn't public/private. The phone company is private and they aren't allowed to tap your phone (unless they're doing it for the government), and they're not allowed to shut off your service because you called members of the "wrong" political party too often.


IMO that last point is because we’re still trying to govern the Internet based on telecom laws from decades ago. Tech moves way faster than laws, especially tech that is helping politicians get elected. The financial and political incentives aren’t aligned to make significant changes any time soon.


I completely agree with you, and we should work to limit these companies to our liking, with legislation.

Your stance is super reasonable. But look around, many people in this conversation are effectively arguing in favor of totally, globally unrestricted speech, which I find extreme and untenable.


Do you think we are observing a symptom (unbridled speech amplification) of the disease (monetization of algorithmically targeted content), so to speak?

Do you think that if we somehow broke this monetization model, that companies like Twitter, Facebook and Google would cease to be the behemoths they are, and the amplification mechanism might go away?


I definitely think the current outrage around "Big Tech censoring free speech" is a downstream effect of these companies curating feeds to maximize ad revenue.

If these platforms presented the content without interpreting and curating it for profit, they could at least take the stronger stance of being a platform instead of a speech curation/distribution/amplification system.

For example if Signal was externally pressured to take down speech based on its content, they would be like "We don't have that ability." But in the case of Twitter and FB, that ability and practice is at the very core of their business model.


Yes. Imagine if the telecoms could play ads before, after, and during your conversations on your phone(s)? Why do we accept it from big tech social media?


Strong network effects + capitalism. They found a good way to make lots of money, which allowed them to pay lots of skilled people to spend their time getting more people into the system and looking at ads.

The problem, IMO, is that most users don’t understand how social media works from a business perspective. Advertising companies put a lot of effort into not being known as advertising companies.

Another fun thought experiment: consider who runs political ads, especially local advertisers (highway billboards etc) and what speech they’re willing to accept. Consider Comcast / cable networks and what speech they’re willing to accept. Who are their customers, how are they making money, and how does that affect what speech they are willing to accept on their platforms.


“ Advertising companies put a lot of effort into not being known as advertising companies.”

That says a great deal doesn’t it? They are too ashamed and know it is an immoral business.


>You can make the same argument in the opposite direction: for those who support unbridled electronic speech amplification, are you prepared for it to fall in the hands of people who are bent on changing minds en masse to fill society with falsehoods, uncertainty, and violence?

Yes, obviously. "But people may say false things on the internet" is not nearly as strong an argument as "you will be prevented from criticizing your government on the internet".


“You can make the same argument in the opposite direction: for those who support unbridled electronic speech amplification, are you prepared for it to fall in the hands of people who are bent on changing minds en masse to fill society with falsehoods, uncertainty, and violence?“

This is truly ironic given that exact abuse has been perpetrated by these tech companies in order to build their empires in the form of advertising and manipulating users on their platforms.

Also, your dismissal of humanism is appalling.


This is a tough issue, and I don't see how it would be more humanist to support something like "all speech, in any situation, must be allowed and protected". Clearly some of that speech will be destructive, and harm people. I don't think you want to advocate that those people who are being harmed should be unable to protect themselves from speech they see as harmful and destructive.

Neither side of this coin is a straight answer; you can easily poke holes in either by imagining the terrible consequences of the alternative. The answer IMO is people, groups, societies, should decide what's constructive and destructive and converge on a balance that works for them.


No. It’s quite clear. People should have the right to pursue dangerous, stupid, crazy, ugly, offensive, illogical activities as well as safe, brilliant, sane, beautiful, flattering, and logical activities.

Your approach results in a star chamber of elites that make value judgments on what people can SAY, which directly impacts what they can DO. Society is limited to their judgment and belief system.

There are countries that exist today with this model, and it is not a pleasant existence to be a regular citizen in those countries especially for creative “weird” people who represent most of the major innovations in the world. Read history and see for yourself how this approach leads to poor outcomes for the majority. In fact, it is at the core of the founding of the United States itself.

One approach leads to a narrow, controlled, and directed society that is controlled by the elite; the other brings breadth and freedom to pursue crazy thoughts that can lead to major innovations.


If you’re loud and profane in a bar and disturbing the tenants, the bouncer will muzzle you. You seem to be arguing that the bouncer should not be allowed to do that.

You seem to be mixing up “free speech” in the constitution and the normal to and fro negotiations and tensions between people. In private life people need to be allowed to act against speech that they find harmful.


The US has strong speech protections, somehow when people are wrong in their beliefs about what that means they move the goal post to this mythical “ethical” free speech maxim.

I dont believe in what those people imagine free speech means. I believe in the “first amendment” which includes freedom of association including for the non-government people that create venues and platforms. “We have the right to refuse service to anyone” as seen on any shop any where.

If you would like Congress to create a new regulation on interstate commerce preventing internet platforms from doing some particular thing, then articulate that specific position.

I am absolutely not bothered the current reality, it is the expected reality and I would like the same privileges on my own venues and platforms.


Except you don't have the right to refuse service to anyone for any reason. It is against the law to refuse service on the basis of protected classes (i.e. race, sex, sexual orientation). These classes are thought to be immutable.

More and more evidence shows that political beliefs are inheritable[1]. Should that be protected? What about height? We know that shorter men earn less than taller men. Should short people be protected from short jokes? Should the government ensure short and tall people have the same salary? What about attractiveness? Again uglier people are less successful and have less sex. Should they be protected? Should the government put their finger on the scale to ensure ugly people have the same amount of sex?

Either, I should be able to discriminate on any basis, or I can't. The problem at the moment is we have it both ways, in ways that asymmetrically attack one particular group of people.

Nobody really things through any of these inconsistencies though because it doesn't impact most people. As the values in Silicon Valley have significantly diverged from large parts of the US, more and more people are being impacted by these policies.

[1] https://www.pewresearch.org/fact-tank/2013/12/09/study-on-tw...


Insightful, I was curious if you had a position? Such as would you like additional protected classes? Or any particular outcome, its not necessary I was just making sure I didnt miss something you were advocating for.


I'm afraid I don't.

Ideally, I think the concept of protected classes shouldn't exist. I'm not sure if there is a difference of teasing/insulting someone about their height or their race. However, I understand the historical reason why these protective classes were created and why they were needed at the time.

I'm not sure how we move forward on this


People are upset because at the core of this debate lies a moral conflict between deontologists and utilitarians, where currently utilitarians are in control and wielding power with complete disregard for the fact that the majority are of different ethical persuasions.

So it’s more a question of how to maintain fair platforms, than it is of free speech. The free speech debate will likely be the dominant topic of the coming decade yet no one actually wants unmoderated public spaces. We just can’t agree to the terms. If this debate really was just about the rights of a few fringe groups of extremists, rather than perceived bias in moderation, it wouldn’t be popular.

You can solve this conflict simply by setting clear rules that everyone must abide by. Rules you’d assume any rational person would agree to. Let’s say: On public platforms you’re not allowed to target any individual or group with attacks. But then you’d also have to ban people who say that conservatives are evil. Which we both know is never going to happen - as just the suggestion would lead to political deadlock. The utilitarians believe in catharsis through the expression of justified hate and anger; no, we weed rules that censor not any division, but unjustified division. We need not rules that protect personal dignity, but the dignity of some over others. In Reddit’s term of service they ban abuse of minority groups; specifically because they don’t want to ban abuse of majority groups. The ministeries of truth of our time are staffed by blue haired ideological activists with a social engineering agenda who love science but only as longs as it’s produced by a faculty that’s struggling with a falsifiability crisis.

If you’re running the communications platforms at the heart of our modern societies you must either govern it from the center, or accept that your subtle guidance however well intentioned is driving us towards ever increasing conflict. You have to govern from the center even though your personal ideology tells you that everything is relative, that truths are only meaningful in context, and that ethics is a function of cause and effect. Because to people who rather view ethics as a categorical imperative _you_ are the one who’s is driving division and pushing us towards disaster. And regardless of who’s right or wrong; it’s arrogant and dangerous to allow yourself to become dogmatic.


I have to agree with this sentiment strongly. As much as I may lament today's social media landscape, it is only a drop in the bucket when it comes to my ability to participate in our cultural zeitgeist. And as long as we are going to pursue capitalism with gusto, I am in favor of capitalists controlling their own business - modulo protected class issues.


> I am in favor of capitalists controlling their own business - modulo protected class issues

What does this mean?

Can you give an example of a protected class and how the business would be impacted?


I think modulo is a bit misused here but race, religion, sex, age and more are legally protected classes in the US and a business' right to refuse service can not be because of discrimination against these protected classes. The parent wants to make it clear that they do support a capitalist's freedom to control their business in so far as it does not conflict with class protections.


That is why I prefer these tools be used by private organizations to enforce their own private rules rather than being managed by government mandate. That ensures that enough social rules exist to maintain society but not rules which are so ingrained in law as to prevent progress or allow trivial total abuse.


When the private companies start getting billion dollar contracts with the government, or are lobbying politicians for laws massively benefitting the companies in the long run, it's safe to say the decisions behind the "private rules" will be impacted accordingly.


Nothing is a perfect system, I support the one I feel has the best chance of ensuring a stable but evolving society. Not restricting things has resulted in the governing body of the US getting attacked by an angry armed mob inside a federal government building. So that's not a viable option for a stable society. The government is both subject to changing political whims and has a history of overusing it's control in various situations. So that's not a viable option for an evolving society. Thus we get to the shitty but potentially best compromise option. Just like democracy itself.


How is it best for a society to evolve by suppressing diversity? In this situation, I'm talking about diversity of ideas relating to politics.

Have you ever considered why Silicon Valley exists in America? It's because as America grew, it welcomed in people from all over the world with extremely different backgrounds with extremely different ideas about how to do things. Over time, that situation resulted in extreme innovation where the best ideas prevailed.


>How is it best for a society to evolve by suppressing diversity?

Because radical change generally leads to very bad things so you want a slow evolution which means you suppress ideas so they spread more slowly and thus society has more time to process them properly.

>Over time, that situation resulted in extreme innovation where the best ideas prevailed.

And the existence of giant corporate beasts, massive wealth concentration, massive income inequality in the bay area, spread of socially destabilizing ideas, foreign influence over elections, privacy failures, etc., etc. And many would argue many of the ideas, especially recently, aren't revolutionary but rather simply better ways to cheaply exploit people and hide externalities (that society but not the company pays for).


> you want a slow evolution which means you suppress ideas so they spread more slowly

So is this your viewpoint on all forms of diversity, such as racial and socioeconomic inclusion? Or are you selectively changing your definition as you go?

> And the existence of giant corporate beasts, massive wealth concentration...

This has nothing to do with what we're discussing here.

> foreign influence over elections

So you recognize foreign influence over elections, but think it's ok to silence people whose underlying animosity is rooted around accusations of foreign influence in the most recent election?

> and hide externalities

Interesting choice of words considering you're in support of literally hiding externalities of the situation as a whole


>So is this your viewpoint on all forms of diversity, such as racial and socioeconomic inclusion?

Yes. Although in the US we've been working on socioeconomic and racial inclusion for the last few centuries so I feel that's moving slowly enough for society to adjust. Communism shows what happens when you move on the socioeconomic side too quickly and it's not pretty.

I do find it funny how some people think racial and socioeconomic inclusion is a new thing that is just now happening. Do you people never pay attention to history, ever?

>This has nothing to do with what we're discussing here.

You talked about the glory of change in terms of SV. I mentioned the dark flip side of change in terms of SV. You can try to cherry pick parts of the story but I'll keep pointing out that you're doing so.


Would you support breaking the near-monopolies that certain organizations have?


Yes although that's a tangential topic I feel related more to power concentration in a society. None of the organizations have close to a monopoly on communication or content. They simply all agree right now which is generally how implicit social rules tend to function. Eventually they'll start to disagree again in one way or another.


They all have American left wing bias. Personally I'd prefer if they had either European left wing bias or European right wing bias, but right now we only have one option and that is the American political bias which is a really shitty situation. I can't wait for Europe to either ban them in some way or reign in their rights to moderate.


What is to stop any European companies from starting their own social networks? They are not monopolies, and are much less of monopolies than other de-facto monopolies which are allowed to exist without being regulated as such, such as ISPs in the US.


Nobody is being hunted, that's pretty hyperbolic. If YouTube were silencing protests I wouldn't like that, but I don't rely on YouTube to understand what's going on in the world and I would encourage anyone concerned about YouTube's content policy to consider visiting other websites on the internet in addition to YouTube.


> If YouTube were silencing protests I wouldn't like that, but I don't rely on YouTube to understand what's going on in the world

A lot of people do. You can't really find out what's going on in the world from the mainstream media in any real resolution or detail, and "other websites on the internet" don't generally have thousands and thousands of people livestreaming actual live video of what's happening.

Perhaps this could be rectified by our society working to make bandwidth an order of magnitude or two cheaper, so that there can be lots more live video hosts. As it stands, you're pretty much stuck with a few giant megacorps like Google or Twitch.


This. Some media outlets have decent live reporting, but going through the available apps on a PlayStation, which is what we use to watch video on a big screen via a projector (we also have a Shield and a Mac mini connected to it, but those get used less), we ended up watching both large media cos and smaller channels live-streaming the events of Jan 6 on YouTube. If one looks at the available apps and platforms on smart TVs, there aren’t actually that many options. This could change, but there are high barriers to entry e.g. having enough of a platform that it makes sense to build an app for Samsung TVs, another for LG TVs, another for Sony TVs, then get those all certified and keep them available.


The overwhelming majority of YouTube content is junk, people shouldn't rely on it to stay informed, at best, they should cherry-pick a few quality sources to supplement others. Yes, there are some people who rely exclusively on YouTube just like there are some that rely exclusively on Fox News to stay informed, in both cases it is the responsibility of the media consumer to make better choices.


You can because the mainstream western media despite it's faults tends to be pretty good

Meanwhile there are tons of crazy extremists streaming. A lot of it doesnt provide an accurate view and might only be useful as a potentially biased first hand account to digged through by journalists and historians after the fact


There are also tons of non-crazy, non-extremists streaming.

Regardless of whether you listen to the streamers, it's an amazing tool for seeing what is actually happening in a specific place. You can ignore the commentary or ideology (not that that is really an issue, most streams are just normies) and see what is happening.

Western mainstream media is a lot of things, but comprehensive and nuanced it is not.


>Nobody is being hunted, that's pretty hyperbolic.

That's pretty hyperbolic.


Your television and radio broadcasts have been "censored" and curated since the dawn of time. Newspapers have curated their content and various newspapers are known for having left or right bias.

This isn't just the stations own content. If you paid to put content on television, radio, or in the paper, the publisher has always had the right to publish or not publish content (with some limits, but nothing this article speaks about would be protected by law). You couldn't have a cable community access channel with whatever content you wanted.

YouTube/ Twitter/ Facebook have been and remain far more permissive in publishing outside content than any other media in history.

You are implying that there is some kind of "new" restrictions on speech, but this is literally hundreds of years old.


I think social media is not directly comparable to television and radio broadcasts. The public square analogy makes much more sense here. But even then, in a lot of places there are impartiality requirements on media outlets, as flawed as they may be.

Let me use your analogy, though.

Let's say there are many newspapers in town. Is it fair for me to refuse to publish certain things? Sure.

What if I own the only newspaper in town? It gets more complicated. But hey, perhaps the town has a thriving samizdat culture, so maybe it's fine.

But then what if I start buying out every printing press in town? And on top of that, what if I'm pressured by the mayor to not publish something? This is a big issue now, right?

How can that situation be addressed? People often suggest regulating that sole paper, but to me the answer is taking steps to ensure that we never end up with one newspaper in town. And we definitely don't let that paper control all of the printing presses.


> The public square analogy makes much more sense here. But even then, in a lot of places there are impartiality requirements on media outlets, as flawed as they may be.

The "Public Square" assumes the space is owned by... well the public. Either the town or city. That's not the case here. Even if it were, the reach of a person in the public square is tiny... a few hundred people if they are absolutely lucky.

Besides... your rights in actual public squares haven't changed a bit. Save the fact that COVID has shut down many public spaces.

> But then what if I start buying out every printing press in town?

You mean the way Clear Channel has actually bought up a huge percentage of radio stations?

The way big right wing corporations have bought up the majority of local television stations in the country?

Is it "fair" that the majority of television broadcast stations are owned by a few right-wing media companies?

For what it's worth, all three "competing" local broadcast television stations are owned by the same company now.

I'd love to know why people think it's super important that Facebook and Twitter cease their supposedly "biased" curating while so many broadcast companies which reach 10s of millions of homes get a free pass.


So let's go after TV and radio stations too, then? I'm not opposed to that.

By the way, I don't think forcing broadcasters or social media to not police their content is the way to go - I'd much rather make sure they don't become monopolistic. The situation you describe with TV must not happen with social media too.


I think the point was that we didn't seem to mind that consolidation nor the censorship/biased reporting in those media sources yet want to ask Twittter/Facebook to hold themselves to a new standard.

Is it more because it's conservative views being censored now?


I think that consolidation was a lot less visible, plus people's attitudes are changing. Finally, this ship has not sailed yet, while TV is pretty much doomed.

American conservatives, being as... loud as they are, did make the issue more mainstream, but people have been bringing up these problems for as long as the internet has been around.


> I'd much rather make sure they don't become monopolistic. The situation you describe with TV must not happen with social media too.

I'd love to see more variety in social media. Unfortunately network effects favor big single player. Also, every time a new entrant threatens, they get sucked up by an existing player. Sadly, since we've allowed Facebook to swallow everything which isn't bolted down, the only real competition seems to be Chinese owned[0].

(And I sympathize with foreign governments struggling with Facebook & Instagram being US owned).

[0] I have no qualms about Chinese people or even Chinese companies in general. But the way the Chinese government operates, I am skeptical of platforms based on speech.


Funny thing is, there might be a way out if we changed the incentives. Because let's be clear: FB, Twitter, YouTube, ... (and to a slightly lesser degree, Google) - they are all media distributors and opinion amplifiers. Much like traditional publishers. Only instead of creating or sourcing their own content, they syndicate.

We also have to acknowledge that content moderation at scale is impossible.[ß] The lines are hazy and fluid, context matters, and incentives are outright perverse. The volume of incoming material is overwhelming. On top of that, people with money, fame or lulz on the line will game any rules based system.

But what if these online syndicates had to publish corrections for their worst offenses? News outlets have had to do that for decades, after all. We could as a society take this as an opportunity to move the goal posts for the better. Instead of allowing the correction to be buried in a dark corner nobody ever sees, specify that the correction itself has to be displayed as prominently as the original news itself.

You're doing print? Then that correction will eat up prime eyeball real estate from the publication. (And make it clear you messed up.) You're an online syndicate? The correction needs to go through the same amplification mechanism and get as prominently on top of the users' feeds as the original item did. It eats up prime screen real estate and makes it equally clear that you messed up.

This would have three benefits:

1) Content publishers become accountable for their actions. The US media in particular is obsessed with just throwing whatever they get their hands on out there and have the readers sort it out for themselves. To hell with accuracy.

2) The requirement to surface corrections as widely as the original item(s) provides a second-order incentive. Your publication forfeits revenue and opportunity from whatever other news are currently floating around, costing both money and reputation. Mess up more often, get punished more often.

3) The readers would benefit from increased media literacy education. Get bombarded with garbage and outright lies? Receive more fact-checks after the fact, published and syndicated through the same funnels that fed you the original trash.

Freedom of speech is not freedom from accountability.

ß: Mike Masnick has been saying this for years. For a longer take, see https://www.techdirt.com/articles/20191111/23032743367/masni...


How would this even work with something like twitter?

Lets say Biden Tweets a big fat lie to his millions of followers. That lie in turn gets retweeted by millions of people. Next, Biden posts a retraction. Do all of his followers also have to post a retraction? Does Twitter have to implement some kind of retweet retraction mechanic?

What if the original user is obscure and someone famous quote tweets a big fat lie instead of inventing their own big-fat lie? Does it make a difference if the quoter was agreeing with the original?

How do you deal with users who refuse to post a retraction or even leave twitter entirely? Is there an auto-retraction? Who decides if and when something is retracted?

> The requirement to surface corrections as widely as the original item

Imagine for a moment you are running an obscure little website, "The 2nd amendment times", and you post a little story on your blog about how Obama is getting appointed to replace Biden in 2024 and he's coming to take everyone's guns. All 17 of your readers read your story and laugh it off except one passes it on to Rush Limbaugh who Tweets it.

Now 17 million people blow up your blog. You are featured on Fox News and dozens of top right-wing pundits repost your site.

Who is "accountable" here? How would this get retracted? If you post a new article on the front page of your blog saying you dreamt the whole thing after smoking too much weed, does that make it all good even though only your original 17 readers are likely to see this correction?


>supporting restrictions on speech and privacy violations

That's a confusing phrase. It took me a while to realize you meant "supporting privacy violations" rather than "supporting restrictions on privacy violations". You seem to be lumping speech restrictions and privacy violations into the same bucket, but I see them as very different, often even opposed to each other. The US elevates free speech higher than privacy. Europe elevates privacy higher than free speech. In many European countries the press isn't allowed to report the names or faces of suspects. In many European countries you can't take pictures of people walking on the street.

Doxing is a clear case where free speech is directly opposed to privacy. Same with right to be forgotten.


> You seem to be lumping speech restrictions and privacy violations into the same bucket, but I see them as very different, often even opposed to each other.

And yet today there was an article in the front page claiming that Signal is a problem because its end to end encryption can be used by anyone, even "bad actors", to communicate. That's an attack on both privacy and free speech, and I wouldn't be shocked if GP had that article (and others like it) in mind.

Mind you, most of the comments here rightfully called that article out for the absurd fear-mongering it was, and even the increasingly common censorship apologists were unusually silent. I was pleasantly surprised, even if it did still get a lot of upvotes and reached the front page.


Fixed the wording to make it a bit more clear, thanks. Connecting doxing as a privacy violation to free speech is a very interesting point that I didn't think about - I was thinking about data collection and tracking when talking about privacy violations.

For what you bring up with doxing, I don't think I have a very satisfying answer. I definitely won't advocate for keeping it on social media websites. But I can't advocate for all-powerful tools that would allow even for actually dangerous content like that to be tracked down and removed from everywhere, because there is no way to ensure that tools like that will not be abused.

Remove it from Twitter, block it on your own federation nodes, try to bring the perpetrators to justice, but, as painful as it is, you shouldn't have a magic button that would be able to completely eradicate that from the internet.

It would also be good to reduce data collection and to teach people to be more conscious of what they share to prevent private information from leaking to begin with.


Aren’t there more ways how and more places where you can speak out today than ever before? Aren’t people able to say more radical or extreme things at scale than ever before?

I’m reminded of folks claiming that their voices are silenced but say so on Twitter, at physical events, on the news, etc.?

The irony seems mind boggling to me.


> Aren’t there more ways how and more places where you can speak out today than ever before?

No. There was for a little while, but the number of avenues that one has for publishing unpopular speech are at a major low vs 10 or 15 years ago.

Now, the internet has reached the status quo of broadcast media prior to it: a small handful of gigantic powerful organizations have total veto over what can be published.


I'm pretty confident that, no matter what idea you want to express, you can find a forum on the Internet which allows you to express it. Even the most offensive, socially toxic ideas typically have a handful of forums that welcome them. Am I misunderstanding your argument here?


You can't find one that allows you to publish at scale. Screaming alone or in a tiny venue is very different than being able to publish to millions.

Everyone's words and ideas should be able to live or die on their own merit, and not be bottlenecked by shitty hosting and blanket domain bans in posts/DMs.

There are tons of places where you can post stuff; Facebook will happily censor DMs between consenting adults that contain your domain name when they want to share links to your content to others.

Become too unpopular (or, more accurately, increase in popularity too much with the "wrong" people) and your host will cancel your account. There aren't that many datacenter operators.


I'm not prepared to have them 'fall into the wrong hands'.

But these platforms are owned by these companies.

What rules do we put in place that they have to host some of this content? Blatant misinformation, vague encouragement of violence... hate?

Is the price of free speech that someone has to host it? Who doesn't want to?


> What rules do we put in place that they have to host some of this content? Blatant misinformation, vague encouragement of violence... hate?

Why do they have to be rules? Why can't we have a culture of tolerance? Robots aren't running these companies. They're people, just like you and me. They can choose to be tolerant.

Tolerance is about tolerating the most horrid things. Not because it's nice or those things have some value. They're horrid, they have no value. It's because the alternative is worse. Because censorship is worse. Because when a person is censored, when they can't speak, they resort to violence and through censorship, you encourage violence. And violence can't be censored. It can only be suppressed with further violence.


"culture of tolerance"

What does that mean exactly? By that you mean you require someone host, intolerance?

Is it a "culture of tolerance" if someone spews hate on me on a social media platform because of my race, because we're tolerating that?

I'm still trying to figure out what folks think this alternate choice looks like... do we require people to host whatever gets posted?


I agree with OP, opinions shouldn’t be censored, no matter how much we disagree with them.

Anything that’s not harassment, a direct threat or direct call to violence should be allowed.

To your example, if someone attacks you personally then that would count as harassment and be moderated out.


>opinions shouldn’t be censored

Let's say I have a platform that let's people comment, so outside of direct harassment I shouldn't be allowed to delete things?


Well it's not really a question of what you should do, it was more a comment on what the ideal solution would be.

Is your platform as large as Twitter, is it one of the main communication platforms in society, does it allow users to wield social and political influence? Is it a proverbial public square?

If so, then in my view a wise and emotionally mature person would elect not to censor opinions, realizing that there are more important things at stake than hurt feelings.

Of course I might be wrong when claiming that censoring opinions and mandating a specific morality will lead to more conflict (see another comment I made in this thread for more on that). Maybe it won't. I guess we'll find out in the next decade or so.


> tolerance is about tolerating the most horrid things.

No it isn’t. No society survives if it is tolerant without limit. You are arguing that anything and everything should be allowed. No laws, no justice.

I suppose it is a valid position to have but I don’t think a practical one over the long term if you want to be alive.


>Because when a person is censored, when they can't speak, they resort to violence

But that doesn't actually happen in the vast majority of cases. Most people try other outlets. If there are no other outlets available, then they tend to give up. Deplatforming has been shown to work.

And it's worth remembering that going violent means putting their life at high risk. Their family at risk. Their possessions and personal freedom at risk. And it means hurting other people. There's fortunately not many people willing to take that step and pay that price.

Just think about what would cause more violence: banning all nazis from Twitter? Or allowing nazis free reign to talk about how horrible minorities are, and what they'd like to do to them?


So we should tolerate violent and hateful speech because if we don't, the violent and hateful will only turn more violent and hateful. Got it. Makes sense.


> Why can't we have a culture of tolerance?

The paradox of tolerance states that if a society is tolerant without limit, its ability to be tolerant is eventually seized or destroyed by the intolerant. Karl Popper described it as the seemingly paradoxical idea that in order to maintain a tolerant society, the society must be intolerant of intolerance.[1]

If you haven't read Popper's "The Open Society and Its Enemies" then I'd suggest you do.

[1] https://en.wikipedia.org/wiki/Paradox_of_tolerance


Have you read Popper? Here's a relevant quote from him about tolerance:

> If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them.—In this formulation, I do not imply, for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would certainly be most unwise. But we should claim the right to suppress them if necessary even by force; for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument, because it is deceptive, and teach them to answer arguments by the use of their fists or pistols.

I think it's safe to say that the extent of censorship perpetrated by the big tech is not limited to the voices that teach others to "answer arguments by the use of their fists or pistols". They of course ban some of that too, but they ban quite a lot more.


> ... answer arguments by the use of their fists or pistols.

The real paradox of the paradox of tolerance is how many people use Popper quotes to rationalize their own intolerance for people with differing ideas, without any awareness that Popper was speaking only of intolerance for people who resort to violence.


Yes. It's as if people want to just use name of famous philosopher to justify their urge to censor people and ideas they don't like.


I think this part: "as long as we can counter them by rational argument and keep them in check by public opinion" is clearly relevant when the very facts public opinion is based on are attacked.


I’m guess you haven’t read it? Because your summary is not accurate.


That summary is a direct copy and paste from the Wikipedia link I posted and references in that quote.

And yeah, I have read Popper as well as Wikipedia. Saying "it's not accurate" isn't accurate.


Ok, but you selectively quoted a part that makes it sound like Popper is fine not tolerating any intolerance. He wasn’t. He basically said you tolerate intolerance as much as you can without risking the collapse of social order.

Big difference.


> you selectively quoted a part that makes it sound like Popper is fine not tolerating any intolerance

I copied and pasted the entirety of the top section of Wikipedia with no editing. That section says "if a society is tolerant without limit" (my emphasis) which goes to your point that there is a threshold here.


I would agree with you in the context of a government or public body. But in the context of private companies, these "tools" were always available. The wrong hands were always able to use them. And did, many times, in fact.

Like, the problems you're describing are real but they aren't new, and aren't particularly a bigger threat now than they were before.


Market concentration might make them a bigger problem today than before, though.


They're already in the wrong hands. The only right hands are our own.


I think there's no such thing as free speech, and there never has been. Any forum worth reading has to be heavily moderated in some way. Things like scamming people and counterfeiting money are rightfully illegal. Any legitimate state is going to need to quell protests and hunt down political opposition, because at any reasonable population size you are guaranteed to have people who disagree enough with those in power to get violent and criminal about it. We saw this play out recently enough earlier this month on Inauguration Day.

Politics is hard, and absolutist ideas like completely unrestricted speech, a completely unrestricted economy, or complete privacy are at best lunacy, at worst downright impossible. I don't think I would want to live in a society where anyone could say anything whenever and wherever, where corporations were completely unregulated, or where law enforcement could never hope to gather evidence because of privacy restrictions. So I think tools and methods for restricting speech and privacy must exist, and they will fall into the wrong hands, but it's ultimately up to the public to collectively decide and enforce what is acceptable and what's not. It's an endless struggle that's been happening not since the birth of the Internet, but since people started living in groups.


>are you prepared to have these tools fall into the wrong hands?

Yes, because society has a plurality of institutions that constrain power. Restrictions (even speech restrictions themselves) are subject to the court of public opinion (we're talking about them right now), there's the judiciary, political opposition and so on.

This argument never had any pull for me for the same reason I'm against abolishing the police or abolishing surveillance or anything else. Power, even in the hands of people I don't like, can't be exercised arbitrarily.


I find this argument confusing. There is plenty of historical precedent for tyrany overcoming public opinion, judiciary, politics. I don't think the current topic is an example of this, but as a general statment on power i find this contrary to my experience.

You're basically saying, I don't worry about abuse of power because there will always be some check on it, no matter how ineffectual. That just flies in the face of every time someone managed to use their power to overcome whatever checks were in place.

And this is like a constant theme in history. All the checks on Ceaser's power didn't prevent the Roman Republic from falling.


Aren’t those institutions sort of circumventable by lying? Can’t I just say “Barrin92 is a bastard, so I’ve banned them from speaking. Look no closer lest I ban you, too.” Feels like we might be better off in the long run by agreeing on ground rules ahead of time and then sticking to them, no?


>Aren’t those institutions sort of circumventable by lying

No


What country do you live in?


Born and lived most of my life in Germany, now UK. I'd be willing to make that claim for much worse governed countries.

My birth place influences my position a lot. After the end of the war we enshrined the principle of 'militant democracy' in our basic law. Giving the state and its institutions the explicit power to deal with threats to the constitutional order, not repeating the mistakes of the weakened government of the Weimar period that gave rise to unchecked extremist forces.


Isn't that a counter argument to your point though? Like you're saying here that the weak weimar republic wasn't strong enough to check the extremist forces and hence that sufficient checks dont always exist.


My point is basically that I'm much more afraid of insufficient checks on radicalism and inability of society to deal with extremism than I am just with this abstract idea of tyranny.

To bring it back to the topic of the US and Facebook. To me the censorship is not a sign of authoritarianism or anything, but lack of strong institutions. In the face of rising extremism and disorder Facebook effectively has to step in and do something to their best ability because in general the US has very few tools to deal with this at a democratic level.

When I see pictures of someone dressed as a shaman on the dais of the senate and lawmakers cowering behind benches it's very weird to me to draw the conclusion that the pressing issue is too much authority and not enough freedom of expression.


“Face of rising extremism” is just your recency bias showing and demonstrates the power of the media to shape our perceptions. There are many periods in the past where the US was experiencing far more extremism and violence than today.


The US had a literal civil war at one point; the bar is low. Things don't have to be the worst ever to still be concerning.


I see your point, but you can just go back to the 20th century. Anarchists bombing Wall Street in the 20's, racial and antiwar riots (and bombings) during the 60's and 70's. Right wing militias blowing up federal buildings in the 90's.

Obviously you don't want to dismiss a growing problem, but things seem much more manageable today than during these past incidents.


It is important to acknowledge that this is a conflict between two ethical ideologies: Deontology/Kantian ethics vs Utilitarian ethics. The Deontological argument is that free speech is good, and any effort to curtail it would lead to a domino effect which will be disastrous. The Utilitarian argument is that this curtailing of free speech is acceptable as the alternative is causing more harm than good. Both ideologies have their limitations and criticisms, and there doesn't seem to be any clear winner.


The philosophy that those who hold the power have will be the one winning, since they have the actual means to impose it, regardless of arguments.

In this case it is Utilitarian ethics.


What's your opinion of Project Censored?

https://www.projectcensored.org/top-25-censored-stories-of-a...

Just to pick any example:

What's the culture war label for the fate of journalist Abby Martin?

https://www.projectcensored.org/24-silenced-in-savannah-jour...


> can be used to silence protests and to hunt down political opposition.

Isn't this what is basically happening right now?

I don't have a horse in the game, but I do remember that quote:

"I do not agree with what you say, but I will defend to the death your right to say it"

When I see lots of "this group denied" and "this online account disabled" stuff, I think they should weather the storm.

I don't know, is this just a control-grab-during-a-crisis move?


Yes. This isn't the government hunting down political opposition, this is a private company saying "hey, neonazis go somewhere else".

If a different private company wants to host the neonazis, they can go for it. It turns out it's /difficult/ for them because most people find them repulsive.

I don't hear anyone making this argument for child abuse images for what it's worth.


That’s because child abuse images are objective while “neonazis” are not.

Regarding the “different private company”. Some of your people are already trying to deplatform Telegram now, because those “neonazis” went there after they were banned everywhere else.

I have been using this platform to connect with my family for years and now all of a sudden my family and I are supporters of the “neonazi” movement somehow, because of how broad the definition of “neonazi” is, for the American left wing activists.

Maybe you are okay with the collateral damage “for the greater good” but I am not willing to make this sacrifice to satisfy the American left-wingers and their witch hunt. I sympathize with the idea, but no, thank you. Please, think of another solution for your problems.

Maybe, just maybe, you need to somehow use the law system of your country to stop them criminals? Or, in case they don’t do anything illegal (I’m not implying that they don’t, I’ve no idea) - maybe you just leave them be until they do something illegal? And then like catch them and stuff? And leave me my Telegram to call my mum every evening?

Or - wow, even crazier idea - you do not ban them at all, but instead of that you follow their threads and actually prevent them from doing anything illegal when they starting to discuss it? My guess it would be much easier to do when you have this kind of transparency.

I understand it’s not that easy as massive bans of everything that moves in a slightly different direction than yours, but this would probably be the right way. Thanks in advance.


Yes. The solution to trolling has always been moderation.

If the moderation is too strict then people leave and setup elsewhere. If it is just the trolls leaving then no one cares and it is a win/win.


flatly, yes, that sounds like an good tradeoff.

It seems unlikely to go wrong the other way, since people are mostly sane about this stuff and agree on what should be shut out (calls for violence, conspiratorial nutjob stuff, widespread misinformation). obviously the calculation would change if Putin-style misinformation was being defended by these people, but it's not.


I disagree about people being mostly sane about this stuff, for what it's worth. We may have very different filter bubbles....


It will be easier to see who were the useful idiots when the usefulness runs out.


Yes, and I fundamentally reject the premise that inaccurate expression should be censored or moderated without a proximate cause to a specific harm or credible imminent apprehension of harm.

I do not think censorship is justifiable for potential or non-quantifiable, collective psychic harm.


>I do not think censorship is justifiable for potential or non-quantifiable, collective psychic harm.

Can you provide an example?


If someone posts an inaccurate or untruthful comment that a reasonable person would not rely upon for their safety, I think the correct response is to mute, unfollow or retort (and if it's self-evident, hopefully it's easy to explain why it's inaccurate or link to countered evidence).

However, the imminent apprehension of harm (from a direct, credible or specific threat), does comport with the concept of assault and is grounds for moderation. Likewise, demonstrating proximate cause from someones actions to a specific harm/damages, would also be grounds for censorship.


Moderation has already been occuring for a while now

This includes moderation of left wing users and things that probably should not be banned(especially minority voices). It's just the the current crop of far right and conspiracy folks complain a lot more/louder.

That said I do worry that it will be an extra excuse for authoritarian countries but it probably won't change their behavior


Don't conflate omission with prohibition.

Twitter can omit realDonaldTrump, no concerns whatsoever.

But any one prohibiting realDonaldTrump from trolling, by whatever medium, would be alarming.


[flagged]


Please don't post in the flamewar style to HN. Regardless of how right you are or feel, it only takes discussions into hell, and we're trying to avoid that here.

It's possible to make your substantive points while staying within the site guidelines, so please do that instead.

https://news.ycombinator.com/newsguidelines.html


Since I primarily see the canned response of "they can build their own youtube if they don't like it", I'll rephrase to help get your actual point across:

Imagine if Hitler were currently alive and rising to power in Germany, and imagine if Silicon Valley were located in Germany. Would you be ok with Silicon Valley de-platforming and doing everything in their power to silence the Nazi-opposition political party? And knowing what's about to come, do you think the stance of "they can build their own communication systems" would be beneficial to humanity as a whole?

I'm aware this is kind of extreme but there are multiple responses that missed the point.


Would I be ok with suppressing speech if I was an omnipotent being who knew the future?

Probably yes.


You do know the Weimar Republic had hate speech laws and enforced them against the nazis:

Contrary to what most people think, Weimar Germany did have hate-speech laws, and they were applied quite frequently. The assertion that Nazi propaganda played a significant role in mobilising anti-Jewish sentiment is, of course, irrefutable. But to claim that the Holocaust could have been prevented if only anti-Semitic speech and Nazi propaganda had been banned has little basis in reality. Leading Nazis such as Joseph Goebbels, Theodor Fritsch and Julius Streicher were all prosecuted for anti-Semitic speech.[1]

[1] https://www.newyorker.com/news/news-desk/copenhagen-speech-v...


I...beg your pardon?

What everyone's been talking about the past few weeks has absolutely nothing to do with shutting down dangerous websites, and any terrorist who leaves a trail of online posts that can be used to track them doesn't need anything special to be done to make that possible.

People have been talking about the major social media platforms—who are, arguably, already pretty high up there on the list of "the wrong hands"—deplatforming users actively supporting attacks—physical attacks, in some cases—on our government, our democracy, and our Constitution.

I mean, there is stuff to be genuinely concerned about in all of this as regards the future, and there's a real discussion to be had here, but frankly, I don't think this is it.


Yes




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: