The dog and pony show from Google and Cloudflare of only stepping in after multiple mass shootings and tons of press coverage tells you all you really need to know about these companies' ethics. They react to sufficiently bad PR, not out of any set of principles, be they either freedom of speech (keeping these sites online) or reducing harm (deplatforming.)
I'm gonna need something citations on being "forbid" or "censor' when posting on these sites. Plenty of people get rebuttaled and turn opinion in threads all the time.
The_Donald, Reddit's Trump supporter forum bans any speech that is critical of Trump.
As a conservative if I go to /r/politics, I'll get my head chewed off with talking points and downvoted to oblivion if I try to debate anything, noone wants to listen. /r/askThe_Donald is a great place to have debates though, I wonder if there's an equivalent sub to ask liberals questions.
A fan subreddit might be ok, a subreddit that actively perpetuates violence and extremely large amounts of lies/false information is not ok. It would be ok if The_Donald was just a forum where people hyped up Donald Trump, but a lot of people post large amounts of false information that all the viewers soak up. I think that's really toxic.
"head chewed off with talking points" doesn't sound particularly bad (it even sounds like a form of debate!), certainly better than being banned for life in The_Donald.
Yelling != debating. You can't have a serious debate if the other side isn't listening. Some subs ban conservatives, others just let their subscribers attack them.
Go pretend to be a conservative on Reddit and see how you're treated, it might give you perspective.
Also, conservatives != donald. Real conservatives do not support donald trump, as he is not a conservative. Perhaps you are mistaken.
Id challenge that statement on whether its truthful or not, but we are talking about 8chan and 4chan here and they dont have the same level of reporting that would you get you removed from a thread
People occasionally ask about this in Ask The Donald, which is basically T_D’s “meta”, and the response from T_D regulars and mods is always the same: T_D exists solely for circle jerking.
Because it doesn't have a 100% success rate doesn't mean it shouldn't be considered, otherwise we could use the same logic to dismiss basically any rule, law or regulation ever made.
Echo chambers are never a good thing, echo chambers built around hateful extremist ideologies are a lot worse. By forcing these people to get out of them some good can be achieved. Of course some of them will manage to regroup elsewhere but statistically a significant proportion will return to a more healthy lifestyle.
But Reddit is not the entire internet, and that ban perhaps directly fed into 8chan's rise. People do not disappear because a forum went offline, and tools to communicate are only getting better, faster and more resilient.
It seems highly dubious that all that attention completely disappeared instead of following other channels, which may be even more extreme. I can't find any study of this.
These individuals dont disappear and it really doesnt take long to route to a new site. Fixing the udnerlying problem is a better idea than covering it with a band aid
So to rephrase the original statement, deplatforming rarely solves anything.
Surely you'd have an issue if the government showed up at a businesses office with guns and forced the owner (at gunpoint!) to allow a nazi hate site to continue to exist on their platform, eh? Cause that is what you are arguing for.
Not the point, but even still what people are arguing against is mobs of internet users pressuring companies into political decisions.
Literally (as in figuratively) the people asking for banning the site are the one holding the owner at gunpoint asking for the termination of a business relationship.
The PR benefits you're claiming only exist because it placates the very same people that make such an outrage and call for corporate action in the first place.
If we were to go the other direction, we can see how having a greater platform would be worse. E.g. if there were a blatantly neo-Nazi cable TV channel in everyone's home, we would expect many more people to end up watching neo-Nazi content and some of them to become radicalized. Propaganda requires a platform to be effective. It is thus not exactly surprising that you can make propaganda less effective by eliminating the reach of the platform.
If racists like those on 8chan can be pushed into the deepest corners of the dark web, where you have to use Tor to get to them or whatever, that's a win. Lots of people aren't going to bother, and thus will never run across them, and never have the opportunity to be radicalized by their propaganda/content.
Technology is not standing still. Tor isn't necessary. We're seeing the rise of distributed, federated, encrypted, and anonymous networks that take little more than an app install or website link. They are only getting more hardened against these mitigation techniques and the approach of "just shut it down" will soon become an infeasible solution.
My concern with big companies deplatforming political extremists that weren't in the public eye is that it almost validates their "they don't want us saying XYZ because it's true" points. Not that their political shit has any basis in reality, but when impressionable people see that they actually are being squeezed out of the internet, it leads many of them to conclude that their other points are valid.
Right now, loads of extremists are taking to Discord and other private chats to discuss their points and recruit people. Inside those tight-knit private groups, there's no possibility of a random passerby to stop in and offer a dissenting view. They see their discussions as the absolute reality of the world. Pushing them deeper into those groups feels far more dangerous to me.
Censorship is the suppression of speech, public communication, or other information, on the basis that such material is considered objectionable, harmful, sensitive, or "inconvenient". Censorship can be conducted by a government, private institutions, and corporations.
In this case as far as I can tell the server provider (not CF) was repulsed by the site content. I support that choice and the company ability to make it. Still it is censorship.
There is bad censorship and good censorship.
All of that said, CF can choose to not do business with anyone, especially if that entity is causing legal grief for them and/or abusing the AUP  See section 2.7. Rather than censorship, I would call it PITA avoidance. I would not want to be the CDN for any of the chan sites.
 - https://www.cloudflare.com/terms/
More specifically, what is a crime prevention service?