They threw out their integrity as a publishing platform when they updated their ToS a month ago: https://news.ycombinator.com/item?id=16431403
Indeed, there is not much of a point to their existence when they throw this away, other than being a virtue signalling magazine with unpaid authors that doesn't even follow their own vague ToS.
This week has been horrible for speech and freedom on the web. But surely a much needed reminder that suppressing speech does not work. The remedy is more speech, not less: http://prospect.org/article/remedy-more-speech
Without holding this uncompromising stance, I'm of the opinion that one cannot call themselves a liberal, in a liberal democracy. It is the cornerstone of a functioning, modern liberal democracy, and the recent trend of this self-censorship by big technology companies is worrying.
Censored environments like Reddit don't have the least hate because they aren't encouraging it, but because they are incapable of being used for communication between parts of humanity that exist, and will continue to exist for the time being. They are as a sandbox/playpen is to a real social environment - somewhere for children to learn and play around, but not much more. The bigger questions, disagreements and problems we face as a world are going to involve hatred and conflict before they are resolved.
Also, the answer isn't to suppress speech so it goes and hides under the surface, it doesn't go away and the people exposed to it never hear the counter points. The communities you talk about, like reddit, hate speech gets downvoted, it gets counter argued. Bring dumb hateful ignorant views into the light, so they can be addressed, if not for the purpose of convincing the OP (which is often futile), convincing anyone who might be on the fence or sympathetic at least partially to some of their points can understand why its wrong. The problem is when boards or sub-reddits become insular so never face opposing views, on either side, going back into bubbles or group think does not lead to moderate opinions and compromises, just like proving someone wrong can sometimes just cause them to double down on their opinion.
All of these sites are extremely heavily moderated, which is understandable, but saying that they are "anything goes" is false by any stretch of the imagination. Honestly before last year, Discord was the place to be, no matter what your topic was. One of my favorite places to hang out was a place called "Meth and Funamines," which doesn't sound pleasant at the surface, as it's a bunch of druggies. The major rule was "No sourcing," but nobody went there for sourcing, and rarely to discuss drugs. When people did discuss drugs, it was similar to /r/drugnerds. It was just a community of mostly chill people, posting art, poetry, supporting one another be it mental illness or otherwise. No matter who you were, you could come to that server and find a friend in a time of need.
It was wrapped up in the ban of alt-right and other 'hate' discords early this year. Other than discussing drugs, I never saw a single bad thing come from that discord.
These days, I don't know where anyone can go to be free. Self-hosted options obviously, but that is not the answer. Self-hosted services don't have the accessibility or open-ness that sites like Reddit and services like Discord provide.
I wish Discord would provide self-hosted servers and just handled keeping them on a server list. It just sucks.
Which doesn't make the concept any more valid than if the opposite were true.
When speech can be automated, this turns into a question of who has the largest promotional bot army. True speech can be drowned out with an infinite array of conflicting lies.
This whole idea that false information needs to be suppressed only became popular after Trump's election win. People needed an excuse to explain that because they couldn't imagine so many normal reasonable people could possibly vote for him. The only explanation must be that they were not very intelligent and got suckered by external influence. That's a pretty arrogant viewpoint and its conclusion - censorship - is pretty naive direct action that ignores their legitimate concerns as well as the obvious horrible side effects that come with censorship. Since when did "these people aren't smart enough to take care of themselves, let's tell them what to do because we know what's best" ever work on a large population who are different from the "bosses"?
Looking at reddit as an example, I don't like where this is headed. First they came for pedos and creepers, and I said nothing because ewww. Then they came for assholes and trolls, and I said nothing because good riddance. Now they've come for gun coupons, and I don't want to flee to Voat because it's dominated by assholes and trolls.
It’s exactly this sentiment that prompts services like Reddit to censor. They need users; it’s no use being a bastion of free speech if you are hemorrhaging ad dollars because assholes and trolls ruin your platform.
It might be paper cuts, or it might be a redesign, or the rumored shift to being social media, but I honestly feel like Reddit is past its eternal September moment and kind of on the cusp of going the way of MySpace and Digg in the next few years.
Your reference to the last presidential election is nonsensical, if you're attempting to claim that the Democratic candidate lost because of a few hundred thousand dollars in ad buys and thousands of Twitter bots. Clinton lost because the campaign was clumsy when it came to image management (which you really need when it comes to the skeletons surrounding both her and her husband), and the campaign ignored flyover states and the electoral college, and even outright insulted voters. You can't rely on California to get elected, sorry.
The real victim, when bots run rampant, is platforms. Not democracies.
Ads/Stories were put in front of micro-targeted social media recipients with one of two goals.
Increase fear/anger on right so people would vote.
Increase apathy/disgust on left and with independents to discourage them from voting or to vote for a third party.
If you micro-target even the smallest group of people thousands of times, the emotional impact will lead to some percentage of success.
That small percentage may absolutely have turned the election.
I see people make that argument all the time...it’s a good sounding hypothesis...but does it actually work? Is there data that shows that this was done and that it was impactful?
That said, Given the number of voters that stayed home in 2016 (over 10% in the key states of WI, MI, OH, and PA) and the number of people who voted "against Clinton" suggests it was highly effective.
Cambridge Analytica and similar firms have a data set of how emotions can impact certain kinds of people. They have openly admitted they do this for conservative causes. Their patent company is seeded financially by some of the most conservative people with wealth and now we know they are also supported by conservative British politicians.
This is a massive conspiracy to undermine governments that are supposed to be for and by the people. This is what happens when dark money is allowed unfettered access to elections.
Democracy. Real democracy....dies.
I disagree -- I think what we saw in 2016 was democracy. We saw the effect of giving stupid people the same amount of political power as everyone else.
It's indisputable that some people are more easily gathered and led than others. We can borrow the term "network effect" to describe how these people come to wield excessive power in a democracy. But these voters don't serve their own interests. They are the players in a competition among billionaires, religious leaders, and state-level actors to see who can raise the biggest army of intellectual zombies and herd them to the polls.
At some point there will have to be a conversation about how sustainable this practice is.
Hillary spent quite a lot more than Trump and had her own army on social media called Correct the Record. I believe it's now Share Blue?
The sad thing is that modern politics is based on who can make the other guy the most hated, as this seems to drive the most votes.
Speaking of "true speech", I didn't say any of those things. I happened to be thinking of Michael Gove's "had enough of experts" https://www.ft.com/content/3be49734-29cb-11e6-83e4-abc22d5d1... and the much earlier GWB reference to not being in the "reality based community".
I agree. But few others do. The conventional wisdom seems to be, keep that idea (e.g., racism) in the shadows; as long as we don't see it, it doesn't exit.
Fringe ideas breed in the shadows. They thrive there. They, like vampires, can't live in the light.
Unfortunately, we live in a head-in-the-sand world. And most wouldn't notice a loss of freedom and/or speech anyway.
I know. Sad.
These things are always actually about the alt-right getting kicked out of a service, they say the service is now dead and doesn't care about "free speech" (argument doesn't hold on a private platform), they make their own competitor and it's filled entirely with hate speech (voat, gab, hatreon). the reason only horrible people go to those competing sites is because everyone else knows the positions are nonsense.
Then guess what: everything that isn't x will be on the first platform.
Now it’s moved on to sex workers, harm reduction communities, marketplaces, and more.
If the next step is liability for “false narratives” and “disinformation”, open and free discussion is dead, under the boot of the arbiters of truth.
Next step: Ministry of Truth?!