A policy that was FAR more relaxed than any of their competitors, which was by design and part of their marketing pitch: come here to say those things you're not allowed to say elsewhere.
Parler's position is: unless the language is strictly illegal according to the letter of the law that's designed limit government censorship of speech, then it's allowed.
By that definition, if I call for someone's death, unless I have the means and the opportunity and mention a specific time, then it doesn't count and the post stays up.
Clearly Amazon, Google, and Apple have policies that are more strict than US law. And that makes sense: US law is shaped by the constitution, which is meant to restrict the government's ability to limit speech. And we should absolutely want the rules regarding government censorship to be as narrow as possible.
But private services are free to operate by different rules.
For example, if I walk into a McDonalds and start swearing at all the customers, I'll get kicked out even if I'm not breaking the letter of the law.
So, did they have a moderation policy? Yes, technically. But did that policy allow extremist and violent language to persist on their site at a level above and beyond what's seen on any competing platform outside of, say, 8chan? Absolutely.
> As a side note, I've heard that the people who actually stormed the Capitol Building (not just holding signs outside of it, which is perfectly fine) used Facebook to coordinate and not Parler.
And Facebook would pull that content down if they found it.
Parler won't.
That's what got them pulled from AWS, and the Google and Apple app stores.
"But private services are free to operate by different rules."
But that's just it, isn't it? Parler tried to make a new service that plays by a new set of rules. And they were crushed, because it turns out that you actually can't have your own rules unless you are already at the scale of Apple, AWS, etc.
I didn't say it was illegal or an infringement of Constitutional rights. But it is pretty worrying.
Before this, the power was somewhat theoretical and used in tiny marginal cases. Now, it's proven that they can effectively exercise the power in a major way, and that's news.
> Now, it's proven that they can effectively exercise the power in a major way, and that's news.
Honestly, it's really not. We've seen groups like ISIS kicked off social media, for example, and no one blinked an eye. Heck, Milo Yiannopoulos was deplatformed way back in 2016.
The thing that's news is that a significant percentage of a major US political party is now associated with a form of right wing extremism and wrapped up in a major conspiracy theory movement whose adherents are willing to commit violence in an attempt to subvert an election.
> But that's just it, isn't it? Parler tried to make a new service that plays by a new set of rules. And they were crushed, because it turns out that you actually can't have your own rules unless you are already at the scale of Apple, AWS, etc.
That's not at all true. If I recall the same thing happened to 8chan/8kun. Yet somehow they live on. If Parler has a market, they'll find a way.
That said, it sucks but, well, that's capitalism for ya.
What else would you suggest? Regulating these various companies such that the government gets to decide who can use their services?
Because if so, a) that would require new laws, b) it'd probably fall afoul of the first amendment, and c) it doesn't seem to align well with free market conservative ideology, and so should be opposed by the very users of Parler that are being affected by this.
A policy that was FAR more relaxed than any of their competitors, which was by design and part of their marketing pitch: come here to say those things you're not allowed to say elsewhere.
Parler's position is: unless the language is strictly illegal according to the letter of the law that's designed limit government censorship of speech, then it's allowed.
By that definition, if I call for someone's death, unless I have the means and the opportunity and mention a specific time, then it doesn't count and the post stays up.
Clearly Amazon, Google, and Apple have policies that are more strict than US law. And that makes sense: US law is shaped by the constitution, which is meant to restrict the government's ability to limit speech. And we should absolutely want the rules regarding government censorship to be as narrow as possible.
But private services are free to operate by different rules.
For example, if I walk into a McDonalds and start swearing at all the customers, I'll get kicked out even if I'm not breaking the letter of the law.
So, did they have a moderation policy? Yes, technically. But did that policy allow extremist and violent language to persist on their site at a level above and beyond what's seen on any competing platform outside of, say, 8chan? Absolutely.
> As a side note, I've heard that the people who actually stormed the Capitol Building (not just holding signs outside of it, which is perfectly fine) used Facebook to coordinate and not Parler.
And Facebook would pull that content down if they found it.
Parler won't.
That's what got them pulled from AWS, and the Google and Apple app stores.