Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] AWS repeatedly warned Parler about posts urging murder (customeriomail.com)
40 points by imraj96 14 days ago | hide | past | favorite | 92 comments



Maybe it's just me, but I got really surprised when it was "revealed" that Parler was using AWS.

Creating a public forum marketed towards "free speech" seems to me to fall into roughly the same category as a torrent tracker or other file sharing service. Not exactly illegal (maybe), but definitely not something that the cloud providers want to be associated with.

In Sweden we have Flashback.org, an old school PHPBB/vBulletin-looking forum with a very strong free speech stance (attracting all kind of people, including those you can imagine being enticed by those rules). It's of course not hosted on any cloud plattform, or using services from any big hip tech company.

Honestly it seems like a strange misstake for them to do, if you are dealing in one of these gray areas, you should probably find some hosting provider / ISP actually interested in hosting you and just do it yourself.


I had the same thoughts. Just from an operational standpoint if your selling point is "we'll host the content one one else will" you should probably be self hosting or at the very least renting a rack from someone who is contractually obligated not to pull the plug on you.


It's worth pointing out the offensive content cited in the document:

* “Fry’em up. The whole fkn crew. #pelosi #aoc #thesquad #soros #gates #chuckschumer #hrc #obama #adamschiff #blm #antifa we are coming for you and you will know it.”

* “#JackDorsey ... you will die a bloody death alongside Mark Suckerturd [Zuckerberg].... It has been decided and plans are being put in place. Remember the photographs inside your home while you slept? Yes, that close. You will die a sudden death!”

* “We are going to fight in a civil War on Jan.20th, Form MILITIAS now and acquire targets.”

* “On January 20th we need to start systematicly [sic] assassinating [sic] #liberal leaders, liberal activists, #blm leaders and supporters, members of the #nba #nfl #mlb #nhl #mainstreammedia anchors and correspondents and #antifa. I already have a news worthy event planned.”

* “Shoot the police that protect these shitbag senators right in the head then make the senator grovel a bit before capping they ass.”

* “After the firing squads are done with the politicians the teachers are next.”

* “Death to @zuckerberg @realjeffbezos @jackdorsey @pichai.”

* “White people need to ignite their racial identity and rain down suffering and death like a hurricane upon zionists.”

* “Put a target on these motherless trash [Antifa] they aren’t human taking one out would be like stepping on a roach no different.”

* “We need to act like our forefathers did Kill [Black and Jewish people] all Leave no victims or survivors.”

* “We are coming with our list we know where you live we know who you are and we are coming for you and it starts on the 6th civil war... Lol if you will think it’s a joke... Enjoy your last few days you have.”

* “This bitch [Stacey Abrams] will be good target practice for our beginners.”

* “This cu* [United States Secretary of Transportation Elaine Chao] should be... hung for betraying their country.”

* “Hang this mofo [Georgia Secretary of State Brad Raffensperger] today.”

* “HANG THAt N*** ASAP”

It's indefensible.


For sure, that's horrible. But the hot question is: is Amazon the entity who should decide to shut it down?

I would personally prefer to have that decision be taken by a governmental agency or anything else managed by an elected person. Now, I'm not a US citizen, and I have zero trust in the current US administration, but in a "normal" context I would want to have at least some governmental overview of the shutdown process instead of delegating the full power to a private corporation.

Edit: In a "Trump" context, I really don't know... Really sad to see this country dying slowly.


Here's another hot question: if it were your computer, would you like a government agency telling you if you should leave it up or take it down?


Not sure how that's relevant to the discussion.


The content is hosted on a computer owned by a corporation, not the government. Many people interpret this as their decision to make, rather than the government's. If you don't believe so, you'll have to describe the ways in which their rights to their computers differ from yours.


See my response here, I believe that's also relevant to your question: https://news.ycombinator.com/item?id=25764333


In short, Amazon is not a public utility. Maybe it could be in the future but it isn't right now. AWS is a private business which rents out a server under very clearly laid out TOS to follow. In addition to this, AWS has gone out of their way for weeks (if not months) to work with Parler on their moderation issue that violates the contract.

Parler has shown no intent to work within the boundaries of AWS TOS. Instead of putting in the work of finding and migrating to other hosting solutions (of which there are many) they used stall tactics. When stalling no longer worked, they brought a junk lawsuit against AWS.


I think Amazon the entity is exactly the right person to decide who their customers are and what their contractual obligations are to those customers.

If I ran a SaaS I definitely wouldn't want to submit a form to some government entity and wait 3 months for some bureaucrat to decide whether or not I can stop serving a customer that I am contractually allowed to stop serving.


I get your point, and understand your position. For my SaaS I also wouldn't want to deal with something like Parler. That being said, I don't personally believe that your analogy of your (or my own) SaaS platform is that relevant. The difference here is that Amazon could be considered a major infrastructure provider. It has around 30% of the cloud market share, which puts it in a very powerful category if they have the power to shutdown platforms. I personally do not feel comfortable giving that much power to a corporate entity without some level of overview by citizens (aka the government).


> It has around 30% of the cloud market share, which puts it in a very powerful category if they have the power to shutdown platform.

It only had that power because Parler gave it to them. Parler didn't have to sign a contract they couldn't fulfill. And Parler will be able to find another way to host.


Parler's own CEO admits they purposely built their infrastructure with the expectation that AWS would pull the cord for TOS violations and they have options. The truthiness of that statement seems in question as it's now Jan 14 and Gab has pretty much already (struggled) to fill the vacuum left behind by Twitter and Parler.

Relevant part of the document:

>Parler’s allegations of harm contradict its own public statements. Parler’s CEO has assured users that Parler “prepared for events like [the termination] by never relying on Amazon's [sic] proprietary infrastructure,” that the site will be fully operational “with less then[sic] 12 hours of downtime” after termination, and that Parler has “many [companies] competing for [its] [hosting] business.”


What is indefensible, letting users post this garbage, letting it linger too long, or not removing it?


Wait, what did the NBA and NHL do?


They support racial equity, a lot of NBA and NFL player (don't know about NHL) supported their communities during Covid + BLM and paid bail/lawyer money in some states so people get their right to vote back. NBA stars were particularly active during the first lockdown.


Thank, I guess I missed that (aside from Kaepernick)


I can find thousands of the same thing on Twitter from right and left people. When does Twitter get cancelled.


No you can't.


If you truly believe that you might want to take off your blinders.

Since you made the claim, back it up. Post a death threat to a public figure on your Twitter account. Lets see how long it lasts.

How'd this go for you? Well I take it?

You seem to think that I owe you something when I don't. You have a computer, go work it out yourself. It's not my job to help you grow up and think more clearly.

So not great then?

Resolved URL because something in my adblocker keeps me from being forwarded, UTM tags removed:

https://www.courtlistener.com/recap/gov.uscourts.wawd.294664...

Mods can add a [.pdf] in title, perhaps. :)


Yes please change this URL!


I am not on Twitter, but wasn't #HangMikePence trending a few days ago? Isn't it a bit hypocritical when one of the biggest social network is allowed to do the same thing?


Amazon’s main argument seems to be that they’ve been asking Parler to remove overt calls to violence since mid/late November, but Parler have refused. While Twitter does moderate this sort of thing, especially when asked to do so. I believe Twitter has been deleting #KillMikePence Tweets that are actually calling for violence, but haven’t been moderating Tweets that aren’t overt threats, like “OMG #KillMikePence was trending on Twitter”, or “Capitol rioters were chanting #KillMikePence” type Tweets.

If that’s all true, then to me it seems reasonable. I’d liken it to DCMA Takedowns - if you respond to them in a timely manner, and your site’s primary purpose isn’t sharing copyrighted content, you’re fine. You’re in trouble though if you refuse to take down the copyrighted content, despite receiving DCMA Takedowns.


DMCA takedowns are defined by the legal system. The system you are describing here is completely defined by these companies. They are in their right to do so, but we ought to ask the question if we should introduce a legal structure like the DMCA to standardize it from a legal standpoint. This would cut both ways: if they de-platformed people complying with the standard, they would be in the wrong. As it is right now, AWS can just arbitrarily decide the standard. The only incentive they have to make it a reasonable one is public outcry, which is not the kind of backstop you want when it comes to possibly harming fundamental freedoms.


Yeah, at the moment companies are pretty free to refuse service like this, based on their own policies, but I think there’s a decent argument for a legal framework. The biggest tech companies are indeed a core place that communication happens, I believe formalizing what types of content should and shouldn’t be moderated could be a good idea.

This would be very contentious, though. Many, many Americans would be strongly against the government telling private companies how to moderate speech on their platforms. It may even be unconstitutional - would a formal, legal framework represent the government restricting free speech?


It really depends on the details. We already have very strong legal standards around what speech is legal under the 1st amendment. If the laws surrounding corporate governance of speech effectively extend that framework into the private sector, that would probably be a fruitful debate - those who most support corporate agency are also most likely to support speech freedom maximalism. (At least, that was probably the case until recently, since now I think a lot of people are unfortunately changing their minds around how damaging large monopolistic tech companies are for society since they're seeing them flex their unprecedented power in a direction they agree with.)


I can immediately find BLM and antifa calls for violence like this on Twitter and FB.

I can also report them and it might be removed fairly quickly. How? Remember all those articles about the practical slave labor camps of content reviewers with psychological problems that everybody was horrified by? And how we said we needed to stop doing that? Yeah, Parler doesn't have those slave camps.

And by the way, do you know how long it took for FB and Twitter to get to semi-decent moderation?

FTA: “During one of the calls, Parler’s CEO reported that Parler had a backlog of 26,000 reports of content that violated its community standards and remained on its service.”

Parler might have refused on some of them, but the above is obviously a big part of the issue. Plus, if some people are willing to justify Twitter leaving up the "hang Mike Pence" stuff, then surely there will be some disagreement about what exactly should be removed on Parler based on context, yes?


It was a hashtag about a video of the DC rioters chanting it. Context matters, and in this context it wasn't hypocritical at all

Source: not on twitter but am familiar with the video


I dont understand your point... Couldn't people watch this video and think that they were right? Hence twitter was doing exactly the same thing than parler. It is not because you are under the impression that twitter is a democrat tool that people not thinking like you are using the platform to commit crime, promote violence, learn how to be violent and racist


What I mean is originally the hashtag was being used to spread the video. You're right, people took it out of context and started making actual threats. But that's when Twitter removed it all and banned the tag. This seems different from Parler which AWS asserts still hadn't addressed content they had flagged as violations before


It seems like there were some death threats on Twitter as well. Most of them were removed so I can't find any direct source.

> When the rioters could not find and fulfil their ill-formed dreams in the real life, they took to Twitter to express their wish to hang Mike Pence, and soon the phrase started trending on Twitter. [0]

0. https://www.wionews.com/world/twitter-blocks-phrase-hang-mik...


This was addressed in the document. The basic answer is AWS doesn't host Twitter(so it's not hypocritical), and Twitter blocked this phrase.


This was a finely worded, AWS does not host Twitter Feeds, but they are in contract to host Twitter timelines...

https://duckduckgo.com/?q=Aws+to+deliver+Twitter+contract+ti...


Ya but they deleted that trend and those tweets and users violated the guidelines and were warned, suspended, etc.

Parler refused to do anything about it and built their whole business on being a safe space for people wanting to incite and plan violence. That’s the difference.


Being hypocritical requires an individual to be in that state, but here you mention a vague group being in that state, which isn't a thing.


I've come down to a "both things are true" position on this.

Parler is absolutely guilty of violating TOS because as near as I can tell the entire purpose of Parler was to facilitate this kind of discussion.

That being said, there is a double standard at work as well. The fascist movement we're discussing here probably would not exist were it not for the YouTube recommendation algorithm, which in an effort to keep people on the site has a powerful "rabbit hole" effect. Twitter has also been full of propaganda for years, and only recently have they put forth much effort to do anything about it... because they were making money off the attention it brought to the platform. Nothing happens to these companies because they are bigger, have more money, and own their own infrastructure.


This is why we can't have nice things.

Not sure why people insist on posting comments like those identified. If you were publicly identified as making those comments, you would definitely be at risk of losing friends/jobs etc.

But that said, Parler isn't alone in their issue with inappropriate comments.


But they would seem to be alone in their refusal to moderate them.


Lots of weird stuff in there, Amazon says Parler is suspended, but not terminated, but then says it will preserve data and help with migration. Which is it? Does Parler have a path to removing the suspension?

Additionally, Amazon says Parler does moderate and remove content, but not with a sense of urgency. How does Amazon define this sense of urgency? Parler was suspended because it wasn't quick enough to remove posts all while trying to manage 12 million users, 2.3 million DAU, and have explosive growth. Is a backlog of 26000 reports really a large number?


It seemed to my unaided eye that the only reasonable point in the lawsuit from Parler was that they were given 30ish hours notice and the contract required 30 days.

AWS have addressed that point.


Is it really a suspension and not a termination?


If its a 30 day suspension, and they give notice to terminate at the same time, its both?


All AWS/GCP/Azure is tell you guys: build you damn infrastructure, learn how to install software and configure you routes.

If this is not clear, I do not know what is.


So “Big Tech” is in the spotlight for suppressing speech explicitly.

Meanwhile, traditional media implicitly aligns minds to defend Democrat/Republican duopoly, who in turn protect American aristocratic power, lest those poor billionaires suffer humiliating figurative death.

I do not really see the “I was hear first so I won for life.” as fostering open debate.


Why is this topic flagged, when so (so) many on the same subject that didn't present evidence of Parler's crimes hit the front page?


I don't know why some topics are flagged while others aren't, but Parler didn't commit any crimes.


HN typically doesn't like it when threads devolve, and users will then flag the thread as such.


Editorialized title. Link is to the response Amazon filed yesterday to Parler's law suit.

That said, yeah. I don't know why people are screaming so hard about censorship when the simple truth is that this was an attempt to retroactively apply moderation to a community that had verifiably gotten out of control, at a moment where it was clear that many of these threats were real and not just rhetoric.


There are two things that can be true at the same time:

- Amazon was within its legal and ethical rights to do this, and ought to win any court cases, etc.

- The aggregate situation of de-platforming of individuals and entire platforms from the Internet by a few unilateral decisions by corporations reveals a power structure we ought to not want (it has always been there, but now it's undeniably shown itself willing and able to exercise its power when needed) since it will inevitably be abused given incentives and the lack of checks and balances on it.


> - The aggregate situation of de-platforming of individuals and entire platforms from the Internet by a few unilateral decisions by corporations reveals a power structure we ought to not want (it has always been there, but now it's undeniably shown itself willing and able to exercise its power when needed) since it will inevitably be abused given incentives and the lack of checks and balances on it.

When it's more than five separate organizations that all independently decide to deplatform is it still a unilateral decision? I also doubt these decisions were made unilaterally in the companies themselves.

In this specific circumstance do we need to regulate these businesses to keep them from removing content related to organizing a violent insurrection?

No one even removed this content until these groups literally stormed the capital chanting about hanging members of congress. I don't see the problem.


I'll admit the use of 'unilateral' here is probably poor, I don't know the proper one. These companies exert monopoly power in the markets they are in. Decisions made by companies which are monopolies are "unilateral-ist" because once a small number of them make a decision together (perhaps colluding) then it leaves no recourse for the person or group they are deciding against.

Mentioned elsewhere, the process we should all wants is one where if a company wants to nuke an entire speech platform like this due to illegal speech, they can get a judge to affirm the legality of the content and the negligence of those who are hosting it. With that in hand, their actions would be immune from criticism. Without it, we find ourselves here, where we basically have to trust them to make the right decision and not abuse their power. It's not what we should want, for the same reason that we should have wanted anti-trust laws in place to reduce the power of monopolists, despite the fact that monopolists were operating entirely legally and often ethically.


Banning content moderation unless you have a judge's order seems like it could be problematic. If this law existed could dang still ban people for people for being a-holes and not following site guidelines or would he need a court order?


I'm referring specifically to the situation where an entire service is going to be shut down, and silence many, many people who presumably were not doing anything illegal.

At the risk of a stupid analogy, AWS have near-Death Star like capability at this point, so blowing up Alderaan in the name of killing a few people on it the local government seems unwilling to seems like it should have a few checks and balances.


So just for hosting?

If someone buys a rents a bunch of vms and use it to DDOS Amazon do they need a judges order then?


AWS has somewhere around 30% marketshare in a market with well more than dozens of players. I have a hard time seeing it as a monopoly.


I'm referring to monopoly power - which doesn't mean a singular entity. A good thought exercise to gauge this in this situation specifically is if it's only incidental that they were on AWS. If they were on Azure or GCP I think we can assume the same result would have occured.


Good thing there's still more than AWS, GCP, and Azure in the cloud hosting market. There are dozens if not hundreds of companies willing to rent you computing hardware on the internet. There are dozens if not hundreds of colocations willing to rent you space, power, and network connectivity.


For now.


Is there a feasible plan to obtain better power structures in the same time scales that these people are literally plotting to overthrow governments, murder political opponents, etc.?


Twitter and FB got a lot of credit for the Arab Spring.


Seems like a difference between between overthrowing fascist religious minority in power and preventing fascist religious minority from rising to power?

Electoral college picks the President but two elections on and no popular vote wins for Trump.

If we’re talking about suppression of speech, seems like Trump was keen on doing that as well, on the backs of a political minority.


There isn't a "deadline" to fixing these problems, they just need to be fixed. Unfortunately, like most things, fear will primarily guide our actions not sober analysis of the long-term unintended consequences of them.


On your second point, there will always be power structures. You can't not have them, when considering realistic arrangements made of humans.

So now what?

Is the complaint really that the gatekeepers are different than they were 30 or 100 years ago?

Or that there is no realistic alternative that could have acted, citing some authority you find more agreeable, in an effective timeframe?


I would have much preferred if this action occurred due to a legal order, not a few CEOs making unilateral decisions. If they were bound to go through a form of due process before nuking people off of the Internet based upon speech which is assumed to be illegal, my guess is it would be much less likely to be abused. It certainly seems like such a system would have worked just fine in this situation and could be made extremely efficient.

I've argued for years having a few companies mediate all consequential human communication was going to be a big problem, so this happening to wake people up of the dangers isn't a surprise to me but just another stage of the process of people realizing it.


The check and balance is working just fine in this case. Parler kept allowing people to post death threats, they didn't remove it because that's the entire reason their platform exists, and AWS enforced their TOS and stopped hosting them. They're free to go elsewhere.


I'm not totally sure what to think, but aren't many violent acts planned on Facebook and Twitter? Wasn't the insurrection at the capitol? Yet, the big boys aren't paying the price, it's the competitor that suddenly got popular that wasn't able to be influenced by politicos.


> it's the competitor that suddenly got popular.

You're omitting a key detail. This competitor was getting popular specifically because of their promise not to remove such violent content, while Facebook and Twitter were steadily ramping up efforts to remove it from their platforms.


The difference: FB is alerted to those posts, they (eventually) take them down / close the communities. Parler was straight up telling AWS to piss off.


Yea this is valid, I'm just terrified of the tech companies having this kind of power to just censor people and shut down competitors at will. They were kind of justified this time, but given corona, all our communications are flowing through these guys. They're not neutral, nor accountable, and they are very powerful.

They also just demonstrated the ability to collaborate to destroy a company they didn't like without some kind of legal order forcing them to.


Twitter definitely made moves to crack down on this stuff (after all, that's why Parler thinks it was poised to gain a lot of traffic). Parler complains about the "Hang Mike Pence" hashtag, but omits the fact that Twitter in fact did kill the hashtag.

Parler's sin was basically to attempt no moderation whatsoever, and even after it was clear that several individuals intended to make good on their threats (see the invasion of the Capitol), did not seem to be willing to make any serious attempts to moderate content. See ¶6 of https://www.courtlistener.com/recap/gov.uscourts.wawd.294664... for some discussion.

(Note that the executive's name in that declaration is redacted because (s)he is legitimately concerned for violence should their name be made available to the public.)


> but aren't many violent acts planned on Facebook and Twitter?

Shouldn't anyone who plans a violent act on Facebook or Twitter (or Reddit, or HN, or wherever) be banned?

Note that both of those sites are ALSO in the process of desperately cleaning up their violent sub-communities. There's a somewhat humorous running gag right now of conservative thought leaders suddenly complaining that their follower counts are going down.

And as far as Parler specifically: it didn't "suddently get popular" in a vacuum. Parler absolutely was functioning as a kind of "ban evasion mechanism for Twitter". People who got tossed from Twitter (generally, yes, for violent rhetoric) would find a home there, and bring there followers. There was very little organic growth, it was effectively all cannibalized from Twitter.


If my “editorialized” you mean “edited,” yes. But it’s not opinionated, it’s stated as fact in the filing with specific details backing it up.


It's editorialized. The non-editorialized title of this post would be "Parler, LLC v. Amazon Web Services, Inc."


So, I'm not arguing one way or another, I just want to try out the following for the sake of discussion as I haven't seen it elsewhere: if we consider Amazon to be an infrastructure provider, and that by doing this they have a duty of neutrality, instead of shutting down Parler directly the approach would be to send a complain to a relevant federal authority (not sure which one that would be, maybe the FBI? please correct me) and let them do their investigation. If they find something considered bad enough to take the platform down, they would ask Amazon to take it offline. An argument for this is that it gives elected or at least government-related people some overview over the process instead of giving full power to private companies (which will backfire against the good people if normalized).

Please tell me why that would be bad, I'm interested to know.


I expect common carriers to have a duty to neutrality and defer to at least some government agency with oversight before deciding to pull the plug.

There is one rail line to carry grain in and out of a community. That rail line is a common carrier. It shouldn't be able to arbitrarily decide whose cargo to carry or not.

There is realistically only one ISP at my home. That should be a common carrier. They shouldn't be able to arbitrarily choose which IP addresses I can connect to or not, or what ports I can choose to talk on. I realistically have no choice on who to choose for that, so there's no market for alternative providers for me.

AWS is not a common carrier. If you get dropped from AWS there's still Google Cloud, Azure, Oracle Cloud, IBM/Softlayer, Hertzner, Digital Ocean, Linode, Hostgator, Dreamhost, and so many other smaller cloud/VPS providers out there. And that's assuming you're for some reason entirely unable to roll your own hardware and move your app into a colo, of which there are literally hundreds of providers in the US alone. There is lots of competition in the cloud hosting industry, and if you widen it to the hosting industry in general its extremely wide.


> instead of shutting down Parler directly the approach would be to send a complain to a relevant federal authority (not sure which one that would be, maybe the FBI? please correct me) and let them do their investigation.

It has been suggested by many people that in fact the causality goes the other way. The reason for the near-simultaneous action against Parler from Twilio, Google, Apple and Amazon seems likely to have been a strong suggestion from Federal law enforcement about an imminent threat.

There's no evidence for that per se, but certainly the threat seems imminent.


I agree with your last sentence. If that's what actually happened, the situation is completely different and the general debate is really missing an important point.


According to this, both Twitter and Facebook use aws: https://www.contino.io/insights/whos-using-aws

There have been a lot of livestreams and general calls of hate and violence on those social media sites from all political spectrums. Where are their bans? That's why theres an uproar.

New York Times, A Genocide Incited on Facebook, With Posts From Myanmar’s Military

https://www.nytimes.com/2018/10/15/technology/myanmar-facebo...

Pretty sure that can be considered violence. I think I feel that way due to the genocide part. Where's Facebook's ban? Oh wait, that's right, big tech is it's own good ol'boys club.


Facebook and Twitter have evident moderation (if maybe insufficient) working on removing or flagging such calls to violence. The argument of this case is not about whether there were user messages calling for execution of public figures, it's about whether Parler did anything about it.


The argument is Facebook and Twitter are doing a better job of moderating, and Parler isn't or is outright not moderating their content.

This may well be true. But the point is now we're arguing a very subjective standard, and the determiners of those standards aren't a judge or jury, but a few CEOs. The standard of "when is it justified to refuse to host an entire platform based upon the nature of how they moderate their worst content" is an incredibly complex question, with no objective answer, and how these company's answer it has a profound effect on freedom of speech, since deplatforming a site like Parler doesn't just censor the speech of those who ought to be censored (by any standard), but everyone else on the platform.

It's fairly obvious where this will lead, but I doubt we will correct it before it causes immense harm (not necessarily directly, but due to the blowback that will come from censorship without due process and tit-for-tat escalation.)


Twitter and Facebook both use their own data centers, with Twitter only recently (mid December 2020) announcing that they will be migrating to use some AWS. They also try to manage their content moderation as best they can and have policies and practices in place to remove content that violates their standards. Parler relies solely on AWS and has 0 content moderation. There are stark differences.


Leaving aside the discution regarding the monopoly position of AMAZON (or other technology companies), I wonder if the PARLER situation would not have been the same if it had gone to any other company offering the same services and used a pre-ownership platform. If they had developed their own stack, wouldn't their flexibility have been greater?


AWS should have no responsibilities on what is on the platform and it is also not taking all the responsibilities, so it should not pretend taking one.

That is totally legal process's role. Anyone hope a A decide what B should do for C is just not understanding the dynamics in human society.


I struggle to understand how it is not a way to remove a competitor and move the spotlight from those companies doing a good deed vs the black duck being parler.

Twitter let ISIS terrorists post beadings and murders on its platform. A simple search on any racial keyword will show plenty of hatred that is everywhere on its platform. Facebook, not even talking about what they do, the way they use data, advertise, and leave content that is racist/dangerous.

And now that parler jumps in without the $ of twitter and facebook and the process/software that took years for a FB to be able delete violent content then we censor them from the outside?

How about we censor google for doing business with china and promoting censorship? Id say that is a good enough argument for the US governement to nationalize google and remove them from their monopoly.


> not a way to remove a competitor

Competitor to what? Amazon isn't particularly relevant in the social media space. (Goodreads and IMDB are the most social sites they run afaik, and that's a bit far fetched)


competitor as a technology company and as a platform. amazon just wanted to participate to something "good" so that they look like the good guys... How about them destroying small businesses and coming out of covid as one of the only marketplace on earth that made billions out of people misery and despair? Id rank that way higher than a bunch of conspiracy theorist on a shady and bad twitter-clone.

I wonder if they'd do same when the CCP is hosting stuff on their server such as list of people or software that directly help to put people into jail.

There is a tendency within tech to think there are bad guys and good guys. I see 0 difference between palantir/parler and amazon,fb,google. Or yes there is. the scale of the later is way scarier and way more dangerous for us. aaron schwartz sad death anniversary was 2 days ago and yet we never learn, we don't understand that this is just showing how dangerous those platforms can be for the people.


Twitter signed a large deal with AWS to host some of Twitters service...

https://duckduckgo.com/?q=Aws+to+deliver+Twitter+contract+ti...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: