Does YouTube's TOS really not contain statements giving them the ability to stop hosting any video, at any time, for any reason?
Edit: I guess not: the only content removal clause I see is this one, which is definitely not "we can remove your video for any reason, as long as we feel like it." I'm a little surprised they don't seem to have included anything like that.
> If we reasonably believe that any Content is in breach of this Agreement or may cause harm to YouTube, our users, or third parties, we may remove or take down that Content in our discretion. (from https://www.youtube.com/static?template=terms)
The highest court in Germany (for civil and criminal proceedings) wrote this in one of their rulings:
> Depending on the circumstances, especially if private companies - as in this case - move into a dominant position and take over the provision of the framework conditions of public communication themselves, the fundamental rights obligation of private parties can in fact be close to or even equal to a fundamental rights obligation of the state.
So as long as YouTube, Twitter etc. are open to "the public", they could write "we reserve the right to remove any content for any or no reason at any time" in their TOS and it wouldn't be valid in Germany.
I feel like it is largely a question of what expectations you work to produce among users and potential users.
If people are actively aware of the organization saying "we can remove videos on a whim if we want to, and we very well might do so", and especially if they are generally known to occasionally do that in practice, and this isn't a surprise to any users with a reasonable level of awareness, then I think that's probably fine.
But if the impression they give to a typical reasonable person is that they are "a platform for almost everyone that behaves according to fixed rules, where you can be reasonably assured that if you follow the stated rules your content will stay available, as much as we are able to accomplish this", but then they go around removing videos on a whim / because they have a grudge against someone / because someone paid them to / whatever , then that's not good, because they aren't being honest about the service they provide.
Whether the govt has any place making an organization who is misbehaving in that particular way stop, I'm, not sure one way or the other.
This idea that private companies earn special privileges in this regard because they were started by individuals needs to die.
So what's your take on Lawrence v. Texas?
Two people in their house,
a large, transparent non-profit advocacy organization,
a dark money lobbying org, and
a hundred-billion-dollar for-profit incumbent that acts as a pillar of human interaction for news, education and entertainment across the globe
are very different situations. It isn't as simple as scaling up numbers.
By phrasing what happened in the passive voice, you are downplaying the fact that Google actively strove to make it hard for new competitors to become established.
Of course there still are competitors to YouTube, depending on how we define the market for online video sharing, and Google didn't invent Metcalfe's Law, but nor should they be surprised that it was hard for alternative sites to compete. That is, after all, why they bought YouTube in the first place and discontinued Google Videos.
An attempt at practical antitrust regulation might mean, for example, that if a Walmart replaces every other business in your town through the sheer merit of its business model, that Walmart doesn't get to refuse to do business with you on the grounds that the manager hasn't liked you since high school. It might even mean that you reserve Constitutional rights like assembly/protest within the area that is functionally the public square, which is the private sidewalk in front of the Walmart.
I know which approach I would prefer.
I've recently had to talk to my internet provider, and that certainly increased my citizen's fervor in regulating companies in dominant positions, because every company will immediately turn around to milk/punish consumers, and try to influence politics for their benefit.
But to me, it's not a punishment. It's a natural consequence of YouTube's dominant position: they have to start acting like it.
So ideally if a video is to be taken down, it ought be under a rationale along the lines of "you agreed to not doing [very specific thing] and this video does it, therefore we took it down", not "technically by clicking on a link in our site, you agreed to fine print that says you'll sell us your soul, so suck it"
Yes, and that's a good thing because it encourages competition. No market should have a player with such a dominant position.
Companies exist to benefit the state and the people. If a company is so successful it distorts rights for everyone and puts fundamental rights at risk, it's not 'punishing' the company even though it might feel like it. It's making sure the conditions that created success for youtube remain in place for future generations - free speech to some degree, competition, etc.
I think corporate personhood and corporate rights are a cancer on society and the corporatism/corporate state/fascism-lite that the west is already deep into is destroying the fundamental freedoms in the country. Disney will always have more money than you to argue for their corporate rights against your personal rights.
This ruling is wisdom.
No, they bought that status by regularly sending dump-trucks full of gold to record labels.
No, it is just attaching some minimal responsibility to the enormous benefits that YT exploits their dominant market position for.
I actually think Germany has it backwards here. If the German government wants to provide a free-speech "safe zone", they should provide it themselves.
You're encouraging government coercion of action.
Of course now you might say, if such a phone company existed, those people who were denied service could just choose another one. But if this company would be the one dominating the market, this would mean they couldn't reach all the people they would want to anymore. This is the analogy with Facebook.
Now someone walks in and starts offering Corona. It's not everyone's favorite, but still decently popular, and the owner has no problem with that.
Until a few hours later, when a new line is added to the list of rules on the door: "no Corona". Then the owner tells the person handing out free Corona that they're breaking the rules and kicks them out.
Is that fair? Not in this case, says the court. Everyone agrees to the rules when they walk through the door, but the owner can't just willy-nilly change them after the fact.
(If you want to really understand the decision, you'll have to read the original instead of relying on analogies.)
Then, on YouTube (or woke mailing lists) there are mostly two sides, only one of which gets censored: The one that opposes the dominant clique.
YouTube is full of filth that is kept up because it is non-political and makes money for Google. If Google were really woke, it would take down all videos that are demeaning to women (according to their ideology). They don't, because these videos are a cash cow.
And you're advocating for foreign private companies to have more rights than the democratically elected governments of the countries they operate in.
Last time I checked laws still come from countries, not from Google. The world isn't America's private companies playground
The biggest difference generally between a private enterprise (foreign or domestic) and a democratically elected government is that participation with the government and its rules are:
- enforced by physical violence
If I disagree with googles rules I can avoid google. If I a disagree with the governments rules (for example tax rates they impose), there will be people with weapons to force compliance.
That's exactly what Germany's highest court denied if they "move into a dominant position and take over the provision of the framework conditions of public communication themselves" because it is not easy to avoid Google/YouTube.
I think we are way past the illusion of democracy and governments protecting their people at this point.
They might have very different views than Americans and Google about what should or shouldn't exist.
As if America and its private companies have the monopoly of what's "right" and what it means to "protect the people" of the world, this is ridiculous
Companies can only provide products and services to willing customers. To restrict them is to restrict the choices of your own people to choose products they prefer. So what you're really arguing for is the majority in a country restricting the freedom of a minority in that same country.
How did you extrapolate that ? No
I really don't get your second point. You can't sell me a car without seat belt, you can't sell me weed, you can't host a jihad video platform.
You're already restricted in many ways. The world doesn't have to kneel in front of YouTube &co and accept all their bullshit.
America isn't a role model, neither in politics nor culture, the world doesn't need to conform to what America think is right.
Also, YouTube doesn't exist to serve the people, certainly not, it exists to make money and uses tax evasion to pay as little as possible in most EU countries, if they wanted to serve the people they could start there
As for the second point, yes and those examples are also cases of restricting their own citizens. Hence the discussion is still about rights of the citizenry versus their own government restricting them, not "rights" of companies as you put it.
The world is free to not use YouTube and always has been. And by the way I don't agree with many of the major websites' censorship decisions in the last couple years either. They depict only what a loud subset of America thinks is right.
Companies generally make money by serving people. I suppose there are business models where this isn't so (e.g. they exist to sue people or something, but exploiting govt's ability to take money by force instead), but in YouTube's case, they entice people the consensual way.
But that is what they are doing here, no? They just use Youtube as the way to provide it. Governments doesn't have to run businesses themselves, private actors are much better at running the day to day stuff usually.
Similarly if you own a huge amount of land and build a city there and let lots of people move in, don't get surprised when the government seizes your square and roads and make them public, or at least force you to let people move there as if they were public roads.
Some people currently seem to have the attitude that a corporation can be protected by the state but not responsible for its actions that negatively impact the common good. This was not the original intention.
> But wether public or private, corporations were originally only granted special legal privileges by government, conditional on them serving some PUBLIC good. With special rights came special restrictions, and their operations were periodically reviewed for compliance with their stated purpose. However, over time the system of incorporation has been altered by corporations themselves, such that the benefits of state-grants have been kept, while the responsibilities discarded. 
Whether or not Youtube should have a clause in their terms allowing them to remove arbitrary content is a different question, and yet another is whether they should be allowed to have such a clause.
No one is holding a gun to YouTube’s head and telling them they have to serve video content to Germans. They want to be in Germany serving video content, so it’s only fair and just that they should listen to the Germans and do what they say or leave.
Germany could probably tell Google not to host any videos from the opposition political party under penalty of fine or expulsion. Hopefully we agree that that would be bad?
We can agree that bad things are bad, but that doesn’t really gives us much insight.
It is entirely possible for countries to do things which are against their constitution.
If they want to diminish the impact legally and profitably, they can just drown it in in more intrusive ads.
And anyhow, the GDR dissolved over 30 years ago, I should think their economic policies died with them.
2. Because it's lazy and lacking credibility: it's a claim to being privy to 'the correct path for society', without showing the ability to create something of value.
The state could still tap market forces to bring about public squares that abide with its guidelines, by providing subsidies to private parties developing such platforms, in exchange for the parties entering into an irrevocable covenant that requires them to follow the state's rules.
Also, the company did not have to go to jail, it got a fine.
I think I will take a stroll through the forest, maybe I meat a bear that plays guitar and sells refreshing lemonade. I am prepared for anything.
All legal injunctions are predicated on the threat of imprisonment, even fines:
A company is generally its own legal entity, intentionally separated from its shareholders to protect them from any fallout should the company ever go bankrupt or get sued into oblivion. You can get into serious trouble if you founded a company and failed to correctly distinguish between your own and the companies property, because as far as the government is concerned you explicitly told it to make this distinction.
And there is very little privilege that incorporation grants corporations that is not entirely contractual. Limited liability for shareholders for tort is the only one I can think of, and that, in my opinion, should be repealed.
Limited liability for debt can be entirely contractual in its basis, and established by shareholders/companies operating outside of the corporate structure too, simply by stipulating that condition in any loan agreememt the business enter into with another party.
> §14.2 GG: Eigentum verpflichtet. Sein Gebrauch soll zugleich dem Wohle der Allgemeinheit dienen. 
Which roughly translates to:
> §14.2 GG: Property obliges. Its use should also serve the public good.
If you believe otherwise, then I challenge you to convince a court of that. Courts, under common law, will invalidate any contract where the parties to it did not provide informed and genuine consent to the terms contained in it.
Resorting to legislative intervention suggests lacking the confidence in the validity of the allegations that premise it to use the judicial free market track.
>>as someone living in a country with multiple nation wide and regional state owned television channels I think they are terrible idea because they are extremely inefficient compared with private ones and also degenerate into propaganda machines for the political party in power, and the longer the same party is in power the worse.
a) you think the polity is incapable of managing large organizatons competently enough to compete with shareholder-run private enterprises, yet you think this same polity can craft effective cookie-cutter rules that will affect millions of interactions a day. It's not a coherent model of the world.
b) the state is not limited to organizations under its direct management to support its policies. As I explained in a response to a sibling comment of yours, the state could still tap market forces to bring about public squares that abide with its guidelines, by providing subsidies to private parties developing such platforms, in exchange for the parties entering into an irrevocable covenant that requires them to follow the state's rules.
Yet another option would be the state funding decentralized open-source protocols that are capable of entirely replacing centralized parties in many markets. The state, being publicly funded, is the only entity capable of cost-effectively funding non-profit initiatives like this that produce public goods.
My larger point is that the state should not be monopolizing industries. It should provide alternatives to the private options. If it fails, the public still has the private options. If it succeeds, it did so by providing an option superior to those provided by the private sector. This imposes accountability on the state, by leaving it with competitors to act as a yardstick, while ensuring its interventions only impact the market to the extent that they improve it.
In Spain at least there are lots of instances of courts overturning contracts, the most famous ones maybe related to mortgage conditions. I'm not sure if this is good or not in the long term, but it's the case.
The Japanese infrastructure slush funds are a classic example of that.
And I would also submit that state intervention in healthcare and education has been a dramatic failure, with costs in these two sectors skyrocketing over the period in which the state expanded its role in them.
>>In Spain at least there are lots of instances of courts overturning contracts, the most famous ones maybe related to mortgage conditions. I'm not sure if this is good or not in the long term, but it's the case.
If a court overturns a contract, that is fine. One of the roles of the state, in a free society, is to rule on contracts via its courts, and invalidate those that do not meet the bar for consent.
But legislative bodies do not engage in the same impartial and deliberative process as courts so they are not the proper venue through which to restrict private interactions.
>>Completely free capitalism brings the kind of aberrations that we can already see destroying the livelihood of millions of people in the USA
The idea that the US has "completely free capitalism" is the Big Lie promoted by the state's unionized activist bureaucracy. The US, along with the rest of advanced nations, has rapidly moved away from being a free society with economic liberty, toward one that has a significant amount of centralized control over the private actions of its citizens:
There is nothing wrong with state-owned corporations in principle. They need not violate any one's rights. How they are funded, and what laws are created to assist them, can of course be problematic, as can the general inefficiency of the state sector.
But if the funding sources for the state subsidies are not based on violating private property or contracting rights, there are no laws enacted to give the state-sponsored enterprise a monopoly over a market, and the subsidies address an externality to produce significant positive economic returns, then it would be fine.
This approach to government intervention is less risky, as failure of government administration will only waste the resources expended on the state-sponsored enterprise, while leaving private citizens with the private sector alternatives. Regimenting an entire sector with top-down rules risks destroying that entire sector if the government chooses the wrong set of rules.
> The German court held that YouTube failed to make its enforcement authority clear in its contract with the account operator who posted the video.
It does contain language quite similar to "regardless of the contract, we hold that ...":
Dahingestellt bleiben kann dabei, ob die Nutzungsbedingungen bzw. die „Richtlinie zur medizinischen Fehlinformation über COVID-19“ einer AGB-rechtlichen Kontrolle standhalten, insbesondere, ob sie dem Transparenzgebot genügen bzw. den Nutzer nicht unangemessen benachteiligen (§307 BGB). Denn die Inhalte des streitgegenständlichen Videos verstoßen bereits nicht gegen die Ende Januar 2021 gültige „Richtlinie zu medizinischen Fehlinformation über COVID-19“ (aa). Bezüglich der Neufassung der vorgenannten Richtlinie hat die Beklagte dagegen nicht glaubhaft gemacht, dass diese wirksam in den Vertrag einbezogen ist (bb).
Which ruling? Have a link (in German is fine)?
Companies have obligations too, and I think that in Europe (compared to the USA) more people would agree that it's better to limit companies if it benefits people.
Certainly, in this case, it's better for people to have their protests heard than for Google to flex it's censorious whims.
For consumers, it is even better not to read the terms, because then they can say that they have been taken advantage of (it is harder to do that if you change a passage manually in a written contract).
Google might also be intentionally lazy, the Google Play licensing terms for example generally fail in front of a court when they are challenged, as they actively violate every law on anti competitive behavior you could think up. Whenever that happens Google just carves out a new licensing region for that courts jurisdiction and keeps the terms unchanged for the remaining world.
This is a discussion about Germany making it clear to Google what they're prohibited from removing.
Powerful enough governments can just choose to ignore or re-interpret things any way they like and act on them, though.
The clause in the sentence before that part reads "Beneath the rule of men entirely great,..." It was referring to authoritarian government.
Seems like a common trend in many countries around the world. Tech companies tend to be american, so they're a popular punching bag/scapegoat when it comes to enforcement actions.
It sounds like YouTube's stance is that the video may cause harm to their users; one of their criteria for removal.
Literally anything can be argued to be "harmful."
If there weren't people that blindly accepting it as the truth and verified everything this wouldn't be bad, in fact would be good because it could drive attention to something that perhaps was ignored.
Unfortunately large part of population has no idea how to do fact checking, and what's worse many take their news from the Facebook.
(For what it's worth, the proposed law now refers to "safety" rather than "harms").
Or let me try, is “hateful”?
Yeah. That about clears it up, I guess. Clear as mud.
I don't even understand why Youtube should be forced by a country to host someone's content. I can understand the opposite (being forced by the government to take down content in their jurisdiction, that has been done before), but forcing a site to keep a video feels really strange to me.
It doesn't appear to be settled even in U.S. law whether requiring a company to host something would be governmental overreach. The traditional "common carrier" doctrine is pretty broad. It's not an exact analogy for the kind of regulation some people would like governments to undertake regarding YouTube, Twitter, Facebook, etc., but it's not totally unrelated either.
Eugene Volokh from UCLA's law school recently posted a 79-page draft article on the legalities of regulating social media companies as common carriers under U.S. federal law: https://www2.law.ucla.edu/volokh/carrier.pdf. One large section, II.A., "The General Constitutionality of Compelled Hosting" (pp. 35-58), argues that compelled hosting isn't in general prohibited by the U.S. federal constitution, so in his view the U.S. Congress could pass a common-carrier-style law mandating such companies host content on a neutral basis, if they wanted to (at least if the law avoids various issues he identifies).
Clearly the line is not 'all speech that is allowed in the US'?
Startups less regulation, global conglomerates more regulation.
However, that document does not actually order a fine. It warns about imposing one ("up to 250'000€") if Youtube doesn't remedy the problem. Considering it's now exactly 3 months later (decision dates April 13th), that lines up.
From what I understand (not a lawyer, yadda yadda) this is a purely civil matter and doesn't really go as far as some statement on free speech. The issue discussed mostly seems to revolve around which terms of service are valid and how they can be changed.
Centralized : Dailymotion, Bitchute, Rumble, DTube, Vimeo, Vidlii
Decentralized : Odysee(LBRY), Peertube
Other ways to help are here : https://lbry.tech/contribute
Never heard of them, but they seem to have chosen a poor name that might be censored elsewhere due to the first five letters.
I'm getting the impression from the Google Translate'd version of this article, that they violated a contract with the company?
What confuses me is that the article states the video was removed in January, which would only make sense if it was January 2021, because January 2020 there were no restrictions in Switzerland. Maybe the above date should be May 2021.
I think the relevant piece from the Welt article is this:
> Es [the court] kam unter anderem zu dem Schluss, die geänderten Richtlinien seien nicht wirksam in den Vertrag mit dem Accountbetreiber einbezogen worden. Hierzu sei ein Änderungsvertrag erforderlich. Der bloße Hinweis, dass es künftig Änderungen geben könne, genüge nicht.
Roughly translated, the changed ToS of Youtube were not contractually effective, because the user was not asked to accept them. Stating that changes can happen any time is not enough.
That "future changes may apply without notice" is invalid in Germany, that's evident. Companies writing bullshit like that in AGBs (ToS) or contracts don't realize Germany is not the US, they need a new legal team.
If the news is correct you are right about the fine being about ignoring the court, not about removing the video in the first place. With a fine that high it has to be about that, there is no way removing a regular video could cause this. Basically the court saw youtube in contempt of the court, kinda.
Might also be politically charged though, East Germany is highly penetrated by Nazis and with a high percentage of corona deniers. Depends on the judge of course whether that was a factor here.
But I was asking why the court was forcing them to re-instate a video in the first place.
None. They violated their civil contract with the user AND failed to fix it within 3 months. cf. toplevel post: https://news.ycombinator.com/item?id=27838603
> “With the historically high fine, the Higher Regional Court makes it very clear that court decisions must be observed without restriction, regardless of whether YouTube assumes a violation of its guidelines or not,”
The point was that, as a platform, you cross a line which is very difficult to ever revert when you optionally censor your platform in response to political pressure because both sides are then forced from a game theory perspective to aggressively pursue censoring your platform in their favor lest the other does it first. The platform ends up being collateral damage in some political war they never cared about in the first place and they can never win because no matter who they bend too, it automatically enrages the other side.
This seems like where youtube and twitter and facebook are at now. They caved to censoring things that were reasonable, but now anyone can make them censor anything as long as they have some poltical power somewhere. It's not going to end well for them. Their only winning move was to stay neutral and do nothing but comply with legal requests.
No? This narrative is a bit ahistorical - and it's fueled by this idea that hyperliberal boogeyman started censoring everything.
If you look at what happened in 2017, Google started "censoring" things because advertisers threatened to boycott. The platforms couldn't stay neutral because advertisers became more and more concerned with staying out of any potential scandal.
> One of the videos that had been restricted was a trailer for one of his short films; another was an It Gets Better video aimed at LGBTQ youth. Sam had been shadow-banned, meaning that users couldn’t search for it on YouTube. None of the videos were sexually explicit or profane.
> ... five YouTube channels alleged that the platform had unfairly targeted LGBTQ content creators with practices similar to those described by Bardo: demonetizing videos, placing them in restricted mode without warning, and hiding them from search results.
Doesn't demonetization on YT just mean the ads still run but the money doesn't flow to the creator anymore? Considering YouTube does it that way with copyright claims and also automatically added ads to previously ad-free videos just because they could, it would surprise me if they'd remove video ads by themselves.
I even saw (this mostly was Facebook, but also was done on YouTube) where the channel/fan page purposefully switching videos to private while talking they are being censored.
Sometimes channels will do this because they have "strikes" shown on their creator pages, so they hide their videos to avoid having so many strikes they get shut down. Some science and engineering channels I watch have run into this problem.
It sounds like it's doing its job.
If a site wants to hide all posts/videos that promote some unpopular political belief, or use offensive words, then implementing that censorship as the default user experience is perfectly acceptable, as long as users can choose to opt out of that censorship.
There might be multiple reasons why a given post/video could be censored, and perhaps there is a small burden on sites to tag every single reason rather than mark it for censorship at the first excuse, but I think that a lot of the tagging work could be made the responsibility of the user who uploaded it.
Such a system would hopefully make moot the slightly disingenuous argument that "If sites can't ban political opinions I don't like then they also won't be able to ban spam". Obviously sites would be allowed to put neutral resource limits on users, to prevent DoS attacks.
There’s been a lot of really bad information on 230 from people who ought to know better.
The fulfillment of this fantasy of forcing platforms to abandon their efforts will just lead to all of social media degenerating into cesspits as they fill up with porn and swastikas and all normal people leave.
I agree that highly-public social media anything like what we see now wouldn't work anymore.
I don't even necessarily think that we should kill 230, but I don't think you should be able to curate and promote content, and claim strong rights to posted content, and still enjoy its protections. Yes, this means "algorithm-curation" social media with broad public visibility of content and that claims significant ownership of posted content, would be in trouble. I think services like that should struggle to operate that way. Take ownership or don't, none of this pretending to be one thing while doing another stuff. That doesn't mean we have to crack down on web hosts or ISPs or email providers or anything like that, since they're not doing most of that stuff.
Yeah, it's called Facebook. Regardless these weren't some no names who were boycotting, it was pretty much the whales like P&G and CocaCola who were complaining. (Just those 2 spend $8BN/year). At the very least having any of them pull out would cratered at least one exec's bonus.
Failure to adhere to terms and services or to express intent to adhere to them going forward.
Parler couldn't or wouldn't stop people from breaking their providers' terms of service and didn't show a good-faith effort towards doing so. Other right-wing outlets do do those things, and have not gotten drilled despite their odious beliefs.
I don't understand why your comment is flagged and dead! You are making a perfectly reasonable point. HN admins, what is unacceptable about what is being stated - I genuinely do not see it.
In this case, with my reply, I preferred to point out the obvious mischaracterization and leave addressing the mendacious falsehoods to others, but it looks like others decided to do something about it.
I would be fine to agree with you if you have proven that there is not election fraud, but obviously having a different opinion is not proof.
They can pit the providers against each other. And that's basically how a free market works.
1. Google definitely censors things on their own as well. Their own search engine turns up a long list of examples, so I won't rehash them all here, but one illustrative example is their censorship of the dissenter plug-in (https://reclaimthenet.org/google-chrome-web-store-bans-disse...), which also seems to be an example of censorship collusion within the tech industry.
2. Google has a long history of internal activism that is highly progressive, and regularly applies pressure on the company, and creates a culture of fear for employees who are either conservative, centrist, or even moderately left-leaning. The James Damore fiasco is a great example of the internal political culture rearing its head and how it impacts who's comfortable speaking up and steering the company's culture (https://www.inc.com/suzanne-lucas/google-fires-employee-for-...).
3. Why do you think advertisers became "more and more concerned"? It's because of left-leaning activist pressure from groups like Sleeping Giants who have made it their mission to organize activists and create a false sense of societal pressure on advertisers (https://en.wikipedia.org/wiki/Sleeping_Giants). It's the same as Google censoring things, because typically activist employees will draw attention internally to these activist campaigns, and try to alter the company's otherwise neutral stances. There's also a pipeline from internal activist employees to certain members of the press (like Geekwire) to try to use external pressure to move company stances.
I'm willing to accept the premise that Google could have neoliberal pressure on the company (I don't know if I would consider the pressure you allude to be progressive or left leaning). That said, James Damore's memo, if you've read it is not a good example of it and I believe he was rightly exiled for it. The memo is poorly sourced and poorly argued. It reads like someone who doesn't understand Dunning–Kruger is.
From Lee Jussim, a professor of social psychology at Rutgers University who was a Fellow and Consulting Scholar at the Center for Advanced Study in the Behavioral Sciences at Stanford University (https://quillette.com/2017/08/07/google-memo-four-scientists...):
> The author of the Google essay on issues related to diversity gets nearly all of the science and its implications exactly right.
They tried to get YT to do it, when YT ignored them they went after the money... It is right from the liberal playbook
Yep the liberal playbook that was kicked off by an investigation by the progressive news outlet... The Times... which is owned by Hyperliberal Billionaire... Rupert Murdoch.
Do you even bother to do a small amount of research into your biases? It blows my mind that people think the world is controlled by a couple of megalomaniacs on Liberal Twitter.
For example, Conservatives demanded radio stations stop playing the Beatles, and, only slightly more recently, the Dixie Chicks. They called up advertisers as well.
You could probably find people complaining to artists' patrons in Medieval texts, if you looked.
If you want recent examples, look at WAP, its Super Bowl and Grammy's performances, Lil Nas X, or the NFL and Colin Kaepernick. There's also the witch hunt and boycott on teachers, companies and anyone else they believe are part of a nationwide critical race theory conspiracy.
"Share and publicize, through existing channels, information already available on critical race theory (CRT)"
"Provide an already-created, in-depth, study that critiques empire, white supremacy, anti-Blackness, anti-Indigeneity, racism, patriarchy, cisheteropatriarchy, capitalism, ableism, anthropocentrism, and other forms of power and oppression at the intersections of our society, and that we oppose attempts to ban critical race theory and/or The 1619 Project."
"Commit President Becky Pringle to make public statements across all lines of media that support racial honesty in education including but not limited to critical race theory.
This isn't limited to universities, either - it's being taught in K-12 schools:
"Responding to prompts such as “In the last year, I have learned _____ about race and racism,” and “One way I will work for racial equity in my work,” teachers say:
“American society makes it hard to have high hopes.” Racism infests the nation’s “entire fabric.” Everyone must “lean into the discomfort.” “Older millennials are disappointingly racist.” “Aspects of the anti racist movement have been co-opted by neoliberal corporations, and reactionarily [sic] opposed by many even mainstream conservative thinkers.” Racism is “layered into everything we do at school.” We must “share the harsh reality of the BIPOC and LBGTQI communities with our students.” “Discuss issues of equity as arising in most every book I teach.”
Oh, and the statement by the NEA was removed from its website shortly after (https://ra.nea.org/business-item/2021-nbi-039/ now redirects to the homepage), which is only further evidence for the fact that many of those pushing this ideology are simultaneously attempting to gaslight and actively lie to their opponents in an attempt to convince the public that it doesn't exist.
I'm not sure why you used the phrase "witch hunt" when the ideology clearly exists and is actively being pushed in education around the United States. Perhaps you meant to use the phrase "accountability culture"?
This confusion was deliberate work of a few conservative thinkers, who wanted an obscure "elitist" academic theory to use as a catch-all term for all anti-racism work.
According to that same mob, when they couldn't find the critical race theory at the school board meeting, everything they didn't like, along with diversity training, suddenly became critical race theory:
> While critical race theory was not on the agenda, parents and community members accused the school district of requiring teachers to take a diversity training that discusses the concept and then teaching it to students. They also criticized the school board for proposing a policy that would allow gender-expansive or transgender students to use their chosen name and gender pronouns and use the restroom that corresponds with their asserted gender identity.
That sounds and looks like a witch hunt for teachers to me.
Also, this is where the conspiracy comes in. Everything conservatives don't like is critical race theory now, and that's by design:
> Christopher Rufo, a prominent opponent of critical race theory, in March acknowledged intentionally using the term to describe a range of race-related topics and conjure a negative association.
> “We have successfully frozen their brand — ‘critical race theory’ — into the public conversation and are steadily driving up negative perceptions,” wrote Rufo, a senior fellow at the Manhattan Institute, a conservative think tank. “We will eventually turn it toxic, as we put all of the various cultural insanities under that brand category. The goal is to have the public read something crazy in the newspaper and immediately think ‘critical race theory.’”
Saying that liberals or conservatives have this to a greater or worse extent is looking at it through the wrong lens.
We libertarians did not mind our flank and Authoritarian liberals today are about 10000000x more of an issue than even the most extreme bible thumping conservative from the 90's ever was
Are these tax payer funded libraries, and could it be the fund reduction is simply a result of the fact that many, including myself, believe that libraries should not be funded by forcible taxation of the population.
This position is often turned by left sources as "raaaaccciiism"
And "what-aboutism" was exactly the point, I was specifically pointing out that boycotting based on morals isn't in any way only the domain of "hyperliberals."
What actually happened isn't that one side of the political spectrum "censored" the other side in some sort of targeted attack.. it's that advertisers and private companies optimized for generating as much profits as possible by (obviously) pondering to the majority of potential customers.
Ironically.. such is the nature of capitalism.
Can you define "reasonable" here in an objective way that we could all agree on?
There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.
>There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.
Nonsense. We don't apply this standard to anything else in life.
I'm not sure what "elsewhere in life" means, but bookstores choose what books they will sell while also not necessarily endorsing every book they carry.
> I refuse to take down this video, because I do not think it needs to be taken down
are two VERY different things. The first one is a political choice about taking down videos in general. The second one is a political choice about a specific topic. The second one is more or less similar to taking down the video because you think it needs to be taken down (same topic, different political decision)... the first one is absolutely nothing like that.
Edit: In case it wasn't clear (because I replied in the wrong place), this is the comment I was discussing
> There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.
While it may be "just as political", it's a political about a totally different thing.
Anyway, a social platform that only removes illegal content (not even spam?) sounds absolutely dreadful and I would not use it and no one would pay to advertise on it.
If my friends bring me dreadful content that I don't like, I'm not sure we'll stay friends. If my friends are spamming, I'll ask them to stop and if they continue, I'm not going to stay friends (or at least I'll unfollow them where they're spamming).
If people post garbage to my posts, I'll delete their garbage and restrict access to my posts.
There's no need for the social platform to do moderation, until it starts putting unrelated people's content in front of me; which is something I don't really want from a social platform.
Presumably, you don’t notice you’re on a moderated platform, but HN is very much not an unmoderated free-for-all.
It would be interesting to have a social platform where you can only see mutual connections. I imagine it'd have a hard time competing with email and group texts and all the other ways people who already know each other can stay in touch.
If I'm searching for a video, I try to do it in a web search, and likely it'll lead to YouTube, and hopefully it'll actually be useful content (or Rick Astley, I actually like that song).
Anyway, if I were YouTube, I would turn off comments everywhere, and review videos before including them in recommendations (which would leave a lot of videos out of that section) and probably have a lot lower views.
Why is it political? They didn't commission the content or request that it be created and hosted on their platform in any way. They're offering the same reasonable self-hosting process that they offer to anyone who shows up with an email address.
It only seems to become political when you decide to take action and either protect or remove the material. You're now no longer a disinterested third party, you're making editorial decisions and it's hard to believe they've taken this step without considering the impact of those decisions.
Other publishing media do not have this standard. For example, the radio waves are another medium where the FCC (which regulates them) could say that only certain things are allowed, or they could say you can broadcast whatever you want. (In fact I think they regulate content, re: obscenity, but I don't want to look it up right now.)
So again, the idea of being only a neutral 3rd party who hosts videos for all-comers is a choice, which has political implications.
They can only do so because there is a limited number of them and users cannot share the space, so it must be licensed to be practically useful.
Also, the FCC cannot dictate to a station what it can and cannot air, the FCC can enforce _community standards_ of the community which is being served by that radio station. They're not in a position to go searching for violations and then act upon them, they merely respond to complaints from the communities themselves.
> So again, the idea of being only a neutral 3rd party who hosts videos for all-comers is a choice, which has political implications.
Yes, but the service clearly exists to make money.. not to make a political statement; which I agree may be incidental, but that shouldn't be the basis for interpreting their actions.
And profiting from it. The scope changes slightly when you realize your business could get sued repeatedly because you promoted misinformation (which is how it would be spun) and someone died because they followed that misinformation.
This is risk, and few of these businesses want to tackle that risk apparently.
It's a testament to how well marketers over the decades have sold the idea that companies care about anything other than their shareholders that people mistake profit-driven motivation for political stances.
> popular anti-vax
since Anti-Vax has now been refined to include anyone that opposes government mandated vaccinations, I am a Vaxxed Anti-Vaxer as I oppose all government mandates. People should be free to choose on their own if they want a vaccine or any other medical treatment.
So should a video of me expressing this position be removed under an "anti-vaxx" policy?
But I think you'd find two problems: determining what could be illegal is really hard (what's a "true threat" and what's a tasteless joke?), and also you'd end up with a community that looks a lot like 4chan or parler. Not a place I would choose to hang out.
While that may be true.. do you not believe that the exceptional openness of either of those two platforms has an impact on places outside of them? Do you think there's no intangible benefits to you by these places merely existing?
I have yet to hear a positive argument for their existence, and "well it collects the dirtbags!" is actually not one. I spent years tracking reactionary and fascist movements on the internet and how they interrelate and spread information; these sites more or less exist to do exactly that. The targeted harassment campagins that target random people they've decided not to like--that's just "for the lulz".
Oh, there's definitely an impact.
> Do you think there's no intangible benefits to you by these places merely existing?
Yeah, the campaigns to harass a game emulator developer to suicide because "40% is a good start" are a fantastic benefit to society, obviously.
So the "just create your own platform" trope has be tried and failed.
If you haven’t seen the issue with echo chambers by now… you’re in one.
I have locals.com account, reddit accounts (which is diminishing amount of my usage as they continue to "mainstream" aka censor the site) , and HN, that is pretty much it for me
So if I were to continuously upload hours-long videos of digital noise, thousands of them, millions, YouTube ought to be obligated to host these ersatz videos, and not remove them unless they were deemed illegal? In that case, someone could in theory run a successful encrypted cloud backup business off YouTube's servers. Or just use YouTube as a massive versioning backup for their own personal data, confident in the knowledge that the files can never be deleted.
A lot of the replies to your comment fixate on this, since they take issue with “everything is political”. But I think it’s very true, and agree with you.
Why should YouTube host anything and everything that random anonymous users decide to upload? Why is this /holy/ act of uploading deemed undoable and unrevocable?
It’s a very weird way of thinking that it’s political only if the video is removed, but not if it is kept online.
Because keeping it up is the default, while you have to go out of your way to remove it.
If I own a store, is it as much of a political act to allow Trump supporters to shop there as it is to ban them?
That's a misconception, and basically what the whole trolley problem is all about. Doing nothing is not the same as doing something. Each have their own ethical consequences and it's not as easy as you think to spurt out the "right" answer. Sure, you might have a preference that's difficult for others to argue against or for, but others also have the right to reach a different conclusion.
With that being said, may I ask you what you know about vaccination? Are you using information from articles you've read online from "authorized" sources, or are you an expert on the matter? Again my point here is not to say that you're wrong or right, you're absolutely free to reach whatever conclusion you desire, but it's really difficult even for actual experts in the medical field to know what's going on currently, who's motivated by altruism, greed, selfishness, or cronyism. Taking up these stands and pretending it's "science" is an insult to science, because "science" is ever evolving and there's no such thing as consensus in the scientific process.
It is like getting a warrant to search your house. It can be expedite if someone is in grave danger.
I'd argue that much of their censorship was not reasonable at all. None of it was illegal. Mostly just differences of opinion. My "information" is your "misinformation".
People who think any platform has the right to censor information scare the shit out of me. Save for things that are illegal, every platform should be neutral. Moderation should algorithmically be tied to the law.
you wouldn't have to. and if they were the majority, then it would be democracy in action.
It's not a case of deciding to:
>optionally censor your platform in response to political pressure
Because so many people assume that if some content is removed and it is <content I think should be permissible / like> well then that must have been because of political pressure or views at that company or something like that.
So by default:
> The platform ends up being collateral damage in some political war
Even if the content removed wasn't removed due to political pressure the accusations fly and the game begins.
Nobody is forced by a platform's actions to play the game, the sad truth is people will believe the game is on all the times to explain anything they don't like / don't understand.
Rudy Giuliani thought that Twitter was 'allowing' someone to post content to his account without his permission ... but really he had posted a link to a domain that didn't exist, then someone registered and had some fun with him.
Rudy of course just filled in the blanks of his ignorance with concerns of bias by Twitter... and there's A LOT of people who do that (with all sorts of political views).
We had an article here on HN where up and down votes changed on YouTube ... it was interpreted by some immediately as some sort of political bias.
Have a platform? You're in the game...
Which is why they would have to be very clear that their policy is to only respond to legal orders for takedowns and nothing else.
The scale of spam of all kinds would render the service useless. Even inaction would be interpreted as part of the game ... and you're in the game again.
I think there is also more Grey here. They could have de prioritized things in the algorithm to encourage the direction of the community without explicitly depltforming anyone or blocking\removing content. I think its possible that they could have achieved a lot of what they wanted with this method while avoiding poisoning the well.
My understanding is that users of YouTube pressure YouTube to censor some things, YouTube wants to appeal to its user base, because customers are important for profit.
Now Germany says that maybe in their position censoring things for profit is not legal or something like that and fines them.
To me, the fact that they don't care about it is the problem. Those companies are having issues because they have no moral or ethical compass. It's become untenable for them to remain neutral because there is no neutral actor in this scenario. There's never going to be a service that's not held to a moral standard for their content regardless of what the law says.
You can argue that you don't like that they are accountable to their shareholders but it is true.
For a significant fraction of users, the most engaging content will be psychologically manipulative - conspiracy theories, racism, political intrigue, and the like. I see the combination of an engagement machine and such content ethically problematic, but I don't think they're going to turn off the extremely-profitable engagement machine unless forced to, so they restrict content instead.
The trouble is the engagement machine is sophisticated, while the content restrictions are crude. This probably isn't sustainable in its current form, but there aren't easy answers to the problem. A sophisticated "bad" content suppression algorithm sounds pretty dystopian to me.
Staying neutral, or alternatively, trying to appear neutral will most likely in itself enrage.
(BTW, this is what I felt about many of Ravikant's statements: he sounds so wise, so believable, and I want to agree with him, because it fits my own philosophy so much, but as soon as I get distracted from the magic of his voice for a moment, I start doubting if all of it has any ground whatsoever or is it just make believe.)
Edit: However, I think I have to clarify that your interpretation isn't exactly how I remember his point. As I remember, the point was that openly censoring the content for any reason other than the court order is basically the moment where they lose the "plausible deniability" of being responsible for their content, i.e. they cannot be seen as "just a middleman" anymore (this is how they want to be seen). But it is close enough for the matter we are discussing right now.
They could have enforced their policies consistently across all accounts instead of their strong anti-right/anti-conservative bias. It's their selective enforcement that created issues for them.
Facebook literally set up vote drop boxes to maximize number of votes in precincts that heavily favour democrats in last election. They also banned a sitting president. So why even bother pretending they are neutral, when the evidence is pretty clear that they set out to sway elections. You can watch the Sergey Brim upset video after Trump won in 2016.
I don't have problem with them having a side. And acting honestly and openly or even making one sided campaign contributions.
I have a problem with a reinforcing circle, where they censor to help elect a government which then rewards their censorship with contracts. At that point they become state actors, censoring on behalf of the state. and I think this is the real issue. And I think we are pretty much there.
On some level, we have to acknowledge that whoever is running the shitshow at youtube and twitter are actually responsible as well, in that they believe they know enough to literally dictate what everyone in the world should see and read.
For example banning Trump from Twitter is a great example of this. Don't get me wrong, I'm no fan of his politics (or Biden's, or any of them, same shit different smell i say), but who the hell do Twitter people think they are that they feel confident enough to ban him? Not talking in the sense of a private company here, sure it's their business and they can do whatever the heck they want, but does anyone really think that by banning him they prevented whatever ideas of his that think should not be encouraged to stop? At best, this shows tremendous life inexperience, which I expect from kids who read a couple of books and think that they understand everything, the typical example of a tech company worker in a company like Twitter.
But the new and very different point that you're making, which is "removing bad actors improves the quality of information on social networks", is also very disputable and shaky.
- First link is "Reddit says that the rules Reddit made helped Reddit." is not exactly impartial, so forgive me to mistrust it.
- Felt bad opening a Vox link (since so very obviously biased, but whatever, opened it for a laugh). The say: "misinformation slowed, the research indicates online discussion around the topics that motivated the Capitol riot has also diminished" if you don't see it (or if it doesn't happen online) it doesn't mean it diminished overall. The whole article is full of bias honestly. Sad that you feel it's worthy enough to source.
I don't think it's been "debunked" as you say. Provide evidence. A few examples is not evidence to make such a blanket statement.
> The best ideas arent the ones that win out, it's the loudest ones and the ones that appeal to our most base emotions that win.
Again, very much disagree with this. If you look short term that may be true, but long term historically speaking at least that has not been the truth.
Maybe on the scale of millenia ideas based in truth are more likely to dominate, but I don't see how you could make that claim about history when our current paradigm of empirical knowledge is only a few centuries old, and already it seems as though cracks are starting to form.
Why do you say it's on the rise? Have you considered that there's lot more people, or that that the internet and various "communities" on the web just gave extra amplification to all sort of ideas? This is exactly why nothing should be banned. If you want wrong ideas to be corrected, you let them be discussed in the open. If you start pulling down videos that talk about flat earth you're gonna end up with grouping all people who think like that in a community where they only get exposed to ideas that affirm their erroneous belief.
> and already it seems as though cracks are starting to form.
again, provide examples or evidence for how and where you see this.
That's exactly what not banning them is doing! Not that I think flat earth content should be banned, as it seems to be mostly harmless. Before the internet, people who believed in fringe conspiracy theories didn't have a good way to coordinate and group together at scale.
Hell, even in the mid 2000s, after the internet had been around for a while, fringe communities tended to self-segregate in their own forums. They group together and formed echo chambers, yes, but they were also insulated from broader society, and therefore had little ability to acquire new converts. It was recommendation algorithms that popularized fringe ideas, by pushing them to bigger audiences that otherwise never would have been exposed to them.
I think that's really the core of the problem, recommendation algorithms. The algorithms don't know how reliable or accurate the content they push is, they just push whatever the machine learning model predicts will keep the user engaged. I would much rather group the people at the fringes into online ghettos than have them roaming the broader web and spewing their nonsense to anyone the algorithm recognizes to be vulnerable to conspiratorial thinking.
As to examples of cracks in the paradigm is respect for empirical knowledge, I don't think you have to look far. The rise of political extremism has resulted in more people on the far right and far left, ideologies that are hostile to the notions of nuance and cool headed reasoning, and thrive on emotionally driven messaging. More than 15% of Americans believe in QAnon, and nearly a third believe in the election conspiracy.
Meanwhile on the left, you have rhetoric that is increasingly hostile to data-driven approaches to problems, instead preferring "lived experience", ie anecdotes. You have cases like David Shor's firing for daring to tweet a study from a respected scholar that appeared to challenge the zeitgeist at the time. Many of these people are highly educated, or even academics themselves.
And that's to say nothing of regular old snake oil that has nothing to do with politics. Essential oils, healing crystals, you name it. Misinformation is on the rise, and most of the population is not equipped to deal with it. Maybe this is a temporary growing pain, maybe not. I don't think there's any way to know for sure right now. But we do know that for most of human history we have lived in the darkness, so it wouldn't be terribly surprising if we end up returning to it.
Are you sure? That hasn't helped flat earthers or anti-vaxxers ( of the vaccines cause autism camp), why do you think it's true?
My personal opinion is that even if the thing that gets banned or removed finds an audience (and it will), it's still a good idea for any platform - from a tiny webforum to a billions-strong social network - to have speech rules that are fair and reasonably enforced. It's not fun to be on a platform where a handful of celebrities and politicians (incl. Trump) are regularly flouting rules you have to follow.
Furthermore, I'd much rather be in a community in which the loudest and most obnoxious users are shown the door, purely for my own sanity. Free-for-alls stop being fun when they outgrow their original userbase.
Walach H, Weikl R, Prentice J, et al. Experimental Assessment of Carbon Dioxide Content in Inhaled Air With or Without Face Masks in Healthy Children: A Randomized Clinical Trial. JAMA Pediatr. Published online June 30, 2021.
I feel like the core problem in many cases was building a system where someone gets to be arbiter of content in the first place: Apple claims they take software down due to "legal requests" by/in various countries (though this is a lie as they also do it anti-competitively or for market perception reasons, but even taking them at their word here) and yet they caused the ability to do that as somehow Android devices sold in the same jurisdictions support installing arbitrary software without issue.
Platforms like Twitter and YouTube make their problems worse by recommending content--which is entirely their editorial decision and should be seen as such: any benefits or costs, moral or legal, should fall squarely on their shoulders--and then conflating that with having the ability to publish, but they would be in a much more morally (and often legally) defensible position if they simply didn't actively recommend content they disliked but still let it be found by people who actively followed the publisher.
Regardless, as usual, I will now link to my heavily-cited talk (every single slide is inherently a citation from a reputable news source) that I gave in 2017 (maybe 2018? the video was unlisted and I only just recently made it public as someone noted I had never done that, and now the upload date is weirdly set to last week ;P) at Mozilla Privacy Lab--"That's How You Get a Dystopia"--wherein I push hard at the idea that centralized systems and the arbiters they empower are the core problem and never work out, even if you like some of their decisions for some of the time.
The proper approach IMO is to honour legal requests, but provide transparency and display some information about blocked content, like who made the request to block it and so on. Of course content must only be blocked for requests originating from that specific country.
I maintain the moral equivalent of this for platforms like Twitter and YouTube is to stop conflating their centralized recommendation systems (which should be considered "their speech") with the functionality to publish at all (which should be considered "someone else's speech"): it is going to be way less controversial to stop recommending something that someone else likes if it is still accessible to people who know about it (as they externally discovered the content or author and were directly linked to it/them), and it is also going to be way less controversial to allow people to publish something that someone else doesn't like to their own audience if it isn't being actively recommended to third parties. These recommendation systems have started to conflate "being able to say what you want" with "being able to be granted a large audience" so well that the feature set for moderation fails to separate them, and that's bad for everyone: there is a reason why we talk about "the right to free speech" instead of "the right to be heard".
If that is happening, then the architecture of the internet needs to take more evolutionary steps.
If the state can censor the internet, there's nothing 'inter' about it. It's just a network, subject to the whims of its flailing predecessor.
Yes, that is happening for many years. China being the most known state, but AFAIK even well-recognized first-world democratic countries like UK block some websites.
> then the architecture of the internet needs to take more evolutionary steps.
Adoption of IPv6 is a good example that architecture of the Internet is pretty much set in stone at this moment. We can put more layers on top of it, like Tor network, but underlying protocols are still IPv4/IPv6 with enough meta-information to allow efficient blocking of protocols or resources.
I have some hopes that new TLS standards with hostname encryption (ECH) along with CDN networks will make blocking impossible. But even that is easily circumvented with government MITM. Kazakhstan already deployed all the necessary hardware and did some successful tests on scale. Browsers blocked its root certificate, but will they block (imaginary) China root certificate, losing 1.5B users?
centralization is merely a necessary but not sufficient ingredient for dystopianism. you also need a ratcheting consolidation of power, especially money (provided by advertisers in this case) and attention/influence (provided by viewers). centralization is simply one of those (key) ratchets that can be leveraged to further consolidate power for the benefit of directors and executives.
relatedly, the american constitution is an experiment in crafting a centralized system with checks and balances stable enough to withstand assaults of power consolidation. the jury is still out, but it's looking somewhat bleak at the moment, given a runaway executive branch fueled by an unhinged fed/central bank. we seem to be stomping on the gas pedal even as the brick wall looms ahead.
in any case, i've long been an advocate of right-sizing organizations of all sorts, especially governments and companies. we've concretely learned over the past many decades that the negatives of large (and small, but those tend to self-regulate away) entities eventually far outweigh the benefits, and are better substituted by a more diverse and specialized collection of medium-sized ones.
in this specific case, imagine if we had thousands of mini-youtubes, each with their own curational quirks. no single mini-youtube could unduly influence the whole zeitgeist. most (all?) will be flawed in their curatorial duties, but none could move public opinion in any meaningful way. viewers could also jump from one to another and be exposed to many different editorial perspectives, even without necessarily being cognizant of that. however, if every mini-youtube were relegated to being a single person each (i.e., no centralization), then we'd lose the benefits of aggregation and curation.
the problem of course, is that we societally accept size and growth as good things, and i'm arguing that they're only good to a certain point (and see your argument as saying all centralization is bad), and that we need to change incentives as a function of size/centralization, so that we get right-sized organizations, rather than unaccountable behemoths. our current incentives make right-sizing an unstable equilibrium point on that curve, which is why it doesn't happen.
A senator can ask you to do something but the fact the they are a senator does not legally obligate you to comply.
But if the DoJ sends you a demand (I'm not a lawyer, I'm just making up details here) let's say it's signed by a judge or something, what are you gonna do, say no?
There are also cases like child pornography where you might preemptively remove stuff without any demand because you want to but it would still be under justification of it clearly complying with a law so you haven't poisoned the well of if you censored anything.
And that's really the important factor in the end.
As everyone else had said, if you don't do that stuff, you are just going to get your domain and assets seized anyway.