Hacker News new | past | comments | ask | show | jobs | submit login
Germany fines YouTube for removing video of anti-lockdown protest (mediaite.com)
495 points by sbuttgereit 65 days ago | hide | past | favorite | 451 comments



> The German court held that YouTube failed to make its enforcement authority clear in its contract with the account operator who posted the video.

Does YouTube's TOS really not contain statements giving them the ability to stop hosting any video, at any time, for any reason?

Edit: I guess not: the only content removal clause I see is this one, which is definitely not "we can remove your video for any reason, as long as we feel like it." I'm a little surprised they don't seem to have included anything like that.

> If we reasonably believe that any Content is in breach of this Agreement or may cause harm to YouTube, our users, or third parties, we may remove or take down that Content in our discretion. (from https://www.youtube.com/static?template=terms)


> Does YouTube's TOS really not contain statements giving them the ability to stop hosting any video, at any time, for any reason?

The highest court in Germany (for civil and criminal proceedings) wrote this in one of their rulings:

> Depending on the circumstances, especially if private companies - as in this case - move into a dominant position and take over the provision of the framework conditions of public communication themselves, the fundamental rights obligation of private parties can in fact be close to or even equal to a fundamental rights obligation of the state.

So as long as YouTube, Twitter etc. are open to "the public", they could write "we reserve the right to remove any content for any or no reason at any time" in their TOS and it wouldn't be valid in Germany.


This is as it should be IMO. When you position yourself to be a de-facto public square, you shouldn't get to selectively operate like a private club when it suits you.


Don't judge too quickly. The ruling also made clear that it is judges who determine if content violates the law. It was not the case here. But it also means that platforms would have to comply with the judiciary restricting content and face fines if they don't. That wouldn't be too positive for the net and its development under certain circumstances.

Double-edged sword...


I don’t see how would it be a double-edged sword. It’s a meme, but we do in fact live in a society, with laws made mostly democratically by people (ideally) — which is the closest thing at this scale to a fair system. Why should a private company get to decide what it wants over the local laws? If it were to host child porn, should we be okay with that even though we collectively have voted to ban any form of pornography of minors?


I think that, at least if you replace "shouldn't get to" with "shouldn't", then for some values of "position yourself to be", this would be right.

I feel like it is largely a question of what expectations you work to produce among users and potential users.

If people are actively aware of the organization saying "we can remove videos on a whim if we want to, and we very well might do so", and especially if they are generally known to occasionally do that in practice, and this isn't a surprise to any users with a reasonable level of awareness, then I think that's probably fine.

But if the impression they give to a typical reasonable person is that they are "a platform for almost everyone that behaves according to fixed rules, where you can be reasonably assured that if you follow the stated rules your content will stay available, as much as we are able to accomplish this", but then they go around removing videos on a whim / because they have a grudge against someone / because someone paid them to / whatever , then that's not good, because they aren't being honest about the service they provide.

Whether the govt has any place making an organization who is misbehaving in that particular way stop, I'm, not sure one way or the other.


Isn't this punishing Youtube for their own success at achieving a dominant market position? They didn't position themselves as the de-facto public square so much as became one due to lack of competition.


Yes. Not so much a punishment as a responsibility that comes with the position. The state isn’t special: if you have deep and fundamental influence over the lives of others, you are burdened with certain responsibilities. Who cares if youtube is a private company, and it’s not liking forcing them to respect rights is hurting their pocket book: if you have that much power over people, you are held accountable.

This idea that private companies earn special privileges in this regard because they were started by individuals needs to die.


Absolutely. It isn't collectivism to regulate industries and companies that have enormous power to do good or harm. This idea that private property is unconditionally off limits is radically individualist and libertarian and just as bad as collectivism. We aren't atomized individuals. We all live in a society with a proper order and a common good. When a company begins to do harm to the common good, it must be regulated (or at least certain activities should be criminalized which is probably a better default in general than regulation). The notion that "I can do what I want" as long as it takes place in the private sphere is absurd. The private is distinct from the public, but it also affects the public and vice versa. Thus the presumption is in favor of not regulating private activity, but not an absolute and universally binding one.


The notion that "I can do what I want" as long as it takes place in the private sphere is absurd.

So what's your take on Lawrence v. Texas?


Not the OP, but I think "companies" and "individuals" are very different, and discussions around corporate responsibility and society are different from regulating social behavior between two individuals.


To add, I think that rather than a person -vs- company dichotomy, it’s better to frame the discussion in terms of the total resources that can be brought to bear, and whether that can have an “undue” influence (relative to the power of all the other agents in a sector). Eg: Bill Gates (an individual) is much closer in this sense to YouTube than a typical small company employing a dozen people.


So what about regulating it between three individuals? Or four? Or four hundred? At what point does collective social behavior become (y)our concern?


Since when is a mathematical definition necessary for this? Finding the "exact line of separation" between a couple of individuals and a powerful megacorp sounds like a fun mental puzzle, but it is not something that comes up in practice. What is the thought experiment in which finding the exact delineation is of importance? On the other hand, it is pretty easy to distinguish the limiting cases. This is like worrying about how large an asteroid has to be in order to be called a planet.


Profit motive, the ability for newcomers in your market-space to challenge your position in your market, and the form of your organization.

Two people in their house,

a large, transparent non-profit advocacy organization,

a dark money lobbying org, and

a hundred-billion-dollar for-profit incumbent that acts as a pillar of human interaction for news, education and entertainment across the globe

are very different situations. It isn't as simple as scaling up numbers.


At the point where a collective asks for the special privilege of forming a limited liability corporation?


[flagged]


Well that escalated quickly...


> became one due to lack of competition.

By phrasing what happened in the passive voice, you are downplaying the fact that Google actively strove to make it hard for new competitors to become established.

Of course there still are competitors to YouTube, depending on how we define the market for online video sharing, and Google didn't invent Metcalfe's Law, but nor should they be surprised that it was hard for alternative sites to compete. That is, after all, why they bought YouTube in the first place and discontinued Google Videos.


Yes. This is antitrust targeted at practical impacts to everyday life rather than at narrowly defined, hard to prove classes of "anticompetitive behavior", which is the route that the US has taken.

An attempt at practical antitrust regulation might mean, for example, that if a Walmart replaces every other business in your town through the sheer merit of its business model, that Walmart doesn't get to refuse to do business with you on the grounds that the manager hasn't liked you since high school. It might even mean that you reserve Constitutional rights like assembly/protest within the area that is functionally the public square, which is the private sidewalk in front of the Walmart.

I know which approach I would prefer.


They made an app that has the power to sway public opinion and they reap billions of dollars of profit from it. They can afford the burden that they took. If they want to participate in the economy and society, it seems only logical and right that they are required to not have a detrimental impact on society. We don't allow companies to do whatever they can do when chasing profits. We don't allow companies to pollute however much they want so they can make more money, we don't allow car companies to make death trap cars, etc.


"Punishing" isn't really the right word, although it certainly feels that way from their position. As democratic societies, we've long felt it necessary to restrict and regulate monopolies and oligoplolies.

I've recently had to talk to my internet provider, and that certainly increased my citizen's fervor in regulating companies in dominant positions, because every company will immediately turn around to milk/punish consumers, and try to influence politics for their benefit.


Sure, but it's also the job of the government to oversee the power corporations have and how they wield it. No single corporation should be allowed to actively shape and censor information just because they are successful / a monopoly.


YouTube's very “success” is “rewarding”.

But to me, it's not a punishment. It's a natural consequence of YouTube's dominant position: they have to start acting like it.


The way I interpret it is that wording an agreement in terms of "we can delete whatever for whatever reason" doesn't specify in advance exactly what is fair game and what isn't. This is similar to Rossmann's complaints that NYC shouldn't get to fine a business for an alleged violation if they themselves cannot explain what the rules are in the first place.

So ideally if a video is to be taken down, it ought be under a rationale along the lines of "you agreed to not doing [very specific thing] and this video does it, therefore we took it down", not "technically by clicking on a link in our site, you agreed to fine print that says you'll sell us your soul, so suck it"


> Isn't this punishing Youtube for their own success at achieving a dominant market position?

Yes, and that's a good thing because it encourages competition. No market should have a player with such a dominant position.


I'm not sure its punishment as long as they keep making their pound of gold. In fact it probably makes it easier for them since they can just say sorry it was x country who made us do it allowing them to deflect any responsibility.


its not punishing youtube its making sure private market forces don't distort speech rights across the entire country, as they are doing in the US.

Companies exist to benefit the state and the people. If a company is so successful it distorts rights for everyone and puts fundamental rights at risk, it's not 'punishing' the company even though it might feel like it. It's making sure the conditions that created success for youtube remain in place for future generations - free speech to some degree, competition, etc.

I think corporate personhood and corporate rights are a cancer on society and the corporatism/corporate state/fascism-lite that the west is already deep into is destroying the fundamental freedoms in the country. Disney will always have more money than you to argue for their corporate rights against your personal rights.

This ruling is wisdom.


> They didn't position themselves as the de-facto public square

No, they bought that status by regularly sending dump-trucks full of gold to record labels.


>Isn't this punishing Youtube for their own success at achieving a dominant market position?

No, it is just attaching some minimal responsibility to the enormous benefits that YT exploits their dominant market position for.


Hmm, when did they do this?

I actually think Germany has it backwards here. If the German government wants to provide a free-speech "safe zone", they should provide it themselves.

You're encouraging government coercion of action.


Think about it this way, you build something that looks like a pub, provides the services of a pub and say you are open for business. Would it be really that surprising that the state enforces laws that apply to pubs on your business or would you tell the health inspector to fuck off and open his own pub if he has problems with it?


It’s the wrong analogy. Phone service is the correct one. It’s something basically everyone uses. You should be able to call or group call anyone you want. The phone company can’t claim they own the phone lines so they decide which content can be discussed or who is allowed to make a phone call. It’s that simple.


Whenever someone proposes a simple solution to a social or political problem that nobody can agree on, and says "It's that simple", they're marking themselves as not understanding what they're talking about while arrogantly pretending they do. It's automatically not simple or it wouldn't even be in the news. You certainly aren't the authority on it.


So you would be ok with a phone company that doesn't allow certains groups of people to make phone calls? For instance, people whose political views it doesn't agree with? Or even worse, people with certain ethnical backgrounds?

Of course now you might say, if such a phone company existed, those people who were denied service could just choose another one. But if this company would be the one dominating the market, this would mean they couldn't reach all the people they would want to anymore. This is the analogy with Facebook.


Seems more like the government ordering you to serve a disorderly patron you previously ejected


If you want a real analogy, it's more like a pub where most drinks are free because they're served by other customers and the owner makes money by auctioning ad space on the glasses. Anyone can go there to hand out free drinks, provided they follow the rules, like no alcohol in the kids' corner and no coca wine. If someone's drink is very popular, they get a cut of the pub's advertising revenue.

Now someone walks in and starts offering Corona. It's not everyone's favorite, but still decently popular, and the owner has no problem with that.

Until a few hours later, when a new line is added to the list of rules on the door: "no Corona". Then the owner tells the person handing out free Corona that they're breaking the rules and kicks them out.

Is that fair? Not in this case, says the court. Everyone agrees to the rules when they walk through the door, but the owner can't just willy-nilly change them after the fact.

(If you want to really understand the decision, you'll have to read the original instead of relying on analogies.)


I don't see any similarities here: A really disorderly patron is hard to avoid in a restaurant but easy to avoid on YouTube.

Then, on YouTube (or woke mailing lists) there are mostly two sides, only one of which gets censored: The one that opposes the dominant clique.

YouTube is full of filth that is kept up because it is non-political and makes money for Google. If Google were really woke, it would take down all videos that are demeaning to women (according to their ideology). They don't, because these videos are a cash cow.


Yeah, but it’s still my property and you can’t make me serve a beer I don’t like.


If your pub, for example, refuses to serve people of color, then yes, the government will come in and shut it down.


But you will be required to serve non alcoholic drinks.


that are cheaper than the cheapest alcoholic one


> You're encouraging government coercion of action.

And you're advocating for foreign private companies to have more rights than the democratically elected governments of the countries they operate in.

Last time I checked laws still come from countries, not from Google. The world isn't America's private companies playground


> And you're advocating for foreign private companies to have more rights than the democratically elected governments of the countries they operate in.

The biggest difference generally between a private enterprise (foreign or domestic) and a democratically elected government is that participation with the government and its rules are:

- compulsory - enforced by physical violence

If I disagree with googles rules I can avoid google. If I a disagree with the governments rules (for example tax rates they impose), there will be people with weapons to force compliance.


> If I disagree with googles rules I can avoid google.

That's exactly what Germany's highest court denied if they "move into a dominant position and take over the provision of the framework conditions of public communication themselves" because it is not easy to avoid Google/YouTube.


Sadly Google isn't your mom and pop convenience store at the corner of the street. Arguing it is just a regular business is dishonest and you know it


If you don't agree with the laws, emigrate! It's probably easier than completely avoiding Google.


Swedish government asked Google to remove YouTube news channels that was critical of the Swedish state owned media, and Google did.

I think we are way past the illusion of democracy and governments protecting their people at this point.


I'm talking about national sovereignty of democratically elected governments.

They might have very different views than Americans and Google about what should or shouldn't exist.

As if America and its private companies have the monopoly of what's "right" and what it means to "protect the people" of the world, this is ridiculous


Do you have a source for this please?


So using "foreign" as a pejorative is cool as long as the foreigners are Americans?

Companies can only provide products and services to willing customers. To restrict them is to restrict the choices of your own people to choose products they prefer. So what you're really arguing for is the majority in a country restricting the freedom of a minority in that same country.


> So using "foreign" as a pejorative is cool as long as the foreigners are Americans?

How did you extrapolate that ? No

I really don't get your second point. You can't sell me a car without seat belt, you can't sell me weed, you can't host a jihad video platform.

You're already restricted in many ways. The world doesn't have to kneel in front of YouTube &co and accept all their bullshit.

America isn't a role model, neither in politics nor culture, the world doesn't need to conform to what America think is right.

Also, YouTube doesn't exist to serve the people, certainly not, it exists to make money and uses tax evasion to pay as little as possible in most EU countries, if they wanted to serve the people they could start there


You're the one who brought national origin into the thread. The arguments on both sides were made just fine without it.

As for the second point, yes and those examples are also cases of restricting their own citizens. Hence the discussion is still about rights of the citizenry versus their own government restricting them, not "rights" of companies as you put it.

The world is free to not use YouTube and always has been. And by the way I don't agree with many of the major websites' censorship decisions in the last couple years either. They depict only what a loud subset of America thinks is right.

Companies generally make money by serving people. I suppose there are business models where this isn't so (e.g. they exist to sue people or something, but exploiting govt's ability to take money by force instead), but in YouTube's case, they entice people the consensual way.


The vast majority of companies shaping the social discourse are American. You can't conveniently ignore the fact, it's inherently part of the discussion. I'm not drown in Gambian or Afghan cultural bs


> If the German government wants to provide a free-speech "safe zone", they should provide it themselves.

But that is what they are doing here, no? They just use Youtube as the way to provide it. Governments doesn't have to run businesses themselves, private actors are much better at running the day to day stuff usually.

Similarly if you own a huge amount of land and build a city there and let lots of people move in, don't get surprised when the government seizes your square and roads and make them public, or at least force you to let people move there as if they were public roads.


but the government isn't paying YouTube to host videos. eminent domain and related takings require compensation


Historically corporations were given a license to operate by the state, and limited liability conditional on their serving a public good.

Some people currently seem to have the attitude that a corporation can be protected by the state but not responsible for its actions that negatively impact the common good. This was not the original intention.

> But wether public or private, corporations were originally only granted special legal privileges by government, conditional on them serving some PUBLIC good. With special rights came special restrictions, and their operations were periodically reviewed for compliance with their stated purpose. However, over time the system of incorporation has been altered by corporations themselves, such that the benefits of state-grants have been kept, while the responsibilities discarded. [0]

[0] https://ptolemy3.medium.com/but-corporations-are-private-com...


This isn't eminent domain, the government didn't decide to build a video service that they need to evict Youtube for. Apparently Youtube decided to remove a video without proper support in their terms of service, so they're being judged to be in violation of their contract with the user.

Whether or not Youtube should have a clause in their terms allowing them to remove arbitrary content is a different question, and yet another is whether they should be allowed to have such a clause.


YouTube is free to pack up shop and stop servicing Germany if they do not like it's Terms and Conditions.


Perhaps in your jurisdiction, but in this hypothetical example, I guess not.


I am surprised you are downvoted. Engineers are the worst kind of people to make rules, as they see things as simply black and white when things are much more nuanced. Best thing to do is "mind your own business". Unfortunately they feel its their business lol


They are not a public square. They do have the right to censor you. Especially, if you spread misinformation and hatred.


Isn't the whole point of this ruling that they don't have that right, at least in Germany?


Seems like Germany should simply nationalize the platform if that's what they want.


Do you think Germany "nationalizing" YouTube is less extreme than this ruling?


Forcing Google to host content against their will is pretty close. I guess in this case it's better than nationalizing though, Google has to pay for the hosting, and you get to tell Google they're not allowed to delete videos.


I require guests in my house to take their shoes off. They don’t have to do it, they can just leave. But if they want to stay in my house they need to take their shoes off.

No one is holding a gun to YouTube’s head and telling them they have to serve video content to Germans. They want to be in Germany serving video content, so it’s only fair and just that they should listen to the Germans and do what they say or leave.


A country isn’t really like your house though, is it?

Germany could probably tell Google not to host any videos from the opposition political party under penalty of fine or expulsion. Hopefully we agree that that would be bad?


Germany could also force all food establishments to add a little bit of poison to the dishes. Would that be an argument against government regulating food preparation standards?

We can agree that bad things are bad, but that doesn’t really gives us much insight.


But Germany can't do that because it would be against Germany's constitution.


It's a hypothetical. A Constitution can be amended or ignored, can't it? My point is that a country is not your house. And citizens are not visitors. It's the wrong analogy and leads to bad conclusions.


s/can't/probably wouldn't/

It is entirely possible for countries to do things which are against their constitution.


You know that Germany does actually outlaw promotion of certain political parties, right? Hopefully you agree German law is bad or you're disagreeing with yourself.


Reminder that Germany doesn't have the same definition of "freedom of speech" that the US does (not making the case that one is better than the other, but simply that they are different.)


They did not delete it because it was unprofitable to host.

If they want to diminish the impact legally and profitably, they can just drown it in in more intrusive ads.


Law overwrites contracts


Sad you're getting downvoted, I was thinking about this the other day, if nations really want to limit applications on the internet, they need to create a free to use high quality national alternative, and personally I don't think this is an awful idea. Canadian "twitter" provided by a gov entity? I would for sure be interested in that. The ability to use a home grown twitter that I needed to use some government ID to log into and had been thought about sensibility, at least then I'd know what I'm getting myself into. I'm not saying it should replace regular internet, just that I wouldn't be apposed to more controlled nationalized alternatives. I'd like to talk to fellow Canadians, in the "comment section", in a manner that I know with over 80% confidence they're actually in Canada.


You can make your own Mastodon instance where you could possibly/probably have verification required to approve members based on some Canadian IDs - but it sounds kinda overkill really.


Good luck finding anyone else bothering to use it. Twitter isn't just some computer software, it's also network effects and the couldn't-have-been-predicted product of a competitive market. How would the government service be known in advance to become a Twitter and not a MySpace?


It's not theirs to take.

And anyhow, the GDR dissolved over 30 years ago, I should think their economic policies died with them.


Is it theirs to force them to host content Google wishes to delete?


I don't think it's unreasonable for companies with publicly available communication platforms be required to abide by laws in the countries they make those platforms available in. If youtube decided that they would host videos in germany on an invite only basis, it would be different.


No matter what the law?


It's theirs to host the videos themselves if they deem them worthy of being hosted.


The German government? Yeah I think they should host videos themselves if they want them to be online and google doesn’t.


If the people want a free-speech-respecting public square, they should petition their state to create one that is compelling enough to become the choice of the majority, not let private citizens do the hard work of creating an effective square, and then use force to commandeer it.


Why? We could let the free market come up with competing solutions and regulate where necessary.


1. Because it's a violation of other people's rights to control their own private property. It's predicated on using the threat of imprisonment against people acting peacefully, which infringes upon human rights.

2. Because it's lazy and lacking credibility: it's a claim to being privy to 'the correct path for society', without showing the ability to create something of value.

The state could still tap market forces to bring about public squares that abide with its guidelines, by providing subsidies to private parties developing such platforms, in exchange for the parties entering into an irrevocable covenant that requires them to follow the state's rules.


1. A company is not a person.

Also, the company did not have to go to jail, it got a fine.


"Punks" arguing for corporate rights to ban content in the name of human rights.

I think I will take a stroll through the forest, maybe I meat a bear that plays guitar and sells refreshing lemonade. I am prepared for anything.


Punks don't rely on the state to force private citizens to let people use their platforms.


A company belongs to its shareholders. Commandeering it violates the shareholders' rights to their property.

All legal injunctions are predicated on the threat of imprisonment, even fines:

https://www.theatlantic.com/politics/archive/2016/06/enforci...


> A company belongs to its shareholders.

A company is generally its own legal entity, intentionally separated from its shareholders to protect them from any fallout should the company ever go bankrupt or get sued into oblivion. You can get into serious trouble if you founded a company and failed to correctly distinguish between your own and the companies property, because as far as the government is concerned you explicitly told it to make this distinction.


It's its own legal entity, that is owned by other people.

And there is very little privilege that incorporation grants corporations that is not entirely contractual. Limited liability for shareholders for tort is the only one I can think of, and that, in my opinion, should be repealed.

Limited liability for debt can be entirely contractual in its basis, and established by shareholders/companies operating outside of the corporate structure too, simply by stipulating that condition in any loan agreememt the business enter into with another party.


In Germany, near the top of our most fundamental lawbook, the Grundgesetz, it says:

> §14.2 GG: Eigentum verpflichtet. Sein Gebrauch soll zugleich dem Wohle der Allgemeinheit dienen. [0]

Which roughly translates to:

> §14.2 GG: Property obliges. Its use should also serve the public good.

[0]: https://www.gesetze-im-internet.de/gg/art_14.html


Any state control over private citizens can be rationalized in this way. Private property is a private creation, and the use of public resources by the individuals that create it should not be used as an excuse to deny its creators their rights over it, any more than the use of public resources by individuals should not be used to invalidate their right to free speech.


The clause only affects when the platform is big enough. It's a concentration of power in the hands of a few, certainly not the shareholders as you say in other comment. That such a concentration of power will be abused is almost a law of nature and so laws are put into place, the same way there are laws to prevent monopolies, to the detriment of shareholders also. You may think that this practicalities don't align with your ideology which is fine but as someone living in a country with multiple nation wide and regional state owned television channels I think they are terrible idea because they are extremely inefficient compared with private ones and also degenerate into propaganda machines for the political party in power, and the longer the same party is in power the worse.


It's not an abuse of power if a) it's contractual and b) it's private property. One doesn't automatically become a victim, incapable of provide genuine consent to the terms offered, by virtue of the party offering the terms having a service that they badly want or need.

If you believe otherwise, then I challenge you to convince a court of that. Courts, under common law, will invalidate any contract where the parties to it did not provide informed and genuine consent to the terms contained in it.

Resorting to legislative intervention suggests lacking the confidence in the validity of the allegations that premise it to use the judicial free market track.

>>as someone living in a country with multiple nation wide and regional state owned television channels I think they are terrible idea because they are extremely inefficient compared with private ones and also degenerate into propaganda machines for the political party in power, and the longer the same party is in power the worse.

Two points:

a) you think the polity is incapable of managing large organizatons competently enough to compete with shareholder-run private enterprises, yet you think this same polity can craft effective cookie-cutter rules that will affect millions of interactions a day. It's not a coherent model of the world.

b) the state is not limited to organizations under its direct management to support its policies. As I explained in a response to a sibling comment of yours, the state could still tap market forces to bring about public squares that abide with its guidelines, by providing subsidies to private parties developing such platforms, in exchange for the parties entering into an irrevocable covenant that requires them to follow the state's rules.

Yet another option would be the state funding decentralized open-source protocols that are capable of entirely replacing centralized parties in many markets. The state, being publicly funded, is the only entity capable of cost-effectively funding non-profit initiatives like this that produce public goods.

My larger point is that the state should not be monopolizing industries. It should provide alternatives to the private options. If it fails, the public still has the private options. If it succeeds, it did so by providing an option superior to those provided by the private sector. This imposes accountability on the state, by leaving it with competitors to act as a yardstick, while ensuring its interventions only impact the market to the extent that they improve it.


I do think the state is capable of managing very large organizations, like public education, health, infrastructure... And the political party in charge may obtain votes by doing a good work. But in the case of the media I think interests are not aligned. I think putting in place laws to regulate private companies is the lesser of two evils.

In Spain at least there are lots of instances of courts overturning contracts, the most famous ones maybe related to mortgage conditions. I'm not sure if this is good or not in the long term, but it's the case.


I would submit that any large organization, whether it's public education, health or infrastructure, can be heavily abused for political gain like providing patronage opportunities.

The Japanese infrastructure slush funds are a classic example of that.

And I would also submit that state intervention in healthcare and education has been a dramatic failure, with costs in these two sectors skyrocketing over the period in which the state expanded its role in them.

>>In Spain at least there are lots of instances of courts overturning contracts, the most famous ones maybe related to mortgage conditions. I'm not sure if this is good or not in the long term, but it's the case.

If a court overturns a contract, that is fine. One of the roles of the state, in a free society, is to rule on contracts via its courts, and invalidate those that do not meet the bar for consent.

But legislative bodies do not engage in the same impartial and deliberative process as courts so they are not the proper venue through which to restrict private interactions.


Put limitations to what a private entity can do on national soil is completely in the hands of local governments. If you want to make money here, you follow our rules. Completely free capitalism brings the kind of aberrations that we can already see destroying the livelihood of millions of people in the USA


That is abuse of the monopolistic control the state exerts over the land within its jurisdiction. It would be equally abusive if the state used it to prohibit homosexual intercourse, marijuana usage or atheism-promoting speech.

>>Completely free capitalism brings the kind of aberrations that we can already see destroying the livelihood of millions of people in the USA

The idea that the US has "completely free capitalism" is the Big Lie promoted by the state's unionized activist bureaucracy. The US, along with the rest of advanced nations, has rapidly moved away from being a free society with economic liberty, toward one that has a significant amount of centralized control over the private actions of its citizens:

https://ourworldindata.org/grapher/social-spending-oecd-long...


Is it really a mainstream position of modern libertarians that state-owned corporations competing against private interest is better than passing a law against some externality? This reads like a Chinese Communist Party press-release and not a non-aggression-pact, individualist libertarian.


I can't speak for the mainstream libertarian position, only my own beliefs on what is consistent with human rights and what ruleset would likely be conducive to economic evolution towards greater functionality.

There is nothing wrong with state-owned corporations in principle. They need not violate any one's rights. How they are funded, and what laws are created to assist them, can of course be problematic, as can the general inefficiency of the state sector.

But if the funding sources for the state subsidies are not based on violating private property or contracting rights, there are no laws enacted to give the state-sponsored enterprise a monopoly over a market, and the subsidies address an externality to produce significant positive economic returns, then it would be fine.

This approach to government intervention is less risky, as failure of government administration will only waste the resources expended on the state-sponsored enterprise, while leaving private citizens with the private sector alternatives. Regimenting an entire sector with top-down rules risks destroying that entire sector if the government chooses the wrong set of rules.


i love this because to all those who say that only government may not censor and private companies can do whatever they want, it makes clear that censorship by private companies that dominate our communication has the same effect as government censorship, and should therefore be under the same restrictions.


Is this not in direct conflict with the quote from the article / OP. The summary should have this been “regardless of the the contract, we hold that …” which is different from the aforementioned quote:

> The German court held that YouTube failed to make its enforcement authority clear in its contract with the account operator who posted the video.


It's a multi-layered argument and a bit of it probably got lost in the process of summarizing it for news reporting and the subsequent translation. You can find the original decision from April 2021 where YouTube was ordered to put the video back up here https://www.justiz.sachsen.de/esamosplus/pages/index.aspx by searching for Aktenzeichen "4 W 118/21" and then clicking on „Dokument öffnen“.

It does contain language quite similar to "regardless of the contract, we hold that ...":

Dahingestellt bleiben kann dabei, ob die Nutzungsbedingungen bzw. die „Richtlinie zur medizinischen Fehlinformation über COVID-19“ einer AGB-rechtlichen Kontrolle standhalten, insbesondere, ob sie dem Transparenzgebot genügen bzw. den Nutzer nicht unangemessen benachteiligen (§307 BGB). Denn die Inhalte des streitgegenständlichen Videos verstoßen bereits nicht gegen die Ende Januar 2021 gültige „Richtlinie zu medizinischen Fehlinformation über COVID-19“ (aa). Bezüglich der Neufassung der vorgenannten Richtlinie hat die Beklagte dagegen nicht glaubhaft gemacht, dass diese wirksam in den Vertrag einbezogen ist (bb).

"The question whether the terms of use resp. the "guidelines regarding medical disinformation about COVID-19" would withstand a check under ToS-law, especially whether they are sufficiently transparent resp. inappropriately disadvantage the user (§307 BGB) can be left aside. Because the content of the video under dispute already does not run counter to the "guidelines regarding medical disinformation about COVID-19" that were in effect at the end of January 2021 (aa). As for the revised version of the aforementioned guidelines, the accused has not convincingly argued that it has been effectively incorporated into the contract (bb)."


The OP is about the ruling of a lower court though. I guess they didn't even reach the fundamental question.


>wrote this in one of their rulings:

Which ruling? Have a link (in German is fine)?



That's very encouraging. We should make this a law on the EU-level.


Isn't it kind of implied that if you own a platform, you retain the discretion to remove stuff at any time, unless otherwise prohibited? The YouTube terms of service can be updated at any time, and will never include language promising to never take down content.


I think it's just an American thing to treat private companies like the Wild West.

Companies have obligations too, and I think that in Europe (compared to the USA) more people would agree that it's better to limit companies if it benefits people.

Certainly, in this case, it's better for people to have their protests heard than for Google to flex it's censorious whims.


Terms and conditions are invalid in Germany (and probably much of Europe) if they run counter to established civil or criminal law. They are regularly thrown out in court.

For consumers, it is even better not to read the terms, because then they can say that they have been taken advantage of (it is harder to do that if you change a passage manually in a written contract).


That argument would imply that the terms of service are worthless as the platform owner can do whatever they want. That’s not how contracts work, at least not in Germany.


Imagine being responsible for writing the individual ToS for ever nation on Earth. That sounds hard.


They probably didn't bother at all. Not long ago Apple got fined because they implied that they would only provide one year warranty unless you paid for more, in the EU two years are the absolute minimum and telling your customers otherwise is not an option. As far as I could tell they just wrote the US version and translated that for every non English speaking country.

Google might also be intentionally lazy, the Google Play licensing terms for example generally fail in front of a court when they are challenged, as they actively violate every law on anti competitive behavior you could think up. Whenever that happens Google just carves out a new licensing region for that courts jurisdiction and keeps the terms unchanged for the remaining world.


Apple's Finnish warranty conditions still say that they offer a limited one (1) year warranty on their products. However, the warranty terms start with a bolded section about consumer protection laws and how this limited one (1) year warranty is an additional protection offered voluntarily by the manufacturer, and that it does not replace the rights given to consumers by these laws or in fact apply in claims made under consumer protection laws.


Imagine doing business and getting revenue in every nation on the Earth. That sounds hard.


There are systems, such as international shipping and financial technology that simplify many of those difficulties for you, at least to the point of making revenue from a great many countries. Complying with the local laws (including paying taxes) is indeed where it gets hard.


Many ToS often contain disclaimers all over the place to the effect of "this doesn't apply in cities/states/provinces/countries where the law says otherwise" and catch-alls to that effect. Sometimes that's enough, sometimes not; if the company still tries to enforce rules in a region where they are invalid then the ToS is simply even more meaningless.


> Isn't it kind of implied that if you own a platform, you retain the discretion to remove stuff at any time, unless otherwise prohibited?

This is a discussion about Germany making it clear to Google what they're prohibited from removing.


I also do wonder, how grey the area of hate crime is in some countries and might of seen YouTube guilty of that possibly if they had left it up, certainly the comments section would of been a light to the moths.


That is also extremely hypocritical. Youtube is expected to have clearer TOS than the goddamned government?


They probably do.

Powerful enough governments can just choose to ignore or re-interpret things any way they like and act on them, though.


The original meaning of "...the pen is mightier than the sword."

The clause in the sentence before that part reads "Beneath the rule of men entirely great,..." It was referring to authoritarian government.


>Powerful enough governments can just choose to ignore or re-interpret things any way they like and act on them, though.

Seems like a common trend in many countries around the world. Tech companies tend to be american, so they're a popular punching bag/scapegoat when it comes to enforcement actions.


> YouTube unsuccessfully argued the video violated its policies on Covid-19 “misinformation.”

It sounds like YouTube's stance is that the video may cause harm to their users; one of their criteria for removal.


> may cause harm

Literally anything can be argued to be "harmful."


As an example, some people consider GOTO statements to be harmful.


Or more precisely; some people don't. /s


Actually that's a good analogy.

If there weren't people that blindly accepting it as the truth and verified everything this wouldn't be bad, in fact would be good because it could drive attention to something that perhaps was ignored.

Unfortunately large part of population has no idea how to do fact checking, and what's worse many take their news from the Facebook.


I imagine YouTube removing C tutorials because they "encourage harmful behaviors"


Well at least the government isn't mandating that all "harmful" content be taken off the web...

https://www.theguardian.com/technology/2020/dec/15/online-ha...

(For what it's worth, the proposed law now refers to "safety" rather than "harms").


>Germany

>UK


Maybe that's why their challenge failed.


> the video may cause harm

Or let me try, is “hateful”?

Yeah. That about clears it up, I guess. Clear as mud.


A knife can do even more harm to the user, but they are still sold. "Can do harm" is a slippery slope.


That appears to be largely what Germany thinks as well.


Recipe for chilli sauce may cause harm. It's not a good argument.


Yeah that's probably why it lost.


A good move. Everything so big as YouTube should follow national laws. Otherwise private terms of policies become more part of society than national ones. Then the hundreds of years of fighting becomes nothing and we are back in a modern kingdom based society.


What does "national law" mean for a global platform though? Does Youtube only have to restore the video in Germany and still block it everywhere else?

I don't even understand why Youtube should be forced by a country to host someone's content. I can understand the opposite (being forced by the government to take down content in their jurisdiction, that has been done before), but forcing a site to keep a video feels really strange to me.


> I don't even understand why Youtube should be forced by a country to host someone's content.

It doesn't appear to be settled even in U.S. law whether requiring a company to host something would be governmental overreach. The traditional "common carrier" doctrine is pretty broad. It's not an exact analogy for the kind of regulation some people would like governments to undertake regarding YouTube, Twitter, Facebook, etc., but it's not totally unrelated either.

Eugene Volokh from UCLA's law school recently posted a 79-page draft article on the legalities of regulating social media companies as common carriers under U.S. federal law: https://www2.law.ucla.edu/volokh/carrier.pdf. One large section, II.A., "The General Constitutionality of Compelled Hosting" (pp. 35-58), argues that compelled hosting isn't in general prohibited by the U.S. federal constitution, so in his view the U.S. Congress could pass a common-carrier-style law mandating such companies host content on a neutral basis, if they wanted to (at least if the law avoids various issues he identifies).


youtube is effectively operating a bridge, or rather the majority of bridges in germany. it pays for maintenance but gets to charge a toll and keep the profits. however, in order to use this privilege, it does not get to decide who should be allowed to cross


it seems that some people disagree with the analogy. could you explain why?


What if it is an authoritarian government that enacts national laws for the purpose of censorship?


Companies themselves are authoritarian organizations that enact international policies for the purpose of extracting wealth from a local population. Countries that have skin in the game for their own geography are the least of your worries, they are optimizing locally. Meddlers without skin in the game are more problematic.


When did youtube , or any big tech not comply with local laws in authoritarian countries? Never, their position is not one of activism it's "I'm OK with that"

https://www.engadget.com/facebook-turkey-emails-200407588.ht...?


They already do, and Google complies with those laws. See China. It is quite telling that Google seems to put more effort into complying with the CCP than German law, however.


Is that really what we want. Most language is allowed under US law (which is a good thing imo), but does YouTube really need to allow hate speech, brigading, or astroturfing?

Clearly the line is not 'all speech that is allowed in the US'?


the question is not which kind of speech do we want to allow, but who gets to decide that. we don't want government to decide that because its decision is absolute and leaves no room for escape, hence the laws for free speech, but the german court effectively argues that on a platform as dominant as youtube, censorship has the same impact as government censorship and should therefore be treated the same.


Why should YouTube allowed to dictate what hate speech means? From the country I am in, the party that must not be named is notorious in making hate speeches, and the videos are widely available on YouTube. YouTube is a private entity that positioned itself as a huge repository of public opinions and videos, it must not decide arbitrarily based on a stupid algorithm what is correct. They must be regulated.


It would be interesting to apply a varying degree of regulation based on an organizations size/revenue.

Startups less regulation, global conglomerates more regulation.


I've tried to find the actual court decision, best match seems to be 4 W 118/21 (enter on https://www.justiz.sachsen.de/esamosplus/pages/index.aspx under "Aktenzeichen" & click "suchen", can't see any obvious way to link the PDF...)

However, that document does not actually order a fine. It warns about imposing one ("up to 250'000€") if Youtube doesn't remedy the problem. Considering it's now exactly 3 months later (decision dates April 13th), that lines up.

From what I understand (not a lawyer, yadda yadda) this is a purely civil matter and doesn't really go as far as some statement on free speech. The issue discussed mostly seems to revolve around which terms of service are valid and how they can be changed.


Youtube Alternatives :

Centralized : Dailymotion, Bitchute, Rumble, DTube, Vimeo, Vidlii

Decentralized : Odysee(LBRY), Peertube


How does one contribute to these (de)centralized alternatives? Apart from sharing links?


You can upload videos and help seed videos too.

Other ways to help are here : https://lbry.tech/contribute


> Bitchute

Never heard of them, but they seem to have chosen a poor name that might be censored elsewhere due to the first five letters.


I don't know if it still works like this but the initial idea with that one was that they would use the bittorrent protocol to help out with their hosting bandwidth any video you're currently watching would be peer-to-peer transferred to other users.


All the alternatives are far-far-right as far as "trusted sources" are concerned. What is anyone supposed to do if even LBRY is considered a tool of radical right wing extremists? If you cant get your family and friends on it then its near worthless.


If they're that prejudiced against tools based on political reputation, then they weren't that interested in using the tool in the first place.


What's the law they ran afoul of here?

I'm getting the impression from the Google Translate'd version of this[1] article, that they violated a contract with the company?

[1]: https://www.welt.de/politik/deutschland/article232421961/OLG...


If I understand correctly, the fine was imposed because Youtube failed to comply with the order of the court to make the video accessible again, which took them several weeks (20. April - 14. May 2020).

What confuses me is that the article states the video was removed in January, which would only make sense if it was January 2021, because January 2020 there were no restrictions in Switzerland. Maybe the above date should be May 2021.

I think the relevant piece from the Welt article is this:

> Es [the court] kam unter anderem zu dem Schluss, die geänderten Richtlinien seien nicht wirksam in den Vertrag mit dem Accountbetreiber einbezogen worden. Hierzu sei ein Änderungsvertrag erforderlich. Der bloße Hinweis, dass es künftig Änderungen geben könne, genüge nicht.

Roughly translated, the changed ToS of Youtube were not contractually effective, because the user was not asked to accept them. Stating that changes can happen any time is not enough.


Should be about something else than ToS, the quote is talking about a contract. The news is not sufficient to understand what supposedly happened there. That the Youtuber had a contract with Youtube, is that common? Maybe the court judged the partner program agreements as one?

That "future changes may apply without notice" is invalid in Germany, that's evident. Companies writing bullshit like that in AGBs (ToS) or contracts don't realize Germany is not the US, they need a new legal team.

If the news is correct you are right about the fine being about ignoring the court, not about removing the video in the first place. With a fine that high it has to be about that, there is no way removing a regular video could cause this. Basically the court saw youtube in contempt of the court, kinda.

Might also be politically charged though, East Germany is highly penetrated by Nazis and with a high percentage of corona deniers. Depends on the judge of course whether that was a factor here.


Yeah, it's pretty clear the fine is about not obeying the court.

But I was asking why the court was forcing them to re-instate a video in the first place.


Right, that part makes no sense at all to me. Even if the ToS did not include a fitting clause for the misinformation removal I see no binding right for the user to have his video hosted. That's why I added the tangent about the political environment in Dresden.


The ToS provide the binding right for the user to have the video hosted. It's in the name "terms of service": you follow the terms and in return you're provided with a service. In this case, the plaintiff complied with the terms but was denied service. The court didn't approve of that. I doubt it has anything to do with their political alignment.


You're probably right. I assumed that there is a clause in the AGB permitting Youtube to in general not host a video if they don't want to. Or that there is something creating a Hausrecht in there, given them the right to chose a customer. Doesn't seem to be the case.


> What's the law they ran afoul of here?

None. They violated their civil contract with the user AND failed to fix it within 3 months. cf. toplevel post: https://news.ycombinator.com/item?id=27838603


It looks like the issue was the amount of time that Google took to comply. Per the article:

> “With the historically high fine, the Higher Regional Court makes it very clear that court decisions must be observed without restriction, regardless of whether YouTube assumes a violation of its guidelines or not,”


Naval Ravikant seems to be a divisive figure here but I think he made a profound point when he discussed this topic on Joe Rogan's podcast.

The point was that, as a platform, you cross a line which is very difficult to ever revert when you optionally censor your platform in response to political pressure because both sides are then forced from a game theory perspective to aggressively pursue censoring your platform in their favor lest the other does it first. The platform ends up being collateral damage in some political war they never cared about in the first place and they can never win because no matter who they bend too, it automatically enrages the other side.

This seems like where youtube and twitter and facebook are at now. They caved to censoring things that were reasonable, but now anyone can make them censor anything as long as they have some poltical power somewhere. It's not going to end well for them. Their only winning move was to stay neutral and do nothing but comply with legal requests.


>Their only winning move was to stay neutral and do nothing but comply with legal requests.

No? This narrative is a bit ahistorical - and it's fueled by this idea that hyperliberal boogeyman started censoring everything.

If you look at what happened in 2017, Google started "censoring" things because advertisers threatened to boycott. The platforms couldn't stay neutral because advertisers became more and more concerned with staying out of any potential scandal.


Google's response to pressures from advertisers wasn't to censor anything. They just demonetized videos that the advertisers didn't want to be associated with. Hosting someone's video for free without even interrupting it with ads seems like the opposite of censorship to me. There may be something interesting to say about how the desires of advertisers shapes our discourse, but its not censorship (in this case at least).


Google doesn't just demonitize videos, they make videos impossible to find by unlisting them from search results[1] and shadowbanning creators:

> One of the videos that had been restricted was a trailer for one of his short films; another was an It Gets Better video aimed at LGBTQ youth. Sam had been shadow-banned, meaning that users couldn’t search for it on YouTube. None of the videos were sexually explicit or profane.

> ... five YouTube channels alleged that the platform had unfairly targeted LGBTQ content creators with practices similar to those described by Bardo: demonetizing videos, placing them in restricted mode without warning, and hiding them from search results.

[1] https://www.rollingstone.com/culture/culture-features/lgbtq-...


> Hosting someone's video for free without even interrupting it with ads seems like the opposite of censorship to me.

Doesn't demonetization on YT just mean the ads still run but the money doesn't flow to the creator anymore? Considering YouTube does it that way with copyright claims and also automatically added ads to previously ad-free videos just because they could, it would surprise me if they'd remove video ads by themselves.


No that is from copyright claims and the holder's desire.


No, because the whole issue is advertisers refuse to have their ads shown with certain types of videos. The ads go away. Johnson and Johnson doesn't want their ads shown next to Nazi propaganda, so demonetization pulls advertising entirely.


Doesn't matter, they still used this as censoring (claiming YouTube cut funds to kill them).

I even saw (this mostly was Facebook, but also was done on YouTube) where the channel/fan page purposefully switching videos to private while talking they are being censored.


purposefully switching videos to private

Sometimes channels will do this because they have "strikes" shown on their creator pages, so they hide their videos to avoid having so many strikes they get shut down. Some science and engineering channels I watch have run into this problem.


So now they are demonetizing videos that want to be monetized and vice versa. Time for YouTube's demise is long overdue. I hope Peertube, decentralization and finger to censorship is the future.


They both censored and demonetized. You're going to have a much more difficult time trying to find Elsagate videos on YouTube anymore.


They (Twitter, Facebook, Youtube, even Reddit) act so very much like publishers that it seems to me the mistake was granting "platform" protections to sites that: claim broad rights over posted content; use complex logic to decide what to promote (that they promote anything is alarming, for a "platform"!) and what a visitor sees while also hosting and distributing that content they're highlighting or promoting (which makes them distinct from some rando's best-of lists on their personal website, linking to content hosted elsewhere); place ads alongside content but sometimes choose not to; and, at times, engage in revenue sharing. And that's before we even get into the censorship.


The whole point of section 230 was to make imperfect moderation legal. It wasn't enacted because some random personal sites got sued. It got enacted because some very large companies got sued and congress thought the results of the court cases were illogical.

It sounds like it's doing its job.


This "platform vs publisher" thing is a myth based on a misunderstand of how Section 230 works. The law doesn't actually distinguish between the two.


It’s an international issue that not only U.S. law applies to. And even if it did, it’s still worth discussing.


Perhaps the most relevant distinction is between "moderation that the users have a choice about" versus "moderation that the users don't have a choice about".

If a site wants to hide all posts/videos that promote some unpopular political belief, or use offensive words, then implementing that censorship as the default user experience is perfectly acceptable, as long as users can choose to opt out of that censorship.

There might be multiple reasons why a given post/video could be censored, and perhaps there is a small burden on sites to tag every single reason rather than mark it for censorship at the first excuse, but I think that a lot of the tagging work could be made the responsibility of the user who uploaded it.

Such a system would hopefully make moot the slightly disingenuous argument that "If sites can't ban political opinions I don't like then they also won't be able to ban spam". Obviously sites would be allowed to put neutral resource limits on users, to prevent DoS attacks.


It distinguishes between something and a publisher, in that it says whatever-you-want-to-call-that-something can't be treated as the publisher (it uses that word) of information it's distributing.


It distinguishes between the person who uploaded the video and YouTube hosting the video. How YouTube exerts editorial control to promote some videos or delete others is not relevant. This is a good rule. HN couldn’t possibly exist without it.

There’s been a lot of really bad information on 230 from people who ought to know better.


I got mine from the text of the law. It does what you say, and also what I say. I definitely doesn't not distinguish between a publisher and a service provider (host, platform, whatever). It does so explicitly.


The law is written by people, so it frankly doesn't matter what it says currently because it can (and probably will) change.


I mean, sure, but parent comment was implying platforms have special legal protections currently, which isn’t really the case.


Unless you know of a way to do moderation 100%, liability for user-generated content is infeasible.

The fulfillment of this fantasy of forcing platforms to abandon their efforts will just lead to all of social media degenerating into cesspits as they fill up with porn and swastikas and all normal people leave.


> Unless you know of a way to do moderation 100%, liability for user-generated content is infeasible.

I agree that highly-public social media anything like what we see now wouldn't work anymore.

I don't even necessarily think that we should kill 230, but I don't think you should be able to curate and promote content, and claim strong rights to posted content, and still enjoy its protections. Yes, this means "algorithm-curation" social media with broad public visibility of content and that claims significant ownership of posted content, would be in trouble. I think services like that should struggle to operate that way. Take ownership or don't, none of this pretending to be one thing while doing another stuff. That doesn't mean we have to crack down on web hosts or ISPs or email providers or anything like that, since they're not doing most of that stuff.


Those threats were never serious though, because YT is too big, but also because advertisers know they can advertise on certain subsets of content that is not offensive.


What about telling advertisers that if they want to boycott, go ahead, find another Youtube to advertise on. They make 15 billion a year in advertising and 3 billion on subscriptions. I'm willing to bet with that much advertising it would be very unlikely any advertising boycott could make a significant difference.


>What about telling advertisers that if they want to boycott, go ahead, find another Youtube to advertise on.

Yeah, it's called Facebook. Regardless these weren't some no names who were boycotting, it was pretty much the whales like P&G and CocaCola who were complaining. (Just those 2 spend $8BN/year). At the very least having any of them pull out would cratered at least one exec's bonus.


[flagged]


> What else do you call the complete removal of parler?

Failure to adhere to terms and services or to express intent to adhere to them going forward.

Parler couldn't or wouldn't stop people from breaking their providers' terms of service and didn't show a good-faith effort towards doing so. Other right-wing outlets do do those things, and have not gotten drilled despite their odious beliefs.


at nomemerror (I cannot reply directly)

I don't understand why your comment is flagged and dead! You are making a perfectly reasonable point. HN admins, what is unacceptable about what is being stated - I genuinely do not see it.


If you start going on about "election fraud", as that poster did, I'll usually flag the post.

In this case, with my reply, I preferred to point out the obvious mischaracterization and leave addressing the mendacious falsehoods to others, but it looks like others decided to do something about it.


Is it not possible to talk about election fraud? I believe that this really has happened historically.

I would be fine to agree with you if you have proven that there is not election fraud, but obviously having a different opinion is not proof.


In the beginning, but now both sides see them as the enemy of the people. One side thinks they censor too much and the other thinks not enough. So YouTube is now caught in a political game of trying to appease whichever party is in power lest they get regulated. Hence why Trump stayed on Twitter until he was out of office.


The advertisers are bluffing when they threaten these boycotts. They're not going to stop advertising on google, facebook, twitter, etc., which these days probably represents a majority of their advertising budget and even more of their ROI. They'll make threats because it costs them nothing, but all the platforms need to do is say "no" and that would be the end of it. Even better: immediately ban the accounts of companies that make these threats. They won't receive any more of them.


They don't need to stop all advertising. They can simply stop advertising on Youtube and focus advertising on Facebook, or stop on Twitter and focus on Tik Tok.

They can pit the providers against each other. And that's basically how a free market works.


Yeah I don't think Facebook is going to ban Coca Cola


A few points:

1. Google definitely censors things on their own as well. Their own search engine turns up a long list of examples, so I won't rehash them all here, but one illustrative example is their censorship of the dissenter plug-in (https://reclaimthenet.org/google-chrome-web-store-bans-disse...), which also seems to be an example of censorship collusion within the tech industry.

2. Google has a long history of internal activism that is highly progressive, and regularly applies pressure on the company, and creates a culture of fear for employees who are either conservative, centrist, or even moderately left-leaning. The James Damore fiasco is a great example of the internal political culture rearing its head and how it impacts who's comfortable speaking up and steering the company's culture (https://www.inc.com/suzanne-lucas/google-fires-employee-for-...).

3. Why do you think advertisers became "more and more concerned"? It's because of left-leaning activist pressure from groups like Sleeping Giants who have made it their mission to organize activists and create a false sense of societal pressure on advertisers (https://en.wikipedia.org/wiki/Sleeping_Giants). It's the same as Google censoring things, because typically activist employees will draw attention internally to these activist campaigns, and try to alter the company's otherwise neutral stances. There's also a pipeline from internal activist employees to certain members of the press (like Geekwire) to try to use external pressure to move company stances.


>The James Damore fiasco is a great example of the internal political culture rearing its head and how it impacts who's comfortable speaking up and steering the company's culture

I'm willing to accept the premise that Google could have neoliberal pressure on the company (I don't know if I would consider the pressure you allude to be progressive or left leaning). That said, James Damore's memo, if you've read it is not a good example of it and I believe he was rightly exiled for it. The memo is poorly sourced and poorly argued. It reads like someone who doesn't understand Dunning–Kruger is.


It was fully sourced with peer reviewed research, per my recollection. I also recall that the version of the memo circulated frequently on social media and in progressive news sources like Mother Jones was not the original, and in particular, omitted all the sourcing. Are you sure you didn’t see an edited version that was circulated specifically to malign Damore?

From Lee Jussim, a professor of social psychology at Rutgers University who was a Fellow and Consulting Scholar at the Center for Advanced Study in the Behavioral Sciences at Stanford University (https://quillette.com/2017/08/07/google-memo-four-scientists...):

> The author of the Google essay on issues related to diversity gets nearly all of the science and its implications exactly right.


And why did "advertisers threatened to boycott" due to pressure from "hyperliberal's" that are prputally offended by everything.

They tried to get YT to do it, when YT ignored them they went after the money... It is right from the liberal playbook


The first Adpocalypse preceded that. Rather, it was part of an internal drive to focus on more "family-friendly" content. Iirc, it was actually the presence of Islamic State propaganda on the site that was part of the motivation.


>It is right from the liberal playbook

Yep the liberal playbook that was kicked off by an investigation by the progressive news outlet... The Times[1]... which is owned by Hyperliberal Billionaire... Rupert Murdoch.

Do you even bother to do a small amount of research into your biases? It blows my mind that people think the world is controlled by a couple of megalomaniacs on Liberal Twitter.

[1] https://www.thetimes.co.uk/article/youtube-hate-preachers-sh...


Hardly. The opposition was from flag-wearing jingoists upset at Islamic terrorist videos on YouTube.


Consumers objecting to content is as old as television, and even older, and is certainly not owned by "hyperliberals."

For example, Conservatives demanded radio stations stop playing the Beatles, and, only slightly more recently, the Dixie Chicks. They called up advertisers as well.

You could probably find people complaining to artists' patrons in Medieval texts, if you looked.


> only slightly more recently, the Dixie Chicks. They called up advertisers as well.

If you want recent examples, look at WAP, its Super Bowl and Grammy's performances, Lil Nas X, or the NFL and Colin Kaepernick. There's also the witch hunt and boycott on teachers, companies and anyone else they believe are part of a nationwide critical race theory conspiracy.


Anyone claiming that critical race theory is a "conspiracy" is either massively misinformed, or being actively deceptive in order to push that very ideology. The National Education Association, a union with about 2.3 million members, has not only affirmed the existence of it, but is actively pushing for its inclusion in schools[1][2]:

"Share and publicize, through existing channels, information already available on critical race theory (CRT)"

"Provide an already-created, in-depth, study that critiques empire, white supremacy, anti-Blackness, anti-Indigeneity, racism, patriarchy, cisheteropatriarchy, capitalism, ableism, anthropocentrism, and other forms of power and oppression at the intersections of our society, and that we oppose attempts to ban critical race theory and/or The 1619 Project."

"Commit President Becky Pringle to make public statements across all lines of media that support racial honesty in education including but not limited to critical race theory.

This isn't limited to universities, either - it's being taught in K-12 schools[3]:

"Responding to prompts such as “In the last year, I have learned _____ about race and racism,” and “One way I will work for racial equity in my work,” teachers say:

“American society makes it hard to have high hopes.” Racism infests the nation’s “entire fabric.” Everyone must “lean into the discomfort.” “Older millennials are disappointingly racist.” “Aspects of the anti racist movement have been co-opted by neoliberal corporations, and reactionarily [sic] opposed by many even mainstream conservative thinkers.” Racism is “layered into everything we do at school.” We must “share the harsh reality of the BIPOC and LBGTQI communities with our students.” “Discuss issues of equity as arising in most every book I teach.”

Oh, and the statement by the NEA was removed from its website shortly after[4] (https://ra.nea.org/business-item/2021-nbi-039/ now redirects to the homepage), which is only further evidence for the fact that many of those pushing this ideology are simultaneously attempting to gaslight and actively lie to their opponents in an attempt to convince the public that it doesn't exist.

I'm not sure why you used the phrase "witch hunt" when the ideology clearly exists and is actively being pushed in education around the United States. Perhaps you meant to use the phrase "accountability culture"?

[1] https://web.archive.org/web/20210702133611/https://ra.nea.or...

[2] https://nypost.com/2021/07/04/teachers-union-vows-to-fight-b...

[3] https://www.washingtonpost.com/opinions/2021/06/23/teacher-p...

[4] https://www.foxnews.com/politics/national-education-associat...


The vast majority of those quotes have nothing to do with critical race theory, and are simply anti-racism. The two are not remotely synonymous, the former being a minor academic framework for studying history.

This confusion was deliberate work of a few conservative thinkers, who wanted an obscure "elitist" academic theory to use as a catch-all term for all anti-racism work.

https://www.nytimes.com/2021/07/13/opinion/critical-race-the...

https://www.newyorker.com/news/annals-of-inquiry/how-a-conse...


Well, this sure looks like a witch hunt[1] in which people were frothing with rage, becoming violent and getting arrested[2] at a school board metting because they didn't get their witch/CRT when they went looking for one to burn.

According to that same mob, when they couldn't find the critical race theory at the school board meeting, everything they didn't like, along with diversity training, suddenly became critical race theory[3]:

> While critical race theory was not on the agenda, parents and community members accused the school district of requiring teachers to take a diversity training that discusses the concept and then teaching it to students. They also criticized the school board for proposing a policy that would allow gender-expansive or transgender students to use their chosen name and gender pronouns and use the restroom that corresponds with their asserted gender identity.

That sounds and looks like a witch hunt for teachers to me.

Also, this is where the conspiracy comes in. Everything conservatives don't like is critical race theory now, and that's by design[2]:

> Christopher Rufo, a prominent opponent of critical race theory, in March acknowledged intentionally using the term to describe a range of race-related topics and conjure a negative association.

> “We have successfully frozen their brand — ‘critical race theory’ — into the public conversation and are steadily driving up negative perceptions,” wrote Rufo, a senior fellow at the Manhattan Institute, a conservative think tank. “We will eventually turn it toxic, as we put all of the various cultural insanities under that brand category. The goal is to have the public read something crazy in the newspaper and immediately think ‘critical race theory.’”

[1] https://www.reuters.com/world/us/partisan-war-over-teaching-...

[2] https://www.washingtonpost.com/education/2021/05/29/critical...

[2] https://www.reuters.com/news/picture/pandemonium-at-virginia...

[3] https://www.cnn.com/2021/06/24/us/loudoun-county-school-boar...


It's too late to edit it now, but that last link should be a [4] and so should the reference number by the last quote.


If you look at people on an axis other than liberal/conservative, you find that the people that object are the same people regardless of their politics. They're the people that can't stand the idea of other people seeing reading/hearing/seeing things, that they themselves wouldn't want to. It's the people that are hurt and scared when they see behavior that is actually harmless to them, but maybe spotlights a difference they don't want to be aware of.

Saying that liberals or conservatives have this to a greater or worse extent is looking at it through the wrong lens.


Amen! Turns out people are complicated and brains/thoughts/ideas don't exist on a single purple spectrum from red to blue.


Nice what-aboutism... Yes Conservatives used to be just as bad, I was out there in the 90's complaining about authoritarian conservatives.

We libertarians did not mind our flank and Authoritarian liberals today are about 10000000x more of an issue than even the most extreme bible thumping conservative from the 90's ever was



Do you have a non-biased source that provides actual facts not hyperbolic rhetoric?

Are these tax payer funded libraries, and could it be the fund reduction is simply a result of the fact that many, including myself, believe that libraries should not be funded by forcible taxation of the population.

This position is often turned by left sources as "raaaaccciiism"


I was thinking historical examples would be an interesting context in part because I didn't think anyone needed reminding of all the times in recent days and weeks that conservatives have tried to boycott or cancel things, but others have provided some examples of that if you're interested.

And "what-aboutism" was exactly the point, I was specifically pointing out that boycotting based on morals isn't in any way only the domain of "hyperliberals."


The advertisers threatened to boycott because otherwise they would have risked being scrutinized and potentially making less profit.

What actually happened isn't that one side of the political spectrum "censored" the other side in some sort of targeted attack.. it's that advertisers and private companies optimized for generating as much profits as possible by (obviously) pondering to the majority of potential customers.

Ironically.. such is the nature of capitalism.


> They caved to censoring things that were reasonable, but now anyone can make them censor anything as long as they have some poltical power somewhere.

Can you define "reasonable" here in an objective way that we could all agree on?

There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.


I'm subjectively inserting that comment. I think that a lot of it was reasonable. I'm not arguing that it was objectively reasonable. I'm writing my thoughts on the internet. I'm not obligated by the burden of proof to justify my subjective opinions.

>There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.

Nonsense. We don't apply this standard to anything else in life.


That's my point. There is no objectively reasonable way to do this. What you think is reasonable to take down and what I think is reasonable to take down are not going to always agree.

I'm not sure what "elsewhere in life" means, but bookstores choose what books they will sell while also not necessarily endorsing every book they carry.


> I refuse to take down anything that doesn't come with the force of law to take it down

and

> I refuse to take down this video, because I do not think it needs to be taken down

are two VERY different things. The first one is a political choice about taking down videos in general. The second one is a political choice about a specific topic. The second one is more or less similar to taking down the video because you think it needs to be taken down (same topic, different political decision)... the first one is absolutely nothing like that.

Edit: In case it wasn't clear (because I replied in the wrong place), this is the comment I was discussing

> There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.

While it may be "just as political", it's a political about a totally different thing.


I get what you're saying, but I just don't agree. They're only different in the abstract. At some point an incredibly noxious (but legal) video gets posted, complaints pour in, and you are either continuing to host it or you aren't. That's a decision and having a policy of "we never remove legal videos" doesn't absolve you.

Anyway, a social platform that only removes illegal content (not even spam?) sounds absolutely dreadful and I would not use it and no one would pay to advertise on it.


On a social platform where I'm only seeing content posted by my friends (or maybe content my friends interact with), anything dreadful I see is brought to me by my friends.

If my friends bring me dreadful content that I don't like, I'm not sure we'll stay friends. If my friends are spamming, I'll ask them to stop and if they continue, I'm not going to stay friends (or at least I'll unfollow them where they're spamming).

If people post garbage to my posts, I'll delete their garbage and restrict access to my posts.

There's no need for the social platform to do moderation, until it starts putting unrelated people's content in front of me; which is something I don't really want from a social platform.


Have you read the HN guidelines? Most rules have nothing to do with legality, and @dang moderates HN on a best-effort basis to ensure quality discussion.

Presumably, you don’t notice you’re on a moderated platform, but HN is very much not an unmoderated free-for-all.


HN isn't really a social platform, IMHO. I'm not connecting with my friends here, I'm talking with all sorts of random people. I absolutely notice it's moderated, and I wouldn't be here if it wasn't.


If this is your “line in the sand”, why are we arguing about YouTube? I cannot believe you connect with your friends there…


I mean sure, but I don't see how that concept maps to YouTube.

It would be interesting to have a social platform where you can only see mutual connections. I imagine it'd have a hard time competing with email and group texts and all the other ways people who already know each other can stay in touch.


Well, in my head (which I'll admit doesn't really match reality), YouTube is a video hoster only, it's not social at all. If something I'm reading or someone I'm talking with links to a video, it's likely there. I might watch that video, but I certainly don't read the comments and I try not to look at the recommendations because both of those areas are a cesspool.

If I'm searching for a video, I try to do it in a web search, and likely it'll lead to YouTube, and hopefully it'll actually be useful content (or Rick Astley, I actually like that song).

Anyway, if I were YouTube, I would turn off comments everywhere, and review videos before including them in recommendations (which would leave a lot of videos out of that section) and probably have a lot lower views.


Unless it is the only book store in town, banning certain books would be problematic.


> Leaving up a popular anti-vax video is just as political a decision as taking it down.

Why is it political? They didn't commission the content or request that it be created and hosted on their platform in any way. They're offering the same reasonable self-hosting process that they offer to anyone who shows up with an email address.

It only seems to become political when you decide to take action and either protect or remove the material. You're now no longer a disinterested third party, you're making editorial decisions and it's hard to believe they've taken this step without considering the impact of those decisions.


The idea that the service is a neutral 3rd party who hosts any video is a political idea, in other words it is a choice made by platforms that both influences and is influenced by politics. Note, because people get confused on here, this doesn't mean that it's a bad idea.

Other publishing media do not have this standard. For example, the radio waves are another medium where the FCC (which regulates them) could say that only certain things are allowed, or they could say you can broadcast whatever you want. (In fact I think they regulate content, re: obscenity, but I don't want to look it up right now.)

So again, the idea of being only a neutral 3rd party who hosts videos for all-comers is a choice, which has political implications.


> For example, the radio waves are another medium where the FCC (which regulates them) could say that only certain things are allowed, or they could say you can broadcast whatever you want.

They can only do so because there is a limited number of them and users cannot share the space, so it must be licensed to be practically useful.

Also, the FCC cannot dictate to a station what it can and cannot air, the FCC can enforce _community standards_ of the community which is being served by that radio station. They're not in a position to go searching for violations and then act upon them, they merely respond to complaints from the communities themselves.

> So again, the idea of being only a neutral 3rd party who hosts videos for all-comers is a choice, which has political implications.

Yes, but the service clearly exists to make money.. not to make a political statement; which I agree may be incidental, but that shouldn't be the basis for interpreting their actions.


> Why is it political? They didn't commission the content or request that it be created and hosted on their platform in any way. They're offering the same reasonable self-hosting process that they offer to anyone who shows up with an email address.

And profiting from it. The scope changes slightly when you realize your business could get sued repeatedly because you promoted misinformation (which is how it would be spun) and someone died because they followed that misinformation.

This is risk, and few of these businesses want to tackle that risk apparently.

It's a testament to how well marketers over the decades have sold the idea that companies care about anything other than their shareholders that people mistake profit-driven motivation for political stances.


Platforms that claim to be open content systems (i.e it is not topic limited like say a Mac Forum) should only remove content that is illegal in nature, i.e Sexual Exploitation of minors, True Threats, etc.

> popular anti-vax

since Anti-Vax has now been refined to include anyone that opposes government mandated vaccinations, I am a Vaxxed Anti-Vaxer as I oppose all government mandates. People should be free to choose on their own if they want a vaccine or any other medical treatment.

So should a video of me expressing this position be removed under an "anti-vaxx" policy?


You are of course free to create a platform that only removes illegal content, just as Twitter or YouTube is free to to remove content they don't like.

But I think you'd find two problems: determining what could be illegal is really hard (what's a "true threat" and what's a tasteless joke?), and also you'd end up with a community that looks a lot like 4chan or parler. Not a place I would choose to hang out.


> Not a place I would choose to hang out.

While that may be true.. do you not believe that the exceptional openness of either of those two platforms has an impact on places outside of them? Do you think there's no intangible benefits to you by these places merely existing?


The primary export of places like that has proven to be brigading, harassment, and bigotry in the form of memes, so...no, not really.

I have yet to hear a positive argument for their existence, and "well it collects the dirtbags!" is actually not one. I spent years tracking reactionary and fascist movements on the internet and how they interrelate and spread information; these sites more or less exist to do exactly that. The targeted harassment campagins that target random people they've decided not to like--that's just "for the lulz".


[flagged]


The norming of bigotry should be recognized and acted against when it occurs. When that norming is done through (shitty) humor, it should be pointed out as such. Did you have a point to make?


> do you not believe that the exceptional openness of either of those two platforms has an impact on places outside of them?

Oh, there's definitely an impact.

> Do you think there's no intangible benefits to you by these places merely existing?

Yeah, the campaigns to harass a game emulator developer to suicide because "40% is a good start" are a fantastic benefit to society, obviously.


One wholesome thing 4chan did: Mr.Lashua's Birthday https://www.youtube.com/watch?v=4lgw7YUYgIc


What places? Like a hypothetical anti-censorship platform, or are you thinking of something that already exists like 4chan (if that even counts)?


Many well regarded distributed platforms are like that too (the web and blogosphere, email, mastodon, xmpp&matrix). The common feature is federated islands that can somewhat form filtered views of the global system.


I think we have proven time and time again that you can not actually create your own platform, if you try then come after your hosting providers, your network provider, your advertisers, your payment processors, and if that fails they find your family to go after their employers, etc etc etc

So the "just create your own platform" trope has be tried and failed.


You would never want to use a big platform that had no moderation, the quality would be shit. For a small, niche platform, it's fine, but as platforms grow they get filled with 13 year-olds posting garbage, which someone has to clean up to avoid driving away other users.


See, for example, the Eternal September


I don’t agree that decent conversation can only exist with heavy moderation.

If you haven’t seen the issue with echo chambers by now… you’re in one.


Become a mod on a reasonably sized forum and see for yourself.


I do not want to, nor do I use any of the big platforms now... I have no twitter account, no facebook account, no insta account, no tiktok account, etc...

I have locals.com account, reddit accounts (which is diminishing amount of my usage as they continue to "mainstream" aka censor the site) , and HN, that is pretty much it for me


The quality decline from mainstreaming isn't due to censorship, it's due to the increased presence of normies and children. Meanwhile the heavy-handed censorship of subreddits like AskHistorians has resulted in the preservation of high quality standards.


> Platforms that claim to be open content systems...should only remove content that is illegal in nature

So if I were to continuously upload hours-long videos of digital noise, thousands of them, millions, YouTube ought to be obligated to host these ersatz videos, and not remove them unless they were deemed illegal? In that case, someone could in theory run a successful encrypted cloud backup business off YouTube's servers. Or just use YouTube as a massive versioning backup for their own personal data, confident in the knowledge that the files can never be deleted.


> There's no such thing as neutrality. Leaving up a popular anti-vax video is just as political a decision as taking it down.

A lot of the replies to your comment fixate on this, since they take issue with “everything is political”. But I think it’s very true, and agree with you.

Why should YouTube host anything and everything that random anonymous users decide to upload? Why is this /holy/ act of uploading deemed undoable and unrevocable?

It’s a very weird way of thinking that it’s political only if the video is removed, but not if it is kept online.


It’s a very weird way of thinking that it’s political only if the video is removed, but not if it is kept online.

Because keeping it up is the default, while you have to go out of your way to remove it.

If I own a store, is it as much of a political act to allow Trump supporters to shop there as it is to ban them?


If you own a store, do you have to sell something, just because it is manufactured by someone, or because you want to sell it?


Science isn't politics and the truth about scientific issues isn't political, but misinformation about issues in which there exists a scientific consensus is very definitely political.


> Leaving up a popular anti-vax video is just as political a decision as taking it down

That's a misconception, and basically what the whole trolley problem is all about. Doing nothing is not the same as doing something. Each have their own ethical consequences and it's not as easy as you think to spurt out the "right" answer. Sure, you might have a preference that's difficult for others to argue against or for, but others also have the right to reach a different conclusion.

With that being said, may I ask you what you know about vaccination? Are you using information from articles you've read online from "authorized" sources, or are you an expert on the matter? Again my point here is not to say that you're wrong or right, you're absolutely free to reach whatever conclusion you desire, but it's really difficult even for actual experts in the medical field to know what's going on currently, who's motivated by altruism, greed, selfishness, or cronyism. Taking up these stands and pretending it's "science" is an insult to science, because "science" is ever evolving and there's no such thing as consensus in the scientific process.


True, once you start deleting videos, then leaving any up is a political decision. That's why, as a platform, you should not be deleting any videos, other than those that violate the law.


It's not always clear whether a particular video violates the law. If a video is in the legal gray area of, let's say, advocating violence against a particular person then should it be left online until a court specifically orders removal?


Yes, that seems reasonable.

It is like getting a warrant to search your house. It can be expedite if someone is in grave danger.


The law is also subjective, there is no fixed law. There is ever-updating case law and dynamic precedence, and it differs by county<state<country.


Not deleting videos is also a political decision.


"They caved to censoring things that were reasonable"

I'd argue that much of their censorship was not reasonable at all. None of it was illegal. Mostly just differences of opinion. My "information" is your "misinformation".


I think OP means they were censoring things that weren't objectionable (offensive, etc,) and they were removing "reasonable" things that were opinions, etc. Maybe I'm wrong, but that is where Twitter/Facebook/etc have lost me; by censoring things that aren't harmful they've opened pandora's box.


They've opened Pandora's box, because now every Joe and Karen realized they can weaponize censorship by being offended by everything.


Right. Nothing these platforms censor or began censoring early on was anywhere near reasonable. It was all political grandstanding to show those in power that big tech are the good guys. They’ll do the political bidding of the elites and politicians will only threaten to hold companies responsible to gain points with their constituents. All the while the politicians and big tech are in bed with each ensuring the information they don’t want shared isn’t shared.

People who think any platform has the right to censor information scare the shit out of me. Save for things that are illegal, every platform should be neutral. Moderation should algorithmically be tied to the law.


So users who troll and harass other users, spam, post hateful content, etc. should never be banned? That's just leads to a precipitous decline in quality and drives other users away. Why would I want to spend my leisure time on a platform filled with 13 year-old shouting racial slurs?


> Why would I want to spend my leisure time on a platform filled with 13 year-old shouting racial slurs?

you wouldn't have to. and if they were the majority, then it would be democracy in action.


Let the people decide. HN deals with this well enough.


Comments can and do get removed for violating the rules here. As they should.


Mostly no, they should not be banned. You'll almost never see it. It's really not a big deal. Would need to get into unlawfulness before considering to censor (slander, doxing, imminent threat to harm, etc).


There's no avoiding this even if you don't " censor your platform in response to political pressure " because everyone assumes that is the case and the game is on...

It's not a case of deciding to:

>optionally censor your platform in response to political pressure

Because so many people assume that if some content is removed and it is <content I think should be permissible / like> well then that must have been because of political pressure or views at that company or something like that.

So by default:

> The platform ends up being collateral damage in some political war

Even if the content removed wasn't removed due to political pressure the accusations fly and the game begins.

Nobody is forced by a platform's actions to play the game, the sad truth is people will believe the game is on all the times to explain anything they don't like / don't understand.

Rudy Giuliani thought that Twitter was 'allowing' someone to post content to his account without his permission ... but really he had posted a link to a domain that didn't exist, then someone registered and had some fun with him.

Rudy of course just filled in the blanks of his ignorance with concerns of bias by Twitter... and there's A LOT of people who do that (with all sorts of political views).

We had an article here on HN where up and down votes changed on YouTube ... it was interpreted by some immediately as some sort of political bias.

Have a platform? You're in the game...


Right this is the entire point about how crossing the line poisons the well.

Which is why they would have to be very clear that their policy is to only respond to legal orders for takedowns and nothing else.


I think the issue then is that the platform becomes ... unusable.

The scale of spam of all kinds would render the service useless. Even inaction would be interpreted as part of the game ... and you're in the game again.


You could ultimately be right that they took the better of the two options and that this path would have killed them. I personally disagree.

I think there is also more Grey here. They could have de prioritized things in the algorithm to encourage the direction of the community without explicitly depltforming anyone or blocking\removing content. I think its possible that they could have achieved a lot of what they wanted with this method while avoiding poisoning the well.


Isn't "political pressure" here really means "user pressure" ?

My understanding is that users of YouTube pressure YouTube to censor some things, YouTube wants to appeal to its user base, because customers are important for profit.

Now Germany says that maybe in their position censoring things for profit is not legal or something like that and fines them.


Valid point. Just pressure in general is probably more accurate. Once you bow to user pressure, you still open yourself up to political pressure. The political pressure probably comes masqueraded as user pressure anyway.


> The platform ends up being collateral damage in some political war they never cared about in the first place and they can never win because no matter who they bend too, it automatically enrages the other side.

To me, the fact that they don't care about it is the problem. Those companies are having issues because they have no moral or ethical compass. It's become untenable for them to remain neutral because there is no neutral actor in this scenario. There's never going to be a service that's not held to a moral standard for their content regardless of what the law says.


You seem to assume that if they cared, they'd be on your side. What makes you think that that's the case? They are first and foremost accountable to shareholders who want profit. It's unlikely that they view the world the same way you do. You probably don't want them throwing their weight around

You can argue that you don't like that they are accountable to their shareholders but it is true.


No, I don't assume that. That's a very wrong assumption on your part.


But if you know in advance, that their actions would be against your goals, why do you want them more active not less?


I think it was inevitable that these platforms would start having content restrictions once they shifted to using machine-learning algorithms tuned to maximize each user's engagement.

For a significant fraction of users, the most engaging content will be psychologically manipulative - conspiracy theories, racism, political intrigue, and the like. I see the combination of an engagement machine and such content ethically problematic, but I don't think they're going to turn off the extremely-profitable engagement machine unless forced to, so they restrict content instead.

The trouble is the engagement machine is sophisticated, while the content restrictions are crude. This probably isn't sustainable in its current form, but there aren't easy answers to the problem. A sophisticated "bad" content suppression algorithm sounds pretty dystopian to me.


*Naval Ravikant (I thought there was something off with your version.)


Ha wow, I read that like 5 times and didn't catch that. Good catch. Fixed


How does “staying neutral” prevent people from pressuring you? Is there any doubt that YouTube et al have and always had the ability to remove anything they wanted from their platforms?


I don't agree with this kind of reasoning at all.

Staying neutral, or alternatively, trying to appear neutral will most likely in itself enrage.


Yeah, I remember that podcast, and I remember thinking that it sounds like it should be, but will it be so? Since then a little has happened to calm my doubts. Like, it wasn't Twitter that got bullied by Trump back when he was USA president, but very much otherwise. (You can say TikTok almost got bullied, but in the end it wasn't, and even then, it almost got bullied into something that some other tech-giants would love.) And while €100,000 might sound significant for some random citizen, is it for Google, really? Heck, is it even a nuisance?

(BTW, this is what I felt about many of Ravikant's statements: he sounds so wise, so believable, and I want to agree with him, because it fits my own philosophy so much, but as soon as I get distracted from the magic of his voice for a moment, I start doubting if all of it has any ground whatsoever or is it just make believe.)

Edit: However, I think I have to clarify that your interpretation isn't exactly how I remember his point. As I remember, the point was that openly censoring the content for any reason other than the court order is basically the moment where they lose the "plausible deniability" of being responsible for their content, i.e. they cannot be seen as "just a middleman" anymore (this is how they want to be seen). But it is close enough for the matter we are discussing right now.


> Their only winning move was to stay neutral and do nothing but comply with legal requests

They could have enforced their policies consistently across all accounts instead of their strong anti-right/anti-conservative bias. It's their selective enforcement that created issues for them.


Its pretty clear these platforms are not Neutral.

Facebook literally set up vote drop boxes to maximize number of votes in precincts that heavily favour democrats in last election. They also banned a sitting president. So why even bother pretending they are neutral, when the evidence is pretty clear that they set out to sway elections. You can watch the Sergey Brim upset video after Trump won in 2016.

https://www.cnn.com/videos/cnnmoney/2018/09/13/google-video-...

I don't have problem with them having a side. And acting honestly and openly or even making one sided campaign contributions.

I have a problem with a reinforcing circle, where they censor to help elect a government which then rewards their censorship with contracts. At that point they become state actors, censoring on behalf of the state. and I think this is the real issue. And I think we are pretty much there.


> They caved to censoring things that were reasonable, but now anyone can make them censor anything as long as they have some poltical power somewhere.

On some level, we have to acknowledge that whoever is running the shitshow at youtube and twitter are actually responsible as well, in that they believe they know enough to literally dictate what everyone in the world should see and read.

For example banning Trump from Twitter is a great example of this. Don't get me wrong, I'm no fan of his politics (or Biden's, or any of them, same shit different smell i say), but who the hell do Twitter people think they are that they feel confident enough to ban him? Not talking in the sense of a private company here, sure it's their business and they can do whatever the heck they want, but does anyone really think that by banning him they prevented whatever ideas of his that think should not be encouraged to stop? At best, this shows tremendous life inexperience, which I expect from kids who read a couple of books and think that they understand everything, the typical example of a tech company worker in a company like Twitter.


There is a ton of actual evidence that removing bad actors improves the quality of information on social networks:

https://www.axios.com/reddit-hate-speech-policies-reduction-...

https://www.vox.com/2021/1/16/22234971/trump-twitter-faceboo...


That's not what I said. To reiterate, what I'm saying is: Shady ideas fester in the dark. Banning won't help society. Exposure and discussion does.

But the new and very different point that you're making, which is "removing bad actors improves the quality of information on social networks", is also very disputable and shaky.

- First link is "Reddit says that the rules Reddit made helped Reddit." is not exactly impartial, so forgive me to mistrust it.

- Felt bad opening a Vox link (since so very obviously biased, but whatever, opened it for a laugh). The say: "misinformation slowed, the research indicates online discussion around the topics that motivated the Capitol riot has also diminished" if you don't see it (or if it doesn't happen online) it doesn't mean it diminished overall. The whole article is full of bias honestly. Sad that you feel it's worthy enough to source.


The "marketplace of ideas" is a fiction that has long since been debunked. The best ideas arent the ones that win out, it's the loudest ones and the ones that appeal to our most base emotions that win. Only a small percentage of people actually possess intellectual curiosity. Most just mindlessly follow social trends.


> The "marketplace of ideas" is a fiction that has long since been debunked

I don't think it's been "debunked" as you say. Provide evidence. A few examples is not evidence to make such a blanket statement.

> The best ideas arent the ones that win out, it's the loudest ones and the ones that appeal to our most base emotions that win.

Again, very much disagree with this. If you look short term that may be true, but long term historically speaking at least that has not been the truth.


Oh really? Then why is the number of people who believe the earth is flat on the rise? Why do people still believe in ancient mythologies that contradict scientific knowledge that has been available for centuries?

Maybe on the scale of millenia ideas based in truth are more likely to dominate, but I don't see how you could make that claim about history when our current paradigm of empirical knowledge is only a few centuries old, and already it seems as though cracks are starting to form.


> Then why is the number of people who believe the earth is flat on the rise?

Why do you say it's on the rise? Have you considered that there's lot more people, or that that the internet and various "communities" on the web just gave extra amplification to all sort of ideas? This is exactly why nothing should be banned. If you want wrong ideas to be corrected, you let them be discussed in the open. If you start pulling down videos that talk about flat earth you're gonna end up with grouping all people who think like that in a community where they only get exposed to ideas that affirm their erroneous belief.

> and already it seems as though cracks are starting to form.

again, provide examples or evidence for how and where you see this.


>you're gonna end up with grouping all people who think like that in a community

That's exactly what not banning them is doing! Not that I think flat earth content should be banned, as it seems to be mostly harmless. Before the internet, people who believed in fringe conspiracy theories didn't have a good way to coordinate and group together at scale.

Hell, even in the mid 2000s, after the internet had been around for a while, fringe communities tended to self-segregate in their own forums. They group together and formed echo chambers, yes, but they were also insulated from broader society, and therefore had little ability to acquire new converts. It was recommendation algorithms that popularized fringe ideas, by pushing them to bigger audiences that otherwise never would have been exposed to them.

I think that's really the core of the problem, recommendation algorithms. The algorithms don't know how reliable or accurate the content they push is, they just push whatever the machine learning model predicts will keep the user engaged. I would much rather group the people at the fringes into online ghettos than have them roaming the broader web and spewing their nonsense to anyone the algorithm recognizes to be vulnerable to conspiratorial thinking.

As to examples of cracks in the paradigm is respect for empirical knowledge, I don't think you have to look far. The rise of political extremism has resulted in more people on the far right and far left, ideologies that are hostile to the notions of nuance and cool headed reasoning, and thrive on emotionally driven messaging. More than 15% of Americans believe in QAnon, and nearly a third believe in the election conspiracy.

Meanwhile on the left, you have rhetoric that is increasingly hostile to data-driven approaches to problems, instead preferring "lived experience", ie anecdotes. You have cases like David Shor's firing for daring to tweet a study from a respected scholar that appeared to challenge the zeitgeist at the time. Many of these people are highly educated, or even academics themselves.

And that's to say nothing of regular old snake oil that has nothing to do with politics. Essential oils, healing crystals, you name it. Misinformation is on the rise, and most of the population is not equipped to deal with it. Maybe this is a temporary growing pain, maybe not. I don't think there's any way to know for sure right now. But we do know that for most of human history we have lived in the darkness, so it wouldn't be terribly surprising if we end up returning to it.


practicing the 'marketplace of ideas' led to the single most important idea/practice/accomplishment in history - the constitution of knowledge. just read jonathan rauch's new book. sure there are threats to this way of learning and understanding, and it isn't perfect. but it's the best we have, and it has not been debunked


To clarify, I belive that the marketplace of ideas is still a useful principle, and I absolutely am worried about censorship in certain domains. However, like any market, the marketplace of ideas is vulnerable to market failures. The idea that everywhere should be a marketplace of ideas is deeply unwise in my view. There should be areas where people can discuss any ideas freely, but I think it is foolish to deny the idea that information can be dangerous in some contexts. Look no further than the needless deaths caused by anti-vaxx conspiracy theories. To have a healthy marketplace for ideas, you need some degree of structure.


> To reiterate, what I'm saying is: Shady ideas fester in the dark. Banning won't help society. Exposure and discussion does

Are you sure? That hasn't helped flat earthers or anti-vaxxers ( of the vaccines cause autism camp), why do you think it's true?


The thing about Twitter is that they had rules that Trump was breaking regularly; and invented an entirely new "world leaders policy"[0] and "public interest policy"[1] to justify not banning him. They even exempted him from DMCA repeat-infringer policies[2] when music artists started taking down his tweets with their music in them. Had Twitter not intervened on January 9th, Trump would have been gotten banned anyway when he left office, purely because of him no longer being an elected politician.

My personal opinion is that even if the thing that gets banned or removed finds an audience (and it will), it's still a good idea for any platform - from a tiny webforum to a billions-strong social network - to have speech rules that are fair and reasonably enforced. It's not fun to be on a platform where a handful of celebrities and politicians (incl. Trump) are regularly flouting rules you have to follow.

Furthermore, I'd much rather be in a community in which the loudest and most obnoxious users are shown the door, purely for my own sanity. Free-for-alls stop being fun when they outgrow their original userbase.

[0] https://blog.twitter.com/en_us/topics/company/2019/worldlead...

[1] https://help.twitter.com/en/rules-and-policies/public-intere...

[2] https://torrentfreak.com/could-trumps-twitter-account-be-dmc...


Facebook is also censoring as misinformation this article about the effects of face masks on children published by the American Medical Association. Now the findings might be right or they might be wrong; we won't know until the experiment is reproduced. But it's bizarre and inappropriate for Facebook to "fact check" legitimate scientific research when the actual facts aren't even known yet.

Walach H, Weikl R, Prentice J, et al. Experimental Assessment of Carbon Dioxide Content in Inhaled Air With or Without Face Masks in Healthy Children: A Randomized Clinical Trial. JAMA Pediatr. Published online June 30, 2021.

https://jamanetwork.com/journals/jamapediatrics/fullarticle/...


That might be because those authors are regular purveyors of misinformation:

https://www.proquest.com/openview/2bb682ad7b07de8051e51873c4...


I was with you until the very end, when I feel like you threw away the argument by seemingly being open to "legal requests"; the "political wars" you get dragged into tend to be from politicians that are certainly making "legal requests": hell, if they aren't legal, they change the law!

I feel like the core problem in many cases was building a system where someone gets to be arbiter of content in the first place: Apple claims they take software down due to "legal requests" by/in various countries (though this is a lie as they also do it anti-competitively or for market perception reasons, but even taking them at their word here) and yet they caused the ability to do that as somehow Android devices sold in the same jurisdictions support installing arbitrary software without issue.

Platforms like Twitter and YouTube make their problems worse by recommending content--which is entirely their editorial decision and should be seen as such: any benefits or costs, moral or legal, should fall squarely on their shoulders--and then conflating that with having the ability to publish, but they would be in a much more morally (and often legally) defensible position if they simply didn't actively recommend content they disliked but still let it be found by people who actively followed the publisher.

Regardless, as usual, I will now link to my heavily-cited talk (every single slide is inherently a citation from a reputable news source) that I gave in 2017 (maybe 2018? the video was unlisted and I only just recently made it public as someone noted I had never done that, and now the upload date is weirdly set to last week ;P) at Mozilla Privacy Lab--"That's How You Get a Dystopia"--wherein I push hard at the idea that centralized systems and the arbiters they empower are the core problem and never work out, even if you like some of their decisions for some of the time.

https://youtu.be/vsazo-Gs7ms


If a website does not honour legal request, it'll get blocked, so its entire content will be effectively removed for all citizens.

The proper approach IMO is to honour legal requests, but provide transparency and display some information about blocked content, like who made the request to block it and so on. Of course content must only be blocked for requests originating from that specific country.


But that doesn't solve the problem, as the "political wars" the person I was responding to described all work in the world of increasingly tight laws; you have to avoid being a target in the first place by correctly navigating the space of solutions for how content is distributed in the first place. Take the Apple iPhone example: if Apple is forced to remove an app from their app store by China, it doesn't matter if the user can install it on the side... and somehow China has not managed to force companies to not provide that functionality: Apple going out of their way to centralize both app distribution and even API access (VPN functionality, as a specific example, is hidden behind an entitlement that they don't give to casual developers) is work they chose to do that directly enables the ability to even make "legal requests" quickly and easily.

I maintain the moral equivalent of this for platforms like Twitter and YouTube is to stop conflating their centralized recommendation systems (which should be considered "their speech") with the functionality to publish at all (which should be considered "someone else's speech"): it is going to be way less controversial to stop recommending something that someone else likes if it is still accessible to people who know about it (as they externally discovered the content or author and were directly linked to it/them), and it is also going to be way less controversial to allow people to publish something that someone else doesn't like to their own audience if it isn't being actively recommended to third parties. These recommendation systems have started to conflate "being able to say what you want" with "being able to be granted a large audience" so well that the feature set for moderation fails to separate them, and that's bad for everyone: there is a reason why we talk about "the right to free speech" instead of "the right to be heard".


> If a website does not honour legal request, it'll get blocked

If that is happening, then the architecture of the internet needs to take more evolutionary steps.

If the state can censor the internet, there's nothing 'inter' about it. It's just a network, subject to the whims of its flailing predecessor.


> If that is happening

Yes, that is happening for many years. China being the most known state, but AFAIK even well-recognized first-world democratic countries like UK block some websites.

> then the architecture of the internet needs to take more evolutionary steps.

Adoption of IPv6 is a good example that architecture of the Internet is pretty much set in stone at this moment. We can put more layers on top of it, like Tor network, but underlying protocols are still IPv4/IPv6 with enough meta-information to allow efficient blocking of protocols or resources.

I have some hopes that new TLS standards with hostname encryption (ECH) along with CDN networks will make blocking impossible. But even that is easily circumvented with government MITM. Kazakhstan already deployed all the necessary hardware and did some successful tests on scale. Browsers blocked its root certificate, but will they block (imaginary) China root certificate, losing 1.5B users?


> "...centralized systems and the arbiters they empower are the core problem and never work out, even if you like some of their decisions for some of the time."

centralization is merely a necessary but not sufficient ingredient for dystopianism. you also need a ratcheting consolidation of power, especially money (provided by advertisers in this case) and attention/influence (provided by viewers). centralization is simply one of those (key) ratchets that can be leveraged to further consolidate power for the benefit of directors and executives.

relatedly, the american constitution is an experiment in crafting a centralized system with checks and balances stable enough to withstand assaults of power consolidation. the jury is still out, but it's looking somewhat bleak at the moment, given a runaway executive branch fueled by an unhinged fed/central bank. we seem to be stomping on the gas pedal even as the brick wall looms ahead.

in any case, i've long been an advocate of right-sizing organizations of all sorts, especially governments and companies. we've concretely learned over the past many decades that the negatives of large (and small, but those tend to self-regulate away) entities eventually far outweigh the benefits, and are better substituted by a more diverse and specialized collection of medium-sized ones.


It is not merely a "key" one, though, as it is (as you admit) a "necessary" one: we see centralized systems abused for reasons both lofty and petty by organizations both large and small; but, if you don't have (or at least make obvious... like, "come on", right?! these companies seem to have a giant "come at us" sign they wave around) some centralized chokepoint, then the problem disappears. At best, I would argue that these other influences you list are themselves centralizations that attempt to push platforms to be more centralized (whether for political, monetary, or whatever else gain). (Also, ancillary point: I would argue the explicit goal of the design of the US government is that it isn't centralized... those checks and balances come from how there isn't a centralized chokepoint of power; we are at our worst when we allow people to--when times seem good--undermine this decentralization by consolidating more power in one of the branches over the others.)


i think we're disagreeing on whether any level of centralization is acceptable, or even good (with centralization defined principly as the consolidation of people and resources here). i'd contend that yes, some degree of centralization is beneficial, but too much centralization gets overwhelmed by negative repercussions.

in this specific case, imagine if we had thousands of mini-youtubes, each with their own curational quirks. no single mini-youtube could unduly influence the whole zeitgeist. most (all?) will be flawed in their curatorial duties, but none could move public opinion in any meaningful way. viewers could also jump from one to another and be exposed to many different editorial perspectives, even without necessarily being cognizant of that. however, if every mini-youtube were relegated to being a single person each (i.e., no centralization), then we'd lose the benefits of aggregation and curation.

the problem of course, is that we societally accept size and growth as good things, and i'm arguing that they're only good to a certain point (and see your argument as saying all centralization is bad), and that we need to change incentives as a function of size/centralization, so that we get right-sized organizations, rather than unaccountable behemoths. our current incentives make right-sizing an unstable equilibrium point on that curve, which is why it doesn't happen.


I think you are mistakenly conflating government official request with legal request. Maybe I would have been better off to use "legal demand".

A senator can ask you to do something but the fact the they are a senator does not legally obligate you to comply.

But if the DoJ sends you a demand (I'm not a lawyer, I'm just making up details here) let's say it's signed by a judge or something, what are you gonna do, say no?

There are also cases like child pornography where you might preemptively remove stuff without any demand because you want to but it would still be under justification of it clearly complying with a law so you haven't poisoned the well of if you censored anything.

And that's really the important factor in the end.

As everyone else had said, if you don't do that stuff, you are just going to get your domain and assets seized anyway.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: