A lot of content creators have voiced concerns over demonetization and/or false positives regarding copyright violations but I've rarely heard anyone talk about collectively moving to a different platform without centralized control.
Governments already ban certain ways of promoting smoking and alcohol use in an effort to reduce harm. Why should anti-vaccination material be considered differently when people not being vaccinating can be directly linked to the death of others?
https://publications.parliament.uk/pa/cm200304/cmbills/110/0... (Section 3)
It's not really a fair comparison, though. The main issue with anti-vaccination content is dangerous misinformation, not simply promoting something harmful. The other issue is that the people who fall for this content aren't usually the ones who get hurt, their innocent children are. It would only be a fair comparison if there were a large movement of very influential, powerful, and famous people posting "vegetables cause autism, and Burger King cures it" videos out there to hundreds of millions of viewers and a boom in child obesity tied directly to parents viewing them.
How you encourage healthy eating and moderation is also a very complex topic in comparison. Vaccination is a much more binary choice (you vaccinate or you don't), and is highly effective.
Most content creators are frustrated by the lack of transparency and the often arbitrary nature of demonetisation, but they see it as essentially necessary to safeguard their livelihoods. YouTube didn't start demonetising videos apropos of nothing, they did it as a specific response to an advertiser boycott in 2017. To a great extent, demonetisation saved YouTube from economic failure.
Google could let the market solve this problem, but ultimately brands don't want to manage their own blacklists, they want to put their ads on platforms where they don't have to.
> they want to put their ads on platforms where they don't have to.
Are they going to change platform? To what?
The most likely immediate switch is not to a new platform but to a whitelist of the top 100 or 1k channels. Because of the Pareto principle, these channels get the majority of views anyway, and it's easier to manage a 100-1k whitelist than to blacklist every long-tail channel with content you don't like.
You could argue that this is no different to YouTube in the short run because they still get the ad dollars, but it's in their best interest in the long-run to make the long tail inventory viable.
First, censorship can be carried out by non-state actors. Second, they treat these videos differently to intentionally limit their reach and appeal.
"Start your own website" is a brush-off when faced with corporate cenosrhip. People have tried it. Daily Stormer had several domains pulled, Gab had payment processors and Google Play ban them.
There is a very real problem with corporate harassment for views their management finds worthy of targeting and people are dismissing it because they like the targeting. For now anyway.
Youtube doesn't accept porn for example, which is why people that want to share entirely legal videos of consenting adults having sex created their own platforms to do it. You never hear complaints from them about being censored. Because they were not.
When did it became acceptable to go into someone's house, start shouting about some crazy religion, then act surprised crying censorship and oppression when they finally ask you to leave?
"Freedom of expression" is just that, the freedom to voice your opinion. It's not the "Right of attention".
Furthermore, like any freedom there are limitations when your freedom starts endangering the freedom of others. We don't allow hate speech and death threats, so where does anti-vax fit in?
“I don’t believe that we can afford to take a neutral stance anymore. I don’t believe that we should optimize for neutrality..” -@jack
Glad to hear it. And, because you just changed your view on neutrality, I just changed mine on anti-trust, eminent domain and tech platforms.
It's true. These platforms don't owe users a freedom to voice their opinions.
But since they're making editorial decisions, we, as a society, don't owe them protection from libel laws for example. And we should definitely look into monopolistic practices of VISA and Mastercard because it's much more serious than some microblogging site.
I don't see this as a "problem," per se. There are no guarantees in life that you will have unfettered access to services like this, particularly if you use them in a manner the providers deem against their terms.
We either need government regulations for these companies or they should be split up. This problem goes further than freedom of expression.
"You can't smoke in my house" "call all the papers I'm being censored!"
If you want to have a real discussion about it you have to concede that there are degrees to everything. Which is why some form of, according to your definition, censorship are illegal and some are not.
What youtube is doing here is entirely supported by the constitution. There is not even a gray area, or a maybe or a second interpretation, nothing.
Youtube is entirely within their rights, as is every one else, to decided what they want to host, share and promote on their platform, spending their own resources mind you as hosting and transferring videos costs a lot of money, and specially they have the right to decide with whom to enter into a financial contract, which is what they are doing here. I will argue forcing them to agree to pay or promote someone they are not legally required to is censorship.
Otherwise, if you can't or won't realize there are degrees of what you call "censorship" and not all is the same then there is no point in having the discussion.
I think the Anti-Vaccination movement is misleading and eventually harmful to public health.
That said I think censorship should be something that should take place (if it has to) in a public office, not in a corporate boardroom.
Here is what Google should have done. If you google anything related to depression google suggests suicide prevention hotlines. If facebook notices a popular article in your timeline it follows it with a snopes link to check if it is real. In other words Google should place a banner/ad/link that educates people on the real facts concerning Anti-Vaccination.
At the end of the day, Google own their platform and can do whatever they want, but is it the right thing to do? Or better yet is there a better way.
Here is the Oxford dictionary definition:
> The suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security
Note there is nothing there about "rights".
There are really people who earn real money by scaring the mothers. And Google is a big part of the problem with their "you will surely want more of the same" algorithms: the "friendly approached" mother googles only once for these topics, sees one of the "anti" videos and then gets from Google many "recommendations" for more and more "anti" stuff, every time she is interested in anything else.
The Google "recommendation algorithms" are the real and huge problem, not only their "monetization".
The same happens with other topics where many lies are involved and the actual harm ensues.
On the other hand they also like to demonetize anything containing anything close to curse words or talking about anything too serious.