Hacker News new | past | comments | ask | show | jobs | submit login
YouTube removes interview with professor of medicine on Covid stats and policy (unherd.com)
390 points by ppod 11 days ago | hide | past | web | favorite | 550 comments





Looks alive and well to me...

https://youtu.be/uk2YZfnsOPg


> Pleased that Youtube has reinstated my interview. I appreciate all of the support. Nobody knows what's going to happen with this pandemic - debate is always a good thing! I think preparing for the worst, but hoping for the best is a sensible strategy.

https://twitter.com/ProfKarolSikora/status/12635183523471482...


So just guerilla advertising? So weird, that video is in the article that claims it was banned.

Don't want to duplicate my comment but trying to prevent wrong conclusions. He tweeted few hours ago that YouTube has reinstated his video:

https://news.ycombinator.com/item?id=23266336


I would have been really surprised if they actually removed it.

The points made in the video are mostly sensible. They acknowledge the many unknowns.

It more or less advocates for the Swedish model of dealing with this vs. most other countries.


Google deleted / censored CSPAN video of a Senator on the floor of the Senate[0].

[0] https://www.thegatewaypundit.com/2020/01/c-span-video-of-ran...


That article is about a Tweet being taken down, not about Google. Also that article says

> It is not clear whether C-SPAN or Twitter took down the video.


My bad, here's a better link. It's hard to find a "mainstream" source because most of the old media is ignoring a lot of newsworthy items these days.

https://www.breitbart.com/tech/2020/02/13/rand-paul-blasts-y...

It's widely known that Google/Youtube took down any videos where the name of the supposed whistleblower were being mentioned, at least according to a lot of regular content creators who had videos deleted or shadow banned.


> Google/Youtube took down any videos where the name of the supposed whistleblower were being mentioned

Presumably because they didn't want to be exposed to the legal ramifications of broadcasting a crime?


Broadcasting a crime is illegal?

https://edition.cnn.com/2019/11/08/politics/legal-question-o...

> According to the Intelligence Identities Protection Act of 1982, the disclosure of a whistleblower's name could be a criminal offense if it is intentional, unauthorized and "the United States is taking affirmative measures to conceal the covert agent's foreign intelligence relationship to the United States."

But as many pages about the Whistleblower Act will tell you, this kind of thing is uncharted territory and presumably the exact kind of uncharted territory that gives expensive lawyers for large media companies extreme willies.


[flagged]


Killing the messenger doesn't change the message.

Think about why the only sources you can find are those which Wikipedia says

“The site has published a number of falsehoods, conspiracy theories, and intentionally misleading stories.”


Remembering that Wikipedia is also heavily biased.

Yes, however Brietbart used to have a section called "Black Crime"

Link? Also, a major political party used to be pro-slavery, so acknowledging and removing mistakes should be considered a good thing.

Their automated system has a non-zero false-positive rate. It will from time to time accidentally take down legitimate videos, which they restore fairly quickly once you request a manual review, such as in this case.

Yet people still jump to conclusion and assume malintent, posting headlines which imply Youtube is intentionally removing these videos. Clickbait.


> which they restore fairly quickly once you request a manual review

Seems like an overly optimistic view, based on stories you hear that small publishers don't stand a chance.


Yet people ignore all the cases where it was indeed recovered very quickly, as is the case here.

I've heard enough of these machine learning buzzword bingos every week when things like this happen. The thing is, I don't care. I just want things like this to never happen again.

This is akin to saying "I've heard enough of these NHTSA reports every week when car accidents happen. The thing is, I don't care. I just want things like this to never happen again."

A 0% error rate is a great aspirational target. But it's also important to acknowledge that a complex system may never be perfect. Insisting otherwise is ignoring reality.


Except that these takedowns aren't "accidents." It's more of a catch-all filter that's designed to proactively censor content that even slightly touches on certain matters.

Near zero (not actual 0%) is definitely achievable and within grasp. We just end up playing funding politics because we don't like the price-tag and liberty-curtailment associated with the value of human life and these things we're trying to solve.

Car accidents? Drunk driving? Murders? Dark-web illegal activities? Domestic violence? Those and many others are all solvable problems that we can get very close to zero instances. But right now we're quibbling and putting measures in place that don't get the numbers anywhere close to zero, they just "incrementally" reduce instances and we all applaud and find those meaningful whilst 99% solutions are ignored.


Then we should shut down videos on the internet entirely, because 500 hours of video per minute is not something anyone can manually moderate. We would all love to live in an utopia where everything works perfectly and there's a solution to every problem, but that's not realistic.

Constantly cracking down on videos that slightly mentions a certain topic goes way too far. Do we really want GFW style censoring to maintain an artificial utopia where harmful content™️ won't exist?

No, that isn't what's happening. You're probably not aware of this if you only read particular media outlets but YouTube keep doing this and they keep defending it. It's absolutely intentional and the headlines aren't misleading people, you are.

Read the article: "They rejected our appeal to have the video reinstated." - appeals are at least looked at by humans and the humans are affirming they made the right call. Most likely the only reason this one got fixed is someone internally caused a stink and got it looked at by someone higher up, as usual for Google. But most of the time that isn't happening.

Here's another example of them erasing a video by an actual epidemiologist:

https://nypost.com/2020/05/16/youtube-censors-epidemiologist...

Ivy Choi, a YouTube spokesperson, told The Post in a statement: “We quickly remove flagged content that violates our Community Guidelines, including content that explicitly disputes the efficacy of global or local healthy authority recommended guidance on social distancing that may lead others to act against that guidance. We are committed to continue providing timely and helpful information at this critical time.”

Note: you aren't allowed to disagree with or even dispute the efficacy of government policy. Period.

There's no difference at this point in policy between a Chinese video site and YouTube, just that the latter are censoring criticism of the state because they've been internally taken over by authoritarians loyal to generic "authority", not because they're forced to. But in the end how does Chinese censorship work? Well, it's not like Xi individually orders people to kill individual stories. It works by ensuring that information services are run by people loyal to the state, who then take decisions autonomously to defend the state.


> They rejected our appeal to have the video reinstated

Yet the video is up and working just fine...



I don’t agree with the interview contents, nor think professorship alone gives sufficient credibility in the face of a novel pandemic, data of which is in the process of emerging. That said, I don’t want youtube to be in charge of deciding what is credible or not. They don’t have in-house experts on these domains, they don’t have a magic epistemology machine that spits out the “facts”. They are good at writing web services, they shouldn’t be in charge of things like sound epidemiology.

I always imagine past figures that had adversarial relationships with authority at their times like Socrates, Galileo or Jesus and realize how Youtube would definitely take their videos down, shut their channels down and Susan Wojcicki would say things like “on matters of geocentricity vs heliocentricity we will follow the expert opinion of Catholic Church”. Then I think how might we be hurting ourselves today in ways we don’t even know by letting these tech institutions be the ultimate arbiter of our meaning making machinery.


And on the other end of the spectrum we have people being indoctrinated into violent conspiracy groups like QAnon, or being prompted to set fire to cell towers, or bringing measles back from the dead, because of YouTube videos. There's a line to be drawn here - and it's a tricky line, to be sure - but "YouTube should be totally neutral and not pull anything" is not a viable answer.

I'm not too sure if true neutrality is not the best answer after all. I think these bans are detracting from the real problem, which is that it's Youtube's system for promoting content that is causing problems, not the content on Youtube itself. Youtube has made itself too important a platform to be left in charge of deciding what content people should watch.

Frankly I would feel much more at ease if we could all decide that Youtube doesn't get to decide what content should be deleted, provided they also can't keep recommending videos. Let me explain.

By recommending videos youtube has a huge amount of influence on what content is promoted and what content isn't. However youtube's bottom-line isn't to recommend good content (despite what they claim it is) but it is to maximize ad revenue, which means they need to make people watch as many videos as possible. Superficially this would seem like it should drive Youtube to recommend good content, but it's beginning to look like it is biased towards addictive content, which has questionable quality at best, but might veer towards harmful conspirational theories at worst. It's a classic example of value drift, Youtube has turned into a paperclip maximizer that is beginning to harm humans to produce more paperclips.


The podcast "Your Undivided Attention" gets into the tech and public policy of this. Basically, their suggestion is to modify Section 230 of the Communications Decency Act, so that storage and retrieval is still a safe harbor for providers, but recommendation algorithms should be treated as editorial decisions and be open to the same liabilities that other communications media are exposed to. So a chronological feed of subscriptions to InfoWars, Stormfront, Plandemic, etc would be protected by law, but YouTube, FB, etc could be held liable for harm caused by content that they recommend. People retain their freedom to speak and publish, but they have no right to promotion or amplification.

> recommendation algorithms should be treated as editorial decisions

That's actually a great comparison. I wonder if it will ever gain traction.


That sounds interesting, I'll be sure to check it out, thanks.

But this favors one recommendation algorithm (time based) over others. What makes ordering by time non-editorial?

Laws don't have to be philosophically pure. A law can define or describe what counts as an editorial algorithm and what does not, and it could exclude simple sorting while including the complicated thing Youtube does to keep you watching.

What might be harder is to distinguish sophisticated search algorithms from recommendation algorithms. There's some overlap, but I'm inclined to think that the two are mostly different kinds of things, especially if the search results aren't personalized to the user.


Indeed, but (at least in the US), you can't really do that without running afoul of the first amendment, presenting content is just a form of speech. Government restrictions on speech are the thing that everyone's afraid of, right?

Currently, platform immunity means the platform can't be sued or criminally charged because a user hosts something for which the user can be. The proposed change would expose the operator of an editorial algorithm to liability for the algorithm recommending things that are already illegal.

I don't think the concern with rolling back platform immunity is government restrictions on free speech, exactly. In the US, the first amendment provides the same protections it always has. Instead, concerns include that the risk of liability will hamper innovation, that only companies with a lot of resources will be able to take the risk, and that companies will overreact with their own proactive censorship to stay well clear of areas with potential liability.

We've seen the latter with FOSTA/SESTA weakening platform immunity, which is arguably what its creators intended, but I suspect not what everyone in congress who voted for it expected.


Right, but the "platform immunity" thing applies equally to all platforms. You're instead proposing to extend immunity to some platforms and not others based on how they choose to recommend things. If you (like I do) believe that recommendation is a form of speech, this creates government favored classes of speech.

To word it differently: "Platforms that recommend/aggregate speech in government sanctioned ways are protected from liability, while those that do not are not". That's, while not open and shut perhaps, certainly spooky.


The law already does something like that. If I run the hypothetical blog host conspiracy-blogs.com, and I manually select ten posts I like to put on the front page every day, I can be prosecuted if I select a post for that saying something like:

Tomorrow at 10 AM, join me on the steps of the capitol. Bring your gun and a sign that says "we will shoot you unless you vote no on bill 123.".

The action described is a crime, soliciting others to participate in it is a crime, and my deliberate choice to show it to a larger audience is likely enough to subject me to accomplice liability that virtually any criminal lawyer would recommend against it. If violence actually occurred, there would be an ever lower bar for the victims to clear to sue me. Planning crimes is unambiguously not protected speech.

The proposal described would subject any sufficiently editorial algorithm to similar liability.


> recommendation algorithm (time based)

Can't any choice (not being purely audio, why the title is only in X languages, etc) can be considered editorial, for any TV news show or book? It isn't constructive to construe FIFO as editorial when it is the common way to experience and share events. Since content creators ostensibly retain control of what is published, they could ideally rearrange their content in their feed by re-pub or out of order publishing anyway.


> Can't any choice (not being purely audio, why the title is only in X languages, etc) can be considered editorial, for any TV news show or book?

Yes exactly.

> It isn't constructive to construe FIFO as editorial when it is the common way to experience and share events.

In what context?

> Since content creators ostensibly retain control of what is published, they could ideally rearrange their content in their feed by re-pub or out of order publishing anyway.

This sounds horrendous. You want to encourage content creators to spam you with content so that they're always at the top of your "feed"?


> In what context?

Any episodic context, like 99 percent of the media we consume. TV show episodes happen in a specific order of events, movies are constructed of scenes in a specific order, community growth around a channel is built around common experiences already shared, even looking at history channels, if a story spans multiple videos requiring multiple media releases, when each video relies on information provided from another, the content creator would have to be daft to release them in the reverse, or worse, a random order, when everybody is waiting to find out what happens next.

From my perspective, it seems like the only videos that are viable outside of this context are long-form videos (1hr+) or viral trash.


> Any episodic context, like 99 percent of the media we consume.

But you're suggesting a single "timeline" for all content. So you get one stream of every show on every channel in order. That's not the usual way we watch episodic content. Hell since the invention of the VCR, we mostly just watch the specific stuff we want, not everything in I ur bundle.

You seem to be conflicting the content made by a single entity with the aggregator. For an aggregator, why does time name more sense than any other view (like say clustered by creator, or as is common today ordered based on presumed interest)?


> This sounds horrendous.

Any "editorial algorithm", by the standards you have put forth, can be abused.


I don't disagree. But what you (or the prior posters) are saying is that the law should prevent companies from trying to optimize for the best user experience. If only showing things in time-order is "safe" for the company, but filtering out spam becomes "unsafe", then I as a consumer am forced to endure a crappy experience for solely legal reasons. Neato.

I really love the sound of this idea.

This is a great point and is also true of Facebook and a number of other platforms. Roger McNamee spoke at length about this issue with Sam Harris on his podcast back when it was called Making Sense. His point was essentially that the AI that controls the recommendations is trained to maximize clickthrough rates, and so it learns in its own way to feed us articles that make us angry, because we click and share more that way. Hence the tendency to lead people down rabbit holes of extremism.

Two things could be done for this content:

- Don't list these videos in watch-next / suggestions. This avoids people getting lost down these strange rabbit holes

- Insert a prompt which notifies the viewer that the content has been flagged as misinformation. Ideally include links to resources to find out more. This prompts people to check out the widely accepted truth to compare what they'll see in the content

Just because something needs to be done about misinformation doesn't mean censorship is necessary


Censorship isn’t the answer to the things you fear. YouTube banning these things won’t stop them from spreading. YouTube not banning these things won’t cause people to commit violence.

Try replacing “being indoctrinated” with “reading” or “being prompted to” with “watching crazies talk about how you should”, and try your sentence again without the biased framing.

You misattribute the agency of the viewer. Video games don’t make kids shoot up schools, and stupid videos don’t make people light cell towers on fire.

Even the entirety of Pizzagate led to exactly one guy out of hundreds of millions picking up a gun, and even then no one was hurt.

Meanwhile, YouTube is taking down videos of ACTUAL war crimes in the Syrian civil war, which is destruction of evidence and is hindering prosecutorial efforts.

Censorship is not the solution.


No, but a hard drive was shot - right through the heart!!

You give censorship a bad name? <confused>

I have no idea what you're trying to say.

It's a play on the "You Give Love a Bad Name" song by Bon Jovi.

"Shot through the heart, and you're to blame, you give love <sic:censorship> a bad name"


That's a fine point, but YouTube isn't restricted to the option of taking down videos. They could stop recommending these videos, and they could pull them from search results. Actually, YouTube might be a better place if new videos were unlisted by default and teams of curators were responsible for approving new channels. But then maybe someone would just build their own YouTube index and search engine.

That works right up until the content-creator community or HN or Reddit riots about "the algorithm".

Opposed to them rioting about "the vehement censorship"?

Frankly those people are just dumb, and there's nothing we can do to truly protect society from them. So, you might as well allow such content to exist.

Let's not forget why idea censorship exists. We are ruled by those we are not allowed to criticized. Even the guys setting fire to cell towers, can we believe that a video is going to convince anyone reasonable of actually doing it? Do you yourself feel tempted?

Let me be contrary: who shut down millions of businesses and put millions out of work? It wasn't QAnon. It was the state -- and it looks like YouTube is aligned with the pro-shutdown folks who don't want their wisdom questioned.

Now I'm a big believer in keeping things shut. I thing prudence is super-important. But QAnon's actions are mainly talk. The state's, however, are different.


    Let me be contrary: who shut down millions of businesses and put millions out of work?
Well, I'd blame the pandemic itself.

Sweden provides an imperfect but interesting what-if scenario for countries wondering what would have happened if they'd imposed little to no lockdown restrictions.

https://www.cnbc.com/2020/04/30/coronavirus-sweden-economy-t...

Their economy is still taking a very severe hit despite retail businesses remaining open there, because people are shying away from making purchases and visiting shops.

You can (correctly) say that many governments took the decision out of peoples' hands. Perhaps the people of country XYZ would have made a different choice. But Sweden's example suggests that economies would have been screwed pretty hard no matter what.


Without lockdowns, it just means that everyone with the means to stay home does so, and the people not able to stay home disproportionately face the disease. Government lockdowns reduce the incentive for desperate people to continue exposing themselves to the disease.

> Well, I'd blame the pandemic itself.

Exactly.

In most countries, people are willingly staying mostly at home. More, they continue to restrain from shopping and eating out after the lockdown ended.


This is not the case for Sweden. For details, take a look at https://news.ycombinator.com/item?id=23266698

I don't think this is the case for Germany, Switzerland, or Denmark either.

What countries do you have in mind where lock downs ended and most people are staying at home?


Lockdowns are causing the majority of economic damage.

Whether lockdowns are justified anyway is a question for another discussion. But economies are hit by lockdowns not the pandemic itself. Let's look at info we can find about Sweden.

Sure, people in Sweden are staying at home more, but the decrease in activity is hardly on the level of locked down countries.

"When people became aware of coronavirus around March 12, we lost almost overnight 30 per cent. It’s OK. For a couple of months, it will work. But after that it will be very, very tough,"[1]

[A pedestrian street in Stockholm on Apr. 1](https://www.telegraph.co.uk/content/dam/news/2020/04/01/TELE...)

[A market in Malmo during the pandemic (no exact date)](http://archive.is/2siwt/5d1b19d61fd21d052c2cc190f13f722c0bf8...)

[Pubs, eating out](https://www.bbc.com/news/av/newsbeat-52618788/coronavirus-ho...)

It absolutely doesn't look that Sweden's economy took a downturn because people are afraid to go outside.

Their economy seems to be hit by supply chains dependent on locked down countries being suspended[1]:

"One big reason is that Sweden is a small, open economy with a large manufacturing industry. Truckmaker Volvo Group and carmaker Volvo Cars were both forced to stop production for several weeks, not because of conditions in Sweden but due to lack of parts and difficulties in their supply chains elsewhere in Europe."

So, to reiterate, it looks very much like the majority of damage to economies are caused by lockdowns.

[1] https://www.ft.com/content/93105160-dcb4-4721-9e58-a7b262cd4...


I don't agree with your conclusion, but I appreciate those links. Thank you.

On top of that, Sweeden is recording very high death rates and latest estimates point to around 7% in immunity.

The 7% was a month ago, ie that was when the tests took place.

If you believe the disease is rampant in Sweden, it’ll be much higher now. If it’s not, then the hard lockdowns elsewhere were superfluous.


Even if they had 30% immunity, that still leaves the majority of the population exposed. I'm sure 30% would be very much preferable (if it's even true), but probably not good enough to allow things like concerts or maybe even sit down restaurants to reopen.

There are estimates that herd-immunity is achievable at 43%

"The disease-induced herd immunity level is 43% ... when immunity is induced by disease spreading, the proportion infected in groups with high contact rates is greater than in groups with low contact rates"

https://arxiv.org/abs/2005.03085


Unlike most other European countries, Sweden is not increasing its sovereign debt load by 15+% of GDP to keep the citizenry afloat. That’ll feel a lot more important than Covid in 5 years.

I think this is likely a biased view. If you ask the families of the 300+k people who died from Covid-19 they probably won't consider the economic impact to be the most significant.

Thankfully, that's both not how democracy works - we don't ask a biased sample - and exactly how it works - we ask everyone what they think of the government's performance and they vote for or against.

When we all look back in 5 years, then 10, then 15, there will be different "obvious" conclusions drawn. Right now, anyone who sides with any action as correct should have a confidence level that is extremely small.

The more varied the responses, the more we learn as a species, and I am glad that not everywhere assumes they are NYC or Lombardy, and acts in varying ways so that in 5, 10, and 15 years the coming studies have different data points of comparison.


I agree with you, that was basically my point. At this point in time we don't know enough about how this will play out to make any claim about what was the more important aspect to consider: the effects on the economy or the effects on population-level health due to Covid-19.

I think it’s fair to say that both numbers represent a measure of different ways that the pandemic has caused pain and suffering, whether it’s in terms of lives or livelihoods impacted.

We obviously need to deal with both problems concurrently. There are a lot of actions that can be taken which can help one which doesn’t come at the expense of the other, so it’s not purely a either-or decision.

It would be wrong to turn a blind eye to either effect. The death toll is an immediate concern, but so is a functioning food supply chain, and indeed a healthy economy is critically important to everyone.

No having lived through an economic collapse or a hyper inflation event, for example, it may be hard to appreciate how difficult (and deadly) that situation can be.


People are worried about economies not for some abstract reasons. Bad economies literally kill and have a potential to kill much more people than Covid-19.

From what I gather, the scale of current economic damage is seriously compared to the Great Depression in mainstream media[1][2]. Well, the Great Depression brought Nazi into power in Germany[3], WW2 followed and took lives of 70-85 million people[4] to say nothing of the post-war devastation.

Or look at Venezuela for an example of a bad economy in modern times, where "families buy rotten meat to eat"[5].

Also, if you ask families of people who will die because their cancer wasn't diagnosed in time, they will probably consider lockdowns too excessive[6].

[1] https://www.cnbc.com/2020/05/19/unemployment-today-vs-the-gr...

[2] https://news.sky.com/story/coronavirus-economic-cost-and-hum...

[3] https://en.wikipedia.org/wiki/Great_Depression#Germany

[4] https://en.wikipedia.org/wiki/World_War_II

[5] https://www.youtube.com/watch?v=8MGbyLUCw5k

[6] https://www.bbc.com/news/uk-northern-ireland-52747659


I don't think anyone said worry about the economy was an abstract idea. It obviously has a very real effect on people and their health, it has been well studied. I was simply pointing out that what people will consider more significant in five years time (the effect of the pandemic itself or the subsequent harm to the economy due to the response) depends on who you ask. Furthermore, since we don't know the full extent of the pandemic or the ensuing economic impact it doesn't make sense to claim one way or the other right now.

Also, the Great Depression was one of many factors that led to the rise of the Nazis in Germany. The tensions that led to WW2 were brewing for before WW1 even started so don't try to claim that an economic depression caused all those deaths. There are far too many factors to make that claim.


"I was simply pointing out that what people will consider more significant in five years time"

Yes, I think I understand what you're saying.

My point is the harm to the economy directly translates to lives lost too. Bad economy leads to surge in violent crime, opiods addictions, gang violence, lost access to health care...

"... the Great Depression was one of many factors that led to the rise of the Nazis in Germany. The tensions that led to WW2 were brewing for before WW1 even started so don't try to claim that an economic depression caused all those deaths. There are far too many factors to make that claim."

It wasn't the only cause, sure. But it was a huge one, afaik. History isn't a hard science, of course, and I'm not a historian (not even an amateur historian) but this quote from Wikipedia doesn't strike me as particularly controversial:

"The unemployment rate reached nearly 30% in 1932, bolstering support for the Nazi (NSDAP) and Communist (KPD) parties, causing the collapse of the politically centrist Social Democratic Party... Large-scale military spending played a major role in the recovery."[1]

[1] https://en.wikipedia.org/wiki/Great_Depression#Germany


> QAnon's actions are mainly talk

QAnon is an FBI-registered domestic terror organization: https://www.rollingstone.com/culture/culture-features/qanon-...

They've been connected to multiple attempted acts of violence. They aren't just talk.


From the article:

The FBI memo cites at least two violent incidents or attempts purportedly linked to QAnon: An Arizona man harassed and doxed locals he suspected of participating in the child sex trafficking ring at the heart of the conspiracy theory; and a Nevada man at the Hoover Dam whose truck was found to contain rifles and other ammunition, who was later discovered to have sent letters to President Trump containing references to the movement.

Doxing and keeping guns in your car are not violent acts and the connection is tenuous at best. By that measure you could easily classify The Young Turks as a terrorist organization, Elliot Rodger was their subscriber on Youtube. And the violence was real.

Why would people on HN of all places take FBI at their word. How many hackers have been defamed, and even jailed, by FBI like that.


FBI has been connected to multiple successful acts of violence.

Terror organization who operates freely on reddit?

Name a better location to monitor a terror organizations activity (for investigators). Makes the feds' jobs easy.

A Qanon guy brought a gun into a pizza place and one tried to mail bombs to politicians.

Don't forget the attempted Venezuelan coup.

Actually it was mostly the populace who shut down business activity of their own accord, at least in the U.S. See https://www.nytimes.com/2020/05/07/upshot/pandemic-economy-g...

> QAnon's actions are mainly talk. The state's, however, are different.

Isn't QAnon directly promoted by some people directly related to the president of the state?

Don Jr. wears it:

https://www.etsy.com/dk-en/listing/716574473/the-official-q-...

The President himself reused their hashtag:

https://www.businessinsider.de/international/trump-retweeted...


You've just given a bunch of evidence that those things are happening despite our benevolent censors' best efforts.

I think giving people the idea that they've stumbled onto dangerous truths just makes them more resolute. Or do you think the issue is they just need to be suppressed even harder?


> There's a line to be drawn here - and it's a tricky line, to be sure - but "YouTube should be totally neutral and not pull anything" is not a viable answer.

It sure can be given other means of combatting misinformation. But sure, let's just go with the sledgehammer every time.


What are the other means?

Active information campaigns. Deprioritizing in search/related results rather than banning. Partitioning verified from unverified information and always showing a minimum number of factual content when showing unverified content.

That's just off the top of my head. There are dozens of possible strategies, but companies will always do the cheap and easy thing that will stop the bad PR quickest, even if it hurts some people. We should demand better.


It is. There are books on all kind of things, are we going to burn books we don't like, next?

The line is already there, we don't need corporations to draw it for us. If something isn't illegal, it should not be censored.

If people commit crimes they'll get arrested.

Including those that commit crimes because they watched the extremely fringe and dangerous Qanon conspiracy theory you mentioned--which from what I've heard makes you torture and kill small animals right after the first video, to continue onto progressively worse things as you keep watching (stay away from it, kids!).


What? "because of YouTube videos," isn't that like saying "violence happens because of video games," etc.?

Of course, yes. Doesn't make any sense.

If it's so dangerous that it should be banned, let legislatures pass laws banning it, and let YouTube enforce those laws.

...... Joe

....... Hillary

..... Chuck

Like him or not, the President is a master at branding. If you really think he is going to have his record setting rallies literally filled with people with Q shirts on and not have control of that situation; it's you who is confused.

Notice how the hit pieces never link to the actual posts.

People who want can find them @ q map dot pub, or a level up at crowd sourced news sites like we are the news dot ws, or even a level up from that at citizen free press dot com.

It's common knowledge that the President of the United States posts on 8kun. We are not waiting for the mockingbirds.

https://news.ycombinator.com/item?id=23112787


Google is a private company. They should be allowed to do whatever they want. They are not a government.

Let the votes decide.

What is wrong with relying on the existing legal foundation? Arrest them for arson, if and when they decide to ravage a cell tower.

Should we halt the sale of knives because they can be misused to murder?


Nationally, the clearance rate for arson is typically under 23%, meaning the vast majority of arsonists get away with it. In general, clearance rates are pretty bad across most crimes. The homicide clearance rate is typically below 62%, clearance rates for non-fatal shootings is typically below 55%, clearance rates for reported rape is typically below 40%, clearance rates for motor vehicle theft is typically below 20%.

And the actual clearance-by-arrest rates are lower than those values.

And conviction rates are even lower. (*edit: relative to the total number of crimes. Conviction rates are pretty high for the cases that State's Attorneys or prosecutor's offices accept and agree to prosecute)

The standard of proof for State's Attorneys/prosecutors to accept a case from Law Enforcement Officers (LEOs) is pretty high, and it's expensive to build and prosecute cases. Clearance rates would be higher if law enforcement agencies had high definition surveillance camera networks covering all public spaces, but I think people would also find that solution unappealing.

I'm a homicide researcher in Chicago and I've come to the conclusion that the legal system will never be able to solve or even make a significant dent in the homicide problem, and taking a "law enforcement only" approach is the same as just surrendering to the problem forever. If you want to solve crime problems, you have to explore preventative solutions, namely give people better paths and eliminate paths that empirically lead to only bad destinations.

[0] https://ucr.fbi.gov/crime-in-the-u.s/2017/crime-in-the-u.s.-...


The last bit sounds like the plot of Minority Report.

I get that being a homicide researcher you're rather working on the assumption that a non-zero homicide rate is a problem that can be solved with a sufficiently clever policy. What if it's just a natural fact, like the laws of gravity? What possible solution could there be to homicide as a category of crimes beyond literally being able to see into the future or people's brains, a la Minority Report or Black Mirror? Isn't there a risk the mentality of a "war on homicide" (you talk of surrender) is actually more dangerous than the problem itself?

Attacks on cell towers is a very new problem. The 5G conspiracy theories are stupid, but you know, I got into a conversation with some coworkers about them the other day. We all laughed at how dumb these theories are until one guy said his mother believes it and he couldn't find a way to convince her otherwise.

The source wasn't YouTube. It was just her social circle. Communicating with WhatsApp and FB groups of course because of lockdown but people talk in real life too and always will.

The difficulty became apparent when I did a few quick searches for "5g coronavirus theories". The top search results were all attempted debunkings - pretty good. Unfortunately the debunkings themselves were not. I don't think I've ever seen such crap attempts at debunking a conspiracy theory. The top hits were mostly newspapers, and a common theme was "the world is now full of idiots who believe conspiracy theories, like 5G causing coronavirus or that there are cures for COVID" where the latter phrase linked to a doctor selling hydroxychloroquine. Which just yesterday the UK government started buying in bulk in case it turns out to be a cure. The article also lumped people protesting against lockdown (normal, expected) with 5G conspiracy theorists (not normal, not expected).

So this is the first reason why the theories aren't going away - the sort of people who attempt to debunk them keep lumping them together with any skepticism of government policy at all. But most people understand that being skeptical of a global mass house arrest justified via buggy computer model isn't at all irrational, so when they're told 5G theories are just like that, it just makes the problem worse.

For these conspiracy theories to be successfully debunked will require the debunkers to stop arguing from authority. People don't trust the authorities. That's why they believe in a massive conspiracy to begin with.


> What possible solution could there be to homicide as a category of crimes beyond literally being able to see into the future or people's brains, a la Minority Report or Black Mirror? Isn't there a risk the mentality of a "war on homicide" (you talk of surrender) is actually more dangerous than the problem itself?

First, I'm going to plant the goalposts. A homicide rate of 0 would be nice, but it's not realistic for any large population. Even in the very well educated, upper-middle class Detroit suburb I grew up in, there was a homicide every 10 years or so (the most recent one was actually done by the dad of one of my friends/a soccer coach of mine; he had a great reputation/was a philanthropist/was president of the local Rotary club, but apparently he was also a pillar of the local S&M community and his wife found out so he had her killed). The goal is to reduce the rate of homicides. And 5 minutes of thought will reveal that that's absolutely possible.

Like many US cities, Chicago is sharply segregated [0]. That link also includes maps of counts of narcotics arrests, homicides, and shootings, and if you cross-reference the narcotics, homicide, shooting, and black residents as a percent of census tract population maps, you'll see they're highly correlated. The vast majority of the violence is in the black neighborhoods. It's an uncomfortable fact, but stay with me. As a liberal, seeing this was initially like a punch to the gut, but it makes sense when you ask the question "why would anyone choose to live in these neighborhoods", and very few choose to live there, rather, most people can't leave. In 2016 (and for the US), the median net wealth of black families was about $17k, while the median net wealth of white families is about $171k, 10x the median net wealth of black families. White people have the funds to climb to better areas with better schools with better economic opportunities and they can afford to buy real estate in good or improving areas which provides those white people with a durable store of value (that has typically appreciated at a compounding rate over the past century), but this increases rents and this pushes black people to areas with lower rents, worse schools, brain-damaging pollution (eg lead paint, lead contaminated water [2][3], air pollution from industry), and much higher rates of violence.

Now to get to the question: what's driving the violence? And it's largely economic. A massive portion of the shootings and homicides are committed by gang members (really neighborhood cliques now, with little or no hierarchy and on the order of 10 members on average) who are in cliques that sell drugs, and from an analysis of cleared homicides and shootings, the shooter and victim are very frequently in geographically adjacent cliques that both sell drugs. In the 1920's, alcohol became a black market good and there was a lot of violence between alcohol traffickers that evaporated when alcohol returned to being a regular market good.

So, do I have to be an omniscient precog from Minority Report to know that [legalizing drugs, eliminating lead exposure, and improving the economic opportunities for black youths by improving the quality of education in all neighborhoods to the level white people enjoy] would drastically reduce the homicide rate? No. Those are obvious steps.

> The source wasn't YouTube. It was just her social circle.

That's kind of like saying "guns don't kill people, blood loss kills people". How do the ideas initially get into the minds of people in her social circle? Per Facebook, a lot of misinformation is financially motivated [4] or geopolitical (from a certain malicious surveillance state that wants time to catch up on 5G development) [5] and spread widely through public groups. If the misinformation was stopped from spreading publicly, it would reach the last leg of the trip to the private group/WhatsApp group phase far less frequently and lead to far less destruction.

> The article also lumped people protesting against lockdown (normal, expected)

No, it's not normal, and the protests have been very small and sparsely attended, but they've been massively amplified by the media [6]. As of May 7th, 68% of Americans (10957 people surveyed) were more worried about restrictions being lifted too early vs 31% that were worried about the lifting too late. And saying you're more worried about the economic harms than the virus is way more normal than actually going into public and gathering in a group with other people who are all intentionally rejecting safety recommendations.

> For these conspiracy theories to be successfully debunked will require the debunkers to stop arguing from authority. People don't trust the authorities.

Yeah, it's not possible to change a conspiracy theorist's mind with facts, because any inconvenient fact can just be interpreted as another part of the conspiracy. To those susceptible to conspiracy theories, there is no known cure to the infection. The only way to avoid outbreaks is to stop transmission.

[0] https://imgur.com/a/K9IJKHr

[1] https://www.brookings.edu/blog/up-front/2020/02/27/examining...

[2] https://www.cambridge.org/core/services/aop-cambridge-core/c...

[3] https://en.wikipedia.org/wiki/Lead%E2%80%93crime_hypothesis

[4] https://www.facebook.com/facebookmedia/blog/working-to-stop-...

[5] https://www.nytimes.com/2019/05/12/science/5g-phone-safety-h...

[6] https://www.vox.com/2020/5/10/21252583/coronavirus-lockdown-...

[7] https://www.pewresearch.org/fact-tank/2020/05/07/americans-r...


Then just do what all legal codes already do, increase the severity of punishment to compensate.

> If you want to solve crime problems, you have to explore preventative solutions, namely give people better paths and eliminate paths that empirically lead to only bad destinations.

Education is the preventative solution. Censorship is just security through obscurity.


> Education is the preventative solution.

It's a possible preventative solution, but when designing anything, you should never forget that you are not the user. You have to design for the user.

How does the ignorant person who is seeking education know which source to believe? Educated or uneducated, when people don't have the time or prior knowledge to evaluate claims on their own, they look to the sources they already trust, especially when they're scared. In the US, tens of millions of people deeply trust Trump, and many people are afraid of dying from COVID-19. Trump has been regularly praising hydroxychloroquine as a miracle cure ever since an attorney/blockchain enthusiast named Gregory Rigano misrepresenting himself as a Stanford Med School Advisor on Tucker Carlson's show [0] (see too_much_detail below) and misrepresented a bad study to assert hydroxychloroquine had a 100% cure rate. Trump has been telling people to take this drug constantly, claiming he's taking it personally. However, in a study by the VA of hydroxychloroquine's effect on 368 COVID-19 patients, the hydroxychloroquine-only group was associated with increased overall mortality (Rates of death in the HC, HC+AZ, and no HC groups were 27.8%, 22.1%, 11.4%, respectively) [3]. Considering

Many people will die because Trump and the GOP are miseducating people. Why shouldn't YouTube improve the quality of the content on their platform by removing lethal misinformation? YouTube and FaceBook were able to greatly limit the spread of the ChristChurch mass murder (by rejecting video's with matching hashes, looping in human reviewers, and other means) such that I never saw the video, so it's not a technological issue.

too_much_detail: {where he falsely presented himself as a "Stanford Univ Med School Adviser" [1] and claimed a "well-controlled [false], peer-reviewed [false] study carried out by the most eminent infectious disease specialist in the world Didier Raoult MD Ph.D. ... enrolled 40 patients, again a well controlled study [still false, no randomization, no blind, treatment group was from one hospital in Marseille (mean age 51.2), while the control group was from hospitals in southern France (mean age 37.3) [2]] peer-reviewed [still false] study, showed a 100% cure rate [false, 100% of the people who were swabbed every day, which excluded 1 patient that died and 3 that went into the ICU [2]]. The study was released this morning on my Twitter account."}

[0] https://www.youtube.com/watch?v=C4zTt8oLD44

[1] https://www.nytimes.com/2020/05/12/magazine/didier-raoult-hy...

[2] https://www.theguardian.com/world/2020/apr/06/hydroxychloroq...

[3] https://www.medrxiv.org/content/10.1101/2020.04.16.20065920v...


Hey, it worked for the UK!

[flagged]


This doesn't directly answer your question of "violent" (and it seems on the whole the QAnon protests have been non-violent), but the FBI assessed QAnon Extremists as a domestic terrorism threat[0]. (The qualification here is "QAnon Extremists" which appears to, probably correctly, separate the extremist element from the majority).

[0]: https://en.wikipedia.org/wiki/QAnon#FBI_domestic_terrorism_a...


> I don’t want youtube to be in charge of deciding what is credible or not

This point is often repeated during discussions on these topics but I believe this is an inaccurate characterization. YouTube is not "in charge of deciding what is credible" in the fashion that kind of phrasing suggests; they are moderating the content on their platform with respect to their business process and company values, those decisions are not a reflection of credibility. YouTube is a corporate product and we should not encourage the narrative that this product is the zenith of knowledge even though there are knowledgeable people who put content on YouTube.


> they are moderating the content on their platform with respect to their business process and company values, those decisions are not a reflection of credibility. YouTube is a corporate product

This is such a tired argument. YouTube is ubiquitous - it is the primary (read: effectively only) platform for video hosting, which today comprises a large proportion of all information transfer between humans. Companies like Facebook and YouTube are central to social discourse, at least in the west. When a medium of communication is that ubiquitous, the fact that it is owned by a private company is not an excuse to throw up one's hands and say "Well, I guess they own it - they can censor what they want".

The fact that it is so widely used and owns such a significant share of the "ways to spread information" market outweighs the fact that the company is private.


> it is the primary (read: effectively only) platform

This is very obviously false based on countless examples, but you're baking in the assertion that exposure to the YouTube audience is a necessary criteria to be considered a video platform, yet you haven't justified that belief. YouTube's ubiquity is an accolade afforded to it by the public on a completely voluntarily basis that can be effortlessly revoked at any time. The reason it is not revoked is because it costs users nothing to use YouTube and for the overwhelming majority of its users YouTube is just a small and generally unimportant slice of their day akin to watching netflix or playing video games.

> The fact that it is so widely used and owns such a significant share of the "ways to spread information" market outweighs the fact that the company is private.

Why? I don't see the connection between the quantity of eyeballs freely gazing and the need for the owner to abdicate their editorial prerogatives. What is your justification for this? If the eyeballs have a problem with how those prerogatives are expressed they can look away to another website and they actually do that quite frequently.


There are a million options for video hosting.

You are performing some sleight of hand here - claiming they are the only way to share your view - when what you are really referring to is that they have the most viewers, and that they have tools for promoting content in front of that audience.

"I deserve access to an audience on their system" is a very weak claim compared to the (untrue) "they are the only place I can share my views."

Personally, I say kill the era of algorithmic search and recommendation. Then you'd have to convince people with compelling content to link out to you, not just be clickbait that plays well in the naive algorithmic feedback loops. Good luck!


More and more social interaction becomes intermediated. And especially during this time of quarantine. It's not as easy as convincing people. Because people are now all on ban-happy platforms. And these platforms synchronize their bans.

Let's say your video has some modicum of success on no-name-host.com. Suddenly facebook, youtube, reddit, discord all ban links to that host to reduce spread. They do this intentionally. They brag about it to politicians. They get rewarded for acting as a block... by the state.

It's a nice little proxy arrangement.


Access to an audience is part of speech IMO.

If most of the world's population was on one continent-sized piece of private property, and because it was private property the owners were allowed to censor whoever they wanted, "you can go talk freely on the almost empty island over there, maybe someone from the continent will come listen!" doesn't sound like free speech to me.

I agree that getting rid of algorithmic search/recommendation/promotion would be a big step forward, but platforms for speech of a certain massive size like Youtube/twitter/facebook/etc should also be heavily regulated and their ability to censor limited. They are, due to their overwhelming size, de-facto custodians of speech.


Youtube

Vimeo

Name the next five that are discoverable, host your videos for free, and can stream that to millions of people in a single day.


> ... discoverable, host your videos for free, and can stream that to millions of people in a single day

No one has an inalienable right to any of these features, let alone to all of them in a neat package. The organizations that invest ingenuity, capital and labor to create platforms with those features do have a right to set terms on how their platforms is used.

I don't get how the solution to what is seen as corporate authoritarianism (on private property) is asking for more authoritarianism (by the government) to force persons to host content they'd rather not.


TBF, these are more like entitlements than rights.

We’ve already covered discoverability: you have no right to an “audience”.

As far as being able to reach millions of people per day for free, why do you think that’s a right, exactly? The fact that the websites you listed are footing the bill for you doesn’t mean that serving content (i.e., bandwidth) in general is “free”.


I don't think there is a right, but don't act like there are a million equivalents to Youtube.

As a contrarian view, do our rights require it to be _cheap_? AWS + a domain name (`www.the-moon-really-is-made-of-cheese.com`) seems like something that is available to everyone.

- It's not cheap - It's not easy

However, it's a platform of speech that is available to everyone, and if you want to scale to millions, one can add the same caching layers etc that FAANG can. We've seen this kind of thing when people were banned from twitter or whatever, and ended up migrating to other twitter-competitors. Newspapers are not required to post our opinions, and I don't think youtube is either.


> AWS + a domain name [for video hosting]

Have fun with that egress bill when your video goes viral.


"that's a good problem to have" doesn't really work when it bankrupts you/your company.

> This is such a tired argument. YouTube is ubiquitous - it is the primary (read: effectively only) platform for video hosting

Far from it. You can host your video anywhere you like, but you get the most eyeballs and revenue from YouTube, and that's based on user preference and not on some nefarious activity on their part. A lot of what makes them successful as a hosting platform is the same content curation that people like to complain about.


This whole argument has been repeated many, many times on HN. Can we stop letting this stuff devolve into generic honest discourse vs corporate nihilism debates? At least mention something about COVID so I know which thread I'm in.

> You can host your video anywhere you like, but you get the most eyeballs and revenue from YouTube, and that's based on user preference and not on some nefarious activity on their part.

Maybe not "nefarious" exactly. But YouTube does have a virtual monopoly. And that's arguably illegal in the US. Or at least, it exposes them to additional government oversight.


In what sense does YT have a monopoly?

In a sense that is a default go to platform for many people when searching for (video) content.

Being the most popular platform doesn’t make something a monopoly.

Instead of arguing about the semantics of "monopoly," it could be fun to ask "does YouTube have power that we as a self-governing people should regulate?"

I think it does. Others disagree. Sometimes it's an interesting discussion.


If there's no substantial competition, it's a monopoly. Intentionality isn't that important. It's about market power.

> A lot of what makes them successful as a hosting platform is the same content curation that people like to complain about.

I'm not sure about this. Besides removing obvious hate speech, which sort of content curation do you mean? I thought its success was due to everyone being able to upload and share video clips. I don't think curation was an important feature.


YouTube is perceived as a family-friendly, PG to PG-13 brand. It would be a very different, and much more niche, environment if it were a free-for-all of pornography, obscene language and graphic violence.

I agree, but notice that except for pornography, which I forgot to include, I covered all of the others (sort of) with "hate speech". Banning violence/insults/pornography/harassment is an obviously useful filter for a PG-13 site, but I don't think it qualifies as curation. To me curation implies something related to the selection of quality.

>I don't think curation was an important feature.

Maybe it wasn't in the beginning, but Youtube is a Google company now, and their entire business model is based on (mostly) algorithmic curation. The business model of every content creator depends upon catering to the preferences of that algorithm, and their success at doing so determines their value to advertisers. Entire genres of content have come into being and have been destroyed overnight because of the algorithm.

Youtube's success mostly comes from driving discovery of new content through recommended channels and videos.


That was my point, it wasn't in the beginning. YouTube's initial success wasn't caused by curation, which is the implication I was replying to.

I'm not sure the discovery/recommendation algorithm is the same as curation. It seems to me these are related but distinct fields.


what youtube decide the list of video when you go to www.youtube.com is the content curation. Its a very important feature.

> The fact that it is so widely used and owns such a significant share of the "ways to spread information" market outweighs the fact that the company is private.

That argument seems a slippery slope towards reverse privitization. Who determines what companies are allowed to censor? Who censors the censors? Companies censor content because they are primarily afraid of liability (legal, political, cultural). There are other video sites out there. Just because youtube is popular doesn't prevent people from hosting content on other platforms.


It’s not unprecedented to have different rules apply to private company passed a given reach.

We could see it the same way we see banks. Banks are private entities but tightly regulated and their activity is overseen by their operating country.

Applying a given set of restrictions and requiremnts to services used by more than 50% of the population for instance could be a thing.

In the US I know people would call bloody murder, but other countries could do with sensible rules of that kind.


One thing that really bothered me was how Australian ISPs handled censorship of the Christchurch shooting videos. They have started proactively blocking not just websites that hosted it, but also those that linked to it - even forums, where it was posted by the users, and the moderators have a general hands-off policy.

It was strictly private censorship at the time - Australia didn't pass legislation to censor it yet. But it was coordinated across the entire industry - basically all ISPs used the same list, so the customers had no choice.

It was also completely non-transparent - the blacklist was secret, and they wouldn't even confirm or deny whether any given website doesn't open because it's on that blacklist.

Yet, since the government was not (officially) involved in any of that, there was no review and no appeal...


And just because your house is already hooked up to your electric utility doesn't mean you can't build your own generator and and source your own fuel! Clearly the utility should have near complete immunity and let an automated algorithm cut off your power indefinitely for vague, unspecified reasons.

We could have antitrust enforcement that prevents companies like Alphabet and Facebook from becoming so dominant in our society, but if we are going to have such dominant corporations, we cannot allow unbridled censorship of speach on their platforms without undermining a pillar of societal strength, prosperity, and happiness: the marketplace of ideas.

According to what though? Walmart is ubiquitous; do they have to let people protest inside stores? No. Do they have to sell antivax books? No. You could maybe argue youtube is closer to a pseudo public place like a mall, but even there you couldn’t force a mall to protest on behalf of someone else or sell something. What exactly is your end game? Create a new type of forum (like physical forum) where private companies are compelled to act according to someone else’s standards?

By the way, you might know that in California, shopping centers may have to allow people to engage in some expressive activity on their private property, even when they don't endorse the speakers' message:

https://en.wikipedia.org/wiki/Pruneyard_Shopping_Center_v._R...

(I don't mean to suggest that this provides some kind of obvious solution to Internet content moderation disputes.)


Thanks! That’s what I was trying to refer to. But that seems different in a few ways. First, there’s actually physical open spaces in malls where the public is invited—you don’t have to shop. Second, the protest in a mall is ephemeral. Youtube is like a store in the mall, not the open forum (I don’t think Macy’s would be required to have a protest in its store); the internet is like that open space. Second, extending this to YouTube would require YouTube to proactively host a video on its servers, pay for costs to play it, and host it forever (until the protestor takes it down).

Those do all sound like significant differences between a physical mall and a video hosting service.

Walmart is not ubiquitous in terms of property the same way Youtube is in terms of streaming video. If you want to protest you can easily avoid Walmart. Likewise if you want to publish a book you can easily avoid Walmart.

If you want to communicate to a large audience with streaming video in the US (and many other nations), it's Youtube or nothing unless you have an already established audience (which you probably established on Youtube).


I don’t follow. When you protest at Walmart you aren’t guaranteed an audience either (people can ignore you and walk away). And when you make a video you don’t get granted an audience either. Why should YouTube have to help you out?

And people can walk away on Youtube, too. That's the equivalent.

Having an audience and Youtube not taking it away, is not the same thing as Youtube intentionally helping you build your audience.


YouTube/Google/Alphabet have repeatedly said they are removing things about COVID19 from non-government sources. They cannot simply be written off as a private agents who can do what they want with their platform, considering their dominance and monopoly on content.

This is textbook definition of Orwell's Ministry of Truth. Google is either becoming the Ministry of Truth, or by its policy, allowing governments to be the Ministers of Truth, with no room of any dissent.

This should be greatly troubling.


YouTube is not the ministry of truth, it is a popular entertainment website that people visit voluntarily. People very incorrectly treating YouTube like it's some kind of authoritative source of knowledge is the problem, not YouTube.

Youtube is treating Youtube like an authoritative source of knowledge.

Marketing is not reality, this should be pretty obvious on HN.

They're not doing this for marketing reasons. They're doing it because they believe by controlling YouTube they control what people think. Even if they're wrong it's dystopian and suggests a megalomanic attitude.

Remember this is the same company that controls Chrome. It's all fun and games right up until people start discovering their Chromebooks and Androids are blocking web sites on the grounds of "misinformation"


Use dailymotion, or vimeo, or host your own site.

I’ll reiterate a point I’ve made before: the ultimate question is whether YouTube (and other social media platforms) should be permitted both the power to control on their platform as well as the freedom from liability of a common carrier. They exist in a limbo where they have their cake and eat it too- which I think is ultimately untenable.

The problem we face is that social media is about echoing information, not validity, which is how misinformation spreads so rapidly, because it sounds like what people want to hear.

There is no precedent in human history in such a thing. A private entity which controls so much public understanding and opinion.

Making YouTube liable for content would go a long way in helping keep YT content honest, and solve the moderator problem. YT will either moderate or die.

The reason YT could get so popular is because they neither have to moderate, nor fact check, but still get the benefits of being informational. This is true for ALL social media, unfortunately. How would one go about moderating the thousands of hours of content produced every hour in TikTok? Etc.


For what it's worth, I do not think people in the past were more informed than people today. If anything, more people are more informed today than at any time in history.

What we're seeing is the breakdown of consolidated narratives. It used to be that people's beliefs were all wrong in the same way. You'll note that this looks basically identical to everyone being well informed.

Now that we've moved into a regime of information abundance it looks like the world's gone crazy, when really it's just decohered.

That said, youtube as a vector is spreading the most virulent memetic viruses we've ever seen. Hard to say how it shakes out though, memes aren't all bad. Everything good we've ever done started off as a meme too.


> If anything, more people are more informed today than at any time in history.

Quantity and access to information is only one side of the equation. The other side is relevance and the meaning that information affords us. If it doesn’t help us orient ourselves better in the world, if we can’t make sense of it, that information is useless, or even harmful. Decohered is the right word, but I don’t think it is only a product of the drastic increase in virality of memes. It is also our sense-making apparatus being broken or hugely behind the information influx that is created by the platform. This like informational obesity, influx of calories that can’t be translated into nutrition. Right now youtube’s policy layer and recommendation algorithms are the only machinery for deciding relevance and sense making on this exponential transfer of information happening. A black box machinery built for maximizing ad revenue, will naturally fall short on making us wiser, relevant or coherent. In fact, it is more natural for ad revenue maximizing to align better with us being more consumptive, polarized and addicted.


The problem is that memes themselves undergo natural selection, which favors the "bad" ones overall.

https://www.youtube.com/watch?v=rE3j_RHkqJc


I don't think this is true in the long run, any more than genes which favor their propagation are bad in the long run. In the short term, viruses, cancers, shitty memes, doomsday cults and other low-sophistication replicators are favored.

But long term you get things like multicellularity, complex nervous systems, the scientific method, civilization and art. The road may be rocky but I see no reason for it to be any different here.


It's "bad" for us as individuals with our cultural background. It's not necessarily "bad" for the memes themselves, especially since the ultimate score in this game is to have an entire civilization proactively spreading the meme to others. Christianity, Islam, and other proselytizing religions are a good example.

I suspect that the old, successful memes are good for their hosts by many definitions of good.

> the ultimate question is whether YouTube (and other social media platforms) should be permitted both the power to control on their platform as well as the freedom from liability of a common carrier

I disagree that this is the ultimate question or that this is in any way a problem that merits a changing of the laws (as current law explicitly permits this). Websites aren't common carriers, and I don't see a compelling reason to force websites to host content that they disagree with for any reason. I don't see any connection between the idea between popularity and the loss of editorial control of your own website.

> They exist in a limbo where they have their cake and eat it too- which I think is ultimately untenable.

Moderation is part of the product. I would argue that the idea that we can effectively constrict companies into a specific type of moderation behavior is what's untenable because it strikes at the heart of the creative freedom to build a website and online community with respect to your own ideals.


To be clear, I think either position is fine. If you want control over what people say that's okay - but maybe then you should be accountable for what they say like a publisher.

If you don't want to be accountable for what people say, that's fine - but maybe then like a traditional common carrier you shouldn't get to control what people say.

The asymmetry of "I control everything you can say but am accountable for none of it" is wrong, in my eyes.


> they are moderating the content on their platform with respect to their business process and company values, those decisions are not a reflection of credibility.

There is a difference between morality and legality. It might be legal for ATT to cut your Internet because they don't like you but it will be immoral for sure.


Having your video removed from someone's website isn't the same thing as having your internet cut off.

How so? Both are private properties.

Do you really not see the critical differences between having your internet cut off and having a website owner remove something you posted to their website? I am honestly having difficulty believing that.

The essential, underlying problem here is that YouTube is a corporate product, just as you say, and yet still has a major influence on the population's aggregate belief, due to its scale of adoption. This is an emergent phenomenon, not a narrative.

> YouTube is a corporate product, just as you say, and yet still has a major influence on the population's aggregate belief

What is your evidence for this? There are so many other sources of influence in this world that I find this idea completely at odds with reality.

https://en.wikipedia.org/wiki/List_of_most-subscribed_YouTub...

Considering the content of YouTube's top channels I think the evidence isn't in favor of that suggestion that YouTube has a noteworthy impact on "the population's aggregate belief"


Well, they roll out a platform where people can upload content. That is the one of the main services they offer, YT should be subject to the local regulations for uploaders and viewers, applying no other discretionary/arbitrary limits than those imposed by regulations or by technical reasons. Imagine a content platform would consider that videos created by <insert group of people> do not align with their values.

A lot of people who blame Youtube for oppressing their right to disinform other people, would immediately oppress other people if they had the chance.

There's only one grocery store in town and I need to buy food, but their prices are too high!! Oh well, guess I will just have to drive to this other town to buy food where the prices aren't so high.

..walks outside to car ..sees local grocery store owner has flattened his tire


This is such a deeply flawed and disingenuous analogy that I find it difficult to imagine you're commenting in good faith.

> I don’t want youtube to be in charge of deciding what is credible or not.

YouTube has to make these decisions because there's no one else willing or able to do the job. If the government established a Department of Social Media to do the same work, HN would be losing its mind over government censorship. And if YouTube does nothing at all, it ends up hosting terrorist recruiting videos and instructional videos on how much bleach to drink to kill coronavirus.

You might be uncomfortable with the role of big tech companies in moderating content but, until you can provide an alternative, your discomfort does not override the imperative to save human lives.


> They don’t have in-house experts on these domains, they don’t have a magic epistemology machine that spits out the “facts”.

What are the other options? Is there any good_or_not(url) API youtube service can call? They are forced to moderate content.



I have been following unherd for a bit and while they are on the let's do as the Swedes side of things, this is wellreasoned long form interviews with established experts and opinion makers.

I don't understand mechanism on why something like this gets censored. I think it is obvious that this sort of "overcensoring" will come back on youtube in a negative way; and possibly even hurt the case for lockdowns and careful reopening.


“on matters of geocentricity vs heliocentricity we will follow the expert opinion of Catholic Church”

Does YouTube remove flat earth idiots?


Maybe the people working at YouTube still have a moral and ethical responsibility to minimize the spread of disinformation.

Unfortunately it isn't always cut and dry. As the GP pointed out, once upon a time the heliocentric model was "disinformation".

In more modern times, suggesting that the Gulf of Tonkin incident was misrepresented would be "disinformation". Suggesting that we didn't need to invade Iraq to prevent another 9/11 would be "disinformation". Sometimes yesterday's "harmful disinformation" is simply today's "information".

I think the ethical question you presented is multifaceted. On another axis, do the people who control the most prolific video platform also have a moral responsibility to ensure a free society has the tools to ask difficult questions?



Youtube isn't in charge of what is credible or not. Youtube is in charge of what shows up on youtube or not. There are many categories of things that can't be on youtube, not just things youtube thinks are incorrect.

Should youtube overstep its bounds, that makes it easier for competitors to enter the video streaming market. It seems like libertarian leaning people forget half of their philosophy on this issue.


This gets a bit into monopolies. YouTube is Google and Google prioritizes its own properties. Google is the most widely used search engine and works hard to keep it that way.

Is there a way to easily have popular interviews online outside of YouTube?


Their monopoly is meaningless in the context of all the knowledge freely available within the entirety of the internet, not to mention all the knowledge freely available outside the internet in the form of books which still happen to be the definitive medium to learn about anything with any type of serious rigor.

>It seems like libertarian leaning people forget half of their philosophy on this issue.

Because they are "drunk" on theory. To grind out a new video hosting platform is a lot of work.


According to your philosophy, we should already have a competitor to Boeing cranking up new airplanes.

It's called Airbus.

In my view from the ten minutes I watched he didn’t state anything any more off base than the WHO has over the time the pandemic has been spreading. There is uncertainty and we’re yet to understand it thoroughly.

Given that, I think this is gross overreach by YT in establishing narrative. What happens if their management becomes a bunch of anti-vaxxers, do they get to set the tone?


The fact there is a global pandemic going on that requires the cooperation of virtually everyone is important context here.

Meanwhile, YouTube does little moderation of zany philosophical, conspiratorial, or otherwise disruptive, unconventional content. The ideas of Galieo, Socrates, Jesus etc. fall into this category. That is to say those figures did not promote public health opinions that undermined a global cooperative effort in a time of crisis.

I don't agree with censorship, but your analogy is off base.


> That is to say those figures did not promote public health opinions that undermined a global cooperative effort in a time of crisis.

I think the figures running the Inquisition and the counter-reformation at the time of Galileo might have disagreed. Remember the 30 Years War was also ongoing from 1618-1648 during the whole time, a Catholic-Protestant religious conflict that engulfed all of Europe and resulted in 8 million deaths.

Bucking the authority of the Church/inquisition at that time was seen as a serious issue against social order in a way that’s hard for us to understand from a modern perspective.

BTW, I strongly disagree with the this video, but really not sure if YouTube removing it is the right way to do this or will be effective in suppressing virus truthers.


True, but Galileo's work was not a direct commentary on said crisis (30 Years War). On the other hand, this "professor" is presenting an analysis of an ongoing crisis and suggesting concrete actions on how it should be handled.

> Bucking the authority of the Church/inquisition at that time was seen as a serious issue against social order in a way that’s hard for us to understand from a modern perspective.

Exactly my point. You can criticize the powers and be and the social order today with impunity on YouTube.


>The ideas of Galieo, Socrates, Jesus etc. fall into this category. That is to say those figures did not promote public health opinions that undermined a global cooperative effort in a time of crisis.

Then some Pharisees and scribes came to Jesus from Jerusalem and asked, "Why do Your disciples break the tradition of the elders? They do not wash their hands before they eat."

(...)

Jesus called the crowd to Him and said, "Listen and understand. A man is not defiled by what enters his mouth, but by what comes out of it. (...) For out of the heart come evil thoughts, murder, adultery, sexual immorality, theft, false testimony, and slander. These are what defile a man, but eating with unwashed hands does not defile him."

(Matthew 15:1-2, 10-11, 19-20)


While that may be construed as relating to public health, that is not evidence of a global crisis nor coordinated response to said crisis.

These types of censorship patterns are identical to that employed by China in intent. The practical consequences are the same as well.

In China information censorship is due to the government trying to promote peace and harmony. Aka, the government thinks it knows what’s best for people and forces it on people.

Facebook et al also encounter these same problems, where they see people spreading “misinformation.” It’s a hard problem to solve, but they essentially resort to taking the same types of censorship as China that everyone so easily criticizes.

The irony is Google is now doing it too after claiming to have left the market due to forced censorship.

Some people may cite China’s censorship/banning of Falun Gong. What people don’t bother to look into is that Falun Gong is an anti-gay, really out there cult. Their censorship justifications of that is not unlike censoring “fake news” like this “doctor.” While it may be right (of course their methodology may be questionable), it completely short circuits the ability for critical thinkers to actually analyze all content. It’s done supposedly for the greater good.


As long as you can host your video on your own website, that's not the same. If you own website, you have all rights to do any censorship you want. Now if your government will try to fine or arrest you because you hosted some video on your website, that's censorship. You can avoid youtube, but you can't avoid your government.

That said, I agree that huge websites like youtube, facebook, instagram are something more than just another web resource and probably some regulations should be applied to them. But it's very sensitive subject.


> As long as you can host your video on your own website, that's not the same.

Very few people even within this peer group have the capability of hosting videos on their own website unless the video is intended for a very small audience (say, less than 1000 viewers per day).

If you want to reach any significant number of people, you absolutely need to go through one of the major video hosters or pay for a CDN. And that's before you consider the network effects of YouTube.


Then they don't need to use video, they can use written words.

There's no reason why they should be able to use the infrastructure of private companies to scale their message as much as they want.

Or they can just pay for a CDN as you say. This just highlights the differences between the West and China even more. In the west, harmful messages need to stand on their own two feet. If they don't have support, then they can be kicked off private platforms. No need for the government to intervene.


CloudFlare, the biggest CDN, despite their words and claims routinely censors sites meanwhile defending hosting terrorist site's free speech.

https://www.fastcompany.com/90312063/how-cloudflare-straddle...

> the company serves at least seven groups on the U.S. State Department’s list of foreign terrorist organizations, including al-Shabab, the Popular Front for the Liberation of Palestine (PFLP), al-Quds Brigades, the Kurdistan Workers’ Party, al-Aqsa Martyrs Brigade, and Hamas.

> CEP has sent letters to Cloudflare since February 13, 2017, warning about clients on the service, including Hamas, the Taliban, the PFLP, and the Nordic Resistance Movement. The latest letter, from February 15, 2019, warns of what CEP identified as three pro-ISIS propaganda websites.

So CF bans even remotely right leaning content but claims terrorist organization's websites are free speech:

https://blog.cloudflare.com/cloudflare-and-free-speech/


CloudFlare is under no obligation to follow a policy you find logical. If you dislike their business practices or find them problematic you might contact your local elected officials and request regulation. Otherwise you’ll have to wait and see if the magic of the free market changes anything.

All I am pointing out is the hypocrisy and false statements CF and others make. CF's actions are much different from their words. They say they need to allow dangerous terrorists because of free speech but then ban other speech which they don't like.

> you might contact your local elected officials and request regulation

Voicing my opinion on a public forum is one way to do that.


Why is everyone downvoting comments without even explaining?

And then the CDNs no platform them and and then someone who agrees with the Wojcickis of the world DoS attacks you and then you're gone.

This argument ends with "just build an entirely parallel internet". It incentivises radicalism inside tech and communication companies. It incentivises the "play to win" mentality we see in universities in which the hard left relentlessly attacks anyone they disagree with using any tactic they can get away with, which are often illegal tactics when the rules are enforced by people who agree with them.

This stuff ends with a totally lawless society, a la China. It's the logical end-game of allowing the hard left to dominate: the rules are just words on paper, what matters is your ideological loyalty to authority. We can see it happening before our eyes and it's terrifying.


Video hosting scalability is not the problem it was if you use a technology like Webtorrent, allowing peer-to-peer distribution of your video files. See PeerTube for a relatively easy platform for your private video hosting.

Even putting Webtorrent aside, a simple server with a gigabit connection allows you to handle a hundred simultaneous viewers at once with a reasonable bitrate, all day long. If you're constantly getting more than that, you can surely afford a CDN or a proper infrastructure.

I do agree that the network effects of YouTube are very hard to compete with, however.


I'm not a lawyer; but, relentlessly downloading and then assisting in the distribution of content I've not personally validated sounds like a legal minefield filled mostly with mines and very little green space.

I have to a pay a lot of money to put some words up in Times Square, versus posting a paper sign on my local telephone poll. Is this different?

Torrenting is a pretty good CDN. Then you just need to host a magnet link

That's actually a really interesting thought. I wonder if you can just serve a tiny JS torrent client and a magnet link and automatically download the content. You would need to have a way for visitors to seed the content too, otherwise you still have to serve everything anyway.


> If you own website, you have all rights to do any censorship you want.

The case of the baker refusing to decorate cake for the gay couple comes to mind. He's in court for the third time. Same people who talk about "private company can do whatever they like" attack the baker.

Soon this will evolve into letting private companies being able to censor minority opinions. Don't forget, just few decades ago, LGBT rights opinions weren't mainstream and would therefore have been banned from private companies.

EDIT: Please explain the downvote.


> If you own website, you have all rights to do any censorship you want.

There is a difference between can and should. They can, but should not.


Why not? Should revnode-videos.net not be able to decide which videos it hosts, pays for in storage and bandwidth?

> not be able to decide which videos it hosts, pays for in storage and bandwidth?

I didn't say they shouldn't be able to. I said they shouldn't, even if they are able to.

> Why not?

Why should they?


Did everyone forget what happened to these unsavory characters? https://en.wikipedia.org/wiki/The_Daily_Stormer#Site_hosting...

Someone I know rather well is their sysadmin, and their site is still up: https://dailystormer.su/

Admittedly, they’ve had a hell of a time, but the hard data is that it continues to run. I don’t condone domain names or hosting accounts getting shut down based on content (ANY content) but it appears that it is still at least a little bit possible to host the most extreme of incorrect/offensive content, it’s just not convenient.

I imagine they might even still be on Cloudflare if they hadn’t promoted the (incorrect) theory that Cloudflare not canceling them was actually a dogwhistle endorsement of their content. Cloudflare cited this incorrect claim as the reason for their termination.

That said, large hosting platforms used by majorities of the population (eg YouTube, Facebook, Instagram) should absolutely have to carry data uncritically, just like the phone company, or at least as long as they are actively trying to kill the federated open web with things like censored native app stores, censored browser extension indices, censored and surveilled messaging platforms, and AMP. Things would be quite a bit different if our hardware didn’t forcibly opt us in to dictatorial censorship of native apps.

It also grinds my gears that they call it “community standards”, which are weasel words to pretend that their userbase likes being forced to watch only what their dictator approves of. They’re unilateral, and invoking the term “community” is a dishonest, unprofessional move.


The Daily Stormer is on a Soviet Union domain. That's really ironic.

> If you own website, you have all rights to do any censorship you want.

I absolutely agree with you here, but we also have the right to criticize and/or stop using platforms that use this. Your parent post said it was wrong, not that you tube didn't have the right to do this.


The difference between practically the same and purely the same is minimal.

In China you can easily get around the firewall if you try hard enough.

In American you can easily distribute your video on your own website if you try hard enough.

Do you see how these two things are fundamentally the same?


These huge web sites are using public right-of-ways and wireless spectrum to transmit their data. It'd be like the phone company censoring your speech. The 1st amendment ought to apply to them unless they are using a fully private means of distributing their information.

Everyone uses public right of ways to communicate. There's nothing stopping someone from setting up their own site using the same public infrastructure.

> The practical consequences are the same as well.

Is YouTube sending people to "re-education camps" now?

If a private platform kicks you out, you can host your stuff elsewhere. If a state censors you, it can use lethal force to silence you. Pretty big difference.


In the good old times of Soviet power, there were other oppression mechanisms besides lethal force. In fact lethal force was only rarely used, especially after Stalin. A couple of examples:

* Send dissenters to perform menial jobs in some remote small city with little impact on the national scale. Technically, the Gulag itself was an application of this strategy, with a particularly harsh destination and a particularly dangerous occupation.

* House arrest. Keep dissenters in their homes, and carefully control whom they get into contact.

The end effect is the same: drastically limit the platform available to dissenters, without outright murdering them.


And those pale in comparison to the worst crimes of YouTube:

- demonetizing videos that advertisers don't want to be associated with

- age-restricting highly questionable content

- deleting videos which I, like, worked really hard on, and which I didn't even say anything directly violent

- ARTIFICIALLY DEFLATING FluoridePoison777's subscriber counts

Seriously: your argument is offensive to victims of totalitarian oppression.


?

There is only the simple observation that there is a spectrum: free speech -> deplatforming [ban from youtube & co] -> deinfrastructuring [ban from paypal / aws & co] -> depersoning [physical exile to periphery] -> lethal force.

I seem to remember a time of nuanced conversations. But perhaps online discourse was always an adversarial contest between two strident extremes. Enjoy.


No, you are just playing word games. "Free speech" does not and has never meant "the right to get any letter to the editor published in the New York Times" and therefore does not mean "the right to get any video I want on YouTube." So it is not a "simple observation" that deinfrastructring -> depersoning is a spectrum. There is no such spectrum. There is an enormous gap between "society thinks you're icky and big corporations won't host your content" and "the state sends you to Siberia," since one is clearly a human rights violation and one is clearly not. Likewise there is an enormous gap between "Simon and Schuster rejected my novel!" and "the FBI busted the printing shop I hired to print my novel." YouTube removing this guy's video is very clearly in the former category. I am aware that it is out-of-reach for many individuals to create their own website which can host videos. But a right to free speech does not mean a right to any content distribution platform you want.

It also whitewashes the issue at hand: this professor is a notorious crank and has been a notorious crank long before COVID-19. This is not a free-thinker taking a bold stand against a sclerotic public health bureaucracy. This is a con artist spreading dangerous misinformation during the worst public health emergency in 100 years. YouTube removing his content is not just them suppressing views they politically disagree with, it's also individual employees at YouTube viewing this interview as a imminent threat to their family's health and safety. This is a very reasonable view for private individuals and private corporations running private infrastructure to take.

Look: if he starts his own website self-hosting COVID quackery, and the government shuts it down, then that really is Soviet-level and I will take your argument more seriously (though given the vicious threat COVID-19 poses, I am open to the idea that this is an extreme but necessary use of the government's emergency powers). But it is simply ridiculous to say that YouTube can't moderate their own content, especially when 99% of the world's doctor's and public health officials would describe the video as dangerous misinformation.

But your self-pitying shtick about how you're a lone voice of moderation in a sea of adversarial extremes is nice. It's not quite as nice as bringing back 90,000 Americans from the dead but it probably made you feel nice about yourself.


I'm sympathetic to Chinese-style censorship, at least in theory before it degenerates into the state trying to suppress anything it considers embarrassing to itself. "The government thinks it knows what’s best for people" because it does know better than most people. Facebook, as the lowest common denominator for American public discourse, is a cesspool of misinformation, pseudoscience, hate, deceptive advertising, and people losing their skin in MLM scams. In India, fake stories on WhatsApp have whipped up violent mobs into a frenzy of xenophobia, resulting in tragic murders of innocent migrants. We should be more aware of the enormous costs associated with a lassie-faire attitude toward speech, even if we decide it's worth the cost. "A person is smart. People are dumb, panicky, dangerous animals and you know it."

All that being said, the practical consequences of YouTube moderation are not even remotely the same as those of the Chinese censorship that everyone criticizes. The worst thing that can happen to you on YouTube is that you can't comment or upload videos anymore. YouTube is not sending people to prison or calling up your employer to tell them you're a dangerous agitator.


> YouTube is not sending people to prison or calling up your employer to tell them you're a dangerous agitator.

Correct. This function is outsourced to Twitter mobs.


Why on earth does this have to be compared to China? And Falun Gong?!

This is long form journalism with establishment figures such as university professors.

YouTube censors clearly overstepped their "mandate" here. And Unherd has other ways of publicising this particular interview and drawing attention to YouTube's censorship.


Yep. It seems to have started with the 2016 election which left popular opinion being that there's a huge part of the population who are too stupid to think for themselves and must be shielded from bad ideas in case they believe them and harm the rest of us with them. Now that such an idea is readily accepted, censorship is easily seen as the morally right thing to do in lots of cases. The feeling is that we're protecting ourselves from real harm by people who really don't have the ability to correctly evaluate the information they receive. Just like China doing it for social harmony. It's not just terrorism and child porn that people tolerate censorship for anymore, it's for 5G and flat earth conspiracy theories, anti-religious ideas, nationalism, and all sorts of "misinformation".

That a huge part of the population choose unwisely and footgun themselves and the rest it's quite obvious. But the real, longterm solution is education, not censorship. But we have to admit that we are in an ultra-connected world like never before and fake, easy news are even more easily spread. So, critical thinking is a necessary skill to be taught.

> That a huge part of the population choose unwisely and footgun themselves and the rest it's quite obvious.

Twitter and media hyperbole aside how exactly is the current admin that much different than the previous? Bush #2 had fake WMDs and leveraged that to go to war. A war that cost hundreds of thousands of lives and a ridiculous amount of money. The last admin bailed out Wall Street (with few concessions), re-signed and expanded the Patriot Act, floated the Paris Climate Accord as a victory (but Naomi Klein tells otherwise). Over both of these admins and prior income inequality has increased.

We've been getting shot in the foot, the head, the arse, etc. I'm not a DJT voter or fan but the idea that he's the problem is naive. If W.DC had been doing their job there would have been no opportunity for DJT to rise. Trump is a symptom. Let's not be foolish and blame the symptom.


Those twitter outbursts you dismiss are public statements from the highest executive office. Provoking hate crimes is not something to be dismissed.

I didn't dismiss them. I asked how those are so much worse than the transgressions of the previous two admins? I'd like to remind everyone that an estimated 100,000+ non-white skineed _civilians_ were killed in Iraq. Alledged hate crimes vs real war crimes.

Bush caused the deaths of hundreds of thousands of people with the Iraq war, and you're upset about some tweets?! Why do people have such a distorted view of right and wrong?

Agreed. As for why, it's obvious. It's the media. It's going to be interesting to see what happens when Trump signs off on extending the Patriot Act. A law that was renewed and expanded by the previous admin.

Maybe they shouldn't have been forced to choose between Hillary/Biden or Trump if you wanted them to choose "wisely". The responsibility for losing to Trump falls solely on the Democratic Party, and the talk of footguns and education comes off as extremely snobbish. Stop blaming others for your failures.

Consider that it is far easier to spread misinformation/disinformation than it is to educate, and that over time there are going to be even more powerful forms of disinformation in the form of deep fakes. People have a hard time distinguishing real from fake now, it's only going to get worse from here.

On one hand you can say that nothing should be censored and the onus is on the viewer to critically evaluate information. But the reality is that misinformation/disinformation is causing real harm to people, populations, and civilization itself.

For example, propaganda on Facebook in Myanmar has fueled mass killings against the Rohingya. Anti-vax misinformation is lowering vaccination rates and causing a resurgence of diseases that were almost eradicated. Should companies like Google and Facebook just turn a blind eye to all of that and watch brutality and backwardness thrive and tear down progress? At what point should the line be drawn?

Make no mistake, there are malign actors out there who are weaponizing misinformation as we speak. If we choose not to do anything about it, they will have an out-sized role in shaping the future to their ends.

It certainly is a very difficult question to figure out where the line should be drawn, and even then it's a moving target. But it's a question worth wrangling over, and ultimately if mankind is going to meet the challenges of now and the future, the threat of disinformation has to be met one way or another.


>These types of censorship patterns are identical to that employed by China in intent.

At the end of the day, Youtube's intent is to make as much money as possible. If Youtube didn't censor a lot of misinformation videos, people would get mad and demand advertisers to stop advertising on Youtube.


> These types of censorship patterns are identical to that employed by China in intent. The practical consequences are the same as well.

True, if you ignore the fact that China is a government and YouTube is an internet video portal. And that there are sites like HN that can point it out. And that YouTube isn't taking steps to cover up what they've done.

But yeah, let's compare the policing of bad information in a pandemic that gets people killed, where YouTube doesn't benefit from the "censorship", to what China's government is doing. At least you get to feel good about your purity when you're posting comments on HN, secure in the knowledge that you don't have a teenager or a parent that might die as a result.


This comment breaks the site guidelines by being snarky, crossing into personal attack, and taking the thread further into flamewar. That's not cool here, regardless of how right you are (or feel) or how wrong someone else is (or you feel they are), so please don't.

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and sticking to the intended spirit of the site, we'd be grateful.


> At least you get to feel good about your purity when you're posting comments on HN, secure in the knowledge that you don't have a teenager or a parent that might die as a result.

What's this ad hominem crap doing here?

I don't know anything about the poster you're replying to, but I think I'm safe in assuming that they don't want your teenager or parent to die either. It's acceptable--essential, even--to be concerned about more than one thing at a time, even during a pandemic.


[flagged]


> as people are defenseless and suspend all skepticism in front of suffering children: nobody has the heart to question the authenticity or source of the reporting.

But that's not actually true. It's nearly impossible to express concern for children in any context on the internet without being mocked and dismissed for having invoked a "won't someone think of the children" ad hominem.

It's strange how deeply that particular meme has dug into the psyche of internet culture, and has become almost an unwritten rule.


In your example, the people saying "won't someone think of the children" are the culprits of pedophrasty.

Once you conceive the concept, you see it all over the place. See here for a more in depth explanation: https://medium.com/incerto/pedophrasty-bigoteering-and-other...


> True, if you ignore the fact that China is a government and YouTube is an internet video portal.

That's the way the west perceives the situation - although over there they might see this as 'powerful unelected corporation which is legally mandated to maximise profits arbitrarily censors with no oversight, in a country where these unelected corporations control parts of the government'.


Firstly, I agree with rcoveson; there are a lot of complaints to be made about your second paragraph. Including that "a teenager ... that might die" is emotional misinformation - they almost surely won't.

The difference between governments doing it and YouTube doing it is that people do have the practical choice of leaving YouTube if the situation is deemed intolerable. Up until that point there isn't a practical difference between YouTube and China if they employ the same censorship methods.

> And that YouTube isn't taking steps to cover up what they've done.

The Chinese government isn't exactly subtle; they are promoting peace and harmony through censorship. A lot of people died in the last Chinese revolution & aftermath - they do have a reasonable argument that unstable government will kill people. It is just we know from experience in the west that transparent democracies are more stable because problems tend to get dealt with when they affect large groups of people.


> where YouTube doesn't benefit from the "censorship"

Of course they do. You think they're spending all that money out of the kindness of their hearts?

Way to dodge actually engaging with the principles at play here. You can certainly assert your consequentialist/utilitarian arguments, but not everyone agrees with that ethical framework.


What ethical framework yields non insane results that isn't consequentialist/utilitarian?

Consequentialism also yields insane results. Look up "the repugnant conclusion". If ethics were settled, it would no longer be a philosophy but a science.

Thanks for that topic its very interesting. It will take some time to digest.

Stop this "bad information gets people killed" nonsense. You can justify any kind of censorship with that. That's what China does too. If they allow protests, they'll get violent and people will get killed. If the government is overthrown, millions of people will probably be killed or die from the ensuing chaos.

Youtube, Facebook, etc. in some ways have more power than the government because they can influence public opinion. They also set the standard for what's morally acceptable and people carry that standard with them elsewhere in their lives. Eventually, they'll probably vote for the government to enforce it because it's so popular.

I once met an American who believed hate speech was a crime. It turns out that a lot of young people believe that. No doubt because it is a "crime" on all their social media platforms. Some of them already want to make it a real crime when the discover it isn't yet.


[flagged]


Given Google's size and power it's only natural the Chinese and plenty of others would have insiders. How far up the influence ladder they've climbed is another matter.

Just the same, the US is doing it as well, abroad as well as at home.


[flagged]


How is it racist?

The dragonfly debacle provides at least some credibility to such an idea: https://en.wikipedia.org/wiki/Dragonfly_(search_engine)

It's impossible to produce a search engine compatible with the Chinese market without having some fairly high-placed contacts in the Chinese government, because it would take some very high-level approval to allow such a product.


"Chinese" in this context pretty obviously means "acting on behalf of the Chinese government", not "having Chinese heritage".

> These types of censorship patterns are identical to that employed by China in intent. The practical consequences are the same as well.

They're not identical, the state doesn't determine YouTubes actions.


Legislators puts pressure on them. E.g. by threatening to repeal CDA or bring other regulation if they do not self-regulate and "do something" about content they deem problematic for one reason or another.

Thats funny, I always thought it was business' lobbying the government to change policies..

Those things are not mutually exclusive. Politicians ask for things that make them look good, companies ask for things that benefit their bottom line. As long as conspiracy theorists are not youtube's cashcow and tax cuts can be sold as improving the economy for everyone both sides benefit.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: