
YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant - petethomas
https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant
======
mzs
>…Fully 81% of all parents with children age 11 or younger say they ever let
their child watch videos on YouTube. And 34% of parents say their child
watches content on YouTube regularly.…

…

>…Some 81% of YouTube users say they at least occasionally watch the videos
suggested by the platform’s recommendation algorithm, including 15% who say
they do this regularly…

[https://www.pewinternet.org/2018/11/07/many-turn-to-
youtube-...](https://www.pewinternet.org/2018/11/07/many-turn-to-youtube-for-
childrens-content-news-how-to-lessons/)

~~~
a_bonobo
I have Youtube Kids on my mobile phone, the recommendation engine only
recommends child-friendly videos - Blippy, one of the millions of videos where
people play with toy train sets, Wiggles, etc., it hasn't yet recommended a
Neonazi video (however it doesn't care for language, often recommending
Spanish or Russian kids videos). It's very useful for when I need my toddler
to be quiet for 10 minutes or so.

------
4werfaw34r
Some of what's in the article seems to contradict what YouTube's product chief
said in a recent interview.

[https://news.ycombinator.com/item?id=19523415](https://news.ycombinator.com/item?id=19523415)

The chief contradiction I see - The product chief claims youtube has been
taking this problem seriously, while this report implies youtube has ignored
the problem for some time, going as far as to avoid searching for videos to
claim plausible deniability. According to the bloomberg article, YouTube was
aware of the "bad virality" problem for sometime, but it was never addressed.

------
alwaysanagenda
A summary:

Let's protect people from the ideas and content we think is bad to provide a
better platform for our advertisers, then blame our focus on revenue as the
reason for why we didn't focus more on censoring the content our advertisers
and corporate leadership find offensive, so that we can continue to generate
ad revenue, which we call "responsible growth."

This article is littered with fascist ideation. Like this line:

> “YouTube should never have allowed dangerous conspiracy theories to become
> such a dominant part of the platform’s culture.”

So, this "dangerous" information needs to be removed, since it's apparently
harming the public who are assumed to be unable to think for themselves. The
same public, who, apparently, likes conspiracy theories.

>“The primary goal of our recommendation systems today is to create a trusted
and positive experience for our users,”

But what if you had a positive experience and were enjoying the way the
algorithm populated new and unusual conspiracy videos? Sorry, you're out of
luck. But hey, Google is making sure you get "trusted" information now. What a
relief!

>The company’s lackluster response to explicit videos aimed at kids has drawn
criticism from the tech industry itself.

The lackluster responsibility in parenting has resulted in explicit videos
being aimed at kids by an algorithm that is incapable of discretion.

An algo that, apparently, works so well that humans need to modify it so we
don't popularize popular content we find offensive.

It doesn't matter if information is true or false. YouTube's stance on
"objectionable" content with a holier-than-thou-you-can-trust-us-to-filter-
content-so-your-toddler-can-mainline-garbage-without-your-input is
outrageously disgusting.

