
YouTube reportedly discouraged employees from reporting toxic content - vezycash
https://www.theverge.com/2019/4/2/18292530/youtube-toxic-conspiracy-video-employees-internal-report
======
manfredo
Seems like an alarmist headline but not really very alarming when you dig into
the details. First of all, this is from "more than 20" staffers. Out of a
company of 2,800 employees and who knows how many content moderators on
contract.

> One proposal offered a way to keep content that was “close to the line” of
> violating policies on the platform but remove it from the recommended tab.
> YouTube rejected that suggestion in 2016, a former engineer said, and
> instead continued to recommend videos regardless of how controversial they
> were.

This does not seem like a bad thing. Why _should_ videos that do not violate
site policies be removed from recommended? This sort of doesn't-break-the-
rules-but-we-just-don't-like-it kind of content discrimination is rife for
abuse, and would probably cause an even bigger uproar around people's content
getting hidden for being "close to the line" but not across it.

> Employees outside of the moderation team were also reportedly discouraged
> from searching YouTube for toxic videos, for lawyers said the company would
> have a bigger liability if there was proof that staffers knew and
> acknowledged those videos existed.

This one is a bit strange. Why would employees outside the moderation team
seeing a toxic video result in liability? Employees outside the moderation
team presumably aren't equipped to tell which content is an isn't violating
the rules, so I guess this rule might be to curb people trying to go out and
find supposedly prohibited content on their own.

~~~
boomlinde
_> Why should videos that do not violate site policies be removed from
recommended?_

The recommendation system seems out of control and I keep getting
recommendations that I have not only never shown an interest in, but have
actively expressed disinterest in when removing it. If they can't fix that,
I'll welcome potentially biased human curators to clean it up from the vile
crap that occasionally pops up there.

~~~
manfredo
This is recommendations working as intended. If all people got was
recommendations to what they already watch them they're no chance for new
creators to get recommended.

Also, I wrote a note quick program to use selenium to traverse YouTube
recommendations. I didn't find any weird rabbit holes or anything. I can
release the source code if you want, though I'm pretty sure similar tools
already exist.

~~~
boomlinde
_> This is recommendations working as intended. If all people got was
recommendations to what they already watch them they're no chance for new
creators to get recommended._

The problem wasn't that I got recommendations for videos that I hadn't
watched, it's that I got recommendations for videos on topics that I had shown
no interest at all in, and even worse, expressly denied interest in. Maybe
after I've told the system that I'm not interested in watching a feminist
getting rekt by Jordan Peterson for the tenth time the recommendation system
can figure out that I don't care for the topic at all instead of just
continuing to recommend Jordan Peterson videos from the seemingly thousands of
other channels that post them.

~~~
repolfx
There's something incredibly weird about YouTube's algorithm, Jordan Peterson
and Hacker News readers. That's a very common complaint. I've watched a few
videos with him in and now my recommendations are also super-saturated with
his vids.

I actually think he's OK, albeit not anywhere near as strong a speaker as his
followers like to think. And to be honest I did enjoy watching Cathy Newman
getting rekt by him when that video first came out. So I guess I contributed
in some microscopic way to your current pain. But even for me, YouTube acts
like I watch the dude every single day - out of the 8 videos it's currently
recommending to me, 2 of them involve him. This is way out of proportion.

I suspect something about content involving him pushes lots of entirely
legitimate buttons inside the recommendations engine. The organic growth in
his popularity has been astronomical, and video comments are flooded with men
saying watching his vids helped them turn their life around.

So if I had to guess I'd say watching one Peterson video is highly, highly
correlated with watching tons of them. I suspect people who only watch one vid
for a few minutes and then forget about him are a much smaller group
proportionally than most YouTubers. I also suspect that this behaviour highly
correlates with the same demographics that read HN i.e. educated western men
who probably don't watch lots of celebrity music videos. It also wouldn't
surprise me if JP videos are way more virally shared than normal, as he
inspires a somewhat evangelical fervour.

That said, it's weird the recommendation engine can't pick up on being
explicitly told "not interested". It makes me think the engine doesn't know
the videos are JP related explicitly and it's picking up on other
characteristics, so clicking "not interested" on these videos just looks kind
of random and useless ("please don't show me this kind of popular video that
strongly appeals to people like me" ... well ok mr youtuber but what else am i
meant to do?)

------
atoav
It is becoming increasingly clear that the tech giants are unfit to deal with
their responsability within society. And this is purely because of the the
incentives they have given the decisions they face. Should they do the right
thing and stop downright wrong and inflamatory misinformation from spreading,
or should they even recommend more of that to keep users watching in an
endless cycle?

Note this is not about political bias, as some are mentioning. Inflammatory
videos based on downright lies are not a serious political position. Or if
they are, you should also accept Islamist extremist content and at least bring
some ideas how to keep a civilized society with content like that circulating
freely.

------
Mirioron
I think this is a good thing. Considering the political bias that seems to be
common in these types of companies it's better for society if they censor
fewer things.

~~~
mugwort13
Agreed. Companies like Google (Alphabet), FB, Twitter, etc. are trying to be
our moral compass, when they themselves are completely biased. Just stop.
Silicon Valley needs to stop dictating the world.

