
'Fiction is outperforming reality': how YouTube's algorithm distorts truth - sus_007
https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth
======
nullc
I put on some science video and walked away recently... came back latter and
the autoplay had navigated itself deep into k00ky flat earther stuff.

People raising concerns about superhuman AI grey-gooing the planet have their
priorities wrong. Incompetent narrow AI will happily navigate man kind into a
wall when the pattern of bricks by chance happens to look like a tunnel.

Don't be evil... we have automation for that.

~~~
analogic
pro tip: never fall asleep to youtube without disabling autoplay. conspiracy
nut vids make for some interesting dreams.

~~~
smacktoward
Not as interesting as in-depth documentaries on the Donner Party.

(source: personal experience. yikes!)

------
graeme
Interesting results. Though I wonder to what extent some of them reflect the
outside world. (Not reality. Just what people outside youtube focus on.)

The Clintons have been the subject of conspiracy theories since the 90s at
least. So, it's not surprising that this would extend on to Youtube.

(I haven't made an extensive study of this, but my impression is that the
American right wing is prone to conspiracies about people or events, whereas
the left wing is prone to conspiracies about institutions.)

The article also mentions Trump speeches were more recommended. There's an
innocent recommendation: people find Trump more engaging. Of the two
candidates, Clinton was far less charismatic - she even admitted she wasn't a
natural like Bill Clinton was. So, it's not exactly surprising that Trump was
more recommended and more watched.

This isn't to take away from the broader thesis. The recommendation algorithm
is certainly worthy of study. And the fact that the algorithm quickly moves
from normal to polarizing is disconcerting.

But, it does seem as though the recommendation algorithm to some extent
mirrors what we see in talk radio and other media segments in America.

~~~
pulisse
_But, it does seem as though the recommendation algorithm to some extent
mirrors what we see in talk radio and other media segments in America._

I'm not sure what you think the significance of that fact is. YouTube's
recommendation system is specifically trying to maximize time spent by users
watching videos, and what's turned out to be the case is that
sensationalistic, politically polarizing, conspiratorial, etc. videos are
_excellent_ for promoting engagement.

It sounds as if you're saying that because such extremist content exists and
engages people in other media, there's no real significance to the fact that
YouTube's recommendation system creates a feedback loop for this content. But
YouTube most definitely has agency there: First, promoting videos to users
isn't just holding up a mirror to reality, but is an overt act. Second, the
design of the recommendation system--what aspects of user behavior to cue off
of and what metrics to optimize--reflects decisions on YouTube's part; they
don't fall out of the sky.

~~~
XorNot
It's to some extent self-destructive for YouTube - the I've noticed that when
looking for entertainment I wind up being rapidly bored by the recommendations
because they're only ever a mirror of exactly what I've seen before - they're
not an attempt to challenge or expand on my preferences.

Which of course is what you get from blind correlation driven user feedback -
"did you like this?" \- if I'm not sure because I'm now exploring outside my
comfort zone then that goes back into the algorithm as "no".

------
poisonous_dart
I don't know, guys. The world before complex algorithms generating clicks on
the Internet was pretty biased and stupid. I don't think autoplay is going to
be the thing that destroys our collective intelligence for us.

If we want our society to be less prone to fictitious nonsense, education that
encourages critical thinking and a rudimentary of statistics should be
encouraged, but that's often not interesting enough to write about for many
journalists, nor is it an easy problem to tackle.

~~~
skywhopper
I would say that we should start by critiquing the implicit claim by Google,
Facebook, Waymo, and others that new means of computational statistical
analysis can serve as a reasonable replacement for human intuition and
judgment. Turns out some of the world’s most valuable companies are betting on
lots of poor assumptions and easily-gamed algorithms, and we’re just starting
to see the fallout of our failure to call BS on the ever-louder hype machine.

I certainly agree that deep learning has delivered some impressive results,
but the acceptance by VCs and the media that his new wave of AI will bring a
revolution in Internet time to established real-world infrastructure and
processes is a major failure of basic critical thinking on a broad scale.

------
firasd
Youtube personalization is eerily well-tuned from a tech perspective, and has
undoubtedly been good for their business. But one of the least expected
side-​effects of new technology has to be the fact that Youtube’s ‘related
videos’ algorithm has given rise to a generation of Flat Earth believers.

Despite that, I gotta say, I'm uncomfortable about the series of media 'hits'
on Youtube over the past year. It is based on self interest as a tech
professional (I want to be able to create a social network without having to
highly moderate everything!) but also, everytime it happens, a bunch of
Youtubers who are innocent of what the media is alleging (objectionable
content, etc.) lose their ad revenue. It's true that the 2016 US election has
prompted the Anglosphere to subject social networks to more scrutiny, but when
clamping down on user-generated content (or how it's displayed and monetized)
we gotta look at the collateral damage, too.

~~~
pulisse
_I want to be able to create a social network without having to highly
moderate everything!_

No one (in the context of this debate) is asking YouTube to ban videos.

------
sqdbps
It's not a given that social media had anything to do with the result of the
election, the few scientifically researched reports out there suggest that
good ol’ mainstream media was by far the prime channel of influence:
[https://cyber.harvard.edu/publications/2017/08/mediacloud](https://cyber.harvard.edu/publications/2017/08/mediacloud)

That aside, so what if “fiction was outperforming reality”? it’s a marketplace
of ideas and people choose to spend their time watching what they may, stop
infantilizing them.

I realise that the media has been under “attack” lately (if you really
consider name calling as serious attack) but they would benefit from some
introspection, not only are they scapegoating social media and calling for it
to be regulated - freedom of speech be damned - they also keep advancing the
invented notion of a “backlash against tech firms” projecting their own
aversions onto the rest of the world.

See: no real evidence of any such backlash:

[https://www.wired.com/story/davos-big-
tech/](https://www.wired.com/story/davos-big-tech/)

[https://www.edelman.com/trust-barometer/](https://www.edelman.com/trust-
barometer/)

------
mythrwy
This whole "fake news" and "big tech is horrible" trope is tiresome.

That being said there is a bunch of nonsense on YouTube and it is annoying
when trying to find legitimate information. Watch an informative video on
history or archeology or something and two seconds later they are pushing a
video about aliens creating the pyramids and the worst part is, because it's a
video, you might have to watch it for a few minutes to find out it's utter
rubbish unlike text where you can scan quickly and determine.

I don't really expect (nor want) YouTube to curate though, the moral hazard in
that seems even more undesirable. Some kind of an expert rating layer would be
cool though for objective topics (lets leave politics out, I'm not interested
in that kind of a rating layer and don't watch those kind of videos). I can't
follow every legit poster on every topic I might become passingly interested
in and it's troubling to see nonsense presented right alongside informed
commentary.

------
aesto
Don't blame algorithms for human nature. That thing is ultimately just
minimizing a loss function and if recommending garbage gets the best results,
that's what it will do.

------
thoaway1
The Guardian calling YouTube out for distorting truth? This is the pot calling
the kettle black.

~~~
w_t_payne
The guardian certainly has a strong editorial position, but I've never found
its' reporting to be anything other than honest, well researched, and written
with integrity.

I think that they believe what they write with a high level of conviction,
which is more than you can say for some of the more egregious videos floating
around on YouTube.

~~~
ghostcluster
The Guardian plays with heavy distortions, biased selective reporting, and
dishonest ideological clickbait tactics all the time.

~~~
w_t_payne
I respectfully disagree...

------
dawhizkid
The fact that we’re all so easily influenced in ways we don’t even realize
makes me believe AI deities are only a matter of time. I’m legit terrified of
a future where some programmer creates a cult AI figure that will direct
people to either harm themselves or others.

~~~
ryanwaggoner
People have been harming themselves and others in the name of religion for
millennia. You could argue that it’s easier to destroy or modify an actual AI
than an imaginary god.

------
0xBA5ED
If the algorithms are designed to maximize engagement, then the opposite is
true from a certain point of view. The result is an _undistorted_ reflection
of our nature.

~~~
bornonline1
Can't follow you

~~~
0xBA5ED
I'm saying it reveals truth about ourselves because it gives us more of what
we want. The growth of certain communities is an exposure of something that
was already there.

------
Bitcoin_McPonzi
> An ex-YouTube insider reveals how its recommendation algorithm promotes
> divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid
> for the presidency?

So the reason people didn't vote for Hillary was that they were fooled by
videos they saw on the Internet? Is this what people who are upset their
candidate didn't win believe, even one year after the election?

