
YouTube comments are heavily manipulated, and I fear for our democracy - rawrmaan
http://rawrmaan.com/youtube-comments-are-heavily-manipulated-and-i-fear-for-our-democracy/
======
btown
The problem is not bots; it's that money can flow towards troll farms
consisting of real individuals, posting of their own free will, but massively
amplified. And it's almost always cheaper to sow discord than to scale a "fact
checking farm." I'm not sure if democracy has actually proven itself to be
stable; perhaps it's an unstable system whose oscillations were artificially
dampened until 2016, because this strategy and the requisite technology had
not yet matured.

~~~
dwaltrip
The problem is information DDOS. We, as individuals, are being overwhelmed
with enormous amounts of information of vastly varying quality and intention,
and we don't have the tools to process it.

This leaves us very vulnerable to manipulation by anyone who has even a slight
edge in processing and disseminating large quantities of information.

I heard a very interesting comment on a podcast recently. This problem has
traditionally been solved by _intermediate organizations_ like universities,
news orgs, political groups, unions, trade guilds, religious groups, etc.
People selectively choose which of these orgs they trust, and then receive
heavily curated information from those groups.

In the digital age, the influence and reach of these intermediate groups has
been decimated / overshadowed in many ways. The proposed solution on the
podcast was that we need to foster new ways of empowering individuals to
connect with next-gen intermediate organizations in order to mitigate the
Information DDOS. For example, Facebook could get out of the content filtering
business entirely, and instead provide a completely agnostic platform that
intermediate orgs could live on top of as a curation layer. People then choose
which of those groups they receive content from. I think the idea needs more
work, but I found it very interesting and relevant.

Of course, the intermediate orgs have not always done a perfect job, and have
their own forms of manipulation and corruption. But throwing them out entirely
without a proper replacement seems to be a huge mistake -- especially as the
amount of information expands very rapidly, as is happening now.

~~~
lstodd
This is crazy. Why are you so bent on outsourcing perception and analysis to
some third party, like 'intermediate organizations'?

Think for yourself, dammit. It isn't that hard.

~~~
save_ferris
Because a single person can't be an expert at everything, and at some point,
individuals must delegate trust to a third party.

I'm not an expert on climate change, but it's an issue that I may care about.
So what do I do? Drop everything and spend the next 10 years becoming an
expert?

------
doomrobo
I'm conflicted here. YouTube is an awful place to get political commentary, so
I'd say that anyone whose opinion is primarily decided on YouTube has made a
mistake to begin with. On the other hand, a LOT of people seemingly do get
their political commentary from YouTube, and we might want to mitigate the
damage that that causes. On the third hand, YouTube is not a public forum and
doesn't have to do anything for us.

~~~
what_ever
What's your go to place to get political commentary then?

~~~
Diederich
You're not asking me, but I'll briefly weigh in.

While not a complete list, I look at
[https://old.reddit.com/r/PoliticalDiscussion/](https://old.reddit.com/r/PoliticalDiscussion/)
and
[https://old.reddit.com/r/NeutralPolitics/](https://old.reddit.com/r/NeutralPolitics/)
The latter, especially, is heavily curated. The former is, I think, fair.

------
benbowdene
I've noticed what I belive to be fake YouTube comments around politics and
financial topics especially. For example, sometimes I'll watch a video related
to the US stock market or currencies and then in the comments there will be a
heavy bias against US corporations or the USD. I remember one video talking
about stocks dropping had maybe ~60 comments in total, yet ~50 of which all
heavily biased towards China and talked about how China is great and the USD
will fall. Which I understand people out there have that opinion, but seeing
50 people all saying basically the same thing seemed like a well-orchestrated
plot.

I've noticed many instances of this and usually, I like looking at comments to
find different views or different opinions but sometimes it's a bit too
unanimous. I have a suspicion that many of the top comments are fake and made
to look like a bunch of random people commenting on the video when it's really
a group of people that are trying to change how people think. Which sounds
silly but people follow the masses, if the majority of YouTube comments are
saying that x thing is bad/good, then people just see that and start believing
it as fact since everyone else seems to belive it. I mean the entire comment
section is saying that x person is bad, plus there's 100 upvoted on the first
comment going into why that is, sooo it must be true :P

~~~
TranceMan
Good Bot :)

Sorry, seriously though, have you noticed this with HN comments? I sometimes
get that feeling when browsing here, I really notice it on the comments that
immediately flood in when Apple are having a conference....

~~~
benbowdene
To be clear, I wouldn't say it's a fully programmed bot. It's probably a real
human writing the "fake" comments. I wouldn't be shocked if a couple of
governments had tens of thousands of YouTube accounts that they created many
years ago and then let age and try to make it act like a real account for a
couple of years. Of course, over time they'd start using it to write comments
that are related to their agenda. Say, every 30th comment or so could be
related to pushing their agenda and as long as each comment is different it'd
be hard to ban such an account.

This type of operation sounds silly until you realize you could push your
agenda to tens of millions of unique people reading the comments section on
YouTube alone. If this idea sounds crazy, just remember the recent US
elections were found to have a similar scenario occur with fake websites being
pushed on various social media platforms and eventually people believed and
trusted the articles written on the sites and started reposting, liking, etc.
the fake news.

"have you noticed this with HN comments?" As for HN comments, I don't spend a
lot of time reading through comments here to know. I'd assume that it happens
on every social media platform. If this is occurring on YouTube, Facebook, and
Twitter, then every platform is in the targets. It's all about the eyeballs,
so I'd be shocked if there weren't already thousands of fake accounts already
opened either aging or already in use on HN. If I were running this type of
operation, I'd have thousands of accounts on each platform: Pinterest, Twitch,
Quora, TikTok, Snapchat, you name it, I'd have it. If it has millions of users
per month and is a social media platform where discussions and opinions occur
*or might occur in the future (new potential features on places like
Snapchat), you better believe I'd have accounts on it.

------
randomguy23
And Hacker News comments are any better? The core issue becomes: who do you
trust? If you read the comment and blithely accept what it says without
thinking about it, you'll fall for any scam that comes along. You need to
examine the ideas within the comment, compare it with other information (from
hopefully different sources) and then come to a conclusion as to it's
veracity. It's part of being in a society where freedom of speech is a right
(a US centric viewpoint, since that's where I'm from).

~~~
rawrmaan
I never said that anywhere was any better. I'd like to think Hacker News is a
place where fewer people would be fooled, because this is a place specifically
for critical thought and discussion.

YouTube is not. It's a website that everyone uses, and where the most
impressionable or least politically aware members of the population are the
most numerous, easiest targets.

~~~
fixermark
And most importantly: YouTube lacks a dang. The volume of comments that are
processed vastly exceeds what human moderation could possibly do, and
community up-or-down-flag moderation of comments has massive known
vulnerabilities to sock-puppeteering and signal-jamming. It's an extremely
ripe target for this type of manipulation.

~~~
charlesism
If YouTube cared about anything aside from "user engagement", they could quite
easily whip their comments into shape. They're not really trying.

    
    
        > The volume of comments that are processed 
        > vastly exceeds what human moderation could possibly do
    

For every 500 comments, there is one video. For every 50 videos, there is one
Channel with a Google user behind it. Force creators to handle the complaints
about comments on their videos... an order of magnitude less work.

I've gone into this before and there really seems to be total buy-in here to
the idea that these companies _couldn 't possibly_ manage to weed the toxicity
off their platforms. I don't believe it for an instant. It's true only if you
refuse to entertain any decrease of "user engagement". If you are willing to
lose some eyeballs, the problem is manageable.

And that sacrifice, in the case of YouTube, isn't even a sacrifice! YouTube
would wind up with _more_ users if the comments weren't so toxic. Susan et al
just don't have the courage to change anything.

~~~
fixermark
> Force creators to handle the complaints about comments on their videos... an
> order of magnitude less work.

In practice, this will result in YouTubers disabling comments on their videos
because most creators don't want to play moderator for toxic assholes.

... I'm okay with this. ;)

------
dang
People make claims about shills, bots, astroturfers, and so on, all the time,
also on HN, and experience has taught me that nearly all of this is
projection: seeing what you want to see, or fear to see, or some other pre-
existing perception. Overwhelmingly, people are seeing patterns that they
themselves are reading into the material. I really do mean overwhelmingly, at
least in our little corner of the internet. I don't know whether it's
different for YouTube comments but don't see why it wouldn't be.

Without some corroborating evidence, these claims don't hold any weight. It's
not that they're necessarily wrong, it's that there's no way to tell.
Imagination produces just the same perceptions, and there is a ton of
imagination going around.

Occasionally we do find evidence, and then we can do something about it.
Absent that, though, the default explanation has to be that you can simply see
anything in anything, especially when emotions are involved. Note the "I fear"
in the title of this submission.

What's frustrating is that users don't have access to most of the data, making
it harder to look for evidence themselves. They have to rely on someone else
to do it and then trust what they say. In a low-trust social climate, that's
working less and less, which only fuels more imagination.

------
ihuman
Do people even read YouTube comments? Serious question. They've been pretty
low-quality for years, so I've stopped reading them a long time ago.

~~~
rawrmaan
It's a pretty easy question to answer: Yes, they do. Look at the engagement in
terms of likes and replies on almost any video. Clearly people are reading and
participating.

------
micah94
Wow. I thought this was going to be a discussion of how Google/Youtube was
manipulating the comments. But it was merely about some silly bots fooling a
bunch of kids. Youtube is never going to do anything about that. They prefer
the "Community Guideline Strike" Hammer or take down your channel completely.
They can disable comments completely (if they want to) or change them here and
there. I've seen complaints about this but it's hard to prove. But we've all
probably seen videos with COMMENTS HAVE BEEN DISABLED (and _not_ by the
owner). They WANT you to be manipulated by the comments on channels they
allow. This guy is completely missing the boat here...

~~~
rawrmaan
Author here--it sounds like you're talking about YouTube banning comments on
certain videos? If you're talking about YouTube changing the content or social
proof on individual comments, that sounds like a conspiracy theory to me.

~~~
yesco
I dislike how people use "conspiracy theory" so dismissively these days. They
were likely referring to what was shown in this video
[https://www.youtube.com/watch?v=ptiWBrd9YbQ](https://www.youtube.com/watch?v=ptiWBrd9YbQ).

~~~
rawrmaan
Hmm, I can't watch the video. "We are experiencing problems with our servers.
Please try again later."

EDIT: Was able to watch it. This looks like a bug more than anything.

------
mlb_hn
It's not just comments - the ability to drown out other views applies to
actual news articles too. We saw back in 2009 with the Time Person of the Year
campaign how easy it was for coordinated groups to drown out the public
([https://techcrunch.com/2009/04/21/4chan-takes-over-the-
time-...](https://techcrunch.com/2009/04/21/4chan-takes-over-the-time-100/)).
As legitimate news organizations increasingly use analytics as part of their
decision processes, that unsolved vulnerability presents an opportunity for
click farms to drive traffic to articles and indirectly manipulate the
production and placement of actual articles as well =/

------
esotericn
To me, this feels like a precursor to the actual issue that's going to happen.

Eventually, we won't be able to distinguish between real humans and bots
online. No CAPTCHA will work, no Turing test, none of that stuff.

Message authentication (e.g. keysigning, or a centralised provider you trust)
works for individual identities that you're already aware of, but the general
case of trusting comments online?

Even without all of that, you've never had any idea whether the "person"
replying to your comment is paid, or has a bias, or whatever else.

If, say, 30% of Hacker News commenters were "fake" in that future scenario,
would we know? How would we know? Once we develop a method, can't that be
gamed?

~~~
ryandrake
As for HN, do you think it’s less than 30% today? If so/not,how do you know?

What do you count as “fake”? Actual bots, sure, but what about a real person
running an automation that spams overnight? What about many real people mass-
typing “opinions” on behalf of someone who pays them? What about a single
businessman typin a falsehood that helps his business? Which of these count as
fake?

~~~
esotericn
a) I don't know!

b) All of the ones you've mentioned, at least. I deliberately didn't define it
because I think it's difficult.

------
notacoward
"What is Google going to do about it?"

Absolutely nothing, unless diverting attention counts. A few product
announcements, something something net neutrality, a little more "juice" for
every possible news item with an anti-Facebook or anti-Twitter vibe, and
voila! Nobody's talking about YouTube any more. They're in the attention
business, after all. They know that letting attention dissipate is their best
strategy in this case.

In a better world, they'd be using some of those vast resources and ML
expertise to detect the kinds of vote/comment clustering that signal both
commercial and political manipulation. Others have done it already. The fact
that the self-styled "best and the brightest" are way _behind_ on this says
that they don't have the interest.

~~~
lgeorget
"Others have done it already."

Others with the same number of accounts and daily visits? With the same
flowrate of publications and comments?

~~~
notacoward
Yes. Facebook has been doing this for spam for ages, for Russian and Chinese
and Iranian political influence rings more recently. It's sure not easy at
that scale, but it's not beyond Google's capabilities.

Disclaimer: I work at Facebook, though I'm only involved in those efforts to
the extent that I work on the data-storage systems that support the data
scientists.

------
pnw_hazor
I watch youtube on my TV using Roku so comments are not even visible or
accessible. I consider that a feature.

------
notahacker
The problem with online discourse in general is less the existence of the
troll farms and more an abundance of people who voluntarily produce and
endorse stuff in a manner essentially indistinguishable from the troll farms.

If you want people to endorse the veracity of your scam, you'll have to pay.
Want them to endorse the veracity of claim you've just made up about
$politicalentity and you won't...

------
eaenki
I thought about it deeply. I think in an utopian future we should give a
monopoly on politics to a few public news outlets which have to as transparent
as Ethereum is. Plus a few strictly enforced rules, e.g an article is either
an opinion, boldly labeled as such at the start and the end, or a fact sheet.

That will have the effect of said news outlets being extremely scrutinized.

At the same time, any platform such as Twitter, IG & FB should be made illegal
worldwide. The only content based platforms that are fine would be heavily
scrutinized educational platforms where no politics are allowed.

I know it'll likely never happen. But what about 1000 years from now? Maybe
so.

There's no other way around it. Information is power, and those systems are
ripping apart humanity.

The printing press and propaganda were/are also obviously bad. BUT. If they
were made hyper transparent, hyper centralized, and obliged to follow a few
rules such as clear labeling, it'd be way better imho.

Said news outlet should be just one. Regulated by some new UN entity.

------
redleggedfrog
If you're reading YouTube comments, you're already a lost cause.

------
robbrown451
"YouTube the most-used social media site in the world"

By a certain measure that may be true, but if you were to actually measure how
often people read and post comments, and what percentage of users do, I think
you might not reach that conclusion.

There is a social media aspect to YouTube, but I don't think that is its main
function for the vast majority of users (who are probably better described as
"viewers").

(that said I think the article is good, and YouTube comment quality should be
an embarrassment to Google)

------
bahmboo
If it is truly this corrupted (youtube comments were already horrible) then
YouTube should turn off adding and viewing comments before the election.

------
pixelpp
YouTube's comments engine is garbage.

------
fareesh
If an attractive Russian seduces a politician for the purpose of espionage and
influencing policy, their spouse is not going to excuse them because "Oh you
cheated on me because of Russian meddling!"

Like the marriage in the aforementioned hypothetical, left of center politics
is failing worldwide for other reasons, not because of some nefarious social
media plot. People are growing fed up with government as an entity, and as a
result, they want less government in their lives. Over the past few decades,
the practiced manner of a politician became something that was mimicked and
parodied worldwide, and popular culture widely regarded the politician as a
crook for far too long. Even some of the most charismatic leaders who were
beloved worldwide - most recently President Barack Obama, have been wrought
with scandals like Snowden's NSA revelations, and the weak and dishonest
manner in which the President responded when the activities of the
intelligence agencies were exposed. Even during his first term, he made the
(perhaps superior) decision to bail out the Wall street banks, and was
subsequently betrayed by the various corporations who gave large payouts to
top executives. To working class folks, these things are unforgivable
especially when they come at the hands of someone you admired, even loved.

The trust in government as an idea is so low that all someone has to do is
stand up and say "these crooks and rascals have lied to you all your life.
They have practiced and poll-tested speeches but I am honest, raw and I speak
from the heart. The fake and dishonest media there will tell you that I am a
thug and I have said this and that. I am a thug, but I am your thug, give me a
chance to show these crooks how it's done". The public will fall in love,
because after what so many parts of the world have witnessed from their
governments for so long, this kind of talk is like sweet seduction.

This kind of post "I read YouTube comments written by bots and the world is
going to end because my party lost the last election and they may lose the
next one" are disappointing. People can vote for political candidates based on
any whimsical or nonsensical reason be it race, religion, "he has a kind face"
\- whatever they want to. Voting is an individual right. Any attempt to
influence a voter - whether it's a bot farm or a person writing an article to
stop a bot farm, means only one thing: "I want this person to vote the way I
vote". Democracy needs a free market of ideas. No number of botnets are going
to convince me to vote for someone who espouses nazism, or worse, communism.
If someone else is swayed, it is their right to be, that's okay. Great ideas
will win in a free market, bots or no bots. All ideas, no matter how
objectionable, should be heard.

~~~
mturmon
"Great ideas will win in a free market, bots or no bots."

This is a slogan, not an empirical fact. There are plenty of instances of
market failure in cases where information about goods is unreliable or
manipulated. It's a whole sub-discipline in economics!

In the case of politics, actors have a strong incentive to cause the
information market (if you want to put it that way) to fail.

------
dreamdu5t
It’s a little odd to me that people are so focused on supposed Internet
comment manipulation when we’ve had one of the largest media empires pushing
hate 24/7 through mainstream channels like Fox News for at least a decade.

~~~
thrower123
I suspect that people that believe wholeheartedly that Fox News is some evil
bastion of hate-mongering demagoguery don't actually watch very much of it.
It's considerably less hysterical than the national nightly news I see on the
old big three.

~~~
slededit
MSNBC took the formula and brought it to a new level.

------
stefan_
Americans think so little of their fellow citizens, it's astonishing.

~~~
GavinMcG
Can you elaborate on what you're trying to say here, or how it relates to this
article?

~~~
stefan_
If you believe a significant chunk of the voting population is swayed by
YouTube comments in their political beliefs, you could just as well be arguing
for forcibly sterilizing the feebleminded.

~~~
GavinMcG
There are other potential problems than swaying voters, though: swaying _non_
-voters (to keep them uninvolved) and swaying potential future voters (such as
teenagers).

To the extent that people engage in _any_ format, propaganda in that format is
concerning. And like it or not, people do engage in YouTube comments.

