
The Information Apocalypse (2018) - kochihabaya
https://diracdeltas.github.io/blog/information-apocalypse/
======
brownbat
I generally agree with the article that people get too caught up in imaging
sci fi deepfakes incidents swaying entire populations, when the economy of
beliefs is way more complex than that.

People have been able to fake photos for a hundred years, and yet fake photos
aren't commonly sparking national emergencies. It's not because photos aren't
sufficiently lifelike, it's because the ways we validate that information are
very straightforward.

Photographer and subjects mysteriously unavailable for follow-up questions?
Huh.

Meanwhile, people thrive on validating misinformation.

TFA is right this is a psychological problem not a technical one.

In the 1951 study "They Saw a Game," students supporting each side of a
football match counted a different number of infractions and reported referee
bias in opposite directions.

Key takeaway: It is impossible to have a neutral arbiter that is also
perceived as neutral.

That's why even the most innocuous filtering and sorting algorithm is accused
of pushing a bias by interest groups.

If you have the AP cryptographically sign "real" news, the only thing that
will do is sentence the AP to allegations of bias.

And this isn't that new. Journalism in the US used to not even bother trying
to be objective. Papers would have Republican or Democrat in their name.
Political Reddit threads, rabble rousing echo chambers that they are, are also
a reversion to the mean.

This may be intractable, but I think our best hope is greater education about
cognitive biases, to give people the language to discuss biases with each
other that's more clinical and less charged, and to help people recognize the
most common traps.

Or maybe just know this is an old problem and just hang on for dear life.

~~~
wickerman
Yeah, my whole impression while reading this article was "why try to solve a
non-technological problem using technology?"

If the over abundance of information is just reinforcing existing
psychological issues with humans then humans have to either get over these
issues or reduce the exposure to information.

~~~
w1nst0nsm1th
We cure cancer, which is biological issue, with chimical means, and so coming
from technological advance.

That said, that doesn't means we have the technological means to solve
problems posed by fake news and radicalization of opinion.

Regulation is probably the key, even if it means policing the web.

The problem here is fake news can comes from everywhere and is not restricted
from coming from countries outside the national intended audience where
national regulating instance have no influence.

An international watchdog has to be formed.

------
hirundo
> extrapolates from the fake news crisis that started in 2016

That's fake news. A fake news crisis was popularized in 2016, around the
election of a particularly controversial politician. But I've seen no evidence
that it started or even got much worse then, rather than just becoming more
visible. And it's worse when we're not aware of it, so this "crisis" may have
actually been progress in that it deservedly decreased trust in news.

~~~
arkh
I remember some fake news about Iraq and WMD around 2003.

~~~
paganel
Up until two years ago I was still able to find a NBC or CBS article from May
2003 saying how more than 80% of the US public was approving of said military
intervention and how that was making the whole thing ok (or at least that was
the implicit message).

I used to link to that article whenever this subject of fake news (or
propaganda as it was called at the beginning of the 20th century) came into
discussion on reddit or here, on HN. A couple of times I also linked to the
Athenians' pre-Sicilian Expedition [1] propaganda campaign led by a certain
Alcibiades, a war which had disastrous consequences on Athens itself. So it
seems that we have been in an "information apocalypse" for at least ~2400
years now.

[1]
[https://en.wikipedia.org/wiki/Sicilian_Expedition](https://en.wikipedia.org/wiki/Sicilian_Expedition)

------
buboard
Technology is before culture and politics. Without the printing press , the
nation state wouldnt be possible, and without the elite-controlled mass media,
the centralized bureaucratic machine which monotonically increased its power
concentration in the late-20th would not have been possible. The internet
disrupted that, and now we have on the one hand states with a legacy of Too
Much Control on people 's lives and on the other hand citizens who are
disengaged and disinterested , again because of technology. Your neighborhood
is dirty and full of bigots? That's OK because you live through a shiny screen
anyway and you get to chat with uber-progressive friends every day. Your city
looks ugly? That's OK because paradise beach weekends are only a flight away.
And that's progress because it allows us to be increasingly atomized and
tolerant of our different neighbors. But we 've forgotten all that legacy
power which is concentrated in the hands of Leaders, and while people are
distracted, bad actors are going to be taking advantage of it. Fake media and
underhanded tactics are going to be rampant until we begin to remove or
decentralize powers.

------
fergie
Fake news didnt start in 2016.

I had an entire semester on media manipulation for my sociology degree in
1995. Propaganda has played a major part in democracy since the invention of
the printing press. Its good that there is an increased awareness of the need
for critical thinking, but to say that we are currently living through a "fake
news crisis" is wrong

~~~
wickerman
The unsaid thing about fake news is that it's manipulation by non-giants. It's
not the big media corps/newspapers lying to the people, it's some small
website playing up cognitive bias.

~~~
Jeff_Brown
Those grassroots lie-factories could be more dangerous than state-directed
ones. The state has a single message to push. A market, by contrast, will try
every damn thing it can think of, and some of it will work.

I don't remember where I read it, but I believe the 2016 Russian
disinformation campaign took advantage of a number of conspiracy theories that
arose "in the wild".

~~~
raxxorrax
I believe the influence of the Russian disinformation was overstated that it
could very well be declared fake news.

Which messages? Which groups were targeted? How many people did it hit?

It was targeted to increase distrust? On which platform?

What historically was regarded as propaganda could very well be dissected
afterwards. I don't really see that here or any arguments that this influence
cannot be traced.

I would be very surprised if Russia didn't try to boost their favorite
candidate as a geopolitical rival, but to imply relevant Russian interference
is an example of distorting the truth in my opinion. And not a trivial one.

Furthermore you are also implying that everyone having a somewhat dissenting
opinion about unrelated issues to being manipulated. Some might take issue
with that.

------
pkrein
I saw a very interesting talk by Mike Tamir, Berkeley and Uber Autonomous
Group, about an extension his research group built to detect sensationalism
and emotional appeals. The idea was that detecting writing designed to anger,
incite and sensationalize (versus state facts) basically separated everything
we’d call fake news from journalism:
[https://www.fakerfact.org/about](https://www.fakerfact.org/about)

~~~
knzhou
This would only detect the crudest type of bias —- it does absolutely nothing
against selective reporting or editorial slant. People read sensational news
precisely _because_ they think “respectable” outlets are selectively and
disingenuously deciding to ignore it.

------
mikeiz404
I think this is an important point. I am reposting it here since it is at the
bottom of the page and we don’t always get there.

> And so, maybe the majority of people wouldn’t even want SafeBrowsing-style
> blacklists of fake news sites or verification badges on legitimate
> journalist-vetted news articles, because they’re not reading the news to
> learn the truth - they’re reading the news to validate and spread their
> existing worldviews.

>Effectively this means that any technological solution to the information
apocalypse depends on a social/behavioral solution: people need to welcome
cognitive dissonance into their online spaces instead of shunning it. But it
sounds almost ridiculous to suggest that shares/likes/retweets should be based
on factual accuracy, not emotions. That’s not how social media works.

------
sdrinf
The idea of a reliability-indicator has been floating around for some time for
few-to-no viable results. A much more broad-spectrum immunity-system building
exercise is Crash Course's Navigating Digital Information (
[https://www.youtube.com/watch?v=M5YKW6fhlss&list=PL8dPuuaLjX...](https://www.youtube.com/watch?v=M5YKW6fhlss&list=PL8dPuuaLjXtN07XYqqWSKpPrtNDiCHTzU&index=11)
), which is excellent, and the ep linked is strongly recommended for everyone.

------
xouse
I have experience in an unusual domain that's shaped the way I think about
truth and misinformation. I play super smash bros melee for the Nintendo
gamecube competitively, and I view it as an odd sort of sanity test for the
limits of truth.

The deck is impossibly stacked in favor of truth in melee. To start off with
we actually have Truth with a capital T. Press Y and dpad down and you bring
up white text telling you exactly what state your character is in. Imagine
being a psychiatrist and instead of the messy uncertain process of diagnosing
patients you can press a button that freezes time and white text from a debug
menu god forgot to remove from the public release of life appears. Time
switches to frame advance mode and "DepressionModerate 27" floats over your
patients head.

Not only do we have the truth, but we have a community that values performance
and has a vested interest in the truth.

And not only that, but there isn't even political or partisan resistance to
the truth.

Ok, so this is a world where pure objective truth exists, science tools are
free easy and available to all, the populace cares about truth, politics
doesn't exist, there's no real incentive to spreading misinformation, and
there are no large scale or individual actors purposely spreading
misinformation. And yet even in this best of all possible worlds microcosm
truth always seems to be barely hanging on by the skin of its teeth.

There's always someone around the corner who says that powershielding an
attack incurs no shield stun.

I don't think people appreciate just how fragile the truth is even before
adding in bad guys. I think we fall too quickly into viewing truth and
misinformation through the most exciting narratives. Like getting in a frenzy
over shark attacks and neglecting the thousands of people killed by the boring
old flu.

------
walterkrankheit
I'm not so sure about the incoming "reality apathy". People seem to be more
and more ready to mobilize, the more pervasive social media has become
(completely saturated already, yes). That applies to both sides though. And to
all issues - from the important to the petty.

~~~
Jeff_Brown
It's not so much that they don't care about their beliefs enough to act on
them; it's that they don't care about verifying those beliefs.

------
Nasrudith
The solutions proposed don't understand what can and cannot be verified. You
can verify if a statement is actually from the source subject to the messy
details of distribution. You can't verify the actual truth cryptographically.
Ans ot encounters the same messy problems as computer security - no matter how
good your technology you can't have both power to the user and protect from
absolute idiots - see phishing.

An authority signing it only tells you that it was approved by one and not
that it is actually valid. Transparency enabled by the internet and
sousveillance (for instance every camera phone capture of police brutality)
makes it clear that the idea you can trust and rely upon authorities is a flat
out lie.

------
icris
i think the alarm is a bit overrated. it talks about "misinformation campaigns
indistinguishable from reality", but if so it means they don't add more view
prejudice to reality than the one that it already possess, so no harm

the danger is when disinformation distorts reality and in that case it's very
distinguishable from reality, at least, i think, to an educated mind. for the
others i don't think misinformation does anything for their worldview than
what they have right now, if anything is just a case of confirmation bias
playing on their susceptible minds

the greatest danger is for people with open minds but little literacy to
digest and filter info from crap noise that can become prey of some well
crafted campaigns to elude. but for better or worst people are more educated
now than they were once before and we can foresee that they will be even more
when the problem becomes too manifest, but will it be enough?

but even in the worst scenario i anticipate probably nothing more than an
inconvenient drawback - but there's how society evolves anyway, a perpetual
dialectic motion of advances and set backs - because to be really useful
prejudice world views should possess better predicaments for the worlds
troubles and that they have not, because paradoxically if that had they would
be not prejudices in the first place. so they could only engage people on some
conditioned response temporarily before reality throws them out. but maybe i'm
being naive, who knows?

------
einpoklum
Bonsai kitten made me sad :-(

Also - the author uses GMail. That's an information apocalypse right there -
the apocalypse of large commercial corporations and the government spying on
your personal correspondence.

------
headmelted
Any possible solution I can think of to this problem of trustworthiness relies
on some notion of a trusted arbiter.

As an example: We could replace e-mail with an identity-verified
communications platform. If we do that, you’re sacrificing your privacy AND
placing your trust in whatever government or company manages that system.

Likewise video. A government could realistically just release statements
through their own websites (e.g. Gov.uk or whitehouse.gov) and make everyone
take their news from there.

In either case you have to place your trust in someone to tell you what’s
true, and if you do that, the platform becomes a propaganda machine about 15
minutes after launching and we’re basically a communist state.

Basically the solution to these problems is even worse than the already
terrible status quo.

This is real, and it’s here now, and Joe Public has no idea why everyone
doesn’t love Trump/Bernie/Boris/Jeremy as much as he does.

This doesn’t end at all well.

I’m going home to hug my kids.

~~~
Jeff_Brown
Before cryptocurrencies, we believed we needed a central authority to create
money, too.

We currently have to trust arbiters of fact because we have no systematic way
of evaluating the work of journalists, nor of aggregating the distibuted
verification efforts that millions of people perform daily. If that work were
easily navigable, our collective intelligence might improve.

In a previous comment[1] I suggest a software solution. I've already written
part of it.

[1]
[https://news.ycombinator.com/item?id=21664084](https://news.ycombinator.com/item?id=21664084)

~~~
headmelted
I had a read at this but I’m not sure how reliable consensus is for something
that is ultimately unprovable.

With cryptocurrency I have mathematically provable truths to rely on. With
consensus I’m relying on collective opinion.

If something sounds believable to most people it would be accepted as true by
any such mechanism. If the last four years have taught us anything, it’s that
this is a terrifying prospect.

~~~
Jeff_Brown
I'm not proposing a consensus mechanism. Rather, a mechanism that allows
anybody to encode anything, and anybody else to run searches over it,
systematically. It would make it easier to inspect the data behind anybody's
opinions, and thereby, I hope, make us collectively smarter.

Yes, systems to encode and query data already exist, but what they can encode
is limited, and the way to encode or search for it is difficult. It's hard to
think of information that Hode cannot encode, and the way of getting data into
or out of it is extremely close to ordinary natural language.

------
papito
Chris Hayes's podcast episode "The Antisocial Media" is pretty interesting, in
this regard.

~~~
dredmorbius
Direct link with downloadable mp3 (otherwise hard to find):

[https://podbay.fm/podcast/1382983397/e/1569308400](https://podbay.fm/podcast/1382983397/e/1569308400)

Audio:
[https://www.podtrac.com/pts/redirect.mp3/traffic.megaphone.f...](https://www.podtrac.com/pts/redirect.mp3/traffic.megaphone.fm/NBCN9663175612.mp3)

------
classified
> ... any technological solution to the information apocalypse

Here we go again. Propagating the unbroken (and unvalidated) belief that non-
technological problems must have a technological "solution".

~~~
buboard
all solutions are technology

------
agumonkey
Information Highway to Hell

------
jstewartmobile
HN: " _You can 't talk politics here!_"

Then I see this on the front page--which is political as hell. That any of
this is a "problem" that needs to be "solved" is a limousine liberal fiction--
pulled out of thin air to preserve their sanity in the face of a Trump
presidency.

Michael Moore called it before the election--Trump was going to win because
the Democratic party under Clinton and Obama abandoned the working class. Not
enough money to address Flint, oxy, and joblessness--but somehow, there's
always room on the credit card to bail out Goldman Sachs, or destabilize
another middle-eastern country.

