
Why Technology Favors Tyranny - winterismute
https://www.theatlantic.com/magazine/archive/2018/10/yuval-noah-harari-technology-tyranny/568330/?single_page=true
======
oblio
> Currently, humans risk becoming similar to domesticated animals. We have
> bred docile cows that produce enormous amounts of milk but are otherwise far
> inferior to their wild ancestors. They are less agile, less curious, and
> less resourceful.

I have a problem with this premise. Anyone that knows history knows that
humans were as tame, if not tamer, in the past. Plus uneducated.

Most of humanity has never been agile, curious and resourceful. Historically a
handful of people were, and they tried to steer everyone around them.

I'd say we are less cow-like than we've ever been. A ton more people have
access to education than they ever did. The only thing the new technologies
(like the internet) are exposing is the depth of the veneer of education the
vast majority of people had, and that depth was very shallow.

I'd say that in some regards this is actually better. Transparency tends to
help and now a lot more people can see through the covers. It will just take
time to adjust.

Nota bene: I still believe that most people, most of the time, are passive.
Being active is a lot of work and truly active people will be always the
outliers. It's just that we can increase the percentage of active people by
making it easier for them to be active.

~~~
village-idiot
You do realize that modern humans are weaker and have a smaller brain than
their hunter gatherer ancestors, right? And not by a trivial amount, but by
like 10% or so.

~~~
Balero
Homo-sapiens were weaker and had a smaller brain, and less advanced culture
and technology then Neanderthals. But we're here and they are not (except in
like 4% of European DNA).

What matters is how adaptable we are. And we certainly are the most adaptable
we have ever been. It might be thought that people use all of the technology
as a crutch and would not be able to survive if they were transported to a
world without it. But, after a couple of days, modern people would get used to
what they have, make do, and be creative in solving problems. It would be hard
and painful, but that's the nature of adapting.

~~~
village-idiot
Yes, although some of the theories about Neanderthal extinction involve
climate change, not intelligence.

The point is that our ancestors up until 10,000 years ago would make us look
like complete and utter wimps, which contradicts your “most people were
inactive” premise.

And based on what happens to a modern human even with survival gear who gets
lost in the wilderness, typically death within a week or two sans rescue, I
genuinely doubt that modern humans would adapt quickly enough to a hunter
gatherer world.

~~~
Balero
"Yes, although some of the theories about Neanderthal extinction involve
climate change, not intelligence."

My point was that they were less adaptable. For example, the climate changed
and they were unable to adapt. I stated that they had bigger brains and more
advanced technology. Neanderthals were more intelligent than Sapiens.

I had no premise that “most people were inactive”. I honestly have no idea
where you think I said this.

A single human alone would probably die no matter when they were from. We have
always been a social species. A group of humans would be fine anywhere that
was not a massive extreme (desert, tundra) unless they had experience in these
locales.

~~~
village-idiot
> Anyone that knows history knows that humans were as tame, if not tamer, in
> the past. Plus uneducated. Most of humanity has never been agile, curious
> and resourceful.

Anthropologists know that ancient hunter gatherers were stronger, fitter, and
typically had lower rates of morbidity than us. Which kind of undermines your
“agile” and “tame” bit.

I’d also say going from nothing to stone tools counts as quite resourceful, if
you ask me. Given that our ancestors had the same basic wiring and larger
brains, I’d actually wager on them being more curious and clever than modern
humans.

~~~
Balero
Who ever you're quoting wasn't me, please look at the author name. It seems
you are arguing another person with another opinion.

~~~
village-idiot
Oops.

------
pjc50
Recently I saw a comment that the real "hostile AI" threat was not something
like Skynet, but more like Youtube algorithmic video recommendations and their
fondness for conspiracy and far-right material.

The perceived threats of "irrelevance" and "uselessness" get easily diverted
against other humans, especially through "othering". As jobs are taken by
automation or offshoring it's easy to blame "immigrants". Keep up the social
division for long enough and this can escalate to actual violence.

Amartya Sen's Nobel-prize-winning work on famine was to demonstrate that it
was rarely due to an absolute shortage of food, but a deficiency in
"entitlement" to it: both purely economic (not having the cash) and political
(not being able to secure famine relief). That's the potential worse case of
being "irrelevant and useless": starving to death because you can't get food
from the "system" somehow.

~~~
dnomad
It is interesting.

The thing that makes Americans so very weird is that they are intensely
paranoid. They're the most powerful and wealthiest civilization the world as
ever known and yet virtually all of them are profoundly _afraid_. They're not
just afraid of foreigners, either -- they're extremely afraid of each other.
It's gotten to a point where "Stand Your Ground" laws encourage armed
civilians to shoot each other the moment they feel threatened. It's difficult
for outsiders to grasp. (I've even spoken to South Africans about this and
even they react with disbelief.)

The paranoid style [1] at work here is well documented but I don't think
people appreciate the logical conclusion. As the world grows more and more
connected Americans will grow more and more paranoid. You'll quickly reach a
point where America has no allies: everybody is a threat, everybody is a
competitor, everybody is an enemy conspiring against them. Nobody can be
trusted. America becomes supremely hostile -- to new ideas, to trade, to
dissent, to _change_. The result, as Nietzsche explained, is a kind of supreme
decadence. People withdraw from the world, they retreat into paranoid fantasy,
they embrace all manner of nonsense and conspiracy, they despise knowledge,
isolation and delusion feed on each other and become all consuming.

In the end it's a self-fulfilling prophecy. The paranoiac develops paranoid
technologies. Technologies designed to exclude, filter, hide, deceive,
surveil, punish and divide (or "decentralize"). Technologies that are all-
consuming, like his own delusions, they must provide rapid and never-ending
alerts about new threats and plots, the more the better. Technologies that
induce paralysis by leading people deeper and deeper into simulations of
reality. Technologies that primarily walls or weapons, ideally both, to
protect against the greatest threat of all -- other people.

[1] [https://harpers.org/archive/1964/11/the-paranoid-style-in-
am...](https://harpers.org/archive/1964/11/the-paranoid-style-in-american-
politics/)

~~~
crazynick4
Have you ever met an American? We're not as wealthy as you might think. Income
is offset by cost of living, we're not exactly bathing in gold-plated rose
petals. Quality of life for the average citizen is not that different from any
other civilized country.

Xenophobia (which is not shared by the entire population) is not specific to
the states either. Do you watch European news? What's happening in China with
the Uighurs? In fact, I would argue that most countries are more xenophobic if
anything, they just don't have to deal with as many foreigners who want to
move there and people here are actually more used to immigrants than in other
places where everyone is 'the same'. In Latin America, xenophobia is gaining
ground as well with the Venezuelan crisis and this is an instance where the
immigrants are people of the same race/culture.

> as Nietzsche explained..

Are you sure he wasn't talking about himself?

~~~
dnomad
> Xenophobia (which is not shared by the entire population) is not specific to
> the states either.

I think the investment hypothesis is a bit more subtle than that. It goes
beyond mere xenophobia into a kind of devout isolationism. What we're
interested in are technologies and trends that promote loneliness. Facebook
can be pressed into service for classical xenophobia, but you need Twitter --
pure rapid-fire alerts, shorn of all social context -- to really draw in the
paranoiac. Youtube with its never ending flood of videos about new threats,
new dangers, and new outrages, all packaged neatly into 30-minute videos.
Reddit is another prime example of "anonymous networks" that provide unity
through division. When evaluating technologies that might appeal to the
paranoiac we're looking for whatever helps people throw up walls, makes them
more mobile, lets them hide, lets them engage in only very limited contact on
their own terms. We really like robots of all sorts -- not just physical bots
that deliver food but highly customized algorithms that can deliver fake news
and sex bots that can deliver fake love... anything that reduces or eliminates
daily contact with the dreaded Other (human beings).

------
msiyer
If we look closely, we will soon realize that everything favors tyranny.
Religious institutions, political institutions, educational institutions...

The problem is humanity has a tendency to pick or gravitate to morally weak
specimens for leadership. We also have a tendency to "believe" authority which
often leads us astray. Would a wolf pack pick a weak leader? Would a wolf pack
let a weak leader remain in position once weaknesses are exposed?

~~~
Angostura
> political institutions, educational institutions...

So democracy and schooling favors tyranny? I'd like to see your working.

~~~
zzzcpan
They do. In fact schooling is a form of tyranny, being mandated by state and
all.

~~~
Angostura
The state mandates education, not that you have to go to school.

------
zekevermillion
I'm with Kasparov, that the best method of combining human and machine will
always beat the best machine or the best human individually. I submit that
Google's AI progress does not disprove this assertion, because (a) even if
Google's AI is self-trained in chess or go, a human team is required to select
a game and set up the training, as well as to update the algo or meta-
algorithm in competition against the guys from IBM and Xinhua, etc; and (b)
chess, although a very large and, for most purposes nearly infinite, game,
ultimately is not open-ended or infinite in the way that nature life is, or
human life decision-space is. Go is perhaps a better contender as a life
simulation but I think still falls well short of the complexity of even a
simple natural system.

One could counter that a meta training algorithm could define rulesets for a
alpha-like AI to self-train-upon, and at sufficient generality this would take
humans out of the equation. Well, if humans are entirely removed, then the
problem is no longer interesting. If humans are involved at any point in the
process, we could still view AI through the lens of a human-centered tool.

What if AGI does in fact evolve its own goals as primary and decides that
humans and/or human self-determination are an impediment? Would humans
eventually fall down the food chain, or outside of it, say as raccoons living
on the city streets? Well, I don't view the world that way. But, even if non-
human life forms gain supremacy of intellect and capabilities, I would note
that even raccoons have a place -- and despite the fervent desires of many a
suburban homeowner, the little pests are amazingly resilient to attempts at
extermination or control.

------
macawfish
The article is much better than the headline suggests

------
api
Hammers don't favor one kind of architecture over another. Hammers do what the
people holding them make them do.

The way we use our technology is a reflection of our ideas. Since the 1990s
I've watched the _ideas_ that are popular within the culture shift in an
increasingly authoritarian direction on both the "right" and "left" of the
conventional political spectrum. We are seeing the ascent of strong man rule
and other types of authoritarianism around the world for democratic reasons:
it's what people want, or what people think we need.

I think the largest single factor is push-back against globalization.
Personally I think globalization in some form and to some degree is both
desirable (to prevent war and increase wealth) and inevitable (due to travel
and communication), but I also think it's been pushed perhaps a little too
quickly and in ways that are profoundly insensitive to the needs of the middle
classes in the developed world. That's created a massive anti-globalization
backlash where people are elevating strongmen and demagogues to re-assert
national borders and national independence.

There are other factors too. I think a similar kind of reaction is occurring
against global social liberalization.

~~~
marcosdumay
Hammers favors the kinds of architectures that one can create with nails.

Every technology have some inherent biases. Electricity favored
decentralization about as much as steam power favored concentration.

------
jancsika
> Lots of mysterious terms are bandied about excitedly in ted Talks, at
> government think tanks, and at high-tech conferences—globalization,
> blockchain, genetic engineering, AI, machine learning—and common people,
> both men and women, may well suspect that none of these terms is about them.

globalization - by definition effects billions of people

genetic engineering - in 2016 GM crops, according to Wikipedia, make up 12% of
global cropland

blockchain - digital time-stamping service that scales to the unfathomable
rate of 7 transactions per second

It tells me the author is unable to separate wheat from chaff, as if writing
this:

> Lots of mysterious terms are bandied about excitedly in ted Talks, at
> government think tanks, and at high-tech conferences—globalization, _HDMI to
> VGA adapters_ , genetic engineering, AI, machine learning—and common people,
> both men and women, may well suspect that none of these terms is about them.

~~~
CuriouslyC
Except that you don't have meetups where people get all excited about the
HDMI-enabled future. I completely agree with you about the current state of
blockchain, but it has captured people's imaginations.

~~~
AstralStorm
So it would make for some fun dated art in 30 or so years?

(since it hasn't captured much anything else)

------
indigochill
Perhaps programming needs to become the new "literacy" standard. Writing human
languages isn't enough any more. By learning to read and write computer code,
people can become active users of technology rather than merely consumers
bound to the creations of others.

~~~
apocalypstyx
Over the decades, every time I have heard some variant on 'everyone needs to
learn to code' there is always a subtle unsettling feeling that arises in the
back of my mind around the notion of linguistic imperialism, and that the
promulgator instinctively believes or realizes, whether it actually exists or
not, that the framing of any given, non-generalized technology inherently
carries a ideological supposition that, as with all suppositions, the bearers
want to spread to totality.

~~~
indigochill
How about a comparison to modern education? Back before public education gave
everyone the knowledge necessary to write for themselves, would it have been
imperialist to advocate for that goal? If so, does it matter?

------
evrydayhustling
Articles like this often risk conflating many distinct concerns about tech in
a way that makes the whole set feel alarmist rather than actionable. This
author does a good job of cataloguing each risk separately. His closing
prescription deserves a place in the article's tl;dr:

> ...if you dislike the idea of living in a digital dictatorship or some
> similarly degraded form of society—then the most important contribution you
> can make is to find ways to prevent too much data from being concentrated in
> too few hands, and also find ways to keep distributed data processing more
> efficient than centralized data processing. These will not be easy tasks.
> But achieving them may be the best safeguard of democracy.

------
jeandejean
Yet another post about some frightening scenarios, imagined by people
technologically illeterate, that has little chance to happen...

~~~
crazynick4
That are already happening

------
matt4077
Among dystopias, this one is remarkably well-argued...

It is rather frustrating to see how illiberality and strive have come back. In
the nineties, the "End of History" made for a convincing hypothesis: Market
economies in functioning democracies had proven that they were capable of
providing decent quality of life for everyone. Competition had moved on from
natural resources to human ingenuity, rendering the supposed motivation for
all the wars of history irrelevant. Contrary to orthodox left-wing opinion,
poorer countries were not exploited; instead, they had a standing invite to
this future of plenty, and Asia took them up on it.

But China's economic rise didn't effect any democratic change. It seemed that
democracy did not, after all, win the cold war. Communism merely lost it.
Dictatorships can be as productive as open societies, as long as you don't
saddle them with too much incompetence or self-defeating ideology.

So we are now in this new battle of the systems, and too often, we're loosing:
China and Russia are obvious. But Turkey, Hungary, the Philippines, and Poland
are more recent converts to the prosperity gospel of strongman leadership.
With Trump, Brexit, and several recent election results in Europe, the hits
keep coming closer.

Maybe humans never actually liked democracy. They just enjoyed its spoils. Now
that there's an option to both eat _and_ hate, they're overjoyed.

~~~
village-idiot
While it’s obvious that their Psyops worked in 2016, it’s hard to argue that
Russia’s system is superior to our right now. Russia is deeply corrupt,
incompetent, and basically a 3rd tier economy or lesser. Italy tosses out more
GDP than Russia.

China is a far better case for your thesis. They’re doing much better
economically and the government is both totalitarian and largely accepted by
the populace. This situation is also very new, since they accepted open
markets at least, and it will be interesting to see how stable it really is.

~~~
pjc50
China's situation is like someone trying to dilute acid with water: put in a
few drops of freedom, wait for the fizzing to subside, put in a few more, all
while hoping to avoid an explosion.

China is accepted by the _Han_ populace. The extremely non-Han parts (Tibet,
Xinjiang) are having it imposed on them at gunpoint.

------
doombolt
If you as a society ignore it and let it loose, why won't it?

See GDPR as a solid example of trend reversal.

~~~
BjoernKW
The jury is still out on if GDPR will actually be beneficial to consumers and
citizens in general or if it’ll rather end up benefiting exactly those
companies that don’t respect user privacy it supposedly was targeted at.

The intent behind GDPR certainly is good. I’m worried about the implementation
though. In that respect I don’t really consider it a solid effort.

~~~
dasil003
I’ve heard this argument a lot from startup people, but it rings of envy of
being late to the gold rush. The only reason to be more worried about
Google/Facebook is because of the power they have, small startups are not
inherently more ethical, and I’ve met enough small SV founders to know that as
a group they don’t hold any kind of moral high ground.

Framing the conversation as tech company winners and losers is to totally miss
the point. The real question is whether GDPR will actually help stem data
abuses, and frankly I don’t see how it can’t be at least a marginal net
positive.

~~~
BjoernKW
I'm not worried at all about consumer-data-driven startups who are "late to
the game".

I'm concerned about the impact GDPR has on small businesses. GDPR applies to
every business regardless of size. This is how it should be in a state under
the rule of law, after all.

For small businesses, however, the effort required for implementing GDPR is
considerably larger in relative terms than it is for larger companies. Even if
you don't process any data beyond what's necessary due to legitimate interest
such as for accounting purposes the effort required for properly documenting
your processes and third-party data processing can be quite considerable.
Larger companies already do have lawyers to deal with this. As a small
business owner you usually don't.

This in itself isn't all that bad. It's mostly a one-time effort and as an
added benefit it allows you to rethink and streamline your processes.

However, some aspects of GDPR have been left rather open and ambiguous in
terms of how to actually implement them (sometimes intentionally so to make
the regulation somewhat future-proof). Please see this earlier comment on some
issues that small business owners are faced with:

[https://news.ycombinator.com/item?id=17099878](https://news.ycombinator.com/item?id=17099878)

Some of that ambiguity and uncertainty opens up new opportunities for the
practice of sending a certain type of cease-and-desist letters that already
incurs a fine at first notice (e.g. for supposedly not meeting legal
requirements such as having a properly worded privacy statement on your
website). This practice is rampant in some areas and industries. So far
unfortunately, neither the EU nor the member states this applies to have
undertaken significant steps to fight this.

What really irks me is that EU officials dismissed these concerns on several
occasions. It's almost as if for them SMBs simply don't exist and the world is
all about Google and Facebook.

------
jkingsbery
This article mentions a lot of things that "might" or "may" happen, without
really any convincing arguments for why these predictions will come to
fruition. It's true that dictators make use of centralized knowledge, but it's
also true that many of the assumptions of market capitalism assume perfect
knowledge and no transaction costs, two assumptions which have never been true
in practice but today are much more applicable (knowledge is faster to attain,
and transaction costs are smaller today and more directly measurable). So, it
seems just as likely to me that markets could be on the verge of being more
useful mechanisms than ever.

------
CM30
As per the last time this was posted, here are my thoughts on this article and
its premise:

Firstly, I feel everyone gets something wrong about AI and jobs, and I believe
that makes things a little less dire on that front than you may otherwise
believe. Namely, people are not purely 'rational' economic actors, and don't
purely make decisions on 'quality' or 'price'.

For instance, art and media isn't all about what's the 'best' work out there,
but the one with name recognition, sentimental value, an existing fandom, etc.
It doesn't matter if an AI can create 'art', because whether that art sells
depends on more than technical competence. Will Mario or Star Wars or Lord of
the Rings be threatened by AI and technology? Probably not, the name sells
regardless of whether a competitor product/brand may be objectively better.

So I believe artistic and creative fields may be the thing remaining after AI
takes most other jobs, since your creativity can make a market that
competitors legally can't compete in. If all fails, a personal brand can do
much the same way. Sell yourself, not the 'product'.

There's also the fact many 'worse' businesses still do well either through
word of mouth, location, advertising, etc. Not everything will be a one horse
race ala Uber or Airbnb, and whether AI/tech/whatever can outperform humans
won't really matter than much regardless. In that sense, things aren't quite
as dire as some people would have you believe.

Secondly, while technology and AI may help those in power consolidate it
further, it also democraticises power too, by making the means of getting it
available to more people than ever before.

For example, for as much as emotional manipulation and fake news is pointed
out as an issue online, the internet has also made it easier to verify
anything, to get around government censorship and to make your own mind up
about current events and the situation at hand. If the old school media got
things wrong (or were told to shut up by those in power), what could you do?
How could you disprove their claims?

With great difficulty that's how. But now we've got a world when anyone can
call out anyone, where finding alternative viewpoints on major topics is
trivial and where doing research on advanced topics is easier than ever.
Narratives have been destroyed, official statements disproved by average Joes
taking photos and recording videos, and pseudoscience has been debunked. Is
that really worse than a world where publishing information is tightly
controlled and regulated?

Technology can be used for the purposes of tyranny, but it can also be used to
fight against it just as well. And that'll only get more true when more
aspects of everyday life involve computers and networking.

So while technology may 'favour' tyranny, in some ways it also favours
democracy and a more equal society too.

------
mar77i
"The growing fear of irrelevance"?

Well duh. First of all, get that out of the way. You _are_ irrelevant, the
universe at a greater scale indeed doesn't even hint a proper damn about you.
I'm more concerned with all those snowflakes out there losing touch with
reality insisting that they are who is relevant when in fact any attention you
give them whatsoever is a waste of time, and stolen from you and just not
worth it.

I've about had it with this civilization, and phrasings like that... oh look,
technology enables a new kind of totalitarianism. Yeah. Well. If that's what
you make of it, fine. Of course it doesn't need to be that way, but now that
you said it out loud, it'll become more true among your kin.

I'll just stay in my little bubble that thinks technology can help increasing
the awesomeness in everyone's life, at least this 300 foot pole away from your
orwellian dystopias - far enough for government work.

~~~
swift532
I wouldn't personally want to help create tyranny powered by advanced
technology, but I don't think that's to say I shouldn't be worried about it.
I'm worried about what others will make of it, so I don't think we should be
too idealistic about it.

~~~
mar77i
If you have concerns, please add your concerns to the license and contracts
around your product.

Also, for someone who says "I don't think that's to say I shouldn't be worried
about it", you sure seem to take a liberal - maybe even a bit contradicting? -
stance not wanting to be too idealistic about it.

~~~
swift532
English is not my first language, sorry, I think I didn't express myself well.
I meant that we shouldn't be too naive when considering what other people
might make of technology, just because we're idealistic.

~~~
mar77i
But we're still idealistic because at the end of the day, we're not only
responsible of what we do with technology ourselves, but when we create things
we're always enabling others to use them; be it by giving them away freely,
and maybe a bit more critically when selling.

There was a case once where a software project amended the GNU GPL with a no
military use clause... On that note, you can never tell exactly who your
customers are, and you shouldn't fit technology with a built-in kill switch or
have it call home, which is just the worst of all options, as seen with
several unpopular videogame companies.

