
Manipulating Human Psychology To Turn Users Into Addicts - coloneltcb
https://medium.com/@dankaplan/the-complete-moral-bankruptcy-of-manipulating-human-psychology-to-turn-users-into-addicts-d09b98281ef
======
diego_moita
When you think about technology critics none seems more relevant today than
Aldous Huxley.

The traditional Hollywood criticism on technology goes along the lines of
Frankenstein, Metropolis, 2001 A Space Odyssey, Terminator and Matrix:
technology will control us. I find it dumb, frankly.

Then there is the political criticism along the lines of George Orwell and
some Luddites: technology is a tool for control and domination either by
governments or evil capitalists. It is not totally wrong (e.g.: China, Turkey)
but not the exact true sometimes.

But the Huxley criticism is, for me, the most interesting: technology will
entertain us to stupidity, apathy and alienation. This seems to be very much
the case in here.

~~~
zitterbewegung
We are in a society that has the distractions of Huxleys world with the
surveliance of Orwell. The only thing different is the fact that no one
expected our computers to get so small . That we would stare at screens that
watch us every hour of every day. The anime psycho pass is much more relevant
.

~~~
chiefalchemist
I think it's easy to get hung up on the means (that each authored). But as we
know, the prediction business is a tough one.

For example, Orwell (in 1984) focused on the filtering, manipulation,
constriction and reconstruction of information in order to continuouly redraw
the history / truth. But today the truth is something that's often just as
fuzzy because we're overwhelmed with too much information and too many
distractions. Different means, same ends (read: confused and manipulated
proles).

But it's the ends that matter. Both Huxley and Orwell have warned us. The
irony is, at this point, too few are paying attention.

~~~
WalterBright
The truth has always been fuzzy/slippery. There never was a golden age of
unbiased nothing-but-the-correct-truth journalism. The only change now is it's
a little more obvious.

~~~
wybiral
> The truth has always been fuzzy/slippery. There never was a golden age of
> unbiased nothing-but-the-correct-truth journalism. The only change now is
> it's a little more obvious.

And IMO that realization is a sort of "coming of age" that isn't in any way
unique to modern generations.

When you become an adult and realize that the people running the show are just
humans like you...

And that anyone selling you a simple "one right perspective" is only selling
you part of the truth.

~~~
Paraesthetic
Or you realise that anyone selling you anything is merely wanting your money.

These days media organisations cause outrage for the sole purpose of getting
you mad, only to read the article and realise the title is nonsense and you've
just given them ad money.

~~~
norlys
I think you're wrong in a tiny but important point. It's not "All media
organisations offer you senseless outrage and clickbait", it's "Some media
organisations". There are plenty of meaningful, in-depth, well-written
articles which provide new viewpoints on topics that it's important to talk
about. You just have to find mediums. And I find it important to honor them
instead of claiming "all media provides bullshit", which fuels a certain,
toxic "truth is no where in media" atmosphere.

------
ncantelmo
My knee-jerk reaction to the book Hooked (which I've somewhat shamefully
avoided reading out of a sort of premeditated disgust) was based on the
general arguments laid out in this blog post.

I believe the tech world has a huge problem on its hands here, and that
problem doesn't get anywhere near the attention it deserves. Large swaths of
the population are turning into zombies and tech is the facilitator. Try
looking at the drivers turning through a busy intersection some time. It's
obscene how many are staring down at their phones rather than at whatever
might be in the road around the corner.

That said, I don't believe the problem is simply that certain products remind
or encourage people to use them at times. Rather, it's that there is a class
of truly addictive products that don't do much (if anything) to help people
manage the resulting dependency. And that approach is celebrated far more than
it's scrutinized.

Coincidentally, I wrote the following on Twitter yesterday:

"Any potentially addictive software/service should offer a way for users to
lock themselves out of their own account for a predetermined set of hours."
[0]

It was meant in my usual tongue-in-cheek style, but I do think that a solution
for breaking tech product dependency is sorely needed and hope this discussion
gains some steam.

[0]
[https://twitter.com/ncantelmo/status/930876372620374017](https://twitter.com/ncantelmo/status/930876372620374017)

~~~
sjellis
"Any potentially addictive software/service should offer a way for users to
lock themselves out of their own account for a predetermined set of hours."

It's really noticeable that Hacker News provides exactly this feature, and
absolutely no other site that I've seen does.

~~~
louprado
Discussion regaring profile options:
[https://news.ycombinator.com/item?id=4855004](https://news.ycombinator.com/item?id=4855004)

For years I resorted to hacking /etc/hosts to block HN. Mentally, I just
assumed that no website would have such settings so I never bothered to learn
what noprocrast meant.

------
zaptheimpaler
IMO tech has simply become the dominant form of media. People who would
earlier spend 3 hours/day watching TV might now spend 4 hours/day on
Netflix/FB/Twitter etc.

The fundamental problem is not tech, it is the incentives of those who pay for
the media we consume. Advertisers pay for media and it is in their interest to
make the medium as addictive as possible. And when there is enough money to be
made as an employee or shareholder of ad companies like FB/Google, people look
the other way as long as they benefit.

The root problem is simple greed, and its a very old problem. Things are not
going to change anytime soon as long as people are able to rationalize doing a
small amount of evil for enough money/prestige. (not to say I wouldn't do the
same)

Thats why these discussions seem pointless to me. Ultimately the people
paying/investing in a company decide how a company operates, and they do not
give a f __* about the next article telling us how bad the product is. Even
ordinary shareholders will happily buy FB stock if its projected to go up,
nevermind why that is. When you operate within an economic system that is
built solely to optimize revenue... it optimizes revenue, at all social costs.

~~~
intended
I think there is a great truth to what you said, but the reduction of Tech to
just media hides a set of critical and novel dangers in doing so.

Tech provides a new set of tools - both finely grained, invasive, and mass
produce-able - which target areas of human existent earlier protected just
because of how arcane and nebulous they were.

As human beings, we have many tendencies that are unguarded for buffer
overflows, or malicious code injections.

With these tools, and corporate/shareholder incentives - firms and teams will
run amok.

Its no longer just 1 way media, its now very close to mind invading media.

Sadly the only solution seems to be to ditch the smart phone whole.

~~~
zaptheimpaler
You are saying tech is a more powerful tool than traditional media. I agree.
But the purpose the tool serves in society is a question of how people wield
it, not how powerful it is.

------
dj-wonk
I am glad to see more people caring about long-term effects of their use of
social software. More of us are mindful in how we engage with technology.

Now that I've said the positive, let me talk about the negatives. I bet things
will get worse before they get better. At present, the commercial pressures on
social media platforms to manipulate "users" dwarf human pushback against
these "dark art" techniques.

I don't see any easy solutions. As with many collective challenges, we need
increased awareness to effect change. This is arguably a political problem; we
need social organizing as a counterbalancing force.

~~~
TheOtherHobbes
We don't need "increased awareness", we need a social decision system (aka "an
economy") that rewards positive social outcomes and disincentivises negative
social outcomes.

At the moment we have the opposite. No one cares how you make your first
billion or two, or how people suffer as a side effect. They only care that you
have a billion or two. Toxic side-effects remain invisible and uncounted.

There's a certain irony in attacking EA for building a game economy that's
closer to the effort/reward profile of the real economy - for both players and
owners - than any other game economy to date.

~~~
dj-wonk
First, a question for you: What values would you like your "social decision
system" to embody?

Second, I advocate for increased awareness because awareness is the first step
towards rational thought. Many psychological manipulation techniques work best
on the irrational parts of our brains, so often a useful first step is to
engage the rational mind. That said, I'm not claiming awareness guarantees
systemic change.

------
a-guest
And then today in YC a blog about "Startup Ideas", there is a suggestion for a
"Social Network for Children".

[https://blog.ycombinator.com/13-startup-
ideas/](https://blog.ycombinator.com/13-startup-ideas/)

Please, no.

------
bryanrasmussen
I was going to clap for this story, the first time ever (despite my misgivings
regarding grammar and spelling - I'm sure I've made errors in this post, but
none as egregious as the ones I've found in the essay) but anyway it turns out
to clap on medium I need to create an account. I decided to forgo the
opportunity out of respect for the author's evident wishes.

~~~
hohenheim
Ironically, clapping, is the very device Medium uses to keep writers writing,
the same stimuli that releases dopamine in other platforms.

And yet, the author decided to write his article on the very site that uses
techniques Nir talks about to promote engagement.

~~~
Devolver
I agree. There is a contradiction here. And I am nearly as uncomfortable with
centralized publishing platforms as I am with mass scale behavior design for
profit.

And yet.

I originally published on my own site and it didn’t get much traction.

But when I edited it substantially and published as a direct response to Nir’s
post, it reached 30,000 people, 6000 of whom finished the whole thing...in
less than 24 hours.

Since the goal for me is spreading the message and fueling the converstion, I
gotta go where distribution follows.

~~~
bryanrasmussen
hmm, sounds like that distribution thing is highly addictive /I kid

~~~
Devolver
I’d be lying if I said that reaching an audience and seeing my post generate
(mostly) positive, thoughtful discussion didn’t feel good.

But that feeling (for me, at least) is closer to gratitude and hopefulness
than it is to ego gratification.

The claps and notifications of retweets are, however, quite addicting, in the
truest sense of the word.

I find myself checking twitter more than is reasonable to see where it’s
going.

And while there is some upside there (connecting with and engaging in
conversation with like-minded peeps), the frequency of my checking is highly
disproportionate to that Value.

Ugh.

------
denkmoon
The thin end of the wedge here is advertising. It is deliberately manipulating
people using psychological insights, often targeting weaknesses to create a
recurrent cash flow.

Yet the entertainment and content-creating side of the internet relies very
explicitly on ad revenue.

Are creators that rely on revenue generated by manipulating human psychology
morally bankrupt?

~~~
SomeStupidPoint
They invented a whole language to paper over the fact that their entire field
is weaponized psychology employed against their customers. Do a noun for noun
substitution to unpack a marketing meeting, and you'll want to puke.

Frankly, it's shameful for any company that says they care about their
customers to employ marketing psychology tactics. Marketing and advertising
are (in their modern, virulent form) simply an attempt to hack viewers to
shift the locus of control from the customer to the business. (Other
objectionable things about tech have this kind of behavior, too.)

Frankly, it's just disgusting and I can't blame customers for really viewing
companies negatively based on their advertising practices.

(If all that wasn't bad enough, online advertising often comes with literal
malware ontop.)

------
yodsanklai
For me, the main problem is that I need my computer for what I consider
"legit" activities such as working or communicating with my friends. Starting
from there, it's easy to get trapped into other "toxic" activities such as
facebook or news. I'd like to be able to just put the computer aside for a few
days but it's almost impossible.

------
westmeal
I've never agreed with an article quite as much as this one.

~~~
xkcdef
Also, I would like to point out three things to folks who are asking "What
about personal responsibility? What about agency?" here and on the article
comments.

Facebook's shadow profiles take away any 'agency' and 'personal
responsibility' out of the equation. While I don't yet know Facebook using
shadow profiles to make someone's life worse off, their penchant for saying
"Screw it" to any ethical considerations makes me feel its only a matter of
time before even that crutch goes away.

Secondly, there is the complexity. You cannot take personal responsibility for
something that you don't fully comprehend. At the same time, we are at a point
where AI produces data insights that even the AI builders cannot reason about.
For example, how does someone take responsibility for clicking the Like button
on a post and then finding out that most of the people who liked the post were
(say) closet anarchists which was then correlated with them being the best
targets for some kind of subversive advertising from (say) the NRA?

Thirdly, there is the asymmetry involved in "moderate use". It is possible for
Facebook to create "automated notifications" implemented at a very tiny cost
to them in terms of their resources/profits while the effort needed on your
part to make sure you don't turn into a Pavlovian "notifications" dog comes at
an extremely high time cost to you.

------
m3kw9
When max profits means max attention from users this will naturally get public
companies like FB to craft products this way

------
uoaei
The thing is, it doesn't seem like much of this is even intentional. A/B tests
(probably inspired by some sort of thought experiment re: 'getting more
engagement') tell a story that says you have more users if you do this thing,
when that thing just happens to be addictive.

~~~
randomsearch
I used to think that, until Sean Parker recently stated it was a conscious,
deliberate, decision to exploit people.

------
bryanrasmussen
There's a thing in essays where the author signifies cutting through the bull
with some earthy and down-home talk, but I think this sentence "It’s too soon
to tell, but it ain’t look so pretty from here." is a step too far.

~~~
bryanrasmussen
actually maybe the article just needs an editor or at least a proofreader.

~~~
Devolver
Chose to use this grammatically broken slang very consciously.

------
CodeWriter23
Nir Eyal didn’t invent anything. He simply adapted mass media techniques to
the App era.

------
Blazespinnaker
Gamification can provide value. Stack overflow is all about addiction and
value. The blog title was manipulative in its hyperbole, btw.

~~~
ben_w
I wouldn’t call _stackoverflow_ addictive - if anything I use it too little! -
but I do have that problem with some of the linked network sites on
stackexchange. “Why yes, who would win in a fight between Aquaman and The
Flash _is_ more interesting than why the third argument of the ExampleFooAPI
constructor doesn’t behave itself!”

(I’m glad adblock lets me hide that block. Really helped me avoid those
distractions).

------
PeOe
I´ve read a book recently called "Hooked" by Nir Eyal.
[https://www.amazon.de/Hooked-How-Build-Habit-Forming-
Product...](https://www.amazon.de/Hooked-How-Build-Habit-Forming-
Products/dp/0241184835/ref=sr_1_1?ie=UTF8&qid=1510910504&sr=8-1&keywords=nir+eyal+hooked)
It´s exactly about this topic. He also gives quite a few examples of the
achievements of Apple and so on. Really interesting.

~~~
placebo
The entire post is addressed to him, and not in a positive way. I tend to
agree...

------
wppick
Stories like this might appear to hurt Facebook and similar companies, but
they actually help create a barrier to entry for the "next Facebook" to not be
able to use the same strategy.

~~~
Danihan
I highly doubt a blog post will have that effect.

~~~
Devolver
Me too. But you know what they say about the pen and the sword?

Maybe we need to update the saying: “the keyboard and a content database can
be mightier than the predator drone”

?

------
acchow
This is a fairly similar argument to saying drug dealers are bad because they
profit off your addiction. I don't know if I agree or disagree with that, but
at what point do we actually acknowledge agency?

~~~
scythe
Drug dealers don't have intricate control over the effects of the drugs
they're selling. I imagine if synthetic drugs were legal in general rather
than specific compounds you might see similar problems pop up (e.g.
3-fluoroethamphetamine, a compound screened in rats to be maximally addictive)
but it's reasonable to advocate for a drug market that avoids this.

~~~
emmab
> Drug dealers don't have intricate control over the effects of the drugs
> they're selling.

yet

------
ttoinou
The original article seems fine. I don't see how it was immoral. Seemed more
like "amoral"

[https://medium.com/the-mission/the-morality-of-
manipulation-...](https://medium.com/the-mission/the-morality-of-
manipulation-c3115fb2bb3d)

    
    
      In the meantime, users will have to judge the yet
      unknown consequences for themselves, while creators
      will have to live with the moral repercussions of how
      they spend their professional lives.
    

[https://en.wikipedia.org/wiki/Amorality](https://en.wikipedia.org/wiki/Amorality)

~~~
scythe
People are amoral, acts are immoral. An act by an amoral person can certainly
be immoral.

------
spaceribs
If you can name a successful product which does not manipulate users to
continue using that product, then it's not a product:

* I've been manipulated to bring an umbrella because I don't like to get rained on.

* I've been manipulated into liking broadband because it provides the internet faster than dialup.

* I've been manipulated into owning a car because it's a fast convenient mode of transportation.

This is the basics of products and services that provide value, this isn't
magical or new, and this book is probably being more honest than most
industries are.

The point the book was making is that if you don't want a product to be flash
in the pan garbage, follow some simple guidelines which ensure that your
service is actually providing some sort of value to end users, or your end
users will eventually leave and find better things to do with their time and
resources (Unless you're selling heroin I guess).

I don't consider Facebook sending me an email when I get a personal message to
be exploitative, but I find no value at all when they send me something every
time one of my friends fart. I specifically avoid interacting with those "hey
someone did something" emails because they are annoying and make me want to
stop being on facebook altogether. That's the result of abusing the hook
model, users go away because it's annoying and unproductive.

~~~
TheOtherHobbes
You haven't been manipulated into buying an umbrella at all, because deciding
you'd rather stay dry than wet is a straightforward choice; a product that
offers you that choice is a straightforward product which provides a tangible
service.

To be manipulated, you'd have to exposed to a constant stream of ads telling
you how wonderful umbrellas are and what an unattractive loser you are for not
owning one; how desirable people with high status use exclusive umbrellas
while most people get by under a wet newspaper; and how you should buy a new
umbrella every year because they're constantly improving, and this year's
umbrella already has more a billion users worldwide and offers 23% more anti-
rain features thanks to the exclusive Umbrella Corporation[tm] lifestyle eco-
system. (And now here's a video of attractive people smiling, and of a world-
famous umbrella designer talking earnestly about engineering, design, and
craftsmanship.)

You'd also see paid testimonials and likes on social media sites. They'd
appear to be spontaneous and sincere praise, but in fact they'd be factory
farmed and produced to order by unhappy people working in open plan offices
paid so little they can barely afford the rent.

 _And_ you'd have an umbrella that only worked reliably some of the time, so
you'd have to keep considering the purchase of an improved model - in the vain
hope you could finally end your constant nagging frustration with umbrella
technology.

Worst of all, you'd believe you were making a straightforward umbrella-buying
decision entirely on your own initiative, using perfectly objective and
rational criteria.

~~~
spaceribs
Has facebook done any of that? Did you come to facebook because of an ad, or
was it because you wanted to keep in contact with your friends and family? The
fact that they are trying to use those social ties to make you look at more
ads is what's amoral, not the fact that they sent you an email that doesn't
apply to you as a "hook".

~~~
ben_w
The explicit description at one of the Facebook developer conferences I went
to was to do things which _caused users to stay for longer_. I think the word
they used was some pleasant-to-neutral term like “compelling content”, but the
goal was to switch from “more users” to “longer stays” - and to use psychology
to achieve that.

Then there is the games sector. I was only in that sector from 2007 to 2010,
but when I left many of them were asking how to make their content
“addictive”.

I’d say it’s immoral that Facebook et al have wasted subjective lifetimes of
their users’ consciousness by addicting them to whichever drug the brain
synthesises to reward the minor and unfulfilling social interactions that it
provides (and the reverse, the pain from the fear of missing out).

