
How Social Media is Built for Addiction - zipwitch
https://www.vice.com/en_us/article/the-secret-ways-social-media-is-built-for-addiction
======
localhost3000
I built a chrome extension to help me with this addiction. It replaces the
distracting parts of social networking sites (like Facebook's News Feed) with
a todo app to keep me on task whenever I look to these sites to procrastinate.

The app unlocks the site for a configurable amount of time after I've
completed my tasks.

Also supports Hacker News, Reddit, YouTube, Twitter, and Product Hunt.

It has helped me a lot with a. Recovering lost productivity and b. Feeling
less addicted to these sites.

If curious: Todobook
[https://chrome.google.com/webstore/detail/todobook/ihbejplhk...](https://chrome.google.com/webstore/detail/todobook/ihbejplhkeifejcpijadinaicidddbde?hl=en)

~~~
bencollier49
Now if someone can figure out a way for Facebook to rewrite their site in pure
WebAssembly, they won't have to worry about pesky programmers like you
breaking people's habits.

~~~
hello_newman
I guess I'm new to the whole WebAssembly thing. Could you please elaborate
more on that statement? If FB were to do that, how does WebAssembly stop you
from doing what OP suggested?

~~~
bencollier49
Theoretically, it doesn't. Practically, it would be a complete nightmare, as
you'd have to decompile the source code into a high level language, then come
up with some kind of (proper) hack, and run a code insertion tool in your
browser to reinsert the changes. And they'd probably break any time Facebook
altered their code at all, and moreover, FB could use a polymorphic compiler
to make such things impossible. And finally, decompiling the program would
probably be illegal if it was to circumvent the intended use of the site:

[https://en.wikipedia.org/wiki/Decompiler#Legality](https://en.wikipedia.org/wiki/Decompiler#Legality)

------
lyra_comms
I used to work for a social network. We used to have 3-hour meetings every
week to discuss how to get the users addicted and keep them that way.

That's why I now help to develop Lyra, an open, ethical conversation platform
which respects language and attention. www.hellolyra.com

We're also a nonprofit with no marketing budget, so excuse the plug ;)

~~~
neogodless
If I understand correctly, this is thoroughly different from "social media."
In fact, it looks like "connections" are all but irrelevant - only
"conversations" matter. Of course, conversations are richest when you have an
understanding of the other participants background and depth of knowledge, so
you can explore, test, debate and grow your own knowledge, or adjust your
beliefs.

My impression is that Lyra targets a niche of individuals that "enjoy
conversing." I'm not sure if there's a nexus, a central target topic of
conversation (like a sub-reddit would have) or a connective tissue such as
real-life familiarity (like a friends and family social network would have) to
hold your users together.

Can you elaborate on who your target audience is, and how they will be
attracted to the site, or what will maintain their interest (while avoiding
the trap of making their engagement addictive?)

~~~
lyra_comms
Connections are relevant in the form of groups, which are private to you: you
use them to set conversation audiences and control whose conversations appear
in your news feed.

Our target audience is those who want to converse online without the
restraints and commitments of social media services which aim to addict rather
than to support and convey language.

We don't aim to enable the discovery of interesting links - Reddit has this
completely down. But it's no good for conversing in a focussed way with people
you know.

Letters and email link people internationally in ways that are impossible on
Facebook or Twitter. When you receive an email, you read it in a focussed
space without distractions; and you're aware of the fact that it was
personally sent to you as opposed to broadcast to a large channel. Narrowcast
will always be a personal experience than coming across a message delivered by
a mysterious algorithm.

Lyra does have a public news feed in which conversations are viewable to all
users. If you add people to your groups and set the feed to view only these,
it's the exact equivalent of Facebook.

What we don't do is offer particular spaces (hashtags or subfora) because
these instantly offer a target for harrassment and abuse, and require
moderation. Lyra's design makes it trivial for people and topics which are
vulnerable to abuse, to control their reading and avoid it.

------
Alex3917
This article also uses manipulative tricks to maximize your time on page.
Notice that each paragraph is relatively short and approximately the same
length. This is done purposefully to reduce the cognitive overhead of reading
and makes it easier to keep going from one to the next, always with the hope
that maybe _the next_ paragraph will deliver some key insight.

~~~
nerdponx
> Notice that each paragraph is relatively short and approximately the same
> length

This technique is also called "good writing" in some fields. You see it in
newspapers a lot.

~~~
zebrafish
Correct. News style prose.
[https://en.wikipedia.org/wiki/News_style](https://en.wikipedia.org/wiki/News_style)

In disagreement with the parent's snarkiness, I would argue there's a fine
line between efficiently structuring written information to convey important
news in a time sensitive manner and using bright red numbers to keep users
swiping your app all day. Especially when it comes to messing with peoples
self-worth and validation!

"You may have won $100! Watch these 5 advertisements to find out!" is gross
but palatable to me.

"People might like you! Refresh the page and view another ad to find out!" is
appalling.

~~~
ouid
In what sense is "You may have won $100! Watch these 5 advertisements to find
out!" palatable?

~~~
everybodyknows
In the sense that it exploits only the reader's poverty or greed, and even the
most credulous will soon tire of watching ads, close the page, and be wiser
for the experience.

Whereas "People might like you! Refresh the page and view another ad to find
out!" cruelly exploits a desperate emotional vulnerability. And there is no
end of it in sight.

------
smitherfield
_" You know when you open Instagram or Twitter and it takes a few moments to
load updates? That's no accident. Again, the expectation is part of what makes
intermittent variable rewards so addictive. This is because, without that
three-second delay, Instagram wouldn't feel variable. There's no sense of will
I win? because you'd know instantly. So the delay isn't the app loading. It's
the cogs spinning on the slot machine."_

Can anyone confirm if this is true or not? Do social media apps insert
`sleep()` calls on startup for gamification reasons? I've only seen that for
things like save dialogs, but I also don't work in social media.

~~~
meagher
This seems especially devious if it's true. (I seriously doubt it is.)

~~~
quadrangle
Oh the cute naivete of your innocent optimistic doubts…

This sort of thing is well-known, so doubting this particular case is not a
well-founded doubt.

~~~
blauditore
How is this "well-known"? Do you have a source?

~~~
quadrangle
I don't mean this exact case, I mean stuff like intentional slowing down of
results. People don't trust results that come too quickly. We like to feel
that the computer worked hard for it. E.g. Airline prices or TurboTax results.

[https://www.theatlantic.com/technology/archive/2017/02/why-s...](https://www.theatlantic.com/technology/archive/2017/02/why-
some-apps-use-fake-progress-bars/517233/)

------
cflewis
I wrote my thesis on this stuff:
[http://www.soe.ucsc.edu/~ejw/dissertations/Christopher-
Lewis...](http://www.soe.ucsc.edu/~ejw/dissertations/Christopher-Lewis-
dissertation.pdf)

which I turned into a book: [http://a.co/3uR0koj](http://a.co/3uR0koj)

I find these sort of articles annoying. I've not read the book referenced in
the article, but I've seen some of Tristan Harris and I think he's a smart guy
but stuff like the 60 Minutes interview plays into him looking like a hack who
is far more interested in self-promotion than in the actual topic.

It just doesn't compute that "knowing what drives users" == "addiction
manipulation". Yeah, there _are_ slot machine-style techniques that you can
use in those areas, but they don't work well or for long and users grow to
resent them.

But the stuff that keeps getting brought up, like "people on Facebook are
desperate for validation and connection" makes it sound like users are forced
to be there. They aren't. They're going there _because_ Facebook _can_ provide
those things, and those things are meaningful. It's meaningful to me that as I
grow older and my time is more and more committed to family and work, that I
can connect with my best friends from my home town. That means something to
me.

I'm at work so I can't write a lot more, but I am super happy to answer
questions. I just hate these articles that assume that people are dumb, that
social media is inherently vacuous, and that UX designers have some evil
necronomicon of addiction recipes. It's cheap journalism.

Disclaimer: Everything here is my own opinion and not that of my employer in
any way shape or form.

~~~
throwanem
Of the eight dark patterns you identify as such in your thesis, Facebook
implements at least two, possibly three - I could swear I've seen reports of
what you describe as "hellbroadcast", but I don't have a cite to hand. Your
"impersonation" I've seen very recently reported here on HN at least, and what
even _is_ Facebook if not the ultimate example thus far of what you
(correctly) term a "social pyramid scheme"?

In general, while I'm willing to assume good faith on your part, I do find
myself moved to inquire further into how you compose, on the one hand, having
produced detailed documentation and an inventory of examples on how to develop
user experiences which are designed specifically to the purpose of maintaining
user engagement in ways largely orthogonal to the inherently engaging quality
(or lack thereof) of the underlying content - and, on the other hand, your
argument here that platforms like Facebook, which are so broadly known to
implement such patterns that to advance the claim merits no controversy
whatsoever, do not achieve any benefit, or their users suffer any detriment,
from such implementations.

One further point: I don't see the article here under discussion assuming that
"people are dumb", or that "social media is inherently vacuous". Instead, what
I see is a very reasonable question being raised around, on the one hand, the
ethics of social media platforms like Facebook using these techniques to drive
engagement, and, on the other hand, how people might protect themselves from
being so manipulated - a subject which, while I haven't yet read it in detail,
seems to be covered in your thesis as well, under the name of "manipulation
literacy". I look forward to reading your exegesis of the topic!

~~~
cflewis
_Of the eight dark patterns you identify as such in your thesis, Facebook
implements at least two, possibly three - I could swear I 've seen reports of
what you describe as "hellbroadcast", but I don't have a cite to hand. Your
"impersonation" I've seen very recently reported here on HN at least, and what
even is Facebook if not the ultimate example thus far of what you (correctly)
term a "social pyramid scheme"?_

Fair point. I didn't mean to imply Facebook as a paragon of goodness, but
rather that social media as a whole does not inherently imply dark patterns. I
think there are certainly things I'd like to see removed from products and I
think those products would actually benefit from that in the _long-run_. The
problem is measurement: it's much easier to employ a dark pattern, show some
short-term metric that indicates "engagement" and get promoted or leave before
it all drops off again. Playing the long game is much harder, career-wise.

 _I do find myself moved to inquire further into how you compose, on the one
hand, having produced detailed documentation and an inventory of examples on
how to develop user experiences which are designed specifically to the purpose
of maintaining user engagement in ways largely orthogonal to the inherently
engaging quality (or lack thereof) of the underlying content - and, on the
other hand, your argument here that platforms like Facebook, which are so
broadly known to implement such patterns that to advance the claim merits no
controversy whatsoever, do not achieve any benefit, or their users suffer any
detriment, from such implementations._

This is a good question.

I'll address it backwards: as to "merits no controversy", again I just meant
that social media doesn't inherently imply dark patterns. I often feel like
media presents the argument as such: "X does A, Y does B, all social media is
bad by default." It's a sexy headline, but I don't think it really holds a lot
of water with the majority of the 1B+ daily users of FB.

I do think a whollleee lot of research is needed into how people's social
relationships are changing and whether their real happiness is going up or
down, but I think of social media as part of the solution, not the cause. I
would have grown a lot more distant from my social connections due to
work/family as I grew older whether or not MySpace existed. I am not sure if
at the time of writing the thesis that I actively avoided discussing users as
"happy" or "unhappy" but I am glad I did. Motivation and happiness are not
necessarily the same thing (Frodo is super motivated to get to Mount Doom, but
he's not happy about it) but I think UX designers really do _intend_ their
users to be happier.

As to how I square the circle on writing a book with all these patterns, but
then turn around and say companies don't have reference material like this...
First off, I didn't contribute to it no-one read the book :) It probably sold
50 copies or so. When I wrote the thesis, gamification was beginning to cool
off from it's peak bullshit, but there was still plenty of bullshit to go
around. I chose the topic to sort of make the statement "if you are going to
do this stuff, at least try and do it so it's actually trying to meet a users
needs rather than just chucking dark patterns at them." I hoped that if the
book became popular, we'd see less of the bad stuff and more thoughtful
application of the good stuff so that apps would be more useful/meaningful to
users.

When I was doing my research, I went barking up a great number of trees trying
to find evidence of some sort of secret documentation of motivational patterns
(it was helpful studying in Silicon Valley). I never found it. Doesn't mean it
doesn't exist, but I never found it and I really don't think it does exist. I
think designers go with their gut and what worked elsewhere, and they don't
have secret teams of behavioral economists figuring out a UX pattern to bump
engagement by 3%.

Does this help? I enjoy the conversation and am happy to clarify anything.

~~~
throwanem
This does indeed help! I greatly appreciate your thoughtful response, and can
only apologize for my inexcusable tardiness in following up further.

I'm not sure it needs secret teams of behavioral economists to produce dark
patterns which drive motivation even to the detriment of the users they
artificially engage. You can hypothesize and A/B test your way into such
patterns as well, I think, and never quite realize what you're actually doing
- for those of us who develop software on a professional basis, how many of us
can maintain the perspective of an end user into the software we ourselves
work on every day? Certainly I can't, and the next one of us I meet who can
will be the first. Granted Facebook's engineers operate on a higher level than
the norm, but in that regard? If anything, they're far more isolated from
their users than the norm, too.

I'm sorry to hear your book sold poorly! I have to admit I've contributed to
that trend, since I have no application for the information it contains. But I
assume the book must differ markedly from your thesis, which appears much more
concerned with factual documentation of the patterns it describes, and much
more muted in its expression of opprobrium toward the dark patterns you cite
there. I'm not sure I agree that the use of even those patterns you recommend
is as innocuous as you suggest, but that's something on which we can
reasonably disagree, I think.

Finally, I'd note that my concern with Facebook isn't so much its effect on
its users, and on those who aren't its users but are surrounded by those who
are, _today_ \- to be clear, I'm not at all sanguine about today's effects
either, but what concerns me far more is that Facebook has almost without
notice amassed an extent of power utterly unprecedented in human history, and
it has done so in a fashion which permits that power to be exercised in total
opacity - quite aside from the limited nature of options for curbing that
exercise should it prove noxious, it's very hard even to know when and how
that power is being used.

Until recently, Facebook has been satisfied simply to exist, and to grow.
After the events of 2016, though, we've already seen initial gestures on the
part of Zuckerberg et al toward a more active exercise of that power - and
whether or not you agree with the direction in which it's been suggested that
power be exercised, I should think the mere existence of that power, in the
hands of people who can do what they damn well please with it and never need
fear being gainsaid, should be worrisome in its own right. Perhaps it's being
used today in a way that accords with your preferences. Will it be ever thus?

I do, though, take your point about the conflation of social media with
Facebook, and I agree that it's understandable but not helpful that people
equate the two. Facebook's model is, I think, an accident of history - when it
came along, there were no good ways for people to both connect socially on the
Internet and maintain ownership and control of their identities and
information, and Facebook solved the first problem well enough that for a long
time nobody much thought hard about the second. But there's no clear reason
why both can't be simultaneously solved, and I know of at least one credible
project currently underway which intends to do precisely that.

~~~
cflewis
I think you hold very reasonable viewpoints :)

Re: dark patterns and being muted: a) If it was too impassioned, it wouldn't
have read very scientific.

b) From a purely self-interested standpoint, I honestly did not want to taint
my ability to have a career. Expressing hard line views are for tenured
professors, not PhD grads who are thinking they might go into industry :/

~~~
throwanem
Thanks! And I suppose I can't really argue with that.

I mean, I _could_ , but not with any conviction; on the one hand, I opted
against college and found my early adulthood less circumscribed in some ways
thanks to an absence of terrifyingly sizable debt - but on the other hand, I'm
not about to try to pretend it's anything other than the clear partiality of
almighty God, or if you wish a frankly implausible degree of good fortune,
that's made it possible for me to build, on the strength of a high school
diploma, even something vaguely resembling a software engineering career. Were
it not for so many things going my way that sometimes it keeps me up at night
just with the thought of how things would be if even one of them had broken
differently, I'd be doing well to make $15 an hour in a job that prompted
daily fantasies of suicide. So any criticism I might presume to offer, of the
fashion in which you've achieved all that you have, would I think have to be
pretty laughable on its face.

------
imgabe
This is a side effect of their dependence on advertising. It's spillover from
the traditional media business model.

The consumer doesn't receive any real, tangible benefit from the product. If
they did, they would be willing to pay money for it. The only way social
media, and much traditional media, can profit is by collecting attention and
selling it to advertisers, so that's what they are optimized to do.

~~~
guntars
It's an interesting thought that, if the users really got a value out of
Facebook, they would be willing to pay for it. I think people do, but if
Facebook asked everyone to pay even just a penny, they would instantly lose
hundreds of millions of users. There just isn't a way to charge users directly
for occasionally useful products.

~~~
imgabe
Almost all of the services people actually use Facebook for were things that
they paid for in the past. It's certainly harder now that these things are
"free", but I think there would be a market for a Facebook clone that said
"Hey, just give us $5 a month and we won't sell your data or bombard you with
ads".

Let's see, keeping up with your friend? That used to be the telephone, for
which people paid _by the minute_ \- an hour long call with your friend or
family in another state might run $5, every time you wanted to talk to them.
Or you could mail a letter. Slower, but still costs money for paper, envelopes
and a stamp.

Wanted to share photos? First you had to buy a camera. Then you had to buy
film. Then you had to pay to get the photos developed (extra for duplicates).
These are clearly things that are (or at least were) worth money to people.

But a stream of random snippets from people you haven't spoken to in 20 years,
asinine clickbait, and political rhetoric? Yeah, nobody wants to pay for that.

------
r3bl
The name Tristan Harris didn't ring a bell at first, but as soon as I heard
the slot machine analogy, I immediately recognized his work.

I recommend his TEDx talk[0], his essay on the topic[1] and finally, his Time
Well Spent[2] project to get a firm grasp about his side of view.

(0) [https://journal.thriveglobal.com/how-technology-hijacks-
peop...](https://journal.thriveglobal.com/how-technology-hijacks-peoples-
minds-from-a-magician-and-google-s-design-ethicist-56d62ef5edf3)

(1)
[https://www.youtube.com/watch?v=jT5rRh9AZf4](https://www.youtube.com/watch?v=jT5rRh9AZf4)

(2) [http://www.timewellspent.io/](http://www.timewellspent.io/)

------
tdaltonc
I built two tings: One makes apps more addictive, the other makes them less
addictive.

The one that makes apps more addictive is
[https://useDopamine.com](https://useDopamine.com)

The one for making thing less addictive is
[http://youjustneedspace.com](http://youjustneedspace.com)

------
0xcde4c3db
The headline claims it's "no accident", but couldn't these properties evolve
out of a lot of A/B testing without the product managers or engineers
realizing the broader behavioral implications?

~~~
trebor
Sure. But that also doesn't mean it's an accident. They're trying to increase
engagement, and what do they have at hand other than addictive techniques?

~~~
yourapostasy
> ...and what do they have at hand other than addictive techniques?

Reaching for addictive behavioral models is like reaching for management
policies when the hard work and long-term investment into leadership is more
sustainable.

To directly respond to your question, they can for example move to a
facilitation model; engagement is encouraged through longer discourse, or
through editorial staff whose job is to summarize viewpoints/trends/patterns
that quickly bring newcomers up to speed, but they do not moderate.

Another possibility is an aspiration model that shows the small steps and
decisions it takes to succeed, rather than orient towards the highlights reel
view favored today. Say someone you admire and follow wakes up at 3 am to go
work out: the social media connection time-shifts to your timezone, corrects
for your distance to your gym, and encourages you via your smartwatch to do
the same, in realtime. They eat exactly 100g of oatmeal for breakfast; the
social media connection corrects for your body measurements (lean body mass,
gender, height, _etc._ ), and says you can eat 90g to follow your aspiration
person.

Yet another possibility is community-wide goal setting and pursuit. Engagement
through groups creating their own layouts, platform-based applications, _etc._
to both decide and pursue their goals.

There are more models to consider. I posit they are not pursued by FB because
they decided they cannot monetize those models under their current ad-driven
monetization model, which in turn is governed by the capitalist (though not
necessarily free market) model they operate under. Relax some or all of those
constraints, and the other kinds of engagement models might emerge?

------
spaceribs
I'm surprised they didn't bring up that this is throughly described in this
book: [http://www.hookmodel.com/](http://www.hookmodel.com/)

The ethics chapter is awesome by the way...

~~~
hammock
This concept is much bigger than a pop science book.

For example, the article quotes the author of a different pop science book on
the same subject; quotes a different source on ethics; and references
intermittent variable rewards which were pioneered by BF Skinner almost 90
years ago.

~~~
spaceribs
Yeah, I was just wondering out loud because it's a source "of the times" from
people building these sorts of addictive products knowingly. I just read the
book and was surprised it wasn't used as proof that people are knowingly doing
it.

~~~
hammock
Scanning the article one more time, it seems pretty limited in focus to "like"
buttons and social validation. You could probably write a whole book on the
other types of products and devices :)

------
chad_strategic
I don't have a facebook account. I had a twitter account that I used for a
news feed for many years. Meaning I really didn't tweet, I just read other
people tweets. Long story, but my account got suspended. Then it occurred to
me that maybe I don't need it any more. It's been 20 days, I really haven't
looked backed.

It kind of feels like when I quit smoking...

------
danso
This is a good topic -- I mean, in the sense that I believe the appeal of
social media is firmly rooted in our desire for social connection and
validation. But that doesn't mean that the design of buttons and interaction-
loops can't meaningfully add to the addictive nature. That said, I thought the
OP's examples were a little flimsy without citing evidence (that the "few
moments to load updates" is deliberately designed around psychology of
intermittent variable rewards. I mean, sure, but the time to refresh for
updates is not non-existent, even with high speed internet).

Examining the simplification of choices to binary is a nice example, IMO, of
how to make social media feel less "work":

> _Following the introduction of Facebook 's like button in 2009, YouTube
> moved to a binary like/dislike format in 2010. Instagram launched that same
> year and came ready-made with a Like function shaped as a heart. Twitter
> adopted this same heart-shaped system in 2015, while, in the years since,
> Silicon Valley has come up with a multitude of new ways to gamify our need
> for social validation._

I've never really liked FB's multiple emotions, though that is partly because
of the interface design in which it just takes longer to register one of the
other non-Like emotions. Netflix's move from stars to thumbs-up/thumbs-down
felt controversial, but I'm pretty sure I've "thumbed" more things in the past
month than I had made star ratings in the past year, simply because it was
easier. And the data that Netflix provides me in terms of potential movie
appeal -- a likelihood described as a percentage, versus a five-star scale --
feels more straightforward, and doesn't seem to significantly differ in
accuracy from what I remember of what Netflix used to recommend for me. Roger
Ebert's print reviews used a 4-star scale, but his Thumbs-Up/Down with Gene
Siskel was still very popular and widely influential.

On the topic of social media engagement design, I would love to hear more
about the recent Facebook feature that prepends my name before every
notification. e.g. _" Dan, you have 32 new notifications and 3 pokes today."_
versus _" You have 32 new notifications and 3 pokes today."_ It's such an easy
an obvious feature to implement, I wonder how long it was in the making? e.g.
A/B testing, psychology studies, etc. Can't say it's really worked on me
except to make notifications feel more of a nag but I'm likely an outlier in
that I generally rarely visit FB.

~~~
graphitezepp
On the topic of "few moments to load updates" while I also could not find
anyone claiming to have solid evidence on this practice, FB having a history
of being intentionally being anti user with it's android app is pretty well
accepted. I wouldn't put it past them, though I suspect it is obfuscated in
the form of javascript bloat. Anticipation plays a big role in addiction.

------
simmons
I'm skeptical about some of the examples provided. (Do we really know that
Instagram's 3-second delay is gratuitous?) But I don't doubt that social media
is pushing people's mental reward buttons.

This seems to be happening not just on social media, but throughout the
consumer technology space. The most obvious example is how video games
skinner-box the customer, but I've also become cynical of how my industry
(television software) is driven by consumer addiction. When I started working
on television applications years ago, I saw it as an opportunity to help
people experience art. Now I see people spending every spare moment glued to
video, and I just feel like a drug dealer. :/

------
meagher
1\. Delete all social apps from your phone (Facebook, Instagram, Snapchat,
Twitter, etc.)

2\. Turn off push notifications or turn on Do Not Disturb (you can whitelist
favorites, all contacts depending on your OS)

3\. Be intentional when you pick up your phone

------
makesthingspos
I don't like how Reddit only shows the user top posts for past 24 hours or
past week. It means one can't easily not look at it today and see what one
missed today tomorrow.

Web news is generally like this, making it hard to see yesterday's stories
today. That's one reason my preferred news source is
[https://en.wikipedia.org/wiki/Portal:Current_events](https://en.wikipedia.org/wiki/Portal:Current_events)

~~~
dang
You can look at previous HN front pages. Here's yesterday:
[https://news.ycombinator.com/front?day=2017-05-24](https://news.ycombinator.com/front?day=2017-05-24).

(We count 'day' using UTC so there's overlap with today depending on where you
are.)

~~~
everybodyknows
The article sets of adjacent days intersect. A merging query of recent days
would be helpful. Viz:

    
    
        query-front-pages-recent --numdays 7 |
            sort --key=HN_priority | uniq

------
alfon
The funny thing is that when I think about this, it always comes down to
government regulations; when big corporations (whose interest is always to
maximize profits at any cost) are allowed to get inside your pocket, your
kitchen, and whatnot, and spend billions in learning how to influence our
brain's reward systems without little limitations, I am not surprised they
achieve so much success, after all, we are limited by our biology, we like it
or not.

------
typetypetype
When I opened facebook this morning, I had almost 10 notifications. I would
say 1, maybe 2 of them were related to an actual interaction.

------
misterbowfinger
Same could be said of TV/Sports/Games/Any form of leisure

The only difference in social media is that we have actual data on how we're
spending our data, vs. in the past when we didn't. To suggest that this is new
is silly.

Perhaps, _perhaps_ there's a point to be made about the scale of social media
and how many people it impacts. But it's hard for me to believe, without any
data, that it's tangibly harming us more than any other form of leisure.

------
failrate
One thing that helped me manage my addiction to Facebook was the realization
that the number of notifications I get is nearly constant no matter how long
the interval between using it is. That is, if I am frequently active, I am
reacting to the same number of notifications as if I am occasionally active.
The more you react to social media, the more it reacts back. It is a feedback
loop.

------
alexchantavy
Getting likes is addicting, sure, but scrolling through my newsfeed and seeing
autoplay videos with obnoxious text banners in impact font ('me af rn crying-
laughing-face-emoji fire 100 100') has made it so that I'm very annoyed with
social media.

~~~
wutbrodo
> Getting likes is addicting, sure, but scrolling through my newsfeed and
> seeing autoplay videos with obnoxious text banners in impact font ('me af rn
> crying-laughing-face-emoji fire 100 100') has made it so that I'm very
> annoyed with social media.

I have never understood this complaint. Why don't you just unfollow those
people? How many of them could you possibly know? This seems like a problem
that can't exist for more than a week or two if it actually bothered you,
since unfollowing someone when you see low quality posts like that will clean
up your feed pretty rapidly.

(Note that unfollow is different from unfriend on Facebook)

------
Clubber
I wish writers would stop diluting the word addiction. Go smoke a carton of
cigarettes a week for a month and you'll see what addiction feels like.

Habitual is the word they are looking for.

------
magic5227
Unlike potato chips? What product is not designed to be as addictive as
possible? One could argue they are being more manipulative but this is not
unique or special to social media. Anyone remember subliminal messages in
advertising?

I don't think capturing it as "addictive" is totally fair. If people don't get
utility from these hooks, they'll leave. Growth hacking doesn't ensure long-
term retention, product utility does.

