
Dopamine: An AI platform for designing human behavior - jiangrybirds
https://techcrunch.com/2017/09/08/meet-the-tech-company-that-wants-to-make-you-even-more-addicted-to-your-phone/
======
013a
The same company sells an api to help make your applications as addicting as
possible [1] and a service for users to help control their addiction to
applications [2].

This organization is disgusting and is evidence enough that our industry has
no sense of ethical responsibility. When massive regulation lands on Silicon
Valley and we whine about the impact it has on innovation, remember companies
like Dopamine Labs who truly deserved it.

[1] [https://usedopamine.com/](https://usedopamine.com/)

[2] [http://youjustneedspace.com/](http://youjustneedspace.com/)

~~~
TimTheTinker
To be fair, all the case studies they list [1] utilized their service to
improve users' personal habits -- diet, exercise, etc.

Behavior shaping isn't necessarily morally wrong to use, though companies like
Facebook and Google are almost completely incentivized to use them against
users.

[1]
[http://www.usedopamine.com/assets/pdf/Dopamine%20Labs%20Case...](http://www.usedopamine.com/assets/pdf/Dopamine%20Labs%20Case%20Studies.pdf)

~~~
_0ffh
"Behavior shaping isn't necessarily morally wrong to use," as long as we are
talking about parents shaping their children, or the justice system
(re-)shaping criminal offenders. Beyond that border, in my mind this becomes
very unethical indeed!

~~~
IanCal
Why not people wanting to shape their own behaviour? Nudges to exercise more
from _an exercise app I 've installed_ is hardly diabolical.

~~~
mmayberry
you are correct. there is nothing wrong with giving humans the tools they
want/need to be the better versions of themselves.

~~~
013a
This is _such_ a fine gray line that its impossible to conclusively categorize
something as evil vs good.

For example: If I decide I want to exercise more, maybe I would appreciate an
app which helps me become addicted to exercise. That might be good for me.

However, what if the app pushes me to exercise too much, and I begin to
experience health problems associated with destroying muscles? Or, what if the
app is formed on bad exercise science and it suggests routines that are bad
for me? Now suddenly the addictions created by the app are working against me;
the app isn't being irresponsible by helping me exercise, but it _is_ being
irresponsible by modifying my behavior and decision making processes to favor
using it.

Similarly, is Instagram "good" for you? Probably not. But, per your comment:
"there's nothing wrong with giving humans the tools they want". People might
want to be addicted to Instagram; that doesn't mean it is good for them and
that Instagram should deliver. People want to be addicted to nicotine and
alcohol, so we put regulations around it.

~~~
tdaltonc
The fact that there is a grey zone does not mean that there aren't also black
and white zones. It more interesting to think about the grey zones, but it's
easier to get things done in the back and white zones.

~~~
013a
That's true, but I don't think a general purpose API for improving the
addictiveness of any application exists in one part of the spectrum. Anyone
can use this.

If we look at something like the activity circles on the Apple Watch; that's
safe enough in my mind to be pretty well in the white area.

------
BayesStreet
>Dopamine’s founders argue that they reserve the right to deny service to
specific companies whose work seems to be off the level...

Just "trust" us. There are obviously many teams working on doing this, but
never so audacious. These things are definitely not a value add to society,
yet very profitable, like e.g. heroin if that doesn't sound too hyperbolic.

~~~
api
A casino is usually more profitable than a hospital too.

It's probably the central problem of capitalism. Monetary systems draw no
distinction between wealth creation and wealth extraction, and the second law
of thermodynamics guarantees that the latter will always be much easier. When
you design chips, cure diseases, or build rockets you are fighting entropy.
When you addict, misinform, and con people entropy is on your side.

~~~
BayesStreet
This article made me pretty sad and made me think of this too. I think what an
ethical investor needs to do is not deploy capital to any industries that
don't add real economic value or wealth or whatever you want to call it. Also,
ethical founders should refuse capital from orgs that fund these projects.
Basically starve these projects out, but that's sort of idealistic.

~~~
markatkinson
Unfortunately greed often turns people from ethical to not so ethical pretty
quickly. Would be nice if the chaps above could create an AI to help with
that!

~~~
intended
We could go to unethical even faster!

An unethical AI has more degrees of movement. It will always beat an ethical
AI.

------
the_common_man
> While Brown condemned the behavior that’s been attributed to Sacca — and
> expressed disappointment that Mazzeo found the same troubles at his would-be
> new home, Binary Capital — he said that it didn’t dissuade him from taking
> Lowercase’s money.

Yup, people always take a high moral stand.. until there is money involved.

~~~
Y_Y
pecunia non olet

------
dontreact
Even if the aims are benevolent, the naming and marketing copy just sound evil
to me. They are trying to make it sound like they can turn your app into an
addictive drug.

~~~
RAB1138
Contrarian, not evil, is more of the motto. You don't get anyone to listen if
you just talk about an AI SaaS platform for long-term habit formation. It
comes off as wonkish. Trust me, we tried ;)

~~~
dontreact
Well first of all thank you for your reply and actually getting involved in
the thread.

I have to say, I'm not convinced.

Yes of course if you say that your product is similar to a what a drug dealer
would offer, then people are going to get interested because of the potential
for returns on their money. You're just confirming my suspicion that branding
things this way was in order to help get investment. I don't see how you are
being contrarian. What is the popular opinion you are rejecting? That
addiction is scary/harmful/a terrible thing to inflict on others?

You're in a tricky position now as I personally don't see how you can reverse
course on presenting the company this way (after thinking about it for a
couple of minutes :) )... but I hope you can figure it out!

~~~
tdaltonc
I think that the thing we're contrarian on is "using software to design
peoples behavior is bad," or "Dopamine (the molecule) is first and foremost
about Addiction."

Dopamine is about learning, and learning only matter because it changes us.
The same brain mechanisms and technologies that make facebook captivating and
can be used to make fitness, a good diet, and spending time with people who
love you engaging too. When we're done building this tech-stack, it will be
possible for people to learn themselves in to whoever they want to be.

Disclaimer - I'm the CEO of Dopamine

~~~
Grangar
So what would ensure you keep using that tech for good? What if no clients
turn up and you still need to produce value for your investors?

In other words, how would you go about solving the 'just trust us' problem
others in this thread have described better?

------
reilly3000
There was a great Dilbert comic that said ~ ‘If marketing we’re only slightly
more effective it would be illegal.’ I am aware of 1960’s era laws against
subliminal advertising. Are there any enforcible laws that limit how people
are influenced with media and technology?

More questions: If there is a technical definition of illegal influence, what
would be the test of legality work? Would would enforce it? Who would be in
violation of those laws today? Are any of the archaic laws about
advertising/mind control that could be applied to today’s tech/media/security
firms?

Finally, in a hypothetical scenario where a state actor was able to influence
the mentally fragile into turning into mass shooters, how would nations and
individuals start to protect themselves against such attacks?

------
jamesrom
Everyone is over hyping this. It's a silly company that drives engagement by
inserting short gifs after user actions.

It has an edgy sounding name and just enough marketing buzz to scare you.

This company is not around in 5 years. Good UX > whatever they are selling.

------
headcanon
Is this an accurate framing of your statement then? If we can think of the
addictive nature of apps as "brain hacking", you aim to be white hats? Using
the same techniques to protect through understanding, instead of attack for
personal gain?

You say you only work with certain clients based on their ethics. Can you name
some examples of an application for this that has the potential to be
profitable? I, and it seems many others here, have a hard time understanding
how this science can be used both ethically and profitably.

------
RcouF1uZ4gsC
I am just waiting for a lawyer for someone with Internet Addiction
([https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3480687/](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3480687/))
to sue this company over one of their supported apps.

This is the kind of thing that I fear will prompt a backlash from the people
and government and destroy the entire online ads regime that underpins the
current Internet.

~~~
nrjames
You may not have noticed that their other product is the exact opposite. Maybe
that gives them some legal protection?
[http://youjustneedspace.com/](http://youjustneedspace.com/)

~~~
zaptheimpaler
Thats despicable. They sell both the disease and a cure for double the money.

~~~
RAB1138
Bruh. Did you even look at the price on Space?

~~~
Grangar
Bruh. Just adding another app to 'counteract' the effects isn't going to
suddenly clear you of ethical issues. It's the equivalent of mcdonalds selling
salads...

~~~
Grangar
Oh and to add to this, the app seems to literally just be a waiting screen.
Everything about it screams cop-out.

------
Fricken
Before I even got through the first paragraph I was reminded of B.F. Skinner
and his book 'Beyond Freedom and Dignity', which I read in high school. So
it's pretty funny that they named their AI after him. Dopamine labs may be
evil, but at least they're self aware.

~~~
RAB1138
We love that one! Required reading.

His work on why you, me, everyone on this thread, and every species that's
part of the subphylum Vertebrata shaped the course of thinking from
contemporary Machine Learning techniques like Q-Learning to us!

Not evil, have done our reading though ;)

~~~
leggomylibro
No offense, but I do not believe you.

I can't, on something like this. Talk is cheap.

~~~
RAB1138
What on the evil part? Or that we make the whole team read Skinner? It's a
thin book.

~~~
leggomylibro
...really?

Welp, there's another one on the 'never touch with a 10-foot pole' pile.
Yikes, your promotional account sounds like a grinning glassy-eyed sociopath.
Serious goosebumps.

~~~
intended
A good sociopath would have been charming.

I think they’re very aware of the problems and they would rather be in the
good side of the issue.

I think he just doesn’t know how to convert this or talk about it in a forum
setting well.

------
stillsut
Did you do something nice for a coworker today?

When's the last time you said more than 'hello' to a neighbor?

The only substitute for Social "Media" is a Social Culture. The reason most
mature adults are checking their phones all the time is nobody interacts to
any significance anymore in the public sphere. When we're in public we are: A.
trying to get somewhere else, and frustrated at being impeded by other people,
B. Trying to buy something to consume or C. trying to sell something.

The best way to fight the screens and the apps is to provide a better
alternative, IRL.

------
freeflight
This feels like such a pandora's box.

On one hand, it's important we get a better understanding of what makes us
tick, on the other hand, I fear what such understanding might enable us to do
to each other.

~~~
tdaltonc
Disclosure - CEO of Dopamine

That's the exact same way I feel. There is definitely a end game here where
everyone has more autonomy and dignity, and that's the one we're fighting for.

~~~
freeflight
How does that endgame look? And how do you intend to get there by further
commercializing, thus monopolizing, such knowledge?

It's not like you are doing anything new here, the only new thing here is your
approach of being dead honest about what you are doing, kudos to that.

But besides that I don't see how any of what you are doing there results in an
endgame of "more autonomy and dignity for everybody", it rather seems to be
reinforcing the current trend where such knowledge and practices are sold to
anybody who can afford it, leaving those who can't out in the rain.

------
waterflame
I guess it's a win-win situation both ways: \- it can make people more
addicted to apps to the point where we all fall off cliffs or get run by an
app addict while crossing the road using our phones. \- or it can push people
to the limit where they get so much addicted to their apps that they actually
call for interventions, completely abandon their phone, or seek help;
introducing youjustneedspace.com </sarcasm>

~~~
RAB1138
Thanks! We think so too! Also, you have a close sarcasm tag with no open
sarcasm tag. Dude that's bad form.

~~~
waterflame
'Also, you have a close sarcasm tag with no open sarcasm tag' \-- sarcasm does
not start by priming people of what to come. Especially those who would take
you words seriously and start to relate to your comment.

------
Zarath
Greaaaaat... Seriously, does humanity really need this?

~~~
RAB1138
Yes. See attached.

[https://youtu.be/Qsi08xvhaQ4?t=4m21s](https://youtu.be/Qsi08xvhaQ4?t=4m21s)

~~~
intended
Ok so I actually watched most of that on crappy internet - and there are many
places where there are big issues with the underlying assumptions.

I think around the 14 minute mark or something you made a statement that
people need to be aware of the apps they choose.

Prior to that you had argued that (mostly for the first world) behavioral
choices are the root of the major causes of morbidity.

Your redeeming speech comes at around the 20 minute mark.

Issues first:

1) you assume people have a choice - and that inherently Discounts the over
size impact of bad actors and antagonists.

This point alone, fundamentally changes the constants underpinning the model
you tacitly must be running to support your other predictions.

2) people are terrible at making a lot of choices on average. People choosing
good apps over bad is similar to hoping people make good life style choices.

Unlikely, and far too path dependent. Unless you are highly informed, educated
about skinner/conditioning and tech aware and cynical - you won’t succeed.

3) a lot of those behavioral problems have arisen from large firms using old
school behavioral influencing to create that situation.

There is no basis for hope that more behavioral tools will solve the problem.

20 minutes onwards though, and you have a much better Wicket - at least up to
the point I’ve watched.

Which is another thing. Very few people are going to watch a 30 minute video
of you speaking, especially when we’re on a text based forum.

------
Stenzel
So some positive feedback gets an app more attention - I get that, apps than
infantilize their users are getting popular. But does it really have to be
presented at precise intervals, calculated by a clever algorithm made by
neuro-scientist? I doubt the impact will differ when triggered at slightly
sanitized random intervals instead.

------
kolbe
Even though their product borders on pure evil, I have a certain admiration
for how they aren't even trying to hide it.

------
anigbrowl
The originator of this project discussed in brief on HN a few months back, and
I assume is reading this thread.

The reactions here could tell you that this is a really Bad Idea. Working or
investing in this is a terrible choice and will forfeit baseline level of
consideration you could normally expect from other people.

------
Turing_Machine
"if a technology like Dopamine can actually encourage retention on apps aimed
at self-improvement, that’s an inarguably good thing."

No, it absolutely is not. Some of most horrifying abuses in the history of the
human race have occurred under the guise of "improving" other people for their
own good.

~~~
RAB1138
The men and women who write your software control the boundary conditions for
your experiences there. They control what and how you can think, feel, and
believe in that experience. When Twitter doubles the numbers of characters you
can tweet, they're making a decision about the quality and size of thoughts
people can express - and in turn what depth of dialogue we have there.

Analogously, when someone designs a diet app, they're creating the boundary
conditions of a new behavior pattern for users. Their use of Dopamine aligns
well with their end user's goals: in the same way that you hold friends close
because you know they rub off on you, so too should we think about our
software. Having an app on your phone you chose to let close to your mind and
behavior - to be a new venue of your thought and future self - is a decision
about who you want to become. And if that app can use good design, a good
interface, and persuasive tech to help you achieve your health or diet goals:
you win.

That's an alignment of what tech wants and what humans want.

~~~
Grangar
You can't trust people to make that decision themselves. If you keep an app
close because of the skinner-esque variable reward ratio I'd argue that's not
a conscious choice.

------
tw1010
Gee, the founders can't have been happy with that title (different than on
HN). What can founders do to avoid being represented in a negative light like
this by the press, HN? Try to be the first to frame the conversation? Or is
all press good press and it doesn't matter?

~~~
sitkack
They just got a ton of press, reaching the exact markets they want to hit.
Their fax machine is probably ringing off the hook right now.

~~~
RAB1138
WE HAVE A FAX!!??!?!

~~~
Y_Y
Have you got any products to trick people into liking your product? Your
comments in this thread seem to be having the opposite effect.

------
golergka
So many accusations about 'evil' here, without any significant evidence. As I
see it, this middle-ware is a tool that you can choose to use for evil or good
- the tool doesn't get a moral value because of it's use.

There is, of course, an argument that any manipulation of human behaviour is
inherently evil, but it's a very weak one - if you manipulate a person to eat
well, exercise, donate to charity, call their parents and vote for a [insert a
political cause that you believe to be inherently moral], is it still an evil
thing to do?

~~~
Grangar
Manipulating people into donating to certain charities (churches?) could
certainly be an evil thing in my book. And ESPECIALLY VOTING FOR A POLITICAL
CAUSE. That is not okay.

Fact is that morals are relative and software companies shouldn't be the
arbiters of that.

------
legacynl
If everyone in the US has on average 5 apps on their phone which use your
service, and it makes people on average engaged for one extra minute... What
would the cost be of all the lost productivity? Does the extra engagement lead
to more revenue? How do the 2 compare?

Also, if this is designed to release dopamine or any other neurotransmitters,
is there a risk that this will ever be classified as a drug? Or at least be
treated legally like a drug?

~~~
freeflight
> Also, if this is designed to release dopamine or any other
> neurotransmitters, is there a risk that this will ever be classified as a
> drug? Or at least be treated legally like a drug?

This whole topic is gonna be huge soon, it's in the same vein as increased use
of gambling style "loot boxes" in video games designed to be skinner-boxes.

------
gpestana
Alright, so that guys has the guts to say “It’s our job to make sure that
everyone gets treated as a human, not as an object.”.

------
tempodox
Ruthless business models like this seem inevitable in a society that values
making money far above anything else.

------
yters
It's what TV advertising is all about. Why aren't we complaining about that?

~~~
dredmorbius
Examine your premises.

We are. And have been for a while now.

[https://en.m.wikipedia.org/wiki/Four_arguments_for_the_elimi...](https://en.m.wikipedia.org/wiki/Four_arguments_for_the_elimination_of_television)

~~~
yters
Flipping out about a company that claims to modify behavior seems a little odd
when they most likely consume hours upon hours of behavior modifying TV ads
every single week. If behavior modification is the real concern, the majority
of the angst should be directed at TV ads.

------
dredmorbius
The irony of the VC firm backing this venture imploding due to the dopamine-
seeking, short-term thinking, failure to contemplate consequences behaviour of
_its_ principals is ironically ironic.

------
sitkack
Hire these guys for TV! I wish them success, all the big players are doing the
same thing with more people. Keep on innovating, disrupting. Here is to your
next cache hit.

~~~
RAB1138
I tried to emoji fistbump you here, but HN won't let me.

~~~
sitkack
[https://www.google.com/search?q=emoji+fistbump](https://www.google.com/search?q=emoji+fistbump)

Thanks man!

1138 is one of my favorite documentaries about the future. There are a lot of
naysayers around here, but only because of the high fidelity mirror you
present them with. It is only their own self loathing.

------
willytobler
This company should be call heroin. And they all should go to jail 30 years
for dealing with addictive substances.

------
tnzn
And then a certain Nobel Prize believes "nudges" will be used for our own
good. Ha-ha.

------
ProAm
Such a terrible name for a company.

------
neelkadia
I guess now I need to take break from Hackernews!

~~~
intended
after this discussion I realized that there is no more safety, and plan to
dump my smart phone for a nokia.

------
alexasmyths
This is what cigarette companies have been doing for years so why no tech?

I'm wary that this will improve anything at all, but make things more annoying
and addictive, as opposed to actually increasing productivity.

------
RAB1138
Hi!

Founder (Brown) here. Appreciate the traffic!

Also, would love to clear up mob misconceptions about:

-Why did we pick lightening-rod branding?

-Why was this the most humanitarian use of a neuroscience MS/PhD?

-What quality of people must we be to do this?

-What is design's ethical imperative that, 100 years ago when most died of infections diseases, that this year the majority of people over 50 will die of strongly behavioral-based diseases? [1]

-How come the Founders of the Push Notification teams aren't part of the cultural dialogue?

-What does Tristan Harris think?!

-What is our pricing model and can your startup get a discount? (YES!)

[1] soundcloud.com/digital-mindfulness/89-the-science-of-addictive-technology-
with-ramsay-brown/s-YQzyZ

~~~
ryderm
So, if you would love to clear up the misconceptions, actually answer all of
the questions you listed? The only one you answered is about discounts for
your product.

~~~
RAB1138
But I'll give you the TL;DR:

Lightening rod: because now we're on HN.

Humanitarian use: see below about 100 years ago vs. today.

Quality of people: that's why Space is free.

What's design's ethical imperative: also, see below about 100 years ago vs.
today.

Founders of push notification companies: Ask the CEOs of
Leanplum/Marketo/Intercom/Vizurly/Kahuna why they haven't released an antidote
for push notifications? Please. Try to get them to talk about it.

Tristan: great friend, mutual supporters

Pricing model: $0.05/MAU for qualifying startups.

xo

