Hacker News new | comments | show | ask | jobs | submit login

The same company sells an api to help make your applications as addicting as possible [1] and a service for users to help control their addiction to applications [2].

This organization is disgusting and is evidence enough that our industry has no sense of ethical responsibility. When massive regulation lands on Silicon Valley and we whine about the impact it has on innovation, remember companies like Dopamine Labs who truly deserved it.

[1] https://usedopamine.com/

[2] http://youjustneedspace.com/

100% agreed on the ethics note. Calling it 'dopamine' is also pretty remorseless. Interestingly, the creators of the like button and pull to refresh, and others, are pushing back against this 'attention economy': https://www.theguardian.com/technology/2017/oct/05/smartphon...

Paul (author on that piece) is wonderful!

Not so much remorseless, we were just shooting for contrarian and attention grabbing...And here we are....knee deep in front page of HN hate mail...

Calling negative comments hate mail speaks volumes about you.

You are building a Zynga-like company and writing vague comments instead of direct answers here. Why do you expect positive replies?

You are making the world a worse place.

To be fair, all the case studies they list [1] utilized their service to improve users' personal habits -- diet, exercise, etc.

Behavior shaping isn't necessarily morally wrong to use, though companies like Facebook and Google are almost completely incentivized to use them against users.

[1] http://www.usedopamine.com/assets/pdf/Dopamine%20Labs%20Case...

Well, yeah. They'd be idiots to present it any other way.

"Behavior shaping isn't necessarily morally wrong to use," as long as we are talking about parents shaping their children, or the justice system (re-)shaping criminal offenders. Beyond that border, in my mind this becomes very unethical indeed!

Why not people wanting to shape their own behaviour? Nudges to exercise more from an exercise app I've installed is hardly diabolical.

you are correct. there is nothing wrong with giving humans the tools they want/need to be the better versions of themselves.

This is such a fine gray line that its impossible to conclusively categorize something as evil vs good.

For example: If I decide I want to exercise more, maybe I would appreciate an app which helps me become addicted to exercise. That might be good for me.

However, what if the app pushes me to exercise too much, and I begin to experience health problems associated with destroying muscles? Or, what if the app is formed on bad exercise science and it suggests routines that are bad for me? Now suddenly the addictions created by the app are working against me; the app isn't being irresponsible by helping me exercise, but it is being irresponsible by modifying my behavior and decision making processes to favor using it.

Similarly, is Instagram "good" for you? Probably not. But, per your comment: "there's nothing wrong with giving humans the tools they want". People might want to be addicted to Instagram; that doesn't mean it is good for them and that Instagram should deliver. People want to be addicted to nicotine and alcohol, so we put regulations around it.

The fact that there is a grey zone does not mean that there aren't also black and white zones. It more interesting to think about the grey zones, but it's easier to get things done in the back and white zones.

That's true, but I don't think a general purpose API for improving the addictiveness of any application exists in one part of the spectrum. Anyone can use this.

If we look at something like the activity circles on the Apple Watch; that's safe enough in my mind to be pretty well in the white area.

Intentional behavior shaping is a light form of mind control, and its very much a moral gray area. You can totally use mind control for positive things; that doesn't make it alright.

Their Dopamine service seems much more heinous and detrimental, especially in a society where many social media application's algorithms tailor user's experiences to accustom and promote each individual's personal beliefs for the sake of ad revenue. I imagine the truly unethical could take advantage of this service and develop applications which leverage what our SM platforms are doing currently, and make quite a lot of revenue as a result.

CEO of Dopamine here.

We make cars with both gas pedals and break pedals because sometimes you want to go faster and sometime you want to go slower. There are some behaviors that you want to see yourself doing more frequently, and others that you want to see yourself doing less frequently. So we make products to serve both of those use cases.

It can be unnerving to see yourself programable; it affronts our sense of freewill. But once we get to the point that these technologies are possible, the question isn't if to use them, but how. We're trying to lead that conversation. And I am genuinely interested to hear your thoughts on how to use this technology to encourage human thriving.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact