Hacker News new | comments | show | ask | jobs | submit login
Dopamine: An AI platform for designing human behavior (techcrunch.com)
123 points by jiangrybirds 5 months ago | hide | past | web | favorite | 134 comments



The same company sells an api to help make your applications as addicting as possible [1] and a service for users to help control their addiction to applications [2].

This organization is disgusting and is evidence enough that our industry has no sense of ethical responsibility. When massive regulation lands on Silicon Valley and we whine about the impact it has on innovation, remember companies like Dopamine Labs who truly deserved it.

[1] https://usedopamine.com/

[2] http://youjustneedspace.com/


100% agreed on the ethics note. Calling it 'dopamine' is also pretty remorseless. Interestingly, the creators of the like button and pull to refresh, and others, are pushing back against this 'attention economy': https://www.theguardian.com/technology/2017/oct/05/smartphon...


Paul (author on that piece) is wonderful!

Not so much remorseless, we were just shooting for contrarian and attention grabbing...And here we are....knee deep in front page of HN hate mail...


Calling negative comments hate mail speaks volumes about you.


You are building a Zynga-like company and writing vague comments instead of direct answers here. Why do you expect positive replies?


You are making the world a worse place.


To be fair, all the case studies they list [1] utilized their service to improve users' personal habits -- diet, exercise, etc.

Behavior shaping isn't necessarily morally wrong to use, though companies like Facebook and Google are almost completely incentivized to use them against users.

[1] http://www.usedopamine.com/assets/pdf/Dopamine%20Labs%20Case...


Well, yeah. They'd be idiots to present it any other way.


"Behavior shaping isn't necessarily morally wrong to use," as long as we are talking about parents shaping their children, or the justice system (re-)shaping criminal offenders. Beyond that border, in my mind this becomes very unethical indeed!


Why not people wanting to shape their own behaviour? Nudges to exercise more from an exercise app I've installed is hardly diabolical.


you are correct. there is nothing wrong with giving humans the tools they want/need to be the better versions of themselves.


This is such a fine gray line that its impossible to conclusively categorize something as evil vs good.

For example: If I decide I want to exercise more, maybe I would appreciate an app which helps me become addicted to exercise. That might be good for me.

However, what if the app pushes me to exercise too much, and I begin to experience health problems associated with destroying muscles? Or, what if the app is formed on bad exercise science and it suggests routines that are bad for me? Now suddenly the addictions created by the app are working against me; the app isn't being irresponsible by helping me exercise, but it is being irresponsible by modifying my behavior and decision making processes to favor using it.

Similarly, is Instagram "good" for you? Probably not. But, per your comment: "there's nothing wrong with giving humans the tools they want". People might want to be addicted to Instagram; that doesn't mean it is good for them and that Instagram should deliver. People want to be addicted to nicotine and alcohol, so we put regulations around it.


The fact that there is a grey zone does not mean that there aren't also black and white zones. It more interesting to think about the grey zones, but it's easier to get things done in the back and white zones.


That's true, but I don't think a general purpose API for improving the addictiveness of any application exists in one part of the spectrum. Anyone can use this.

If we look at something like the activity circles on the Apple Watch; that's safe enough in my mind to be pretty well in the white area.


Intentional behavior shaping is a light form of mind control, and its very much a moral gray area. You can totally use mind control for positive things; that doesn't make it alright.


Their Dopamine service seems much more heinous and detrimental, especially in a society where many social media application's algorithms tailor user's experiences to accustom and promote each individual's personal beliefs for the sake of ad revenue. I imagine the truly unethical could take advantage of this service and develop applications which leverage what our SM platforms are doing currently, and make quite a lot of revenue as a result.


CEO of Dopamine here.

We make cars with both gas pedals and break pedals because sometimes you want to go faster and sometime you want to go slower. There are some behaviors that you want to see yourself doing more frequently, and others that you want to see yourself doing less frequently. So we make products to serve both of those use cases.

It can be unnerving to see yourself programable; it affronts our sense of freewill. But once we get to the point that these technologies are possible, the question isn't if to use them, but how. We're trying to lead that conversation. And I am genuinely interested to hear your thoughts on how to use this technology to encourage human thriving.


>Dopamine’s founders argue that they reserve the right to deny service to specific companies whose work seems to be off the level...

Just "trust" us. There are obviously many teams working on doing this, but never so audacious. These things are definitely not a value add to society, yet very profitable, like e.g. heroin if that doesn't sound too hyperbolic.


A casino is usually more profitable than a hospital too.

It's probably the central problem of capitalism. Monetary systems draw no distinction between wealth creation and wealth extraction, and the second law of thermodynamics guarantees that the latter will always be much easier. When you design chips, cure diseases, or build rockets you are fighting entropy. When you addict, misinform, and con people entropy is on your side.


This article made me pretty sad and made me think of this too. I think what an ethical investor needs to do is not deploy capital to any industries that don't add real economic value or wealth or whatever you want to call it. Also, ethical founders should refuse capital from orgs that fund these projects. Basically starve these projects out, but that's sort of idealistic.


Unfortunately greed often turns people from ethical to not so ethical pretty quickly. Would be nice if the chaps above could create an AI to help with that!


We could go to unethical even faster!

An unethical AI has more degrees of movement. It will always beat an ethical AI.


It is interesting how this article had the same affect on a lot of people. I also wasn't too comfortable with their angle or what they are doing.


don't add real economic value or wealth or whatever you want to call it.

Literally an impossible thing to measure because it's subjective. If it makes money is adds wealth so by your standard anything goes.


"When you design chips, cure diseases, or build rockets you are fighting entropy."

I've never quite heard 'entropy' used as an argument, and though you might be a little bit correct in some sense of the word ... I don't think it holds true either physically or as an analogy :)

But bon-bons for trying.

That's my favourite 'most HN comment of the day'.


Healthcare is a much larger percent of GDP than casinos for what it's worth. But your comment does remind me of http://slatestarcodex.com/2014/07/30/meditations-on-moloch/


And if you add the FIRE industries to that?

Or the addiction-feeding elements of healthcare -- opiates and other-than-therapeutic interventions.

https://www.selectusa.gov/financial-services-industry-united...


Happy to answer any questions you have! It's all of our job to be transparent about the new tablestakes of design, because that transparency builds exact dialogues like this! It would be more audacious if we didn't publicize what we're doing and these techniques remained the dark patterns that keep people hooked to social apps that have poor alignment to what people actually want out of life!


This is how I understand what you're saying:

We are building something evil and reprehensible, but we are open about what we are building, and therefore we are better than the other guys who are doing it surreptitiously.

Fine, but in that case your product is basically performance art. Bravo, you did a good job raising awareness: now go actually work on fixing the problem instead of pretending like you aren't part of it.


Making games more addictive is not really all that "evil" or "reprehensible". It's not much different from making them more fun.

What does need work is their messaging. Suggesting they can control users does not sit well. Instead, they could suggest they make games so much more fun, users boost in-game time.

They need a lesson on marketing and messaging, the idea itself is fine.


What's your take on gambling addiction?

Frankly, I feel bad that we are piling the heat on to this company. I think a lot of the people in this thread are frustrated and upset, and don't know where to apply their energy to start solving the problem that they are upset about. I am one of them; I had a very visceral reaction to the OP.

But I think this is a good wake up call for just how angry and frustrated we all are. We should be thanking this company for putting us over the edge. How many of the posters here railing against addictive media checked Instagram or Facebook in the same hour that they read this post? Time to start putting our money where our mouth is.


> Making games more addictive is not really all that "evil" or "reprehensible".

Considering the amount of behavioral science being used to accomplish this, it's actually quite "evil".

As much as we humans fancy our intellect and "free will", we are still just very predictable biological automatons. Put us in a very well designed skinner-box, and we gonna be pushing that button until our body rots away without us even noticing.

There is a very fine line between making something "engaging", for the sake of building something exciting, and making something "engaging" for the sake of keeping "engagement going".


> It's not much different from making them more fun.

I've played quite a few games that were addictive but not fun. The difference is hard to spot when you're in the middle of it, but stark from the outside. Typically they start off being plain fun to draw you in, and then gradually segue into addictive after a few hours to keep you there.


Addictive games are often fun, I didn't say necessarily fun.

But that wasn't my point. My point is that it's not terribly evil to make a game addictive. Addictive qualities are only bad to the extent that the addictive thing hurts you (e.g., smoking, drugs). These games are all pretty harmless.


No, addiction is never harmless. If you have never been addicted to anything, it is difficult to empathize or understand.

That doesn't mean it is wrong for a game to become addicting because it is so damn fun it just happens to become addicting. What is beyond despicable to me is to create a game that deliberately manipulates the human mind to become addicting without being of any real value.

If nothing else, it's a matter of opportunity cost. You are stealing attention away from hobbies, friends, family, and more likely, more wholesome forms of entertainment, possibly even more intelligent video games. Through in-app purchases you are funneling people's income from their bank accounts to yours and your employees'. This is fine when people are making a choice to waste their resources, but when you take people's choice away by addicting them, then you are effectively enslaving them.


Imho certain personalities are more keen on this than others. I've seen it getting called "a propensity to compulsion", which is a very fitting way to put it.


Is there any behavior that you wish you did more often or more regularly? Going to the gym? Driving more carefully? Taking a moment to center yourself?

We're actively working on all of those. I don't think it's possible to pretend that these techniques don't exist. So instead we're flamboyantly advertising that they can be used for good.

What behaviors do you wish you did more of or did more regularly?


> We are building something evil and reprehensible, but we are open about what we are building, and therefore we are better than the other guys who are doing it surreptitiously.

This basically covers all of American business and politics. I think the anger in this thread is misplaced.


OK but who do you think is worse, Voldemort or Umbridge?


People are downvoting you, but I think you actually make a good point here.

I'm just not sure which one is which. Is Dopamine Voldemort or Umbridge?


In this context, Voldemort. They're clearly embracing the role and even playing it for publicity, like a classic villain going "muahaha!" about their evil plan.

Contrast the new 'social credit' system in China - it's (IMO) pure evil and a disturbing example of government mind control, but presents itself as orderly virtue.


I think these guys need to amp the “evil” up a lot more.

It’s the perfect media juncture to do it- force the conversation with obvious names, have fun with the media, and in so doing carry out the public service of shining the brightest light on the darkest spots.

And it’s not like anything is illegal. People need to be horrified before they react and then figure out exactly what is acceptable and what is not.

But the whole IT industry itself seems to be in desperate need of a code of ethics and behavior. These guys can always just dodge by pointing out that they are hardly the worst fish in the sea. Just the most self aware and well labeled.


That's great but what is really needed is litigation/legislation.


Why? On what basis?


Because this is not a code problem.

It is inherently a people problem.

The issue is simply that any fix (by code or by competition), will only move the conflict point from its current location to a new place on the board.

Without being able to approach the bad actors and their motivations/incentives directly, you will not be able to effect change.

And there is always going to be an incentive to manipulate minds, easily and at scale.


Because the industry is deliberately creating addictive products that happen to be detrimental to the health of the people who use them. Facebook is the biggest offender but any time you hear the term "engagement", that's probably what is going on.

This isn't just going to go away, it has to be fought.


How good can Dopamine's moral compass be if they don't immediately refuse service to anyone who wants to use Dopamine's services?


> While Brown condemned the behavior that’s been attributed to Sacca — and expressed disappointment that Mazzeo found the same troubles at his would-be new home, Binary Capital — he said that it didn’t dissuade him from taking Lowercase’s money.

Yup, people always take a high moral stand.. until there is money involved.


pecunia non olet


Even if the aims are benevolent, the naming and marketing copy just sound evil to me. They are trying to make it sound like they can turn your app into an addictive drug.


they can turn your app into an addictive drug

I think that's exactly their elevator pitch.


Contrarian, not evil, is more of the motto. You don't get anyone to listen if you just talk about an AI SaaS platform for long-term habit formation. It comes off as wonkish. Trust me, we tried ;)


Well first of all thank you for your reply and actually getting involved in the thread.

I have to say, I'm not convinced.

Yes of course if you say that your product is similar to a what a drug dealer would offer, then people are going to get interested because of the potential for returns on their money. You're just confirming my suspicion that branding things this way was in order to help get investment. I don't see how you are being contrarian. What is the popular opinion you are rejecting? That addiction is scary/harmful/a terrible thing to inflict on others?

You're in a tricky position now as I personally don't see how you can reverse course on presenting the company this way (after thinking about it for a couple of minutes :) )... but I hope you can figure it out!


I think that the thing we're contrarian on is "using software to design peoples behavior is bad," or "Dopamine (the molecule) is first and foremost about Addiction."

Dopamine is about learning, and learning only matter because it changes us. The same brain mechanisms and technologies that make facebook captivating and can be used to make fitness, a good diet, and spending time with people who love you engaging too. When we're done building this tech-stack, it will be possible for people to learn themselves in to whoever they want to be.

Disclaimer - I'm the CEO of Dopamine


So what would ensure you keep using that tech for good? What if no clients turn up and you still need to produce value for your investors?

In other words, how would you go about solving the 'just trust us' problem others in this thread have described better?


I don't know if 'Contrarian' is exactly the word - but yes - the company name 'Dopamine' is definitely a smarty-pants branding strategy to get hype.

Startups are dead -> they're all rock bands now.

VC's are record labels.


There was a great Dilbert comic that said ~ ‘If marketing we’re only slightly more effective it would be illegal.’ I am aware of 1960’s era laws against subliminal advertising. Are there any enforcible laws that limit how people are influenced with media and technology?

More questions: If there is a technical definition of illegal influence, what would be the test of legality work? Would would enforce it? Who would be in violation of those laws today? Are any of the archaic laws about advertising/mind control that could be applied to today’s tech/media/security firms?

Finally, in a hypothetical scenario where a state actor was able to influence the mentally fragile into turning into mass shooters, how would nations and individuals start to protect themselves against such attacks?


Everyone is over hyping this. It's a silly company that drives engagement by inserting short gifs after user actions.

It has an edgy sounding name and just enough marketing buzz to scare you.

This company is not around in 5 years. Good UX > whatever they are selling.


Is this an accurate framing of your statement then? If we can think of the addictive nature of apps as "brain hacking", you aim to be white hats? Using the same techniques to protect through understanding, instead of attack for personal gain?

You say you only work with certain clients based on their ethics. Can you name some examples of an application for this that has the potential to be profitable? I, and it seems many others here, have a hard time understanding how this science can be used both ethically and profitably.


I am just waiting for a lawyer for someone with Internet Addiction (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3480687/) to sue this company over one of their supported apps.

This is the kind of thing that I fear will prompt a backlash from the people and government and destroy the entire online ads regime that underpins the current Internet.


Why do you fear that? It would be wonderful.


Agreed, a good backlash is way overdue at this point.


You may not have noticed that their other product is the exact opposite. Maybe that gives them some legal protection? http://youjustneedspace.com/


Potentially, but this is all new ground anyway. Is there even a law that would cover it in this context?

It also doesn't strike me as benevolent to offer a service to fix the problems your other service caused.


'Nice free will you have there. Be a shame if someone manipulated it.'


Thats despicable. They sell both the disease and a cure for double the money.


Nah, it's not a cure. They know that. It's like telling a slot machine addict that they are provided with the option of not playing.


Bruh. Did you even look at the price on Space?


To quote Andrew Lewis, "If you are not paying for it, you're not the customer; you're the product."

You'll know when a user of a Dopamine-backed app installs Space, and that's useful to you. Measuring people trying to give up an addiction feeds back in to understanding the causes of addiction, and in your case presumably gives you quantifiable information you can use to make Dopamine more effective. Saying you made it and give it away out of the goodness of your heart seems a little underhanded.


Ok its free, you got me!

Your company is still about making shit addictive. Its customers will inevitably be those with the money & inclination to make their app addictive - social media & mobile games. Its morally pretty close to adding sugar to junk food or selling cigarettes.

You were going to clear up "mob misconceptions" about ethics but it turns out you had nothing to say, just an "invitation for discourse".. while most of the tech world has been in on this particular discourse for years, with constant media coverage, and we know its a problem.

But I'm pretty sure you know all of that. Don't expect public approval for making glorified/AI-powered slot machines, they're still just slot machines.


Bruh. Just adding another app to 'counteract' the effects isn't going to suddenly clear you of ethical issues. It's the equivalent of mcdonalds selling salads...


Oh and to add to this, the app seems to literally just be a waiting screen. Everything about it screams cop-out.


Do people with eating disorders sue McDonalds? Or even the companies that do their ads?


Before I even got through the first paragraph I was reminded of B.F. Skinner and his book 'Beyond Freedom and Dignity', which I read in high school. So it's pretty funny that they named their AI after him. Dopamine labs may be evil, but at least they're self aware.


We love that one! Required reading.

His work on why you, me, everyone on this thread, and every species that's part of the subphylum Vertebrata shaped the course of thinking from contemporary Machine Learning techniques like Q-Learning to us!

Not evil, have done our reading though ;)


And You guys named your system Skinner. (which is very on the nose, and also very funny now)

I think I truly get what you are doing, but I definitely think you are going to have to very seriously explain to people here WHAT the problem is and how you are going to help.

So some straight up advice -

1) People here need to understand with examples from your research, on just how Bad the effect of un-noticed conditioning can be.

2) Highlight your mission statement, and very strongly defend the part where your tools do good. Preferably with illustrations.

If you want to take on anti-hero, branding, remember that it only works if people are intimately familiar with the inner workings of his mind and thus his motivations.

As such you may want to put this very prominently out there -

>We watched and read too much scifi growing up. And for every 1 dystopia movie, there's 3 better utopia novels. So now we can't get 2 determinately optimistic ideas out of our head: (1) the future is going to be awesome (and awesome for everyone: not just white bearded educated heteronormative californian male technocrats: that would suck). (2) The relationship between humans and our machines must be one of mutual thriving and improvement. The little magical slabs of glass that live in our pockets and proliferate across the globe are tools for human thriving. So we built Dopamine to help that future became manifest.

>Technology is becoming more (addictive / persuasive / coercive) every day and there isn't much we can do to reverse it. And it sucks that the technologies that are becoming persuasive the fastest aren't necessarily ones that are great for human thriving. You deserve a better world than one in which the most persuasive technologies demand from you your eyeball hours and brand loyalty in exchange for cat videos. That's dystopic.

people will also want to know that its not marketing speak. SO you will need to show that you take pains to actually move us to the world we need to go to.


No offense, but I do not believe you.

I can't, on something like this. Talk is cheap.


What on the evil part? Or that we make the whole team read Skinner? It's a thin book.


...really?

Welp, there's another one on the 'never touch with a 10-foot pole' pile. Yikes, your promotional account sounds like a grinning glassy-eyed sociopath. Serious goosebumps.


A good sociopath would have been charming.

I think they’re very aware of the problems and they would rather be in the good side of the issue.

I think he just doesn’t know how to convert this or talk about it in a forum setting well.


Did you do something nice for a coworker today?

When's the last time you said more than 'hello' to a neighbor?

The only substitute for Social "Media" is a Social Culture. The reason most mature adults are checking their phones all the time is nobody interacts to any significance anymore in the public sphere. When we're in public we are: A. trying to get somewhere else, and frustrated at being impeded by other people, B. Trying to buy something to consume or C. trying to sell something.

The best way to fight the screens and the apps is to provide a better alternative, IRL.


This feels like such a pandora's box.

On one hand, it's important we get a better understanding of what makes us tick, on the other hand, I fear what such understanding might enable us to do to each other.


Disclosure - CEO of Dopamine

That's the exact same way I feel. There is definitely a end game here where everyone has more autonomy and dignity, and that's the one we're fighting for.


How does that endgame look? And how do you intend to get there by further commercializing, thus monopolizing, such knowledge?

It's not like you are doing anything new here, the only new thing here is your approach of being dead honest about what you are doing, kudos to that.

But besides that I don't see how any of what you are doing there results in an endgame of "more autonomy and dignity for everybody", it rather seems to be reinforcing the current trend where such knowledge and practices are sold to anybody who can afford it, leaving those who can't out in the rain.


I guess it's a win-win situation both ways: - it can make people more addicted to apps to the point where we all fall off cliffs or get run by an app addict while crossing the road using our phones. - or it can push people to the limit where they get so much addicted to their apps that they actually call for interventions, completely abandon their phone, or seek help; introducing youjustneedspace.com </sarcasm>


Thanks! We think so too! Also, you have a close sarcasm tag with no open sarcasm tag. Dude that's bad form.


'Also, you have a close sarcasm tag with no open sarcasm tag' -- sarcasm does not start by priming people of what to come. Especially those who would take you words seriously and start to relate to your comment.


Greaaaaat... Seriously, does humanity really need this?



Ok so I actually watched most of that on crappy internet - and there are many places where there are big issues with the underlying assumptions.

I think around the 14 minute mark or something you made a statement that people need to be aware of the apps they choose.

Prior to that you had argued that (mostly for the first world) behavioral choices are the root of the major causes of morbidity.

Your redeeming speech comes at around the 20 minute mark.

Issues first:

1) you assume people have a choice - and that inherently Discounts the over size impact of bad actors and antagonists.

This point alone, fundamentally changes the constants underpinning the model you tacitly must be running to support your other predictions.

2) people are terrible at making a lot of choices on average. People choosing good apps over bad is similar to hoping people make good life style choices.

Unlikely, and far too path dependent. Unless you are highly informed, educated about skinner/conditioning and tech aware and cynical - you won’t succeed.

3) a lot of those behavioral problems have arisen from large firms using old school behavioral influencing to create that situation.

There is no basis for hope that more behavioral tools will solve the problem.

20 minutes onwards though, and you have a much better Wicket - at least up to the point I’ve watched.

Which is another thing. Very few people are going to watch a 30 minute video of you speaking, especially when we’re on a text based forum.


Even though their product borders on pure evil, I have a certain admiration for how they aren't even trying to hide it.


So some positive feedback gets an app more attention - I get that, apps than infantilize their users are getting popular. But does it really have to be presented at precise intervals, calculated by a clever algorithm made by neuro-scientist? I doubt the impact will differ when triggered at slightly sanitized random intervals instead.


The originator of this project discussed in brief on HN a few months back, and I assume is reading this thread.

The reactions here could tell you that this is a really Bad Idea. Working or investing in this is a terrible choice and will forfeit baseline level of consideration you could normally expect from other people.


"if a technology like Dopamine can actually encourage retention on apps aimed at self-improvement, that’s an inarguably good thing."

No, it absolutely is not. Some of most horrifying abuses in the history of the human race have occurred under the guise of "improving" other people for their own good.


The men and women who write your software control the boundary conditions for your experiences there. They control what and how you can think, feel, and believe in that experience. When Twitter doubles the numbers of characters you can tweet, they're making a decision about the quality and size of thoughts people can express - and in turn what depth of dialogue we have there.

Analogously, when someone designs a diet app, they're creating the boundary conditions of a new behavior pattern for users. Their use of Dopamine aligns well with their end user's goals: in the same way that you hold friends close because you know they rub off on you, so too should we think about our software. Having an app on your phone you chose to let close to your mind and behavior - to be a new venue of your thought and future self - is a decision about who you want to become. And if that app can use good design, a good interface, and persuasive tech to help you achieve your health or diet goals: you win.

That's an alignment of what tech wants and what humans want.


You can't trust people to make that decision themselves. If you keep an app close because of the skinner-esque variable reward ratio I'd argue that's not a conscious choice.


Letting a person choose 140 or 280 characters (which originally was done for technical reasons -- 140 octets was the maximum length of a classic SMS message) is an entirely different thing from cold-bloodedly psychologically manipulating them.

Entirely.


First, those apps are not your customers and you aren't fooling anyone.

Even if they were today, your VCs and future investors will not allow you to turn down profits and only sell to health or self-improvement apps.

If your product is successful, you have a fiduciary duty. If you fail in that duty, you will either be removed, or potentially even sued. And then it will be too late, because you already built it, and someone else will have taken the wheel. And they will maximize it, inevitably weaponizing it against the population.

There are better things you can be working on. Just walk away. Humanity does not need this.


Gee, the founders can't have been happy with that title (different than on HN). What can founders do to avoid being represented in a negative light like this by the press, HN? Try to be the first to frame the conversation? Or is all press good press and it doesn't matter?


They just got a ton of press, reaching the exact markets they want to hit. Their fax machine is probably ringing off the hook right now.


WE HAVE A FAX!!??!?!


Have you got any products to trick people into liking your product? Your comments in this thread seem to be having the opposite effect.


So many accusations about 'evil' here, without any significant evidence. As I see it, this middle-ware is a tool that you can choose to use for evil or good - the tool doesn't get a moral value because of it's use.

There is, of course, an argument that any manipulation of human behaviour is inherently evil, but it's a very weak one - if you manipulate a person to eat well, exercise, donate to charity, call their parents and vote for a [insert a political cause that you believe to be inherently moral], is it still an evil thing to do?


Manipulating people into donating to certain charities (churches?) could certainly be an evil thing in my book. And ESPECIALLY VOTING FOR A POLITICAL CAUSE. That is not okay.

Fact is that morals are relative and software companies shouldn't be the arbiters of that.


If everyone in the US has on average 5 apps on their phone which use your service, and it makes people on average engaged for one extra minute... What would the cost be of all the lost productivity? Does the extra engagement lead to more revenue? How do the 2 compare?

Also, if this is designed to release dopamine or any other neurotransmitters, is there a risk that this will ever be classified as a drug? Or at least be treated legally like a drug?


> Also, if this is designed to release dopamine or any other neurotransmitters, is there a risk that this will ever be classified as a drug? Or at least be treated legally like a drug?

This whole topic is gonna be huge soon, it's in the same vein as increased use of gambling style "loot boxes" in video games designed to be skinner-boxes.


>What would the cost be of all the lost productivity?

What if it was a productivity app that you were spending more time in or interacting with more efficiently. Or what if it was a fitness app that got you to do extra stretches throughout the day? Is that just economic dead weight?

Disclaimer - I'm the CEO of Dopamine.


Alright, so that guys has the guts to say “It’s our job to make sure that everyone gets treated as a human, not as an object.”.


Ruthless business models like this seem inevitable in a society that values making money far above anything else.


It's what TV advertising is all about. Why aren't we complaining about that?


Examine your premises.

We are. And have been for a while now.

https://en.m.wikipedia.org/wiki/Four_arguments_for_the_elimi...


Flipping out about a company that claims to modify behavior seems a little odd when they most likely consume hours upon hours of behavior modifying TV ads every single week. If behavior modification is the real concern, the majority of the angst should be directed at TV ads.


The irony of the VC firm backing this venture imploding due to the dopamine-seeking, short-term thinking, failure to contemplate consequences behaviour of its principals is ironically ironic.


Hire these guys for TV! I wish them success, all the big players are doing the same thing with more people. Keep on innovating, disrupting. Here is to your next cache hit.


I tried to emoji fistbump you here, but HN won't let me.


https://www.google.com/search?q=emoji+fistbump

Thanks man!

1138 is one of my favorite documentaries about the future. There are a lot of naysayers around here, but only because of the high fidelity mirror you present them with. It is only their own self loathing.


This company should be call heroin. And they all should go to jail 30 years for dealing with addictive substances.


And then a certain Nobel Prize believes "nudges" will be used for our own good. Ha-ha.


Such a terrible name for a company.


I guess now I need to take break from Hackernews!


after this discussion I realized that there is no more safety, and plan to dump my smart phone for a nokia.


This is what cigarette companies have been doing for years so why no tech?

I'm wary that this will improve anything at all, but make things more annoying and addictive, as opposed to actually increasing productivity.


Hi!

Founder (Brown) here. Appreciate the traffic!

Also, would love to clear up mob misconceptions about:

-Why did we pick lightening-rod branding?

-Why was this the most humanitarian use of a neuroscience MS/PhD?

-What quality of people must we be to do this?

-What is design's ethical imperative that, 100 years ago when most died of infections diseases, that this year the majority of people over 50 will die of strongly behavioral-based diseases? [1]

-How come the Founders of the Push Notification teams aren't part of the cultural dialogue?

-What does Tristan Harris think?!

-What is our pricing model and can your startup get a discount? (YES!)

[1] soundcloud.com/digital-mindfulness/89-the-science-of-addictive-technology-with-ramsay-brown/s-YQzyZ


Assuming this is being sold to companies other than strictly self-improvement apps, how can you consider this to be an ethical thing to have created?


Hi!

Because we don't sell to apps that would hurt people. See manifesto on usedopamine.com/team/index.html.

~100 years ago, the most frequent causes of death in the US were pathogens for which we barely had a name. Pnemonia, Flu, Cholera, Fevers. And it was only after we developed a rigorous technology of the body (modern medicine) did we lift millions of people out of suffering simultaneously.

Today, if you are under 50, you're mostly like going to die of opiates. Over 50? Type-2 diabetes, stroke, cardiovascular disease, obesity and its complications, and stress-related illness.

Every single one of these has strong behavioral components.

Building a smartphone-first, AI-powered rigorous technology of the human mind gives us all an above-the-table, democratized chance at designing scalable technologies that stop this. It spreads better across national, sex, gender, and SES lines than most other behavior-change oriented solutions. And as we enter an age of an excess of cheap energy, food, and data, we NEED a rigorous way to help us better align modern aspirations with an ancient brainstem.


How can we trust you not to sell apps that hurt people? The road to hell is paved with good intentions.

Note that you're using the word 'addiction'. Many would argue that is hurting in and of itself - regardless of what one would be addicted to.

And that last paragraph is absolutely haunting. We need companies controlling our minds because our brainstem isn't evolved enough??

I hope you get sued into the ground. It's time to start holding people accountable for their effect on our peoples brains.


So, if you would love to clear up the misconceptions, actually answer all of the questions you listed? The only one you answered is about discounts for your product.


But I'll give you the TL;DR:

Lightening rod: because now we're on HN.

Humanitarian use: see below about 100 years ago vs. today.

Quality of people: that's why Space is free.

What's design's ethical imperative: also, see below about 100 years ago vs. today.

Founders of push notification companies: Ask the CEOs of Leanplum/Marketo/Intercom/Vizurly/Kahuna why they haven't released an antidote for push notifications? Please. Try to get them to talk about it.

Tristan: great friend, mutual supporters

Pricing model: $0.05/MAU for qualifying startups.

xo


(right, it was more an invitation for discourse!)


What’s your current “pain point”?

(Other than ever attempting to re-brand before the heat death of the universe.)


Are you hiring?



I second that question.


team@usedopamine.com ;)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: