
A therapy chatbot for depression - Prygan
http://www.businessinsider.fr/us/stanford-therapy-chatbot-depression-anxiety-woebot-2017-6/
======
tpowell
This is an excellent idea, but their loose stance on privacy doesn't seem to
line up with the intended field. Even a cursory look over their privacy policy
raises questions (third party marketing, data in the event of a
sale/acquisition). As for Messenger, they merely provide a link to Facebook's
privacy policy.

I wish it outlined, in plain language, how private (or not) the transcripts of
my chat data are and the retention policies associated with said data.

[http://woebot.io/privacy](http://woebot.io/privacy)

~~~
pamelafox
Hello! I'm Pamela, CTO@Woebot. I agree that the privacy policy is confusing,
and I've started to make a list of standard questions that people ask Woebot
about privacy. I'd like to make a TLDR at the top that covers the FAQs. Could
you let me know what particular questions you want answered, so that it can
cover that? Thank you!

~~~
DanBC
I'm giving you sensitive personal information.

Are you complying with current UK / EU laws about how you handle that data?
Will you be compliant with GDPR when it comes in?

[https://en.wikipedia.org/wiki/General_Data_Protection_Regula...](https://en.wikipedia.org/wiki/General_Data_Protection_Regulation)

------
659087
Feeding this data to Facebook, a company already known to perform unethical
psychological experiments on users without consent, is beyond stupid. If the
people behind this are aware of Facebook's history and practices, I'd consider
it malicious to choose to expose depressed people to their platform.

~~~
SomeStupidPoint
It's gross-negligence at the very least.

Facebook routinely experiments on people in a way that's dangerous to people
with mental health issues (and likely even healthy adults).

Opening up their treatment to Facebook is incredibly negligent.

~~~
doug1001
yep--one would have hoped that this shop would have thought about the obvious
consequences--ie, what is the likely effect on someone in depression who
believes that they are seeking treatment and that the treatment has failed
them.

is this responsible guys?

~~~
659087
Ethics and responsibility seem to have taken a permanent back seat to profit
in the "tech" industry.

~~~
doug1001
sure seems that way (who has time to be ethical and responsible when you are
changing the world!)

------
pdeuchler
There seems to be a new push for technological problems to social illnesses,
and it consistently boggles my mind how so many smart "experts" really think
people engaging more with computers/phones/AI instead of, you know, engaging
more with actual people will solve any kind of social ill.

~~~
WalterSear
A decent CBT therapist can charge $150+ per hour, and personal change can take
years of constant, ongoing effort.

For most people right now, it's going to be an app or no treatment at all,
particularly if for people who are only mildly impaired by their distress.

Besides, it's not an either/or proposition. Personally, I don't think that
chatbots are all that useful (I've been building them as a large part of my
job for a year), but I can see interactive CBT/DBT
reminders/journals/checklists as being a very powerful tool.

~~~
pdeuchler
Something isn't adding up here... if this is supposed to be supplemental
therapy to talking with an actual human, but instead the actual human part is
nixed and it is used as the entirety of therapy (i.e. outside of it's
prescribed and tested usage) are we not worried about the negative
externalities of replacing human contact with machines for curing social ills?

That's not even touching where you've just pointed out that we're essentially
now decreeing that poor people get to talk to machines and rich people get to
talk to humans, or the fact that insurance companies will jump at the chance
to force the cheap AI down everyone's throats before they pay for human
therapy, etc etc...

~~~
aianus
The status quo is poor people talk to nobody and rich people talk to humans.
This is an improvement, no?

~~~
SomeStupidPoint
No.

The status quo is poor people talk to social contacts (eg, friends, family,
church community, etc) while rich people talk to professionals (eg,
therapists).

This is a change in that poor people are being moved to a professionally
manufactured tool, which isn't necessarily an improvement -- it's just
replacing an established, ad hoc system with an unknown technical one.

There's every possibility that it would make the situation worse and it's
hubris on the part of the medical community that a tool built by them is
better than informal therapists.

~~~
webmaven
_> The status quo is poor people talk to social contacts (eg, friends, family,
church community, etc) while rich people talk to professionals (eg,
therapists)._

For some people (particularly those who have large extended families), that is
indeed the status quo, but for others not so much.

If your home environment is not supportive (you can imagine responses ranging
from _" everybody gets depressed sometimes, just work harder and things will
get better"_ to _" grow a pair"_), or even abusive, and you aren't in a
position to get professional help, a tool like this (though given various
other concerns, perhaps not this specific one) could very well be the only
alternative to no treatment at all.

What comes to mind though, isn't an automated therapist per-se, but something
closer to the "Young Lady's Illustrated Primer" (from the novel The Diamond
Age).

------
zimmund
Just in case someone wants to know, if you got to this bot without looking at
their page, it's paid, it doesn't tell you anything about it, and after using
it for ~two weeks it stops working and asks you to pay 39 usd per month.

------
save_ferris
Couldn't they have picked a better platform than Facebook to try this out on,
given all of the research into the effect of social media on one's mental
state?

~~~
pamelafox
(Pamela from Woebot). Great points here about the pro's and con's of Facebook
integration.

Regardless of those, we definitely plan to develop non-FB options in the near
future. There's a mailing list here if you'd like to find out when those are
offered: [https://www.woebot.io/#besides-
facebook](https://www.woebot.io/#besides-facebook)

On a personal note, I oft recommend the "Kill News Feed" Chrome extension for
Facebook users. (Woebot doesn't recommend it as he doesn't know about it yet,
he lives in a happy world free of news feeds :).

------
zitterbewegung
I think this is a great idea due to the fact that we need new approaches to
depression. Tracking your mood would be helpful to people with mental
disorders and could also inform their medical professionals. Also, the
theraputic effects of a chatbot that has research behind it gives it a large
amount of credibility that this actually has a postive impact.

On the other hand although the creators of the bot have good intentions I
wonder about the fact that you are using the internet to relay personal
medical information. Also, I don't know if your woebot information is stored
anywhere so if Woebot gets hacked a bunch of PII relating any medical data you
send. I wish that this was a standalone app that would prompt me to upload my
data if and when I see fit.

~~~
pamelafox
Pamela from Woebot here! We plan to develop non-FB options in the future, and
they could involve entirely client-side stored data with no internet relay.
There's a mailing list here if you'd like to find out when those are offered:
[https://www.woebot.io/#besides-facebook](https://www.woebot.io/#besides-
facebook)

------
jonjacky
Kenneth Colby, another Stanford professor, had something like this in the
1970's and 80's. According to the Wikipedia article [1] it was sold as a
product. I remember reading a skeptical account of it in a popular book back
then [2].

[1]
[https://en.wikipedia.org/wiki/Kenneth_Colby](https://en.wikipedia.org/wiki/Kenneth_Colby)

[2] [http://rdrosen.com/psychobabble-fast-talk-and-quick-cure-
in-...](http://rdrosen.com/psychobabble-fast-talk-and-quick-cure-in-the-era-
of-feelings/)

~~~
jdietrich
Going right back to the dawn of AI, ELIZA was designed to crudely emulate a
Rogerian psychotherapist.

[https://en.wikipedia.org/wiki/ELIZA](https://en.wikipedia.org/wiki/ELIZA)

------
swagv1
Far from the first idea I've heard for this, e.g.:
[http://www.newyorker.com/tech/elements/the-chatbot-will-
see-...](http://www.newyorker.com/tech/elements/the-chatbot-will-see-you-now)

------
timoth3y
There is a clinical psychologist in Japan who is doing the same thing, but
without the data-mining. Unfortunately, the medical and academic communities
here have been pretty aggressive about trying to shut her down.

[https://www.disruptingjapan.com/can-this-startup-solve-
japan...](https://www.disruptingjapan.com/can-this-startup-solve-japans-
hidden-mental-health-problem-hikari-labs/)

It's a great idea. I hope we see more of it.

------
zkms
Oh god, it's ELIZA!
([https://en.wikipedia.org/wiki/ELIZA](https://en.wikipedia.org/wiki/ELIZA))

------
TeMPOraL
You can easily fool people into thinking they're talking with a person even
though they're talking with a machine. But I wonder, can you do the same if
they know they're talking with a machine? Or even, could you make yourself
feel a program is human?

Simpler question: with a sufficiently-smart story generator, could you enjoy
it if you knew it was procedurally generated? I feel that I couldn't. But I'd
love to be wrong about that.

~~~
drdeca
> Simpler question: with a sufficiently-smart story generator, could you enjoy
> it if you knew it was procedurally generated? I feel that I couldn't.

I feel like this might commit you to having some amount of conflict with
"death of the author"?

If one cares that the writing was written by a person, and not just
transcribed by a person, but composed by a person, it seems to me like one
probably cares that the writing was done with an intent.

I don't see a reason that one would care about whether the writing was made
with an intent, in a way that plays a role in whether one enjoys it, but not
care at all about what that intent was.

So, I would expect that anyone who couldn't, or at least most people who
couldn't, enjoy a work of fiction if they knew it was procedurally generated,
probably would care about the intent of the author in writing what they are
reading.

I also generally care about that, but I think I sometimes enjoy procedurally
generated works, though this might be largely due to my knowing about the
intent behind the writing of the code that produced the procedurally generated
content, which includes substantial bits of literal text to include in the
output, so..

~~~
blacksmith_tb
What if it was ML trained on all the novels in Project Gutenberg? I would
think the general style of writing could be convincing, but I wonder about
things like overall plot, character development, and pacing. At least for a
human author, those require a certain degree of planning, it would be
interesting to see if AI could just 'let them happen' or if it'd be the
literary equivalent of a shaggy dog joke.

------
11thEarlOfMar
Too bad, I don't, and won't, Facebook.

------
w00bl3ywook
You will need to install the Facebook to try it.

~~~
pamelafox
Yes, that is currently true. We plan to develop non-FB options in the future,
mobile apps with minimal server data interaction/storage. There's a mailing
list here if you'd like to find out when those are offered:
[https://www.woebot.io/#besides-facebook](https://www.woebot.io/#besides-
facebook)

------
sjg007
If you can read about CBT in a book then you can use an app or AI bot as well.
It should be tested and validated. However, there are some ethical concerns of
not having a licensed professional in the loop and who can do an evaluation
and monitor progress / intervene.

------
andrewtbham
I have for several years thought about a real counselor using machine
learning....

[https://github.com/andrewt3000/DL4NLP/blob/master/carl.md](https://github.com/andrewt3000/DL4NLP/blob/master/carl.md)

------
bitJericho
Why not develop like... a chatroom...

Reminds me of the story about robots in old folks homes. Why not just have
people to keep people company.

~~~
ceejayoz
> Why not just have people to keep people company.

This has actually been tried in the Netherlands, apparently with great
success. University students can live rent-free in a retirement community on
the condition that they spend time with the residents.

[http://www.pbs.org/newshour/rundown/dutch-retirement-home-
of...](http://www.pbs.org/newshour/rundown/dutch-retirement-home-offers-rent-
free-housing-students-one-condition/)

------
hxegon
Really like this, but wish it wasn't via Facebook messenger. Maybe I'll make a
one off profile to use it.

~~~
pamelafox
(Pamela from Woebot here) Yes, I'm afraid it's only FB right now. We plan to
develop non-FB options in the future, mobile apps with minimal server data
interaction/storage. There's a mailing list here if you'd like to find out
when those are offered: [https://www.woebot.io/#besides-
facebook](https://www.woebot.io/#besides-facebook)

------
callesgg
Seams very good. Will be interesting to see if it helps me reveal any insights
to myself :)

------
contingencies
Brings new perspective to the modern American adage "find someone who cares".

------
diimdeep
Is there automated online diagnostic tests comparable to one might take in
person?

