
A therapy chatbot and app for depression and anxiety - nopinsight
http://www.businessinsider.com/stanford-therapy-chatbot-app-depression-anxiety-woebot-2018-1
======
kingkawn
The notion that the impact of talk therapy comes purely from the intellectual
discourse seems to me severely misguided. It is the experience of being with
the other person and their acceptance of your pain that does so much of the
work. To reduce it to the intellect divorced from all physicality seems like a
travesty that will only alleviate symptoms experienced while alone, and will
do nothing to reacclimate you to being open and comfortable in the presence of
others.

~~~
jdietrich
>The notion that the impact of talk therapy comes purely from the intellectual
discourse seems to me severely misguided.

The evidence suggests otherwise. Internet-based CBT does not seem to be
markedly inferior to face-to-face CBT, which is the current gold-standard talk
therapy. We really don't have any evidence that the human contact element of
psychotherapy is integral or even significant to the effectiveness of the
therapy.

We should also be careful to avoid letting the perfect be the enemy of the
good. There are many barriers to psychological care, from cost to stigma to
simple availability. For the overwhelming majority of people worldwide, it
isn't a choice between internet-based CBT or face-to-face psychotherapy - it's
a choice between internet-based CBT or no treatment at all. Even if internet-
based CBT is significantly inferior, it still has the potential to prevent
immense amounts of suffering and death.

[https://www.ncbi.nlm.nih.gov/pubmed/25273302](https://www.ncbi.nlm.nih.gov/pubmed/25273302)

[https://www.ncbi.nlm.nih.gov/pubmed/25590119](https://www.ncbi.nlm.nih.gov/pubmed/25590119)

~~~
stephth
The question here may not be face to face vs remotely, but a person vs a
rudimentary AI. Can one feel acceptance and support from a chatbot that has no
sense of self?

~~~
tinymollusk
It seems possible that the external is the trigger to allow oneself to feel
and acceptance and support internally. For me, that was the difference maker
in talk-therapy; I learned how to treat myself better, tutored by an external
resource.

------
emptybits
> "Woebot has moved away from Facebook and launched as a stand-alone app that
> only requires a first name to sign up; ... Users can also ask Woebot to
> delete their account history, wiping all conversations. ... Even when people
> email us, we're like, 'We don't know who you are!' That was a really clear
> decision from the outset."

So it _sounds_ like they're doing the right thing. They're helping people with
mental health issues so I want to believe good privacy practices will
continue.

Time will tell. Until then, this sounds like a wonderful project.

[UPDATE] Okay, I just tried the app. The first thing it asks for is your name.
Fine. Pseudonym, no problem. The second request is, "I need to send you a
code. What's your phone number?"

So much for "we don't know who you are". >:

[SOLUTION/PSA] Use a service like Twilio for signup. For $1 you can pull a
random SMS-compatible phone number out of a pool, get your signup code from
the Twilio web console, and then release the phone number. Not anonymous but
obscured identity, at least.

~~~
EADGBE
Perhaps that combination is used in case of the chatbot being aware that one
is wanting to take dramatic actions from mental health side affects (hurting
one's self or others).

At least with a name (maybe) and a number they have a way to contact you. Or
the police. Or anything that will take you off the brink.

Also, poor sap down the road who wants to try it but gets your random number
will be devastated. Hypothetically, could have been their last resort.

------
spuz
> The students were split into two groups — one was assigned to chat with
> Woebot over two weeks, while the other was directed to read an e-book about
> depression. Unlike the students in the e-book group, those using Woebot said
> they saw a significant reduction in their depressive symptoms.

It sounds interesting. It also reminds me of ELIZA[1] a chat bot designed in
the 60s to demonstrate the superficiality of human conversation which ended up
being a surprisingly effective therapist to its users.

[1][https://en.wikipedia.org/wiki/ELIZA](https://en.wikipedia.org/wiki/ELIZA)

~~~
yathern
I wish there were a control group - it could very well be that reading an
e-book about depression can cause a regression in depressive symptoms.

~~~
jdietrich
We've run quite a number of randomised controlled trials on self-help books.
They work.

[https://www.ncbi.nlm.nih.gov/pubmed/22215865](https://www.ncbi.nlm.nih.gov/pubmed/22215865)

------
moron4hire
The efficacy of drugs, specifically Lexapro, on my anxiety cannot be
understated. I had attempted lots of different "mindfulness" techniques and
other types of talk therapy. They seemed like they worked, sorta, at the time.
But being on Lexapro is so night and day different that it makes all those
other attempts clearly an exercise in futility.

Like with everything, if you can't tell it's working, it's not working.

Stop guessing. Go talk to your doctor.

~~~
jdietrich
>Stop guessing. Go talk to your doctor.

Your doctor will also guess. There are about 20 drug treatments and a dozen or
so talk therapies that are all about equally effective at treating depression
and anxiety. We have no evidence-based way of figuring out which treatment
will work better for which patient. It's a total crapshoot, even for
psychiatrists.

Anecdotally, escitalopram worked for you. A thirty second Google search will
turn up hundreds of anecdotal reports saying that escitalopram is worse than
the devil incarnate. The evidence says that it's no better or worse than a
laundry-list of other treatments. You got lucky.

With that said, talking to your doctor is a sensible decision.

~~~
Domenic_S
I don't understand your point. Surely approaching a solution, even after a few
false starts, is better than unmitigated suffering.

Besides, there's family history to consider, which may help rule out some
drugs. The doc isn't always going in totally blind.

------
andrew_
Anyone else find this ironic, given the potential link [1] between
screen/device use and depression?

[1] [https://qz.com/1190151/why-am-i-unhappy-a-new-study-
explains...](https://qz.com/1190151/why-am-i-unhappy-a-new-study-explains-
americas-unhappiness-epidemic/)

~~~
petercooper
I don't see it. There's a link between being in water and drowning, but
swimming has huge health benefits.

------
TeMPOraL
Ok, so it's an app. An iPhone one. I would love to try it out (I could use
some CBT), but I don't have an iPhone. Is there an Android version? (Edit: no,
coming soon.) Or a web one? (Edit: the messenger bot seems up, assuming
[https://woebot.io/](https://woebot.io/) is the official site.)

Also, going through comments, it seems to be an Eliza-like chatbot. I.e. not
very complicated on algorithmic front. I wonder, what it would take to have it
instead of M-x doctor in my Emacs?

EDIT: trying out the Messenger bot. God, I'm already in love with the humour
and the flow. Nicely done!

------
zitterbewegung
I would say that therapy chatbots (including ELIZA) would be able to work at
least the same as Journaling what you do every day. See
[https://www.urmc.rochester.edu/encyclopedia/content.aspx?Con...](https://www.urmc.rochester.edu/encyclopedia/content.aspx?ContentID=4552&ContentTypeID=1)

If you make it more convenient to do this or encourage people to participate
then having a nice UI would probably work wonders. Also, if the user believes
that the system on the other end is human enough I think you could see better
results.

------
hmhrex
This reminds me of another app I saw recently, Calm Harm
([https://itunes.apple.com/gb/app/calm-
harm/id961611581?mt=8](https://itunes.apple.com/gb/app/calm-
harm/id961611581?mt=8)). At first I thought that was a strange approach, but
as I looked through the screenshots, and read the reviews, I realized that
this really seemed to be helping people. I don't struggle with self-harm, but
I do struggle with depression and anxiety, so I'm glad to see apps like this
being created.

I was pretty bored with the app ecosystem in 2017, but Calm Harm gave me hope
that there are still some people out there innovating and actually working to
use our existing technology to help people for good rather than just making
some money off advertising or Yet Another To Do App.

The only thing I get concerned about with these apps is the storage of this
data and how it can be made personally identifiable. Hopefully these angles
are tackled with people's privacy in mind.

------
n1000
> So Woebot has moved away from Facebook and launched as a stand-alone app
> that only requires a first name to sign up; Darcy described the app as
> anonymous.

Err, so why do I have to register my phone number before I can do anything
with the app?

------
gtirloni
An alternative is sites like [https://7cups.com](https://7cups.com) where you
talk to real people that have undergone training for this. I can't comment on
the efficacy though.

------
pnathan
Big PR coup here, well done Woebot Inc!

I look forward to computational therapists, it should help with a lot of the
oddities with traditional therapists.

Love to see it turned into a non-cloud service though. Mental health stuff can
be incredibly personally sensitive, and a non-cloud solution would be far
preferable.

------
meesterdude
i tried it. it clearly had a script, and my answers were all emoji or one word
(they render for you as options to press). So... really was a
"next...next...next.." kind of experience. And then it was done with me for
the day, after it rambled about making mistakes.

There is a vibe of "i'm one of the cool kids", like when parents try to
"level" with their teenagers. And the machine learning appears rudimentary or
perhaps tastelessly implemented.

But it gets CBT in the hands of more people. Maybe people wouldn't pickup a
book but are willing to checkout an app. So i think thats an important win,
even if the app itself isn't all that.

~~~
hmhrex
I think this year we will see more therapeutic apps that deal with mental
illness. I also think, that this is just a simple beginning and that we'll see
bigger and better apps, maybe even this one will greatly improve with time.

------
lalos
Great, then they can sell 'anonymized' data to ad, insurance and recruitment
companies. They will surely not be able to correlate that data with any other
data points they may have.

------
dotdi
They seem to have made it free, recently. Last time I saw it, maybe 6 months
ago, they used to charge an unholy 35$ per week or so.

Now, let's chat up that bot...

------
akvadrako
If you like this idea, definitely watch Space Station 76. A 70's version of
the future with a very smart but realistic theory bot.

------
mycomments2017
What about irc.freenode.net ##psychology chan for example (as a last resort
before resorting to a therapy chatbot)

------
randomdrake
Study: Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of
Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot):
A Randomized Controlled Trial

Citation: Fitzpatrick KK, Darcy A, Vierhile M. JMIR Ment Health 2017;
4(2):e19.

Link:
[https://doi.org/10.2196/mental.7785](https://doi.org/10.2196/mental.7785)

DOI: 10.2196/mental.7785

Abstract:

Background: Web-based cognitive-behavioral therapeutic (CBT) apps have
demonstrated efficacy but are characterized by poor adherence. Conversational
agents may offer a convenient, engaging way of getting support at any time.

Objective: The objective of the study was to determine the feasibility,
acceptability, and preliminary efficacy of a fully automated conversational
agent to deliver a self-help program for college students who self-identify as
having symptoms of anxiety and depression.

Methods: In an unblinded trial, 70 individuals age 18-28 years were recruited
online from a university community social media site and were randomized to
receive either 2 weeks (up to 20 sessions) of self-help content derived from
CBT principles in a conversational format with a text-based conversational
agent (Woebot) (n=34) or were directed to the National Institute of Mental
Health ebook, “Depression in College Students,” as an information-only control
group (n=36). All participants completed Web-based versions of the 9-item
Patient Health Questionnaire (PHQ-9), the 7-item Generalized Anxiety Disorder
scale (GAD-7), and the Positive and Negative Affect Scale at baseline and 2-3
weeks later (T2).

Results: Participants were on average 22.2 years old (SD 2.33), 67% female
(47/70), mostly non-Hispanic (93%, 54/58), and Caucasian (79%, 46/58).
Participants in the Woebot group engaged with the conversational agent an
average of 12.14 (SD 2.23) times over the study period. No significant
differences existed between the groups at baseline, and 83% (58/70) of
participants provided data at T2 (17% attrition). Intent-to-treat univariate
analysis of covariance revealed a significant group difference on depression
such that those in the Woebot group significantly reduced their symptoms of
depression over the study period as measured by the PHQ-9 (F=6.47; P=.01)
while those in the information control group did not. In an analysis of
completers, participants in both groups significantly reduced anxiety as
measured by the GAD-7 (F1,54= 9.24; P=.004). Participants’ comments suggest
that process factors were more influential on their acceptability of the
program than content factors mirroring traditional therapy.

Conclusions: Conversational agents appear to be a feasible, engaging, and
effective way to deliver CBT.

------
ChrisClark
> My name is Dr. Sbaitso. I am here to help you. Say whatever is in your mind
> freely, our conversation will be kept in strict confidence. Memory contents
> will be wiped off after you leave. So, tell me about your problems.

~~~
schoen
In retrospect, I've wondered if Dr. Sbaitso _actually_ wiped the memory
afterward, or just said it would as a joke.

