
Show HN: Pacifica, daily tools for anxiety based on Cognitive Behavioral Therapy - beermann
http://thinkpacifica.com
======
japhyr
Does an app like this need to be HIPAA-compliant?

I would guess that many people would consider building an app like this
without thinking too much about HIPAA. "We're not doctors, we're just building
an app that will help people manage their anxiety." But the app clearly asks
questions of its users that are focused on mental health. If a situation
arises in which a user has a bad experience related to how their information
was shared, it seems quite reasonable to consider whether their right to
medical privacy was violated. This seems particularly important with the
unfortunate stigma associated with mental health issues.

I don't ask this just to nitpick. I'm looking at building some projects
related to education, and in education there's a comparable act called FERPA -
Family Educational Rights and Privacy Act. It seems convenient to ignore these
kind of regulations when building projects that are meant to be really helpful
to everyone, but once a project like this takes off compliance with privacy
acts seems critical.

I'm quite curious to hear from the developers what their take on HIPAA has
been.

~~~
beermann
This is a great question. Technically, we don't store Protected Health
Information ("Personal health information (PHI), also referred to as protected
health information, generally refers to demographic information, medical
history, test and laboratory results, insurance information and other data
that is collected by a health care professional to identify an individual and
determine appropriate care."). Everything is self-reported in Pacifica.

That said, we treat our data as if it were PHI. We have a Business Associates
Agreement signed with AWS, and take all of the precautions they require for an
app that would claim it is HIPAA compliant. Technically, we could claim that
we are HIPAA compliant, as we don't store PHI. But we didn't want to say that
just for the sake of saying that.

The bigger question, in my mind, is about whether or not a situation would
arise as you mention. The FDA recently provided a little more clarity on some
of this ([http://mobihealthnews.com/39775/fda-clarifies-the-line-
betwe...](http://mobihealthnews.com/39775/fda-clarifies-the-line-between-
wellness-and-regulated-medical-devices/)). Specifically, Pacifica seems to
fall outside regulation as it "Claims to promote relaxation or manage stress
when there is no reference to anxiety disorders or other reference to a
disease or condition." We try to be pretty careful about the language that we
use. We don't mention things like Generalized Anxiety Disorder, Panic
Disorder, OCD, etc.

The truth is that it still seems like a grey area. That same article mentions
that we should not claim that we treat anxiety if we want to stay unregulated.
I think that we're on the fence here. In the future, we will go after FDA
clearance in any case. We just need the means to do so.

~~~
markolschesky
There's a bit more to HIPAA than signing a BAA with your infrastructure
vendor. I agree that you currently aren't storing PHI, so you're in the clear
for now. I imagine that in the future your business would require you to.
There's a bunch of things like auditing, logging, vulnerability scanning,
disaster recovery, training and having policies in place when you do need to
fully account for protecting PHI.

We open-sourced our HIPAA policies where I work at Catalyze recently. Check
'em out and good luck!
[http://catalyzeio.github.io/policies/](http://catalyzeio.github.io/policies/)

~~~
beermann
Thanks Mark, we've actually read your HIPAA policies. And yes, you're correct,
there's a lot more to HIPAA compliance than how you store your data. We will
continue to move down the path of treating everything as being PHI even though
it isn't currently.

------
falcolas
Being married to someone with an anxiety disorder, the best second-hand advice
I can offer is: see a professional. If they recommend CBT, great, perhaps this
tool can help. If they prescribe medication, pursue that as well.

The medication they can prescribe really does help: it stops your heart from
racing, it helps you stop the cyclic negative thinking... it just plain helps.

~~~
beermann
Thanks for the comment. We actually agree entirely. One of the most effective
ways to combat anxiety is through a combination of psychiatric drugs
administered alongside CBT. Unfortunately, not everyone has the means to
pursue professional help, for a variety of reasons. Of the 40 million people
in the U.S. with anxiety disorders, only 1/3 of them are actually seeking
help. We're hoping that Pacifica can help bridge that gap, ideally helping
people find the care that they really need.

~~~
falcolas
I understand the challenges, they are similar to those faced by the adult ADHD
community. Less than 1/5 of adults with ADHD are being treated.

Our society puts so many negative connotations on the use of drugs to address
mental deficiencies that even after learning how much they could help, we
hesitate for fear of the peer pressure and social stigma of "popping pills" in
favor of "shut up and man up".

> We're hoping that Pacifica can help bridge that gap, ideally helping people
> find the care that they really need.

Great! Might I recommend using your app to provide advice on how to approach
discussing this issue with their GP, or even giving links to professionals in
their area? Or even provide a "doctor's view" which can be passed to your GP
to assist in their diagnosis?

I make these suggestions because the emphasis on the word "privately" across
your website concerns me. It seems to re-enforce the negative mindset of "this
is your problem, don't bother anybody else" that I saw so profoundly affect my
wife.

~~~
beermann
I love the idea of recommending how to approach this topic with a GP. We've
been a little careful, as described in another comment, about crossing the
line of providing medical advice. Currently, we see Pacifica as a self-help
tool. We have to find the right ways to do some of this because we don't want
to upset the FDA. The truth is that they have the same goals that we do, and
we will try to work with them to provide the best application we can, while
making sure it is safe to use.

Regarding "privately," we had a slightly different take. Because of the
stigma, there are so many people that aren't seeking help. We think that
providing an application that can help you understand where some of your
thoughts come from might help you realize that you can start talking about
this. And ultimately, our hope is that creating a consumer application that is
widely used might contribute to breaking down some of those walls. It's a
lofty goal, but our mission is to do what we can to combat the stigmas around
mental health.

------
graycat
Looking at the OP, at least in part they are addressing _anxiety disease_.
Their suggestions sound good.

If the suggestions work, terrific.

But: As someone who had a close family member die from anxiety disease, I have
to say that during the long course of the disease we thought of all those
suggestions, especially _cognitive_ ones, and many more, and they were all
like a BB gun against a Russian tank. The real problems were much deeper.

And the _cognitive_ approaches, that didn't work, were being tried by a
genuinely brilliant patient -- Valedictorian, PBK, _Summa Cum Laude_ , world
famous research university Ph.D. _Cognitive_? No shortage of _cognitive_
ability: The patient saw and understood the _cognitive_ ideas, maybe more
deeply, and certainly faster than the professional could present them. At one
point, the professional had the patient write a paper describing the cognitive
approaches then exclaimed that the paper was "brilliant". Yes, it was -- very
clear, etc. And the _cognitive_ approaches? Total flop.

So, after considering such _suggestions_ , _good ideas_ , and _face validity_
, I get led to consider also the old, two criteria -- safety and efficacy.

Again, if the suggestions work, terrific. But I would suggest for such
patients and their families, ASAP, and maybe not in this order, (1), if only
to be a better, loving family member, learn as much of the Clinical Psychology
101 level material you can and (2) get the best professional help you can. And
for (2), if at first the _treatment_ doesn't look quite promising and/or
fairly soon there is no significant progress, which in my small sample size
seems quite likely, then get some better professional help.

Be careful with anxiety disease: Else members of the close family can throw
away significant parts of their lives, and the patient, all of theirs.

In K-12 or even in a college STEM BS you may not have been taught good
information about anxiety disease -- so, at first symptoms, and you need to
know about such symptoms, get caught up.

------
beermann
Thanks for your comments everyone. I'm trying to keep up with them all. A
little more information:

Pacifica is a hybrid application, built on the Ionic Framework
([http://ionicframework.com/](http://ionicframework.com/)). We've been pretty
happy with Ionic, it's the main reason we were able to release on Android and
iOS simultaneously. Thanks to Max and the Drifty team for creating a great
platform.

There are a lot of comments and questions about privacy and compliance. I'll
try to summarize some of my answers:

We don't technically store what's called Protected (or Private) Health
Information. This is because Pacifica is a self-help tool and PHI is defined
as originating from a healthcare professional. That being said, we are taking
steps to treat our data as if it were PHI. We have a signed Business
Associated Agreement with Amazon and are trying to operate as if we were HIPAA
compliant (we technically are, in the same way that any company that doesn't
store PHI is HIPAA compliant).

Regarding privacy and security: yes, we're in the cloud. Specifically, on AWS.
While this may be contentious, we believe that there's no reason this is less
secure than if you were hosted in a local colocation facility. Amazon has
pretty rigorous requirements for who has access to machines and who can access
data on those machines. Many of their services are HIPAA compliant, and they
certainly take this extremely seriously.

In addition, we do try to make sure everyone's data is as safe as possible.
The mobile applications communicate with our servers over HTTPS. We're using
Elastic Load Balancers but don't terminate SSL at the ELB, it passes through
to our own server so Amazon doesn't have the private keys. Recordings are
stored encrypted in S3, and our RDS instances are also encrypted. There's more
that we can do (as there always is), but we wanted to provide a little
information about what we are currently trying to do to protect things. We
welcome any additional suggestions.

------
airza
I would love to disclose my extremely disturbing and potentially violent
intrusive thoughts to an application with a completely murky privacy policy...

~~~
UrMomReadsHN
Totally agree.

Looks like you have to login and your private medical data is stored "in the
cloud." I have no idea why anyone would think this would be a good idea.
Storing it locally would be the only remotely sane solution.

I mean, I don't mind telling someone I have an anxiety disorder (and I do have
one). But the content of some of those intrusive thoughts is something I don't
want anyone to know. Especially not some startup who may sell their company
to, say, Facebook one day. Some of my intrusive thoughts are things that I did
wrong (10 years ago...). So logging them would mean a third party would have
an entire database of almost every mistake I made in my life, out of context.
The potential for that to go wrong is... extreme. I'd rather have nudes be
leaked.

And the stupid embedded video in a jQuery modal. Stop it, you're making your
videos unusable. Let me pause. Oh wait, I can't. I need to pause the video
because it runs at lighting speed, so fast it is impossible to gain any
information at all from it unless you pause it.

~~~
UrMomReadsHN
Ok, while I'm crapping all over your app (sorry...you were looking for
feedback, right?)

The website rendered so badly on my phone I went to go check out how it looked
on the desktop. Well, the video pause issue is fixed... (I still think putting
a video in a jQuery modal is incredibly annoying...just embed the video)

You need to fire your "usability consultant." Your color scheme is absolutely
horrible for anyone who is older or with less than perfect vision. It is very
difficult to read for me.

Any time your as yourself "should I display my -main content-* in white text"
the answer should be no. Doublely so if you your background is light pastel
green.

Light grey on white is another combo that is extremely hard to read. So is
white on very light picture of a beach.

Animated backgrounds(?) (The background is not animated in my mobile browser)
are not only are incredibly distracting, but also actually, ironically, cause
anxiety for me. I'm sure I'm not the only one.

I know different people have different preferences, but at LEAST give your
color scheme some halfway sane contrast. And fix the rendering errors on
mobile. Both issues, besides affecting usability, make you look amateurish. In
my opinion.

*not saying you should never use white text, ever. Just not in paragraph form.

~~~
gqvijay
Most of what you mentioned is design-related don't you think? Don't you mean
"fire the designer", not the usability consultant?

~~~
UrMomReadsHN
Yeah, its all design, but they list a usability consultant on their staff
page. Usability is a part of the design and if a usability constant can't pick
up such glaring problems with how the design affects usability, I'm not sure
what they are supposed to do.

------
DanBC
I'm always pleased to see new tools.

Did you check the NICE guidance for Computer based CBT?
[http://www.nice.org.uk/guidance/TA97](http://www.nice.org.uk/guidance/TA97)

They recommend some software; they do not recommend others. It would be
interesting to see if your app avoids the mistakes made in the unrecommended
softwares.

(NICE is one of the English "DEATH PANELS" \- commissioners of health services
need to pay attention to what NICE says.)

------
addydev
I am a behavioural scientist and really liked your app. Good work. Get in
touch.

~~~
japhyr
You might want to modify your HN profile to include your email or some other
contact information.

~~~
addydev
Contact me on twitter @gopcruise

------
foolinaround
1) What do you do with the data that you collect? \-- Can it be turned over to
other organizations? \-- can it be used for identification.

2) How long do you retain data - \-- How long is it needed for the proper
functionality of the app \-- How much longer do you store it for your internal
research and data mining purposes.

3) Can the data be destroyed on demand, when an account is closed?

4)Can much of the data be stored on the device itself? Will this be on the
roadmap?

( i have more questions, but this is a start :) )

~~~
beermann
Thanks for the questions. Hopefully I can answer the satisfactorily:

1) We don't do anything with it other than use it to provide the best user
experience we can. We won't turn it over to other organizations. Technically,
the thought records could be used to identify an individual based on your
voice, but someone would need to gain access to them. They are stored and
transmitted encrypted.

2) Currently we retain all data. This may change in the future. For the
progress to work correctly you'd want to retain a month's worth of data. The
idea is that thought records help you analyze your thinking over time. This
may take months, or longer. I kind of like the idea of allowing the user to
set their own retention policy though. It would take a little work to
implement, but I think it's reasonable.

3) Yes, if you close your account we will destroy everything (except for what
we need to retain for purchasing records, but that is anonymized when we
delete an account). You can't currently request that through the app though,
you have to email us at info@thinkpacifica.com.

4) We actually have implemented a lot of offline functionality. We just
couldn't get it to where we wanted to before launching. As a bootstrapped
company, we need to try to ensure our longevity in order to provide this. But
yes, it's definitely on the roadmap.

Happy to answer anything else!

~~~
foolinaround
Thank you for patiently answering the questions, as I believe that this app
can be valuable for many of us, as soon as details get clearer.

I would request that you build a roadmap on your webpage, stating the various
plans you have and appropriate current prioritization. If possible, allow
registered users to add comments/votes.

My comments on your answers above:

1) Am I correct in assuming that this data as such is not valuable without
voice identification? Are there ways to anonymize this - just asking.

The website talks vaguely about turning over to authorities, but more details
in the privacy policy would assuage some of our fears.

2) You might want the user to download that data, and then get it off your
system. Saves your space, liability etc without possibly impacting the app.

If the user wants highly accurate data, he leaves it there. Otherwise he
deletes it. The onus should be on the user.

3) I don't think this is mentioned on your site?

4) Awesome!

~~~
beermann
I like the idea of having a roadmap. We'll try to get something up once things
die down a bit.

1) Right, unless you can perform voice identification having the recording
doesn't matter. Your voice could be used as an identifier though, which is why
I mention it. I'm sure we could do something to modulate the recording with
some filters, but part of the therapy is in hearing yourself talk, so it would
be a bit self defeating. As for privacy, yes, it's clear we're going to need
to make a few tweaks.

2) I think allowing the user to download the data would be good for the
roadmap :)

3) No, it isn't. It was actually something I put in place this week and we
just haven't gotten any info on it up yet.

4) Hopefully sometime soon :)

------
fluidcruft
Maybe it's just me, but it seems like the person with the PhD is buried in the
"About us" page and that made me instant less interested particularly given
the heavy emphasis of CBT. So, to me the entire effort is now sorted into the
marketing bullshit/MBA-lipstick/snake-oil category.

I mean Dr Moberg seems bolted on as an afterthought when you realized "oh,
shit people might actually expect to see relevant credentials". Apparently you
don't even know what she does there besides "contributes to Pacifica’s
development on a regular basis". The site is already dripping with Valley
happy-derp marketing speak and that's the best spin you can do about someone
that should be at the center of the project? If that's not the case, you
really need to fix your messaging.

~~~
beermann
Christine is our advisor, and an awesome one at that. We're certainly not
trying to hide her.

We created Pacifica because my co-founder, Chris, has struggled with anxiety
his whole life. He came to me saying that we should try to do an app based on
CBT. Personally, I've had insomnia for quite a while and I was really
interested in how CBT might apply to both of our situations. We did a ton of
research and Christine was one of the people we reached out to to make sure
that we were adhering to the best practices in the profession.

I'm sorry you see this as snake oil. We've really tried to build an app that
fits into the daily lives of people with anxiety. Furthermore, it's designed
from the perspective of someone with anxiety to provide tools that you can
actually use throughout your day. Is there a particular aspect of the app that
you don't like or is it simply the site itself?

------
mcfist
That's what I'm getting as an opening screen: "The application Pacifica
(process com.pacificalabs.pacifica) has stopped unexpectedly. Please try
again."

~~~
beermann
Would you be able to send an email to info (at) thinkpacifica (dot) com from
the email address you created the account with. If you could let us know which
device you're on, we'll check it out as soon as we can. Thanks and sorry for
the trouble.

------
Deebot
Storing mental health data in the cloud with a random "startup" type company
sounds like it would cause more anxiety than the app could possibly solve. If
it doesn't, it should.

What an awful idea. I can't help but wonder what their goal is with the data
they're collecting via this app.

