
Facebook brings suicide prevention tools to Live and Messenger - tooba
https://techcrunch.com/2017/03/01/facebook-brings-suicide-prevention-tools-to-live-and-messenger/
======
pugio
To those commenting that this somehow absolves friends from the responsibility
of connection (and that's a bad thing), you're missing the point:

\- Not everyone HAS friends who are willing or able to connect with them
sufficiently. Saying "yeah, but they should" does NOT help the actual
situation. These resources might.

\- It is hard to know what to do with a (near) suicidal friend. Even if you
truly wish to help, "connecting" or reaching out to them often doesn't help
sufficiently. It can also be extremely draining on the helpful friend, who
thinks they have a responsibility as a friend and good person to be there for
someone in need. For normal gloom and doldrums, sure, that's what friends are
for. For severe depression and suicidal thoughts, a friend can very quickly
get in over their head, becoming the sole emotional anchor or lifeline for the
depressed person. This can also lead to the helpful person's becoming burnt
out, depressed, and upset at their perceived failure in helping this person
they care about.

The more technology can do in identifying this negative emotional vortex and
prompting people with the appropriate resources, the better.

If you're commenting about a friend's responsibility, have you ever been
severely (suicidally) depressed, or the prime friend of someone who is? I
have. I support Facebook's actions in this wholeheartedly.

~~~
Nuzzerino
It's nice to see them trying to do something about this. But I'd hold off on
judging the merits of this approach until some reliable data comes in with
regard to its efficacy.

------
contingencies
Reminiscent of the automated religious booths in THX 1138:
[http://www.imdb.com/title/tt0066434/](http://www.imdb.com/title/tt0066434/)
[https://en.wikipedia.org/wiki/THX_1138](https://en.wikipedia.org/wiki/THX_1138)

~~~
imglorp
That was my first thought too. It's a very small step from

    
    
       It sounds like you might hurt yourself, sending help.
    

to

    
    
       It sounds like you don't agree with the Party, comrade. We're 
       sending the happiness team to collect your family for retraining.

~~~
zardo
That seems like a pretty big step to me. Would you oppose 911 systems on the
same grounds?

It's such a small step from: Your house is on fire, let's do something about
that.

to

Forced fire safety re-educatation camps.

~~~
imglorp
Both of the increments in my example depend on sentiment analysis of private
communications. That tech opens a whole brave new world.

~~~
zardo
Both of my examples involve your neighbor using telecommunications to report
you to the government.

That a technology could be used to do bad things, is not a very good reason
not to use it to do good things.

Would not providing automated suicide support prevent malicious use of
sentiment analysis? Just because they use the same underlying technology
doesn't mean one use causes the other use.

~~~
imglorp
Yeah. I guess it's more a social issue: if a tech can get abused to support
the state's goals--we are quickly finding--that it will get abused.

Quick example: Alexa has already gotten its first subpoena for a presumed
private voice conversation in someone's home.

------
forgottenpass
It's impossible to argue preventing suicides is a bad thing. But it is
inseparable from the fact facebook is now _actively pursuing_ their ability to
use the platform for social control.

I'm willing to grant them the benefit of the doubt that they just sort of
bumbled into an interaction loop with downsides that might make facebook use a
net-negative for the mental state of some users. They were trying to provide
communication tools and it evolved into something that (at least correlates
with) a decline in well being [0][1].

As more and more companies doing "social" cultivate userbases larger than they
can moderate start feeling justified in shaping their users' behavior en mass,
I worry that we're slowly stumbling towards human-scale skinner boxes that no-
one fully understands or controls.

[0]
[http://journals.plos.org/plosone/article?id=10.1371/journal....](http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0069841#s3)

[1] jimmies' comment
[https://news.ycombinator.com/item?id=13772729](https://news.ycombinator.com/item?id=13772729)
is a fairly typical example of someone figuring out facebook use is harming
their psyche.

------
dredmorbius
While I don't use Facebook, I've been on several other platforms and seen
either threats, or actual suicides, by multiple contacts amongst them.

Facebook has, by some measures, 1.8 billion active users, roughly 1/4 of the
global population. If there is a suicide every 40 seconds, there's a good
chance that there's one every 2-3 minutes among Facebook's users. Strictly by
standard mortality tables, there are tens of thousands of deaths amongst FB
users _per day_.

In the case of one friend -- deeply troubled, I knew -- there were two periods
in which she'd threatened suicide. The first time, I managed to get through to
local authorities in her area, though that was suprisingly difficult itself.
The second time, it was simply too late. For another friend, I'd had no idea
until after it was all over.

The feeling of absolute impotence one can feel behind a screen, and realising
the impractiality of a large systems provider of being able to reach out
directly is numbing. Heroic interventions, particularly via online systems,
are unlikely at best.

Far more critical would be to strengthen mental health, physical health, and
social welfare (in the broadest sense) systems. There are so many people, in
all parts of the world and all stations of life, in precarious straits, and
often only the flimsiest of support, if that, available.

I appreciate Facebook's efforts, but a committment to substantive, early, and
ongoing support strikes me as vastly more meaningful and effective. This
initiative could have some impact, but without a deeper committment, strongly
risks being seen as cosmetic and ultimately self-serving.

[http://expandedramblings.com/index.php/resource-how-many-
peo...](http://expandedramblings.com/index.php/resource-how-many-people-use-
the-top-social-media/)

[http://www.who.int/gho/mortality_burden_disease/mortality_ad...](http://www.who.int/gho/mortality_burden_disease/mortality_adult/situation_trends/en/)

~~~
Bartweiss
To a depressing degree, most of our society's efforts against suicide work
like this. The issue gets ignored altogether until someone is perceived to be
in immediate danger, and then they either get support or (all too often) a
three day psychiatric hold.

This process is terrible on so many levels. The psych hold in particular is
both excessive (f someone isn't suicidal, it's a pointless kidnapping) and
insufficient. (If someone just attempted suicide, they're at _low_ risk right
after. If someone gets antidepressants, three days isn't long enough for them
to kick in.)

I suppose Facebook isn't in a position to help earlier on (at least, not
without even more privacy invasion), but we definitely don't intervene in this
issue at sensible times.

~~~
dredmorbius
The kinds of interventions I'm looking for would be deep and social (see my
other reply in this thread). The thing about healthcare, physical or mental,
is that early _and sustained_ efforts pay of _vastly_ more greatly than heroic
acute efforts.

I'm increasingly convinced that the traditional Chinese concept of paying the
doctor _when you are well_ is an approach well worth considering, and possibly
adopting.

It's also quite helpful to realise that not all ills _can_ be healed. Almost
all can be made far more tolerable, though, at the very least.

------
jimmies
I don't know about the claim "Facebook is in a unique position to help prevent
people from doing harm to themselves." Oh, Big Brother is our brother now?

If harm means physical harm, maybe -- or maybe not: apparently many people
definitely thinks that facebook live is a popularity contest and start
streaming their suicide on it [OP, and Google - 0]. Here, to understand why fb
might be the reason people did that in the first place, you should ask if live
streaming functionality is all that people wanted, then why streaming suicide
attempts on youtube wasn't a (or, as big of a) problem?

But I think on a more intricate level, harm also means slowly sinking, and
being depressed, then to me, Facebook is a drug that does exactly that.

There are studies which concluded that people are actually happier without fb
[1]. Personally, I found that to be true: I feel depressed browsing the
meaningless statuses of people bragging about trivial things or sharing
political news. I find fb isn't even a platform that I can share "what's on my
mind" (like it always prompts) anymore. Everything is tied to my real name
with no other option. I can't share the stories at work because co-workers on
facebook will see. I can't share anything about my relationship because our
mutual friends will see, and my girlfriend gets pissed off. I can't share my
good news because people will think I am a dick who want to brag. I can't
share my bad news because people will get worried more than me. I can't share
hobbies because my friends don't have the same hobby and they don't care, so
no reaction, so I get depressed because no one cares.

So all I can share is shitty vacation pictures with me smiling like an idiot,
and inconsequential news. Those are the news that pleases everyone: The lowest
denominator of me and everyone on fb. Fuck that. Also, I found out that my
behavior on fb is extremely similar to an addiction behavior. That includes
finding it not useful but keep coming back, repeatedly checking it for new
stuff, coming back after deactivating.

I used to think Facebook messenger was essential. Not until I found myself not
reading the messages and don't care what people say in 1000s of groups I am
in. I see that in my friends too, they left group without saying a word, it
means FU, don't add me to stupid group chats again. How about events? People
start not responding on event invites as well.

So I decided enough was enough -- along with numerous ethical problems of
facebook [2], fb does more harm than good to me. In the last couple of weeks,
I have deleted my fb account. This time, I don't announce to people that I
will be gone, I just silently deleted it. I had enough, I don't care who will
miss me, thank you facebook. It is a lie: most of them don't, and if they or I
do, we would already have each other's contact information.

Beginning of 2017, I also swapped my top-of-the-line smartphone with a dumb
phone, with little to no distraction. Data caps is no longer a problem,
privacy is no longer a question (I just assume everyone hears), features is no
longer a problem, losing of sleep is no longer a problem, apps definitely is
no longer a problem, and I no longer question or have any problem when someone
defriends me. I check my emails when I want. I come to people. I no longer
have the urge to check my phone for facebook feed in the middle of the night.
I lug around a big ass mirrorless camera that takes pictures not for likes and
navigate by my instincts. I feel great. I feel productive. I feel creative. I
feel freedom. I started commenting on websites that people don't know and care
who the fuck I am. But they have the same hobbies and interests, so I feel
connected, I feel togetherness.

There is definitely more when I have less of facebook.

0:
[https://www.google.com/search?q=girl+live+streams+sucide+on+...](https://www.google.com/search?q=girl+live+streams+sucide+on+facebook)

1:
[http://journals.plos.org/plosone/article?id=10.1371/journal....](http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0069841#s3)

2: [https://stallman.org/facebook.html](https://stallman.org/facebook.html)

~~~
M_Grey
I've found Facebook almost comically easy to avoid since day one. The only
thing that never changes is how intense the desire of people on Facebook to
have me join them is, but their arguments never improve. The only friends I
know still on FB are the ones who only ever used it to keep in touch with
people halfway around the world, and nothing else. Even then, most of them
have migrated to other services.

Facebook is a crazy invasion of privacy, but you literally have to invite them
in.

~~~
Bartweiss
> Facebook is a crazy invasion of privacy, but you literally have to invite
> them in.

Yep. I have my Facebook privacy settings cranked up to max, but I'm honestly
not sure why I bother. My actual policy is "this is a public space, only put
things here I'm happy to publicize", so there's really no privacy to invade.

~~~
M_Grey
That's the sane way to do it, definitely.

------
cJ0th
> In Facebook CEO Mark Zuckerberg’s recent manifesto, he wrote about how
> Facebook is in a unique position to help prevent people from doing harm to
> themselves.

This makes you wonder how often it happens that a post on facebook (say,
school kids bullying a class mate) is the final straw for someone who is
already mentally suffering.

In other words, FB is probably quite responsible for many suicide attempts by
people who are at risk to committing suicide.

~~~
wu-ikkyu
I hate Facebook, but being hasty to place blame and responsibility on them
after a suicide short sighted. Unless it was related to the emotional
manipulation experiment they did a while back.

[https://www.forbes.com/sites/gregorymcneal/2014/06/28/facebo...](https://www.forbes.com/sites/gregorymcneal/2014/06/28/facebook-
manipulated-user-news-feeds-to-create-emotional-contagion/#482e8ef039dc)

~~~
cJ0th
"Responsibility" may be too strong a word but I guess it is fair to say that
technically they've played a role in many cases. And because of that it rubs
me the wrong way when they say they are in a unique position to "help" people.
It's rather an attempt at saving face.

------
6stringmerc
What a sticky situation. I applaud the groups concerned with well-being for
being up for partnership with Facebook. In reality, I also want to see the
tools as on the positive side, in how they can do some good. Deploying such
tools is a risk on Facebook's part, and I think it's the good type to take.
Harm prevention is probably a consensus point to work on; I see it as a good.

My imagination did give me a little pause though, that if somebody was going
through some kind of paranoia state and suddenly the Facebook AI scoots them
over to one of the crisis providers - you know, kind of like technology being
indistinguishable from magic/witchcraft when using AI - that could be a
compounded situation. Granted, a drastic hypothetical, but perhaps not an
entirely useless one.

------
DanBC
This is probably a moderately useful suicide and self harm prevention measure.

We know that of s=deaths by suicide in people under 18 1 in 7 posted messages
to social media before they died:
[https://twitter.com/ProfLAppleby/status/837053501243023364](https://twitter.com/ProfLAppleby/status/837053501243023364)

We have small amounts of weak evidence, mostly from interviews of people who
survived suicide attempts and people who self harm who say that this kind of
intervention is helpful.

See also signs in multistory carparks and different packaging (and reduced
pack sizes) for paracetamol.

I do a lot of searching for suicide related stuff, and I already see quite a
lot of similar advice. I guess I'm about to see a lot more of it.

[https://twitter.com/actioncookbook/status/834439563032555521](https://twitter.com/actioncookbook/status/834439563032555521)

~~~
gcr
But I think the thing that makes it impactful is that in these kinds of
interventions, it comes from one of your friends.

In the intervention that Facebook proposes, the victim only sees a popup from
Facebook, not a friend. The person who generated that popup isn't even named.
It's very impersonal.

If I were suicidal, the only thing this popup would tell me is that I made one
of my viewers uncomfortable, _and that they weren 't willing to connect in
person_. I imagine that would only serve to amplify my shame and make me feel
even worse.

------
Existenceblinks
I suddenly think of the fake bomb alert last year,
[http://www.theverge.com/2016/12/27/14088982/fake-news-
safety...](http://www.theverge.com/2016/12/27/14088982/fake-news-safety-check-
bangkok-thailand). I can't tell this is going to have false negative or
positive situation. It might be worse. How do we prevent the former? I don't
think there is a way. It looks like this - some kids keep calling 911 for fun.
That used to be a thing in my country.

~~~
Bartweiss
And false positives for suicide are comparably bad to ones for bombs. No SWAT
teams perhaps, but it can easily get someone held on a closed psych ward for a
week (with lasting legal consequences for them). Someone is going to sue the
hell out of FB when they get held on a false positive.

------
wooshy
This article leaves me with a very sour taste in my mouth. I echo the
sentiment that others have expressed about pawning off the work of connecting
with the person in need to be a bad thing. I also don't think that just
because Facebook is in a position that they can do something about it that
they should. It seems like a far too personal matter for them to be sticking
their hands into.

~~~
killjoywashere
Having participated in a couple teen suicide autopsies, I submit that, at
least in the meetings I was in, there was broad support for recruiting social
networks to help fight both cyber-bullying and detect suicidal tendencies. If
anything, let's screw up in the other direction for a while and see how the
two approaches compare.

------
gcr
This really strikes me as wrong. I can't quite put my finger on why.

It's like if I'm watching my friend go through a hard time, this removes my
responsibility to connect with them? I can just ask robots at Facebook to do
the dirty emotional work for me. How messed up is that?

We're outsourcing emotional labor to robots now? like if I get uncomfortable
supporting my friend, i can click a button to have robots do it instead?

An actual handcrafted message from someone saying "hey, it sounds like you're
hurting. what do you need?" would mean far more to me than a popup from
Facebook that says "An Unnamed Friend has Activated Crisis Response Protocol!
Deploying Support Resources ..."

~~~
pugio
You're describing the world in terms of "should/ought" rather than "is". Yes,
it would be nice to live in a world where every person has friends who are
capable of connecting with and supporting them, but that is not the present
world. The point of tech is scalability - you can implement something which
can rapidly help _everyone_, especially those for whom other systems have
failed.

Additionally, you're ignoring how hard it can be to adequately support a
suicidal friend. Often times people want to help, but feel out of their depth
and don't know how, and many people are unaware of the availability of
resources such as the Crisis Text Line or Suicide Prevention hotline (which is
what FB connects you with).

As for the "responsibility to connect with them", at a certain point your
responsibility as a friend is to get someone professional help and
intervention. Trying to keep a depressed friend afloat as a layperson is
enormously taxing, and often unsuccessful. Yet people think they have a
"responsibility" to keep trying, as a good friend, and can easily burn
themselves out, with disastrous consequences for both themselves and their
depressed friend.

I welcome any extra support technology and AI can provide, at least in terms
of connecting people with the correct resources.

------
gydfi
How long before facebook gets sued over a suicide it failed to prevent?

~~~
kolokolo
How long before Facebook starts grading people and choosing whom to prevent
from suicide and whom to push towards it?

~~~
type0
They could always sell that data to companies that are interested in not
employing persons with mental health issues.

~~~
lawless123
Something like Cambridge Analytica could use it to do terrible things.

~~~
nkrisc
I can think of quite a few awful things data like this could be very useful
for. I feel just a little gross that I had these ideas.

------
funnyfacts365
Did Facebook even stop to think and realize they are the reason those persons
are depressed and wish to die?

------
LoonyBalloony
Fix the inequity in society that is the cause of most depression? Nahhh lets
just virtue signal.

~~~
grzm
Specifically what role do you think Facebook should play in reducing
inequality?

~~~
LoonyBalloony
Reorganize as a worker cooperative.

