
Facebook is predicting if you'll kill yourself. That's wrong - pseudolus
https://www.theguardian.com/commentisfree/2019/jan/30/facebook-is-predicting-if-youll-kill-yourself-thats-wrong
======
lazzlazzlazz
The never-ending seesaw of responsibility: years ago, people complained that
Facebook was doing nothing despite having access to critical information, at
the right time, that could save lives by preventing suicide.

Now that Facebook responds to those criticisms - that's wrong too.

I've never seen such a clear demonstration of "damned if you do, damned if you
don't" over the last few years as criticisms toward Facebook.

~~~
e1ven
Are you seeing these criticisms from the same people, or from different
groups, who might have differing opinions?

When you become as big as Facebook, there are going to be arguments about your
moral responsibility, and not everyone will agree.

------
dannykwells
I think the comment around HIPAA is particularly important: hospitals and
other entities collecting health information are required to have extremely
strict data privacy standards. Suppose Facebook started selling your suicide
risk to insurance companies? Suppose they leaked? For those individuals, it
would be a disaster, and one they had no control over. Regulation must come to
Facebook and all the major tech companies that forces them to treat their data
akin to personal health data.

------
DanBC
> Some of those risks include high false positive rates leading to unnecessary
> hospitalization and forced medication, potentially violent confrontations
> with police, warrantless searches of one’s home,

None of these are a problem of Facebook identifying people at risk for
suicide.

All of them are American problems that are outside Facebook's control.

It's ridiculous to me to suggest someone is going to be forced into hospital
against their will because of a Facebook algorithm. That cannot happen in
England. The US system is hopelessly broken if it happens over there.

------
marblar
> "Ever since they've introduced livestreaming on their platform, they've had
> a real problem with people livestreaming suicides," Marks says. "Facebook
> has a real interest in stopping that."

[https://www.npr.org/2018/11/17/668408122/facebook-
increasing...](https://www.npr.org/2018/11/17/668408122/facebook-increasingly-
reliant-on-a-i-to-predict-suicide-risk)

~~~
masonic

      "Facebook has a real interest in stopping that."
    

Stopping the _live streaming_ , not the suicides themselves.

------
Udik
Next step: Facebook detects you're at risk of committing suicide and starts
increasing the influx of positive posts to your timeline.

~~~
DanBC
They already target adverts for talking therapies to this group.

~~~
tropo
I would think that adverts for talking therapies should go to any friends of
the suicidal person.

The suicidal person would be much more interested in purchasing stuff like
rope, pills, life insurance, inert gas, and firearms.

~~~
DanBC
Weirdly, because I research suicide a lot I do sometimes get adverts for rope.

There doesn't seem to be a way to tell ad-tech companies about this.

Here's one example: [https://imgur.com/hhOYUJb](https://imgur.com/hhOYUJb)

------
woah
This sounds similar to “swatting”. Facebook can now order the police to come
into your home without a warrant.

