
Can artificial intelligence help prevent suicides? - aspen97
https://www.dqindia.com/can-artificial-intelligence-help-prevent-suicides/
======
user748191
> It harnesses data to look for words or terms or emojis that point towards
> suicidal thoughts and immediately connect these individuals to counsellors

All AI rules MUST be crystal clear for the public.

In this case, all words, terms and emojis MUST be released to everybody. Not
only the ones using the app, or whatever, but parents, relatives, everyone.

People will say, oh, but users will try ti game the system and thus and that.
Well, that's another problem.

The fact is, we can't live in a world where AI rules and only a dozen knows
the rules.

------
gus_massa
I'm very worried about a nightmare scenario where a false positive makes
Clippy send you to a week in the suicide watch ward becase you are not smiling
enough or you don't enjoy San Valentin.

I remember a story of last year where a graduate student had a big argument
with his advisor, and the advisor reported him to the suicide service. And
they come and forced the graduate students to go in an ambulance with a
straitjacket to have a friendly interview. (I can't find the story, but my
search story now looks suspicious ...)

Remember that no one will want to override the AI. If you make a mistake and
the subject commit suicide, you will get a lawsuit or go to jail. If you
follow the orders of the AI, you are just doing your work.

~~~
krapp
>I'm very worried about a nightmare scenario where a false positive makes
Clippy send you to a week in the suicide watch ward becase you are not smiling
enough or you don't enjoy San Valentin.

No one is suggesting ever giving "Clippy" the ability to make diagnoses, much
less to automatically assume power of attorney and have people committed
against their will based on criteria no human doctor would ever accept. Your
nightmare scenario is too absurd to be worth worrying about.

~~~
gus_massa
From the article:

> _It harnesses data to look for words or terms or emojis that point towards
> suicidal thoughts and immediately connect these individuals to counsellors
> to prevent loss of lives due to suicides._

The problem is that Clippy will send you to the counselor. You will get a
generic notice, but the counselor will get the suicidal risk in big red
letter.

Then the counselor will have a protocol, and send you to a real doctor, also
with a hidden warning about the suicidal risk.

The doctor will also have a protocol, and put you in watch for a week just in
case.

Where you see two thinking humans that may override Clippy, I see two
rubberstampers that are afraid to make a mistake so they just follow the
protocol.

I don't expect to see this here in Argentina, sometimes the barrier of the
language and the cost is an advantage. But I expect to read a few horror
stories about this form universities in USA during the next 10 years. Let's
verify our guess in April 1, 2030.

