Hacker News new | past | comments | ask | show | jobs | submit login
Ask YC: Why not prevent suicides by automatically spying on people's emails?
2 points by amichail on Nov 21, 2007 | hide | past | favorite | 26 comments
One can create AIish algorithms to see how depressed someone is likely to be by analyzing their emails. Something like that could be done with gmail for example.

If some depression threshold is passed, mental health authorities would be contacted automatically and immediately. Surely the lives saved would be worth any loss of privacy?




Why not have televisions that watch back and make sure we're not making bombs? Why not have random searches and seizures if there's a chance we could save a life? Why not put GPS tracking into every car, every person, so we could find out where they are if they go missing?

Slippery slopes.

And don't call me Shirley.


> Why not put GPS tracking into every car

This is already here... GM offers use of it's OnStar system to law enforcement so they can hear what you are saying while in your car, OnStar also has GPS tracking.

> every person

Your cellular phone already does, even when it is turned off.


OnStar is designed specifically only to work in two instances: 1. Someone presses the blue button in the car manually, or 2. the car detects an accident and automatically calls OnStar and the police.

It doesn't "hear what you're saying while in your car" just randomly.


http://www.news.com/2100-1029_3-5109435.html

OnStar intially helped the FBI in a case by letting them listen into a conversation inside a vehicle, then decided to sue the FBI over it. That doesn't mean OnStar won't change it's mind later...


You probably think your phone only listens when you take it off the hook, too.


"AIish" is my new favorite word. Let's apply this principle to some other words.

"I'm sorry that your father died, but we aren't really a hospital... we're hospitalish."

"I'm not a plumber, but I am plumberish."

"If I just came right out and said that strong AI is a joke, and that I wouldn't necessarily trust an AI to sort my socks, let alone recommend to government authorities that I be committed to a mental institution, it would not be subtle. But it might be subtleish."

Also, you do realize that your plan would permit anyone who can forge a FROM header to conduct a denial-of-service attack on your entire life, don't you?


Amichail you have missed your calling -- you shouldda been a hacker version of Jerry Springer. "Live! This afternoon! Gay Midget Nun LISP programmers and the System Administrators they love"

Here's a better follow-up question: if Google currently analyzes emails to determine what advertisements to place, and it can be shown that similar email processing could prevent suicides, is Google liable for not either providing the service or notifying people that the service is not included?


And what if by analyzing your email Google aided in your suicide by bringing up advertisements that lead you to what you used to commit suicide with. Google might bring up an advertisement "Search eBay for 'commit suicide' now!" or "Find books on amazon on "commit suicide"!"


It could be worse. It could be Microsoft, and that little animated paperclip pops out. "Hey! Looks like someone is wanting to commit suicide today! Have you written your suicide note yet? Studies show that suicide notes written using AllIsLost letter creating software are 22% more poignant! Thought about creating an online will? How about some overseas prescriptions at really low prices? Did you know that the nearest gun store is 7 miles NE of your location. Like directions? There's a Taco Bell on your way in case you're hungry. I'll just keep chatting down here until you finally off yourself."


Turns out many people have associated Clippy with suicide notes [+]. Something about Microsoft's 'help' just brings out those feelings!

[+] http://www.visar.com/AssistedSuicide.html


I have a feeling that the person receiving the email will be a lot better than your AI is at knowing if the person is depressed or suicidal.


But maybe he/she would not be willing to contact mental health professionals without your permission?


Maybe we should put cameras in your bathroom that monitor your bowel movement habits. When you're severely constipated with a bowel obstruction, should it call the ambulance for you?


Every time I see a post like this I'm sure amichail wrote it, does it happens to you?

I'm sure google ads will show some health club publicity with that kind of content on an email.


> Every time I see a post like this I'm sure amichail wrote it, does it happens to you?

Yes. I guess he has a severe case of collectivism, which is quite strange for someone hanging out here.


Why not? Have you read "1984"?


Double plus goodthinkful, comrade.


"Surely the lives saved would be worth any loss of privacy?"

Very scary mindset you've got there.


Hey this is the modern world. We don't expect to take personal risk for anything. If there is any death at all from doing something, there's somebody out there who wants to take away our freedoms to do it in the name of "safety". This is just the logical continuation of that trend (which has been going on for many decades, I might add)


Clippy pops up

It looks like you're going to commit suicide.

Would you like to:

- Buy some pills.

- Buy some ammo. ...


The estate tax is a better monetization plan.


what's wrong with suicide? all of us die at some point. i'd rather die at a point of my own choosing. so suicide is the best death option (for me!), and since i'm going to die, it's best if i (eventually) commit suicide. Now there's this problem of committing suicide "too early"...presumably the right cutoff point would be the point when life stops being worth living. which then raises the problem of knowing when that is (i.e. teenagers getting dumped and assuming, erroneously, that life is no longer worth living). thus, having a web app for calculating that point would be REALLY convenient (hint hint!)


You are the first hit in my suicide filter, and I didn't even need access to your mail client. Please pay up.

If you were being remotely serious about suicide, here is what I decided at a young age when I first thought about the subject when trying to decide whether a religion was true or not: If this whole existence is worthless and meaningless, make the best of it while you are here. If this whole existence is part of something bigger, make the best of it while you are here. There really is really no problem or ailment that makes suicide a viable option.


No, you can't create AIish algorithms to detect depression. Not with the current state of the art in AI.


heh, this is kind of a bad idea for a lot of reasons.

but rest assured, the nsa already monitors all domestic internet traffic.


You sure have a lot of faith in "the authorities". Should they be able to forcibly medicate you into a vegetative state for life so that you never kill yourself?

A better use of technology is to make people's lives better so they don't feel suicidal.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: