
How Artificial Intelligence Could Help Diagnose Mental Disorders - jonbaer
http://www.theatlantic.com/health/archive/2016/08/could-artificial-intelligence-improve-psychiatry/496964/?single_page=true
======
tcj_phx
The problem with Psychiatry is that it treats symptoms, not causes. The hope
represented by this article is that by catching symptoms earlier, or by better
pinpointing linguistic markers, vulnerable patients can be treated earlier
with more-appropriate medications.

I think that it is much more important to find causes behind "mental
disorders" than to describe them better.

From the article:

> According to Schwoebel, it took over 10 primary-care appointments before his
> brother was referred to a psychiatrist and eventually received a diagnosis.
> After that, he was put on one medication that didn’t work for him, and then
> another. In the years it took to get Schwoebel’s brother diagnosed and on an
> effective regimen, he experienced three psychotic breaks. For cases that
> call for medication, this led Schwoebel to wonder how to get a person on the
> right prescription, and at the right dose, faster.

What are the conditions that led this person to have a psychotic break? They
didn't care, and just put Mr. Schwoebel's brother on maintenance medication
for life. I suspect that such patients are usually 'exhausted' [1].

[1]
[https://news.ycombinator.com/item?id=12331317](https://news.ycombinator.com/item?id=12331317)
(my comment from this past weekend)

"How Psychiatry Lost Its Way: Identifying new mental disorders and their
"curse" had become a productive industry; it is also bad medicine" [2], by
Paul McHugh M.D., is very helpful for giving context to the mental health
industry's status quo.

[2] [https://www.commentarymagazine.com/articles/how-
psychiatry-l...](https://www.commentarymagazine.com/articles/how-psychiatry-
lost-its-way/)

"The Doctor Isn't In" (same author) is also quite helpful:
[https://news.ycombinator.com/item?id=10315705](https://news.ycombinator.com/item?id=10315705)

~~~
dragonwriter
> The problem with Psychiatry is that it treats symptoms, not causes.

No, the problem is that we know a lot more about symptoms than causes, so we
diagnose by and treat symptoms, which may or may not be treating causes
(though many treatments are chosen because they treat something that there is
at least a plausible hypothesis is the cause of the symptoms, and the fact
that they have efficacy in treating the symptoms is at least suggestive
evidence that both the hypothesized cause is correct and that it is being
treated.)

This is also far _less_ different than lots of other areas of medicine that
critics of psychiatry try to make it out to be.

~~~
tcj_phx
Robert Whitaker [1] has looked into the "mental health" industry, and proposes
that today's patients do worse on their medications than their unmedicated
predecessors. It used to be that people had "episodes", now they have chronic
diseases from which they never recover (unless they stop taking their pills).

[http://www.MadInAmerica.com/](http://www.MadInAmerica.com/)

~~~
keyboardhitter
Thank you for this link. this seems right up my alley - I'm trying to avoid
being an uninformed critic. i've had bad experiences in my life with
psychiatry, been in and out of the system since i was 12 and witnessed a LOT
of ... let's say bad practices. do you have any other suggested reading /
sites for someone in my position?

~~~
tcj_phx
> Thank you for this link. this seems right up my alley - I'm trying to avoid
> being an uninformed critic.

You're welcome, thanks for responding.

> i've had bad experiences in my life with psychiatry, been in and out of the
> system since i was 12 and witnessed a LOT of ... let's say bad practices.

Having seen the inside of this world (as a visitor), I can sort-of imagine
what you've been through. People do have problems; the system knows not that
it does not always help.

> do you have any other suggested reading / sites for someone in my position?

depends on what you're interested in. I switch to this account whenever I have
something to say about mental health, so there might be some other links
there. (I just surveyed my comments, there aren't too many. I post the link to
Robert Whitaker's work fairly regularly, but there are a bunch of others. They
depend on the context...)

[https://news.ycombinator.com/item?id=11974769](https://news.ycombinator.com/item?id=11974769)
\- this has a link about the monopolists' approach to medicine. I posted that
comment from my phone, and didn't feel like fishing out the full link. But the
article is quite helpful for illuminating the source of bias in the "Medical"
world.

[https://news.ycombinator.com/item?id=10974230](https://news.ycombinator.com/item?id=10974230)
\- I responded to this person's comment to thank them for their link about
"SSRI's"

What are you interested in? anti-psychiatry? self-hacking mental health? The
history of Medical Monopoly?

------
vonnik
One worry here is that AI only classifies as well as the labeled datasets that
it trains on. And my concern would be the way humans apply labels to other
humans. Diagnosing mental illness is notoriously messy. DSM-IV descriptions of
various illnesses including borderline personality disorder are disputed, not
settled. If we can catch psychosis early, great -- but what's psychosis?

------
elizabethanera
I was going to write something about how ominous it is to defer to the
judgment of a machine for a person's sanity, but the machine of psychiatry to
which we defer is no more or less subject to the weaknesses of that kind of
deference.

~~~
T0T0R0
You've identified that a group of humans will make mistakes, possibly as much
as machines, and that's not incorrect.

The real hazard, on the other hand, is the unrestrained laziness humans might
adopt, once the norm of machine assessment matures. Always trusting the
machine answer. Never (or rarely ever) contemplating the judged person.
Optical sorting. Rubber stamps.

No longer simple stigma. But endemic callousness, as people stop trying to
help anyone, because machines now adjudicate who to simply avoid, until those
untouchables rectify their unit tests.

The isolation induced by a machine-endorsed judgement could be far more rigid,
than times when people would have had to get to know you, feel you out. And
this process could have been theraputic for at least one side of the
conversation. Maybe that goes away now.

------
doctorstupid
And we're just feeding our devices all of this information, permitting its new
owners to freely build psychological profiles about us.

------
dbreunig
In a recent Harvard study about how Instagram could predict depression with
trained algos, they shared some very high depression rate figures regarding
Mechanical Turk workers. We should be doing more to investigate this issue, as
the decisions made on MT fuel much current work.

Expanded thoughts here: [https://medium.com/@dbreunig/do-algorithms-find-
depression-o...](https://medium.com/@dbreunig/do-algorithms-find-depression-
or-cause-depression-2e047ef84cda#.n01ls254a)

------
lighttower
Yes. Our startup ran into these problems. Even worse, you don't even know if
the labels of your truth table are meaningless

