
California tests mental health app which tracks patients' usage of their phones - cfitz
https://gizmodo.com/california-tests-mental-health-app-that-tracks-everythi-1835583585
======
ziddoap
> _California mental health officials have been in talks with two startups
> about developing a digital system that identifies when a smartphone user is
> about to have an emotional crisis, characterizing it as a “fire alarm,”_

This whole concept of predictive software for mental health (or otherwise) is
a new level of disconcerting.

In an era where both of the following are true, we should not be pursuing
these wide-spread predictive applications:

\- Inability (or simply not caring) of companies / agencies to properly secure
their data.

\- Systemic inaccuracies present in an infantile field

We're struggling to hold companies accountable for the data they lose. Some of
the biggest companies and agencies in the world are having data breaches (or
inappropriate access to data, etc.) We are only just exploring how to identify
and remove bias from predictive software. And yet, they want to roll this out?

> _But it’s still unclear how effective an algorithm can be for someone at
> their most distressed, and in the meantime, the distressed are providing a
> tech company with its most desired asset—a wealth of deeply intimate data._

That paragraph, at least, sums up how I am feeling.

Noble goal, I think. Terrifying execution.

