> a crappy but cost-effective approach with a high margin of error may rise to prominence.
The tech startup that eventually creates that will call this "efficiency". This type of 'solution' is exactly what capitalism creates.
> I would love to have a real-time dashboard/HUD with measures of disorganized speech patterns or affective intensity
> It would be hard to make it not distracting
Not only would it be distracting, it could also bias you in unexpected ways.
> I would love to have a real-time dashboard/HUD with measures of disorganized speech patterns or affective intensity,
I already don't trust a lot of the mental health industry because of the very bad experiences[1] I've had in the past. The easiest/fastest way to guarantee I never visit a psychiatrist again is to start using that kind of "AI" tech without first showing me the source code. "Magic" hidden algorithms are already a problem in other medical situations like pacemakers[2] and CPAP[3] devices.
> make up for the trade-off of not being in the room
Maybe what you need isn't some sort of "AI" or other tech buzzword. It sounds like you need better communication technology that doesn't lose as much information.
--
On the more general topic of "AI in psychiatry", I strongly encourage you to play the visual novel Eliza[4] by Zachtronics. It's about your fear of a cheap, high error rate system with an additional twist: the same system also optimizes your role into a "gig economy" job.
The tech startup that eventually creates that will call this "efficiency". This type of 'solution' is exactly what capitalism creates.
> I would love to have a real-time dashboard/HUD with measures of disorganized speech patterns or affective intensity
> It would be hard to make it not distracting
Not only would it be distracting, it could also bias you in unexpected ways.
> I would love to have a real-time dashboard/HUD with measures of disorganized speech patterns or affective intensity,
I already don't trust a lot of the mental health industry because of the very bad experiences[1] I've had in the past. The easiest/fastest way to guarantee I never visit a psychiatrist again is to start using that kind of "AI" tech without first showing me the source code. "Magic" hidden algorithms are already a problem in other medical situations like pacemakers[2] and CPAP[3] devices.
> make up for the trade-off of not being in the room
Maybe what you need isn't some sort of "AI" or other tech buzzword. It sounds like you need better communication technology that doesn't lose as much information.
--
On the more general topic of "AI in psychiatry", I strongly encourage you to play the visual novel Eliza[4] by Zachtronics. It's about your fear of a cheap, high error rate system with an additional twist: the same system also optimizes your role into a "gig economy" job.
[1] a brief description of one of those experiences: https://news.ycombinator.com/item?id=26035775
[2] https://www.youtube.com/watch?v=k2FNqXhr4c8
[3] https://www.vice.com/en/article/xwjd4w/im-possibly-alive-bec...
[4] https://www.zachtronics.com/eliza/