Hacker News new | past | comments | ask | show | jobs | submit login

And I have fears enthusiastic technology adopters will blindly accept these tools as god because “the computer says so”. Forcing patients to do the same without the opportunity to give informed consent.

Marginalized groups will receive worse care because they weren’t in the training set.

Lazy doctors will get lazier.

Errors will be deeply buried in a bureaucracy with no available remedy because nobody at the ground level understands the technology.

Scheduling will become less transparent. Escalations will be automatically denied.






Every v1.0 of technology has issues. How useful these tools become to patients will be based mainly on how much society forces them to be patient value focused and not provider value focused since provider value focus generally drives to cost and not health/life outcome.

The debate about how good these tools can become is basically over at this point. Humans really can't keep up. In a year we have gone from 'you must be dreaming' to 'the note is too verbose'. Soon we will be asking why the note is being generated at all since every person in the healthcare chain will want their specific view based on the raw source information and not a note that looses a massive amount of the ground truth in it. I think these tools will eventually mean, and likely also soon, that healthcare won't be able to function on the old note system just like you really can't provide services without integrating with an EHR anymore.

If the profit motive is the only motive then it is possible, if not likely, that patient outcomes will suffer for all but the wealthiest as this technology optimizes for them and not society as a whole. We have been seeing that trend in US healthcare for a while now but this disruptive technology has a chance to upend it all. We will have to see how it actually goes though because this is just the beginning.


Should doctors also require patient consent to Google disease symptoms?

Not a blind AI fanboy, but:

Marginalized groups are already being biased against because they are not in the doctors' training set.

If anything, using AI can help actively counter this bias.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: