Yep, as a medical student, I strongly believe we need explainable AI for healthcare, and that AI only empowers clinicians to make the final call, and does not do any medical decision making autonomously. I really hope other players in the space like Hippocratic AI [1] and Google Med-PaLM2 [2] understand this as well.
Do you even need ai? What about just a basic system where you type in some keywords like the symptom presented, it queries a known good source like the physicians desk reference or something, and returns results with a likelihood score to match your keywords? Such a systems seems like it would be useful and also would not need an ai model or expensive training to create.
Please be careful with things like this before someone gets hurt or worse.
I have friends in the medical profession who have tested GPT4 and it’s good, but not quite good enough for them.
I wouldn’t touch this with a 20 mile stick; something that claims actual medical advice like this.