IF (and ONLY if) you are fully cognizant and aware of what you're doing and what you're talking to, an LLM can be a great help. I've been using a local model to help me work through some trauma that I've never felt comfortable telling a human therapist about.
But for the majortiy of people who haven't seriously studied psychology, I can very easily see this becoming extremely dangerous and harmful.
Really, that's LLMs in general. If you already know what you're doing and have enough experience to tell good output from bad, an LLM can be stupendously powerful and useful. But if you don't, you get output anywhere from useless to outright dangerous.
I have no idea what, if anything, can or should be done about this. I'm not sure if LLMs are really fit for public consumption. The dangers of the average person blindly trusting the hallucinatory oracle in their pocket are really too much to think about.
My personal view is that we humans are all too easily drawn into thinking "this would be a danger to other people, but I can handle it".
I believe that if you are in apsychological state such that the input from an LLM could pose a risk, you would also have a much reduced ability to detect and handle this, as an effect of your state.
That’s how people dig deeper and deeper holes and it becomes much harder to exit them. “I’m immune to propaganda” and then go out and buy a Disney themed shirt.
Therapy is a bit different though. It's meant to make you think. Get your mind unstuck from the loop or spiral it's in. Generally you will know what's wrong but your mind keeps dancing around it. There's a lot of elephants in the room. In that sense it doesn't quite matter that much if it tells you to do something outrageous. It's not like you're going to actually do that, it's just food for thought. And even an outrageous proposition can break the loop. You'll start thinking like oh no that's crazy. Maybe my situation isn't so bad.
The problem is when you start seeing it as an all knowing oracle. Rather than a simulated blabbermouth with too much imagination.
In general it's been very positive for me anyway. And besides I use it on myself only. I can do whatever I want. Nobody can tell me not to use it for this.
Even if it just tells you (sometimes incorrectly) that nothing is wrong and just sides with you like a friend, even that is good because it takes the pressure of the situation so reality can kick in. That doesn't work when stress is dialed up to the maximum.
It also helps to be the one tuning the AI and prompt too. This always keeps your mind in that "evaluation mode" questioning its responses and trying to improve them.
But like I said before, to me it's just an augmentation to a real therapist.
Getting therapy is part of the job. Not sure about 'psychology as a discipline' but the therapists I know definitely get therapy and LLM exposure as well.
As I was told by one: the fact that you're able to tell your LLM to be more critical or less critical when you're seeking advice, that in itself means you're psychologically an adult and self-aware. I.e. mostly healthy.
She basically told me I don't look like a dork with my new DIY haircut. (Though I *did" complete CBT so I kinda knew how to use the scissors)
But they work with sick people. And that can mean a range of things depending on that clinical context. Usually sick things.
I think the main point people should focus on and take away should be that the people that know the truth about psychology and psychotherapy know that its a very vulnerable state where the participant isn't in control, has no ability to discern, and is highly malleable in such states.
If the guide is benevolent, you may move towards better actions, but the opposite is equally true. The more isolated you are the more powerful the effect in either direction.
People have psychological blindspots, some with no real mitigations possible aside from reducing exposure. Distorted reflected appraisal is one such blindspot which has been used by Cults for decades.
The people behind the Oracle are incentivized to make you dependent, malleable, cede agency/control, and be in a state of complete compromise. A state of being where you have no future because you gave it away in exchange for glass beads.
The dangers are quite clear, and I would imagine there will eventually be strict exposure limits, just like there are safe handling for chemicals. Its not a leap to understand there would be harsh penalties within communities of like-minded intelligent people who have hope for a future.
You either choose towards choices for a better future, or you are just waiting to die, or moving towards such outcomes where you impose that on everyone.
But for the majortiy of people who haven't seriously studied psychology, I can very easily see this becoming extremely dangerous and harmful.
Really, that's LLMs in general. If you already know what you're doing and have enough experience to tell good output from bad, an LLM can be stupendously powerful and useful. But if you don't, you get output anywhere from useless to outright dangerous.
I have no idea what, if anything, can or should be done about this. I'm not sure if LLMs are really fit for public consumption. The dangers of the average person blindly trusting the hallucinatory oracle in their pocket are really too much to think about.