I am still slightly worried about accepting emotional support from a bot. I don't know if that slope is slippery enough to end in some permanent damage to my relationships and I am honestly not willing to try it at all even.
That being said, I am fairly healthy in this regard. I can't imagine how it would go for other people with serious problems.
A friend broke up with her partner. She said she was using ChatGPT as a therapist. She showed me a screenshot, ChatGPT wrote "Oh [name], I can feel how raw the pain is!".
all humans want sometimes, is to be told that what they're feeling is real or not. A sense of validation. It doesn't necessarily matter that much if its an actual person doing it or not.
Yes, it really, truly does. It's especially helpful if that person has some human experience, or even better, up-to-date training in the study of human psychology.
An LLM chat bot has no agency, understanding, empathy, accountability, etc. etc.
I completely agree that it is certainly something to be mindful of.
It's just that found the people from there were a lot less delusional than the people from e.g. r/artificialsentience, which always believed that AI Moses was giving them some kind of tech revelation though magical alchemical AI symbols.
I am still slightly worried about accepting emotional support from a bot. I don't know if that slope is slippery enough to end in some permanent damage to my relationships and I am honestly not willing to try it at all even.
That being said, I am fairly healthy in this regard. I can't imagine how it would go for other people with serious problems.