These people need be given a button to call an inference url with just text. When you realize that's all a "model" is doing its easy to understand that its not sentient.
If you go to the "MyBoyfriendIsAI" subreddit or what it now is called, you'll see that many people claim to perfectly well understand how it works, some of them even being software developers themselves, yet they still describe what they feel as "love", even though they know it's just numbers being activated in different ways.
I'm not sure how to explain it either, for the folks who seem to understand yet "believe" anyways. I've also stopped caring much about it, if they say they feel "love", then who am I to say it isn't/is, they feel what they feel and it's as real for them as anyone else, regardless of what the thing they're loving actually is.
I think that's almost unfair, to say someone can't feel feelings without being labeled as part of a "mental health crisis". Just because you and I don't understand it doesn't mean it's inherently bad. I mean I think it probably could be bad, but not just because I don't understand how they're feeling those feelings. But I wouldn't label those people "sick", feels borderline disrespectful.
I think it’s pretty fair and I was not being quite as absolutist as you’re making my statement out to be. I said by and large it is a possible mental health crisis developing (or an existing one being expressed which I omitted). There are other possibilities, most of which I would say fall under “this person just doesn’t understand what an LLM is/isn’t and why it can’t engage in a consensual relationship.” I also did not call anybody “sick.” That’s a very loaded term when we are talking about mental wellness, and one I would never use in this context. This may all feel nitpicky to you but the way I’m talking about this issue is intentional. All that being said I can acknowledge that it was kind of glib, that it is my stance based on pretty clear evidence you can’t have a romantic relationship with a large language model, and that I’m happy to elaborate on my stance.
An LLM cannot love somebody because it is not a person or otherwise sentient/capable of a relationship. You cannot be in love with it. Loving your dog is one thing. Being in love with your dog is another. This is because nearly everyone understands that that kind of love cannot be reciprocated and a human being cannot be in romantic love with a dog. A dog for its part can’t even consent to that relationship. Neither can a computer (possibly “yet”).
I would say, generally speaking, somebody who does not understand an LLM is incapable of reciprocating love (or any real “feelings” indicating a real relationship) and who has been told what an LLM is (and understands it more or less) is likely somebody who needs to talk to a therapist. If I said this about somebody being in love with their pet nobody would call it “borderline disrespectful.”
> you'll see that many people claim to perfectly well understand how it works, some of them even being software developers themselves, yet they still describe what they feel as "love"
This statement is what prompted me to comment. Like I said above if someone knows what an LLM is (and presumably isn’t) then it’s very concerning that they still believe a romantic, consensual, reciprocated relationship is possible. If you didn’t have that part then I would say “it can also be an education problem.” But the premise you set entirely removes any need to qualify that and makes this situation all the more concerning. Your phrasing makes me think you think that makes it better, but IMO it makes the situation worse.
For emphasis: you established that these people more or less understand what they are interacting with, yet choose to pursue a “relationship” with an LLM anyway. This is incredibly troubling behavior in this context with far reaching mental health implications.
Let me just ask you point blank: do you think LLM’s are sentient/akin to people? Do you think someone is capable of being in a loving, healthy relationship with an LLM today? Because to me it’s at best a potentially harmful misunderstanding that can be clarified with education and at worst…well, like I said, the possibilities can be very deeply troubling. But ultimately my point is it can’t be a real, consensual, reciprocated relationship. It simply can’t. That’s not “lack of understanding,” that’s reality.
> Let me just ask you point blank: do you think LLM’s are sentient/akin to people?
I think you have to ask those questions to someone else/somewhere else, I'm not saying I'm in love with an LLM or even that I understand how the people who say there are, you're gonna have to ask them those questions. I was merely describing thoughts and writings of others, and my perspective of what I've read, I don't have personal experience about those feelings.
And seemingly I think it's the same for you, and neither of us can tell another human "You cannot be in love with it", that's just not how feelings work. You can say you don't understand it, ask them questions about it or whatever, but you cannot prescribe what feelings they should or shouldn't have.
I think I have a moderate understanding of how LLMs work, I'm currently sitting and building my own GPT-OSS implementation in Rust and Cuda, so I like to think I know bits and bobs about it. But even so, I'm not going around telling people what is or isn't true in regards to their feelings, especially not when I have 0 experience of what they're going through. You might want to take a step back and maybe ponder if it would be wise for you to do the same.
> But ultimately my point is it can’t be a real, consensual, reciprocated relationship.
I do agree with that, but that doesn't mean that someone could still feel like they're in love with something, even though they know it cannot be reciprocated. If they're feeling that they're in love with something, then that's what they're feeling.
At the risk of being blunt, I think a lot of that comes across as hand wavy and dismissive. These read gimme like vaguely emotional appeals designed to say I lack empathy. I don’t feel like you’re engaging with most of the questions I asked at all.
No, it's true, I didn't, because the questions seemed to be aimed at people who currently think LLMs are sentient, or they currently feel like they're in love with LLMs, which I don't and I'm not, so I skipped the questions.
Lets try again, if you want. What specific questions you want me to answer?
I asked the relevant questions I had. You can have an opinion on something you don’t actively participate in man - you’ve been expressing one up until this point after all.
If you don’t want to answer so be it but we’re now entering “bad faith” territory so I’m not interested in participating further. Have a good one.