There hasn't been any social ruptures over the possible sentience of whales hunted to near extinction, dolphins caught in fishing nets, cows, pigs. There will be no uproar over the feelings of chatbots.
Those situations are not really comparable. Nobody has conversations with dolphins or whales in human languages, whereas "talking" with a chatbot is fairly trivial to do and becoming a fair universal experience, at least in the global north.
Is written language the only way you can conceive of empathizing?
Besides, people seem more than happy to ignore the suffering of fellow humans, even as they profit from it. There would be no cheap electronics without great human suffering in the mining industry, for a quick example.
I wasn’t trying to emphasize the written aspect, I was trying to emphasize that we interact with the chatbots in a way that’s natural to us.
And yes we do. It’s very easy to do that because we don’t talk with them. We don’t have to see what they live through and how they feel about it. If every electronic device came with an hour or two of voice/video call with even one of the people who slaved away in the factory to build it, that might change.
>'social ruptures' between people who disagree on its sentience
You think?
There's already real bad ‘social ruptures’ between people who disagree on the sentience of the actual primates they are electing to leadership positions.
The arguments could be over if people were to carefully define the words they're using, and any controversial words in those definitions. It would show either that they're talking about two different things, or as is more often the case here, that they truly don't know what they're talking about.
I don’t understand, why would the overnight emergence of a handful of groups claiming Minority Report-level foreknowledge be a cause for social awkwardness?