It would seem to me that there's a different class of knowledge at play with knowing e.g. that someone is speaking to you. I mean, Google is pretty good at answering questions you type into the search bar, these days, but it doesn't give me the creepy feeling that ChatGPT gives me, precisely because its answers seem to be based on association and relation. Google does not raise my neck-hairs, and ChatGPT does; I do believe it is because the latter does seem to understand indexicals, whereas the former does not, because when I say 'you' to ChatGPT, it reacts *as though it understood that it was being spoken to*.
Its powers of imagination (e.g. simulating a linux box) are pretty amazing, but it's the eerie feeling that it's playing along because I asked it to that, for me, puts my jaw on the floor.
I guess I'm groping towards a reductive theory of the uncanniness that ChatGPT occasions --- that it's the indexicals.
Basically I want to know what the magic looked like that took it from a distinction between two things, and the (apparent, pseudo-) understanding that it understands itself to be one of those two things. Self and other. Do you follow?
Based on the answer at throwaway81523's link is appears to me that the answer is that ChatGPT has a drive to process input, but no means to process input, and the programmer gives ChatGPT a processing template that ChatGPT can use to process input. And that particular processing template says that processing input using this template means that the processor's referent is the ID ChatGPT.
So the "other" is input. While the "self" is processing template. And the actual ChatGPT is a machine-learning database with an impulse to use a pre-loaded processing template on input. The ins and outs of the processing template and the machine-learning database (i.e. the actual code) with respect to driving both output to the user and adaptation of the database and/or template I do not know.
Its powers of imagination (e.g. simulating a linux box) are pretty amazing, but it's the eerie feeling that it's playing along because I asked it to that, for me, puts my jaw on the floor.
I guess I'm groping towards a reductive theory of the uncanniness that ChatGPT occasions --- that it's the indexicals.
Please allow me to think further on this matter