>Psychologist Sarita Robinson at the University of Central Lancashire, UK, says that hallucinations are common when people are in isolation, usually occurring if there is also sensory deprivation, such as being in a dark room.
Does the AI model hallucinations somewhat linked to this, does computer AI model too need some sort of socializing?
That's an interesting conjecture. Many of the attributes of AI that people object to (hallucination, sycophancy, psychosis, plagiarism, etc.) originate from human behavior. We see ourselves in the system and, in a way, are the ghost in the machine. So yeah, it could be that AI systems need socialization, and that may be a job for humans in the future. I'm now waiting for the first Dr. Susan Calvin.
Congratulations, you built and deployed that, you could register a proper domain name, pillu.ai is available. Also `anyone who works with their brain` in tag line sounds like a bit in harsh tone or like sarcasm you could change that. Just my input.
Because streets are not controlled environments. Planes have auto pilots for a long time, because air is a highly controlled environment with professionals agreeing to cooperate and making logical decisions (most of times).
After reading the article I was wondering is AI generated images Art! We showed them millions of images and it's reinterpreting those, in a way it's Art.
Oh I had some really good times with ohm editor, when I was evaluating ohm, although I could not use ohm for the project but I do miss such editor/visualization for any parser toolkit.
Does the AI model hallucinations somewhat linked to this, does computer AI model too need some sort of socializing?
reply