Hacker News new | past | comments | ask | show | jobs | submit login

Okay, but if you ask me the same thing and I use a library or tell you the name of a library that does not exist (although the name comes close), would you say I am hallucinating? If not, then how could we apply this to AI if we cannot even apply the term used for humans to me?



OK, personally If a friend recommended me a library that didn't exist I would reply back to him "you made that up".

So, ChatGPT is "making stuff up".

To me hallucinating is a really similar term for "making something up". So the term works perfectly well for me.

I mean the definition on Google at least is: "an experience involving the apparent perception of something not present."

Isn't that what's happening? ChatGPT is experiencing (writing code using) the perception of something (the library) that is not present.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: