Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Gell-Mann Amnesia effect is strong.

LLM constantly hallucinates JS built-in functions and builds code with massive foot-guns, which I catch, but it's probably fine for pizza recipes ... right?



Maybe it is me being pedantic, but AIs don't hallucinate. They make stuff up (it is pretty much all they do) but to claim that is a hallucination attributes a trait to them that they don't have (shoot calling then AI does the same thing)


That ship has sailed.

> hallucinate - When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information

- Cambridge dictionary

> hallucination - computing : a plausible but false or misleading response generated by an artificial intelligence algorithm

- Webster's dictionary


I dunno, I think even in the expression of “make stuff up” there’s anthropomorphising


So long as you don't mind glue on them, apparently.

(I've never seen this category of mistake on ChatGPT, but it does have a really hard time understanding that I want metric not imperial).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: