LLM constantly hallucinates JS built-in functions and builds code with massive foot-guns, which I catch, but it's probably fine for pizza recipes ... right?
Maybe it is me being pedantic, but AIs don't hallucinate. They make stuff up (it is pretty much all they do) but to claim that is a hallucination attributes a trait to them that they don't have (shoot calling then AI does the same thing)
> hallucinate - When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information
- Cambridge dictionary
> hallucination - computing : a plausible but false or misleading response generated by an artificial intelligence algorithm
LLM constantly hallucinates JS built-in functions and builds code with massive foot-guns, which I catch, but it's probably fine for pizza recipes ... right?