Hacker News new | past | comments | ask | show | jobs | submit login

Hallucination can't be solved because bogus output is categorically the same sort of thing as useful output.

It has no world model. It doesn't know truth any more than it knows bullshit just a statistical relationship between words.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: