Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I stumbled upon LLM Kryptonite and no one wants to fix this model-breaking bug (theregister.com)
2 points by nbmk on May 24, 2024 | hide | past | favorite | 3 comments


It is well known that it is impossible to eliminate hallucination in LLMs

Here is a formal proof.

https://arxiv.org/abs/2401.11817

Hopefully this article helps spread the word, but this is something that will have to be mitigated within acceptable limits.

LLMs are NLP, not NLU.


> NLU

Natural Language Understanding

(I had to look it up. >_>)


> It would appear that the biggest technological innovation since the introduction of the world wide web a generation ago has been productized by a collection of fundamentally unserious people and organizations who appear to have no grasp of what it means to run a software business, nor any desire to implement any of the systems or processes needed to affect that seriousness.

Yes, venture capitalists. They've been doing that since there was an Internet to Venture Capital in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: