Hacker News new | past | comments | ask | show | jobs | submit login

> In fact, a tremendous amount of money is being invested right now to reduce/control hallucinations.

Ok, I can only laugh on this.

Those "hallucinations" are the one main feature neural networks. It's imposed by their fundamental structure, and what gives it all if its usefulness... when the people making use of it knows what they are doing.

The way to reduce/control the hallucinations of a neural network is to unplug it from the power outlet.




The hallucinations are a consequence of sampling from a probability distribution over all possible tokens at each step. There are a lot of very smart people trying to figure out how to sample for generative purposes while "grounding" the model so it hallucinates less. It's an active area of research.


Absolutely agree. I use LLMs for their hallucinations and barely for their recall of "facts".




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: