Hacker News new | past | comments | ask | show | jobs | submit login

“…the outcomes of Large Language Models are not designed to be true — they are merely designed to be statistically likely.”

yep!




A search engine does not return a true answer. It returns a set of possibilities ranked by likelihood.

An LLM is just an aggregation of search results.


No, that misses the nuance entirely. An LLM returns one answer that it thinks is likely to be true. A search engine returns many answers one of which is either certainly true or (depending on what you're searching) is the literal source of truth.

There is a world of difference between these two results.

Ultimately the problem is that a single repository of all verifiably true human knowledge just doesn't exist and both search engines and LLMs are copes to deal with this fact.


a set of possibilities ranked by likelihood… written by humans who have the capacity to be interested in truth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: