1) Current (traditional) search engines are indexes. They point to sources which can be read, analyzed and summarized by the human into information. LLM do the read, analysis and summarization part for the human.
2) chatbots, perplexity search engine, summarization chrome extensions, RAG tools. Those all built over the idea that hallucination is a quirk, a little cog in the machine, a minor inconvenience to be dutifully noted (for legal reasons) but conveniently underestimated.
Most things in life don’t have a compiler that will error on a inexistent python package.
1) Which search engine comes with infallible information? 2) Where are LLMs being sold as something different?