No, that misses the nuance entirely. An LLM returns one answer that it thinks is likely to be true. A search engine returns many answers one of which is either certainly true or (depending on what you're searching) is the literal source of truth.
There is a world of difference between these two results.
Ultimately the problem is that a single repository of all verifiably true human knowledge just doesn't exist and both search engines and LLMs are copes to deal with this fact.
yep!