> For example you know context and a poor explanation of what you are after. Googling will take you nowhere, LLMs will give you the right answer 95% of the time.
This works nicely when the LLM has a large knowledgebase to draw upon (formal terms for what you're trying to find, which you might not know) or the ability to generate good search queries and summarize results quickly - with an actual search engine in the loop.
Most large LLM providers have this, even something like OpenWebUI can have search engines integrated (though I will admit that smaller models kinda struggle, couldn't get much useful stuff out of DuckDuckGo backed searches, nor Brave AI searches, might have been an obscure topic).
This works nicely when the LLM has a large knowledgebase to draw upon (formal terms for what you're trying to find, which you might not know) or the ability to generate good search queries and summarize results quickly - with an actual search engine in the loop.
Most large LLM providers have this, even something like OpenWebUI can have search engines integrated (though I will admit that smaller models kinda struggle, couldn't get much useful stuff out of DuckDuckGo backed searches, nor Brave AI searches, might have been an obscure topic).