This is what bothers me about most voice assistants, I think maybe the Amazon one finally got an upgrade to modern LLM capabilities? I don't know about the Google one.
I assume the cost is too high, but I don't expect ChatGPT / Grok / Claude level of knowledge from a voice assistant LLM, if they can run a drastically small enough model that doesn't cost an arm and a leg at scale, I would be okay with that. Definitely would have to cache some of the responses when viral events happen.
This is what bothers me about most voice assistants, I think maybe the Amazon one finally got an upgrade to modern LLM capabilities? I don't know about the Google one.
I assume the cost is too high, but I don't expect ChatGPT / Grok / Claude level of knowledge from a voice assistant LLM, if they can run a drastically small enough model that doesn't cost an arm and a leg at scale, I would be okay with that. Definitely would have to cache some of the responses when viral events happen.