LLMs are only 3-5 years old (NLP is much older OFC), for all we know they'll be a dead end in research like LSTM are today - LLM/Multi Modal just look super hot. "Attention is all you need" was released in 2017, it took 5 years to prove it was useful, for all we know the new hot thing has already been published and LLMs are obsolete - Google might have been right to wait.
Besides I dont think the top people at Google's DeepMind - and I can only "infer" this from watching them speek online - actually think LLM's are "the one".
Is the goal GAI or to add as much as possible to the market cap? I was specifically talking about the latter and why they got leapfrogged by OpenAI which has been valued at a substantial fraction of Google’s overall value despite having a fraction of the revenue. If Google had managed to generate this much value for themselves they’d be respected differently but for now it seems like they missed the AI tech stack today and are playing catchup for like the next 5 years regardless of where AI evolves later.
Larry's original goal for Google was always to be a revenue vehicle to reach AGI although I don't think Sundar is interested in anything except revenue/profit.
Note also that many of Google's previous attempts with LLM generated significant press controversy, and it was in Google's interest to let other groups take the heat for a while while the overton window shifted.
Besides I dont think the top people at Google's DeepMind - and I can only "infer" this from watching them speek online - actually think LLM's are "the one".