Having worked in GCP, it is not surprising at all to me that Google, despite having the resources, can’t productize an LLM.
It feels like OP question stems from a belief that all there is to training a good model is throwing lots of compute at it. As with any product, there’s polishing and fine tuning you need to do, both before and after travel. Google can’t do that. You also have to accept imperfection and clever tricks to this end, a.k.a. the startup / hacker mentality, which Google is also not positioned to do. I think Meta has a good chance though
It feels like OP question stems from a belief that all there is to training a good model is throwing lots of compute at it. As with any product, there’s polishing and fine tuning you need to do, both before and after travel. Google can’t do that. You also have to accept imperfection and clever tricks to this end, a.k.a. the startup / hacker mentality, which Google is also not positioned to do. I think Meta has a good chance though