They didn't try particularly hard to find out why the biggest model didn't perform as well. Or at all, as far as they report. They give us one sentence speculating "maybe not enough training".
Come on. This is Google. They have unlimited compute resource, biomedical AI is a core strategic objective, they've spent untold $$$ getting data and working on this for years. There are news articles on their efforts going back a decade. And all they could do is wave their hands at "maybe not enough training".
Come on. This is Google. They have unlimited compute resource, biomedical AI is a core strategic objective, they've spent untold $$$ getting data and working on this for years. There are news articles on their efforts going back a decade. And all they could do is wave their hands at "maybe not enough training".