By all the standards I had growing up, ChatGPT is already AGI. It's almost certainly not as economically transformative as it needs to be to meet OpenAI's stated definition.
OTOH that may be due to limited availability rather than limited quality: if all the 20 USD/month for Plus gets spent on electricity to run the servers, at $0.10/kWh, that's about 274 W average consumption. Scaled up to the world population, that's approximately the entire global electricity supply. Which is kinda why there's also all the stories about AI data centres getting dedicated power plants.
Don't know why you're being downvoted, these models meet the definition of AGI. It just looks different than perhaps we expected.
We made a thing that exhibits the emergent property of intelligence. A level
of intelligence that trades blows with humans. The fact that our brains do lots of other things to make us into self-contained autonomous beings is cool and maybe answers some questions about what being sentient means but memory and self-learning aren't the same thing as intelligence.
I think it's cool that we got there before simulating an already existing brain and that intelligence can exist separate from consciousness.
Old-school AI was already specialised. Nobody can agree what "sentient" is, and if sentience includes a capacity to feel emotions/qualia etc. then we'd only willingly choose that over non-sentient for brain uploading not "mere" assistants.