What the hype machine still doesn't understand is that it's a language model, not a knowledge model.
It is optimized to generate information that looks as much like language as possible, not knowledge. It may sometimes regurgitate knowledge if it is simple or well trodden enough knowledge, or if language trivially models that knowledge.
But if that knowledge gets more complex and experiential, it will just generate words without attachment to meaning or truth, because fundamentally it only knows how to generate language, and it doesn't know how to say "I don't know that" or "I don't understand that".
It is optimized to generate information that looks as much like language as possible, not knowledge. It may sometimes regurgitate knowledge if it is simple or well trodden enough knowledge, or if language trivially models that knowledge.
But if that knowledge gets more complex and experiential, it will just generate words without attachment to meaning or truth, because fundamentally it only knows how to generate language, and it doesn't know how to say "I don't know that" or "I don't understand that".