I disagree, but that's beside the point here. You yourself narrowed the scope to:
"imitating even the most primitive brains, even though computationally we should have enough juice by now"
Which is kind of weird to claim today. GPT-4 may be the strongest counterexample to date, but it's far from the only one.
Of course, you need to remember not to confuse the brain with attached peripherals. Just because we can't replicate a perfect worm or fly body, complete with bioelectrical and biomechanical components, doesn't mean we can't do better than their brains in silico.
I'd also say that some of it might be more computing power required, but much of it is us cracking the "puzzle" to it, we haven't figured out the exact right architecture/structure for creating say an AGI.
Just like with transformers revolutionising text generation and now things like LoRa and other fine tuning methods are helping us find a better solution to that puzzle, the same will happen for the development of AGIs.
GPT-4 does not "imitate" a "brain"; it does not function like a brain, nor is it even really analogous to a brain in any useful sense. What it imitates is human speech.
"Imitating human speech" is not a trivial thing. You can't do it by a lookup table, or by a Markov chain. Not properly, not in open-ended, unscripted situations. It requires capabilities and structures that, if they aren't a world model and basic abstract reasoning skill, then they at least start to look strikingly similar in practice. This is where we are with GPT-4. It doesn't imitate speech. It imitates reasoning.
And if it walks like a duck, and quacks like a duck, ...
GPT-4 is a good example because it's pretty clear that the model isn't merely a stochastic parrot (or, if it is in some sense, then in that sense so are we). But it's not the only game in town. Not all generative transformers deal with language. All seem to be powerful association machines, drawing their capabilities from simple algorithms in absurdly high-dimensional spaces. There are many parallels you can draw to brains here, not the least of which is that the overall architecture is simple enough and scalable, that it's exactly the kind of thing evolution could reach and then get railroaded into building on.
"imitating even the most primitive brains, even though computationally we should have enough juice by now"
Which is kind of weird to claim today. GPT-4 may be the strongest counterexample to date, but it's far from the only one.
Of course, you need to remember not to confuse the brain with attached peripherals. Just because we can't replicate a perfect worm or fly body, complete with bioelectrical and biomechanical components, doesn't mean we can't do better than their brains in silico.