An argument against AGI might be that brains are extremely efficient for what they do. Maybe we could make a computer that's as powerful as a brain, but if it consumes 100 MW of power what's the point?
There are many industrial processes that use tons of power and are far less efficient than a human doing those tasks. Yet, they're still viable because they (scale / are faster / more consistent / etc than humans.
For AGI, it's really about replication, density, and easy-of-operation.
At the moment, we certainly can't mass produce "brains-on-a-chip" that provide a guaranteed level of human-like intelligence across various tasks.
But, imagine a world in which you could install racks of "brains-on-a-chip", powered via electricity (easily distributed/stored/fungible compared to food-powered-brains), and have a Moore's Law like scaling of "brain density". That would change everything, even if those brains consumed 1000W a pop.
Obviously, a literal brain is probably not the way this will pan out (hopefully not! "brains-on-a-chip" is rather creepy...), but you get the idea.
if something is generally intelligent, at the level of a human brain, and is forced to work in "industrial processes" isn't that a form of subjugation?
there seems to be a moral implication here that alot of people seem to be neglecting...