
Why strange loops could be an argument for artificial intelligence - ColinWright
http://io9.com/why-strange-loops-could-be-an-argument-for-artificial-i-1524245285
======
simonh
>So what? So that means that the sense of self will arise organically, from
enough data input on sufficiently complex machinery.

Simply being complex isn't enough though. The non-brain parts of our bodies
are as complex as you like, vastly more complex than all the computers on
earth put together and quite possibly more complex than our brains depending
on how you measure complexity, but are not themselves intelligent.

I don't believe intelligence or a sense of self that spontaneously emerge from
any complex system. If that were true all we'd need to do is build some huge
computer with billions of neural network components jammed together any old
way, give it a decent internet connection, switch it on and wait. I suspect
there's going to be more to it than that.

Our own brains are extremely fragile. Even tiny amounts of damage to very
localized parts of the brain can render us completely helpless. That damage
doesn't significantly reduce the complexity of our brains, in fact it might
even increase it, but it can catastrophically disrupt the delicate
relationships and feedback loops that make us sentient and conscious. I don't
see how such a delicate, easily disrupted pattern of behavior can arise and
sustain itself from a chaotic, unordered system. I'm afraid if we want to
create sentient computers, we're going to have to do the hard work of
designing them, or at least e.g. constructing a carefully tuned algorithmic
evolutionary system that leads to one emerging. That's not going to be easy.

