

The Moral of the Kurzweil v. Myers Spat - ewjordan
http://ewjordan.blogspot.com/2010/09/complexity-of-intelligence-kurzweil-v.html

======
SeanLuke
I don't agree with Kurzweil on much, but the more I read the back and forth,
the more it seemed to me that Myers was totally out of his depth. He
completely misquoted what Kurzweil said, and then when that came to light he
followed it up with fairly thorough misinterpretations of what Kurzweil was
saying. I don't think he was doing that intentionally: he just didn't
understand what he was saying. On my reading, Kurzweil wasn't really delving
into genomics (this was entirely a red herring): he was staying on firm ground
in information theory, cognitive science, and computer science, and in these
areas it was Myers who showed a fair lack of acknowledgment for his own
ignorance in the field.

------
donaldc
50 million bits is enough to specify a human-level intelligence, but only if
one has the appropriate machine. In terms of constructing an AI, the human
genome is doing little more than taking clever advantage of ways to cause
things in the physical world to self-organize (as per the book, Notes on the
Synthesis of Form). Physical reality itself is the execution machine, is
responsible for much of the end product, and has _way_ more than 50 million
bits of complexity.

If we are thinking of running our first AI on something resembling current
computer hardware, then we are going to run it in a extremely impoverished
environment, as compared with actual physical reality. This means that the AI-
specifying algorithm will have less environmental complexity to work with than
the human-intelligence algorithm does, and this in turn means that the AI-
specification will need to be longer and more complex.

So from a practical standpoint, while one could _in theory_ specify an AI in
50 million bits, in practice the first AI will need a much larger
specification, and/or will need to run on some very powerful computer
hardware.

Just how powerful can probably be estimated, but it has little to do with the
fact that the human genome contains 50 million bits of information. In terms
of determining when AI will happen, the 50 million bit argument is a red
herring.

