
You’re smarter than you think - pw
http://scottlocklin.wordpress.com/2010/05/18/youre-smarter-than-you-think/
======
chroma
_assume the brain dissipates 1/10 of this; 200 kilocals per day._

The human brain is pretty hungry.
<http://www.pnas.org/content/99/16/10237.full> says around 1/5th of resting
human metabolism is to power the brain. So double all numbers in that post.

 _This puts an upper limit on how much the brain can calculate, assuming it is
an irreversible computer: 5_ 10^19 64 bit ops per second.*

It's one thing to put a theoretical upper limit on the human brain's
computational abilities. It's another thing to claim the human brain actually
performs anywhere near that level. Using the same math, a lawnmower engine
puts out 1,000 joules per second, so the upper bound on its computational
ability is 10^22 ops/sec. I think more than one equation is needed to get a
good idea of the brain's performance.

Evolved brains are highly suboptimal. There's no evolution fairy waving a wand
and selecting solely for brainpower. Much of the selection pressure is for
basic things like famine resistance. Even when selecting for intelligence,
evolution can only make incremental improvements. It's going to have trouble
stumbling on something like the wheel or the impeller. And it shows. Brains
have no API, no debugger, and they certainly aren't end-user modifiable. I
can't, by thinking, dedicate a small group of neurons to mental arithmetic. I
have to expend significant effort to do the math in my head, and even then I'm
not always correct.

Neurons fire at a rate of only 20 hertz or so, meaning any mental computation
has to finish in 100-200 steps for a person to respond in real-time. The
signals sent by neurons only travel at around 100 meters per second. That's
0.0000004c. Less than one one-millionth the speed of light! And that's not
going to change soon, because evolution can't change substrates. It's forced
to use neurons instead of silicon or carbon.

>Even if brains are computationally only 1/1000 of their theoretical
efficiency

Based on what I've said above, I think brains are far far less efficient than
that.

------
kmavm
"People who get degrees in computer science like to think of computers as
“Turing machines.” I suspect this is because it is easier to prove theorems
about Turing machines than Von Neumann machines. That’s fine by me, though I
will note in passing that I have never read a proof (or anything like a proof)
that VN machines like you write code on are computationally equivalent to
Turing machines. Someone smart please point me to the proof if it exists."

This trivial proof is usually left as an exercise sometime in the first couple
weeks of an undergrad theory of computation course.

<http://en.wikipedia.org/wiki/Turing_completeness>

This quote rhetorically implies that the author is deeply versed in computer
theory, and has made a penetrating breakthrough that undercuts its entire
edifice. In reality, he's "not aware" of such a proof because he is aware of
very close to zero proofs in computer theory. It makes me reticent to bother
engaging with the remainder of the article, which contains assertions in
fields I'm less capable of evaluating skeptically.

------
Kliment
The title is somewhat misleading, as the article is about entropy, heat
dissipation and the limits of irreversible computing. The title sounds like a
silly motivation article that we've seen far too many of here lately. I was
positively surprised.

~~~
sethg
Cf. the Fredkin gate, which has three inputs and three outputs, and in which
the number of bits going out equals the number coming in.

<http://en.wikipedia.org/wiki/Fredkin_gate>

------
openfly
A++++++++++++++ WOULD READ AGAIN!!!!

