
Computers vs. Brains - robg
http://judson.blogs.nytimes.com/2009/03/31/guest-column-computers-vs-brains/
======
swombat
Wow, way to spectacularly miss the point!

1) No intelligent person is suggesting that a computer will directly map to a
human brain. We're not trying to reproduce a brain in silicon. However, a
sufficiently powerful computer might be able to model the workings of a brain
- or even might not need to if we find another path to intelligence.

2) If anything, the complexity and _existence_ of the human brain is an
indication that such a system _can_ be built efficiently. After all, the brain
exists, therefore it should be possible to at least build something of
comparable complexity and power, at some point in the future. Maybe it won't
be as rapid as Moore's law predicts, but there's no reason to think that we
won't eventually be able to reproduce what nature has been able to do through
blind selection.

3) There's no reason why a singularity can only come via wholly artificial
intelligence. Intelligence augmentation could easily achieve the same
objective (and is perhaps even more likely - after all, it's happening today
already, albeit on a smaller scale).

Did they even bother to read up on Singularity stuff before writing this
article?

~~~
robg
Your criticisms are fair, but:

1) <http://en.wikipedia.org/wiki/Blue_Brain>

2) How long do you think we'll need? "At some point in the future" is an
awfully long time scale, especially relative to how long it took natural
selection to create the human brain.

3) I don't think they'd deny that augmentation is already happening. But that
piecemeal progress is still a long way from _The Singularity_.

~~~
swombat
1) is exactly what I suggested - a virtual model of the brain. I don't see any
"but" there...

2) I'm no AI expert, so my opinion of this is pretty irrelevant, but I would
be extremely surprised if we didn't have some form of human-level intelligence
within the next 100 years, and I would not be surprised if we had it in the
next 20 years.

3) Piecemeal progress that feeds back on itself is the essence of the whole
Singularity hypothesis. Better brains come up with even better ways to enhance
themselves, which makes them even better, etc. Technological progress can grow
exponentially in ways not subject to the Malthusian limits.

------
danbmil99
Why do neuroscientists get so hot & bothered over the prospect of AI? It
doesn't diminish the brain's complexity to think that one day we can rival it
with alternate technology. There's a whiff of duality/magic animus in this
line of thinking.

------
dejb
Be warned you may lose several IQ points reading this article if you are
familiar with the concept of strong AI and the singularity. Short version -

\- Strong AI isn't possible because it will require too much power according
to Moore's law.

\- Even if we could build Strong AI, why bother?

------
warfangle
"To put this in perspective, the entire archived contents of the Internet fill
just three petabytes."

This is categorically false. While the "Internet Archive" currently has the
capacity of 4.5 petabytes, the sheer amount of data available on the http
protocol is more likely in exabytes (especially once you take into
consideration high-capacity services like youtube, bittorrent, and other file
sharing protocols).

------
ilaksh
This points out how strong status quo bias is even for neuroscientists. Which
is an indication that we may be in for a very significant collision between
reality (artificial general intelligence) and mainstream humanity. I think
that it is going to be quite some time (years or maybe decades) before most
people accept and comprehend the changes brought on by AGI and start to modify
social and other systems to adjust to this reality.

------
varjag
Give us the million years evolution had to develop the brain, we would surely
come up with something better.

------
niels_olson
um, I rather doubt a computer is going to consume a gigawatt in 15 years.

