

New Computer Programming Language Imitates The Human Brain - amerika_blog
http://io9.com/new-computer-programming-language-imitates-the-human-br-1080026417

======
wwweston
> But the human brain, which most certainly must be a kind of computer

 _Must_?

It's one thing to observe that the brain appears to do computation, and to
have as a working hypothesis that everything the brain does is a form of
computation.

It's something else to say that it _must_ be true, when, for the moment, the
brain also appears to do things that we have no idea how to make computers do,
that we have significant difficulty getting computers to even _pretend_ to do.

~~~
dchichkov
On which sources are you basing your statements? What else could it be, if not
a kind of associative/predictive/computation+memory system?

And as to the IBM effort, I think they are doing the right thing. Whatever
brings us close to having the hardware that can do the job (needs to be at
least on the same scale as the human brain, that features around a petasynaps
of connectivity/memory)...

~~~
wwweston
> What else could it be

It doesn't matter what else it could be. Either there's convincing
evidence/argument that we can get computers to do everything the brain
apparently does, or there isn't.

Right now, we're in the later state.

~~~
dchichkov
I dunno. But after I've seen a neuro network dreaming about hand-written
digits, I couldn't think about the brain as of something other than a large
neuro-network any more. It is still beautiful and magical to me, but it is
magic without magic.

The video that I've mentioned is still available here:
[http://www.cs.toronto.edu/~hinton/digits.html](http://www.cs.toronto.edu/~hinton/digits.html)

~~~
seanmcdirmid
If you read the paper * , the IBM programming language's model really isn't
far from that. Even the article indicates that you are basically building a
modular neuro-network.

* Find the link somewhere at [http://lambda-the-ultimate.org/node/4797](http://lambda-the-ultimate.org/node/4797)

------
aa0
Seems somewhat like hype. FPGAs and ASICs are the only fast enough solutions
for direct parallel nets of inputs and outputs at the current moment. There is
no way a processor is going to perform as well, especially if running
instructions from a memory chip.

