

A neuromorphic-computing ‘roadmap’ - aburan28
http://www.kurzweilai.net/neuromorphic-computing-roadmap-envisions-analog-path-to-simulating-human-brain

======
arethuza
Isn't that article just blogspam? I don't see what value it is adding over the
original publication:

[http://journal.frontiersin.org/Journal/10.3389/fnins.2013.00...](http://journal.frontiersin.org/Journal/10.3389/fnins.2013.00118/full)

[Even the tl;dr crowd could skim the introduction and the conclusions].

------
SideburnsOfDoom
"Because analog devices do not have to process binary codes as digital
computers do, their performance can be both faster and much less power hungry"

That's either a non-sequitur or an oversimplification.

~~~
tlarkworthy
The idea is that neuro chips are error tolerant and such can deal with
manufacture tolerances better. As we shrink silicon more perhaps we will
abandon fragile digital design and go for self organising neuro inspired
design.

I am somewhat skeptical as we don't do that kind of computimg in software. I
can't yet imagine doing a whole application using 90% neural nets.

~~~
neolefty
Rather than "fragile digital" it may be more apropos to say "efficient
analog". If anything, analog is more fragile -- affected by changes in
temperature, drive voltage, chemical gradients across individual chips.
Digital is an escape from that fragility, but at the expense of power
efficiency and speed.

~~~
tlarkworthy
I think of digital as a logical house of cards. A single flip of a bit has a
cascade of effects. Analogue is affected by lots of things, but good analogue
design is robust to these perturbations. Digital circuits can't be perturbed
outside their tolerances. Digital design can't generally deal with a noisy
AND. It this kind of manufacturing noise they want to deal with with
neuromorphic chips. As I said, I am sceptical myself but that's the
justification I have heard from this research domain.

