
Neuromorphic Promises Better AI - JoachimS
https://www.eetimes.com/document.asp?doc_id=1335207
======
m0zg
Have they even solved MNIST at state of the art levels of accuracy with it
yet, let alone imagenet image classification? Because until they do, this is
just a bunch of speculation and hot air. Power consumption can also be reduced
by co-locating memory with compute and eliminating the need to hit the memory
bus for the majority of what the chip needs to do. That strikes me as a more
immediate and promising approach, compared to trying to model neurons.

I actually took a computational neuroscience course on Coursera a few years
back. I was disappointed to discover that modeling more than a handful of
neurons accurately in time domain was intractable at the time. It's likely
still intractable now, except maybe the handful has gotten a little larger.
Another surprise was that we have no clue whatsoever as to how the brain
actually learns. We know how it works on a cellular level (ion exchange,
spikes, etc), we sorta know how it operates at a very high level, and the
visual cortex has seen (see what I did there?) quite a bit of study as well,
but we don't really know much else, and we especially don't know how memory
works, or even how visual cortex learns low-level features without
backpropagation.

The reason why we don't know this is because we can't really know this with
today's research approaches. To see why, try to apply these same approaches
(i.e. sticking needles in the brain and collecting signals) to something much
simpler, like a digital CPU - you'll never be able to figure out how it works
no matter how hard you try.

And it's not at all clear what a "better" approach would look like. These
neuromorphic approaches, therefore, are akin to a cargo cult: they've
replicated the hardware, sorta, and now they're expecting that gods will
deliver the miracle and high-grade intelligence will "emerge" somehow. But I'm
pretty sure it wont. You can't fix what you don't understand.

------
bigred100
Do neuromorphic chips have any practical advantages beyond power consumption?

~~~
__s
Isn't that all that's necessary? Reduced power consumption implies less heat,
implies potential for further miniturization

~~~
p1esk
The problem is they don’t even provide that. Spiking hardware does not have
better power consumption than regular analog DL accelerators.

FG transistor or memristor based crossbars are extremely efficient as GEMM
engines.

