
Brains scale better than CPUs. So Intel is building brains - truth_seeker
https://arstechnica.com/science/2019/07/brains-scale-better-than-cpus-so-intel-is-building-brains/
======
arathore
Though neuromorphic architectures are interesting, the Loihi is essentially an
ASIC for Spiking Neural Networks [1]. The lower power consumption than GPUs is
due to the limited nature of computation that can be performed on these chips
(which is generally true for ASICs).

[1]
[https://en.wikichip.org/wiki/intel/loihi](https://en.wikichip.org/wiki/intel/loihi)

------
billconan
I don't quite understand brain simulation. Most article talks about how neuron
works, how they are connected. It seems I can write a simulation easily.
[https://medium.com/@bennashman/modeling-your-brain-on-a-
comp...](https://medium.com/@bennashman/modeling-your-brain-on-a-computer-
with-computational-neuroscience-57596c919c70)

What I don't get is, the interconnected neurons don't seem to be
differentiable. How do they learn? How to provided feed back to the system?
How to train them like we do to ANN?

~~~
tormeh
[https://en.wikipedia.org/wiki/Hebbian_theory](https://en.wikipedia.org/wiki/Hebbian_theory)

 _or_

"Fire together - wire together"

------
smachiz
I look forward to my brain leaking memory to other threads.

