
Researchers create “neuromorphic chip” that is modeled after the brain - jonbaer
https://news.yale.edu/2017/11/28/new-research-creates-computer-chip-emulates-human-cognition
======
aalleavitch
I honestly think that focusing on asynchronicity is almost completely the
wrong way to go about designing neuromorphic systems. One thing that engineers
almost never focus on, but is a well-researched vital aspect of the brain, is
the concept of firing frequency and synchronicity: "cells that fire together
wire together". This aspect of synchronized, timed firing is so strong that it
creates neural oscillations on a large scale that are consistently
characteristic of different brain states (brain waves) and widely theorized to
be important to how information is encoded in the brain. It's also very
commonly observed that, for instance, the intensity of a stimulus is encoded
not in the amplitude of a particular neuron's response but rather in the
frequency of a neuron's firing. This oscillatory activity has been shown to be
active at all different levels of organization, too. There's basically an
entire dimension to how the brain works (in terms of positively and negatively
interfering wave patterns localized in different parts of the brain, with
neurons that serve to speed up or slow down a continuous pattern of firing)
that people looking to reproduce its function usually totally ignore. They
think of neuronal firings as discrete calculations, when really they're
transformations on an ongoing pattern of activity that's evolving over time. I
feel this way of looking at the brain is far more consistent with how
biological processes generally function: the point is not that each neuron is
accomplishing some precise operation independent of the other neurons, but
rather that with a large enough population of imprecise neurons firing
individually an overall pattern of useful activity emerges.

~~~
shahbaby
This is like saying that all flying birds have feathers therefore we must have
something similar to feathers on a flying machine.

Simply emulating things just because it feels like they should be important is
not a particularly promising approach when you're designing circuits.

~~~
aalleavitch
To torture your analogy: in my mind, if the brain is like a bird, then modern
approaches to machine learning are like helicopters. Yes, these things can
both fly, but they don't fly in nearly the same way, and if your helicopter
isn't getting off the ground then looking at the design of a bird isn't going
to help you very much with many of the challenges you're going to face in
getting it to work. You wouldn't claim that a helicopter is bird-shaped, even
though there's some basic relationships between the aerodynamic principles of
a bird's wings and a helicopter rotor.

If we're having trouble getting our helicopter to work, maybe we should be
trying to make a working airplane first, since then we can base more of it off
the design of a bird and use this to help us to better understand the basic
principles of aerodynamics.

Do you get where I'm coming from here?

~~~
shahbaby
To be clear, once we understand the principles behind the neocortex, only then
it will make sense to ask questions about the importance of oscillations or
how it relates to asynchronous vs synchronous systems, and many more
phenomenon we have observed but do not really understand.

Without an underlying theory to tie it all together, it's difficult to make
sense of it.

I personally think that oscillations are more likely to be an emergent
property which is a common theme in nature.

------
cVwEq
TL;DR Researcher and team at Yale invent a chip that is similar to a neural
network. It's good at object recognition and operates asynchronously.

------
GuiA
_1 million “neurons” that communicate via 256 million “synapses.”_

Human brain = an estimated 100 billion neurons for over 100 trillion synapses
according to Wikipedia

So a 5 orders of magnitude difference for neuron count, but only 1 small order
of magnitude for average synapse per neuron (1 for 256 in former, 1 for 1000
in latter)

I wish scientific news were better at framing the numbers they quote from
press release.

It’s interesting to keep track of these sort of numbers, because while we
clearly won’t have a chip that can do things remotely near what a human brain
can when the numbers are at parity, at least it’ll help focus people more on
exploring changes in architecture, rather than throwing more and more
computational units at the problem.

~~~
crusso
_while we clearly won’t have a chip that can do things remotely near what a
human brain can when the numbers are at parity_

That seems hard to say at this point. Certainly quantity won't be the sole
determining factor; but quantity combined with behavior and interactions might
mean that on-a-chip neurons function better than human/animal brains.

------
aoeusnth1
I find that neuromorphic chips are in this awkward valley where they aren’t
trying to be state of the art machine learning systems (TPUs and GPUs do ML
much better), but they’re also not trying to emulate biology faithfully. I
think experimenting with new hardware and technologies is great, and I hope we
discover something really useful from these. Still, I cant help but wonder
whether they should be more focused in either practicality or pure science.

~~~
crusso
But there are people working on the pure science of learning how the brain
works.

Taking bits and pieces of strategies, chemicals, or structures found in nature
and adapting them to other technologies has a long history of success.

If I had some bitcoins to bet, I'd put them on these emulated brain/neuron-on-
a-chip techniques when it comes to our pursuit of general purpose AI.

------
Exo_Tartarus
I wonder how that train it? The network is built into the silicon so I don't
see how training could occur after its creation... But maybe they build a
simulated network and train it then construct a faithful hardware
representation

