
Brain Computation Is Organized via Power-of-Two-Based Permutation Logic - aburan28
http://journal.frontiersin.org/article/10.3389/fnsys.2016.00095/full
======
MrQuincle
There are only a few organizational principles that allow the number of groups
(K) to scale beyond the number of neurons (N).

Polychronization
([http://www.izhikevich.org/publications/spnet.pdf](http://www.izhikevich.org/publications/spnet.pdf)).

This is e.g. different from synfire chains in which the number of groups
equals the number of neurons. I don't remember much about the capacity
analysis of standard reservoir computing methods (ESN, LSM, ELM), but I
thought it was also limited to N.

I'm looking forward to more research in this direction! We as humans build up
such long-term memories. This must be a driving factor in AI. A lot of
prediction tasks hinge on being able to handle this huge temporal scale.

Currently I'm trying to see if I can formulate chains of transition matrices
in MCMC that allow simultaneous search on multiple scales. I feel like the
math from multiple angles is converging, exciting! We're gonna solve this in
our lifetimes!

~~~
whenov
Is ELM reservoir computing?

~~~
MrQuincle
Yes, if you consider the most salient part of reservoir computing method a
random projection matrix that is quenched (kept fixed).

[http://www.nature.com/articles/srep14945](http://www.nature.com/articles/srep14945)

------
abeppu
I spent way too long staring at figure 1, before realizing that they're not
using the word "permutation" in the way that people with exposure to CS or
discrete math would use it. They're assigning a "neural clique" (possibly not
exactly a clique in the graph theory sense) to each subset in the powerset of
inputs, and then ignoring the empty set. _Now_ 2^i - 1 makes sense; ordering
does not matter to the authors.

~~~
harryjo
It seems like you are saying that the headline is nonsense because the authors
of the article are using non-standard meanings for their words.

Any system with optional components has a factor of 2^n in its count of
configurations -- each optional component might be absent. That's an
interesting fact of math, not of any particular system.

~~~
abeppu
I am not saying it is nonsense -- just that it was hard for me to understand
their claim, in part due to a background that I expect many other readers here
share.

------
silos372
It seems that by "permutation" they mean what a mathematician calls a
"subset", or a computer scientists calls a "bitmask". Essentially they found
lots of evidence that animal brains implement demultiplexers?

------
dzdt
This sets off my BS detector.

The approach is to make a "thought experiment" about how neural computation
might work mathematically, then look for evidence that the guess is correct.
Their thought experiment makes them guess powers of 2 are special, that they
should expect to see organization at sizes 1,3,7,15.

But when they analyze data, I don't see any comparison saying that 7 is really
more prevalent than 6 or 8, or 15 is more prevalent than 14 or 16. Indeed 15
seems to be the largest they looked at.

The whole numerology theory driving the headline looks like unlikely,
unsupported nonsense to me!

------
mrfusion
This seems pretty hand wavy. I'm not seeing any actionable insights from this.

~~~
kpmcc
For real though. We already knew that neurons can do binary logic, they either
fire or they don't and neurons higher up the chain fire based on converging
inputs.

Also what do those plots actually 'show'. The authors claim repeatedly that
the plots show the existence of 15 clusters, but without stating p values
relating to any sort of classification for many of the plots.

------
dkdkdjfjf
Would anyone here be able to offer some book suggestions for tackling AI from
the neuroscience/ biological perspective, as opposed to the computer science/
computational perspective? I want to expand my horizons.

~~~
DocSavage
I would suggest "Principles of Neural Design", a book that tries to explain
why things might be engineered the way they are in biology:

[https://mitpress.mit.edu/neuraldesign%20](https://mitpress.mit.edu/neuraldesign%20)

~~~
posterboy
"explains" and "might" are kind of at odds, here

------
joggery
Wai H. Tsang:

[http://www.academia.edu/27961545/The_Fractal_Brain_Theory_-_...](http://www.academia.edu/27961545/The_Fractal_Brain_Theory_-
_Chapter_4_-_Binary_Trees)

------
kensai
Except for the Dopaminergic neurons, among other "subsystems".

~~~
aisofteng
That is explicitly mentioned in the article.

------
jessaustin
_This simple mathematical logic can account for brain computation across the
entire evolutionary spectrum, ranging from the simplest neural networks to the
most complex._

That is, further research will investigate how e.g. a mouse turns in a
particular direction in a maze before investigating how high-performing humans
perform amazing mental feats. It may all be the same at some level, but we
have to walk before we can run.

------
goldenkek
Some questions:

We act like stimuli are exact things but the brain is all about abstraction.
In fact, a certain amount of entropy/information enters your beain every day
from vision, smell, touch, etc. Its not stored in full fidelity. There arent
enough nuerons or energy for that. So what is lost? Well...whatever isnt
abstractable, right? You celebrate your birthday. You remember blowing out the
cake in pristine vision. But how much of that 'movie reel' inside your head is
actually a superposition of abstracted reality vs actual observed reality?

All Im asking is, is it even worth it to think of a human being in terms of
absolute information when their memories are so sparse from reality. We
remember whats important. And the things that are important, weve remembered
to remember. But ultimately, the actual holding power of the brain is quite
small. The brain is a master of deep abstraction from sparsity.

Is it like a quantum system? Where the superposition values of one subcortical
system are flattened by another? So that a whole brain belief/memory is a
function of the fuzzyness of each subsystem being exacted/wave function
flattened by the heuristic correlation/connection between all the shitty data?
I think we need to start thinking about sparsity. The universe is sparse when
it comes to using 3d space. Holographic principle says physics is so
symmetrical that we sparsely use 3d space. Such that our entire universe could
take place on 2d space. Max Tegmark gave a talk about how deep learning works
so well because of the sparsity of faces or music or voices, the overwhelming
symmetry/redundance in physics and nature. And transitively, humans.

~~~
dualogy
> _So what is lost? Well...whatever isnt abstractable, right?_

Raises a most curious question.. what ever _isnt_ abstractable?

~~~
visarga
We're just systems that transport our genes into the next generation. Our
perception exists mostly to help us find food, avoid dangers and reproduce. Of
course we lose a lot of information, because it is irrelevant to us.

~~~
goldenkek
Fortunately, evolution selected for intellect even though it might have
initially have been past a minimum fitness/energy valley. Id encourage you to
rethink this simplistic view that ignores the general tool that developed in
order to handle more random and abstract specific solutions to threats.

~~~
fpoling
There is no proof that smarter brain is what was targeted by evolution. It
could be that the size of human brain was either purely accidental consequence
of other factor or at least was driven by the needs like balancing when
walking/running on 2 legs.

~~~
berntb
A simple reality check:

Considering the large energy needs of the brain, there is an evolutionary
pressure to make it smaller... so something pushed it the other way.

~~~
fpoling
Compared with apes humans much more efficient at digesting of starch. However
to get enough protein and micronutrients from starchy root vegetables one has
to consume much more in calories compared with vegetables. That effectively
provided "free" energy to keep bigger brain. Of cause this is rather
speculative, but it just emphasises that we really do not now why humans
develop bigger brain.

~~~
VLM
Another abundant energy source is fish protein, which always existed but our
ancestors species is the only species that went nuts over fishing. Its pretty
easy for hominids or scavengers in general to get fish protein at a net loss,
or in very small quantities, but once you have a big enough brain our species
really got into eating fish starting about 100Kyrs ago.

Apparently there is evidence you need a pretty big brain and speech abilities
to run a net energy positive fishing village. A neanderthal should be able to
catch fish, just not efficiently enough to live off a fishing culture. Their
trash piles have all kinds of bones, just not fish bones. Ours are/were full
of fish bones, anywhere there's fish.

Fishing might have been "the niche" that led to our species... starting with
older hominids who were better than average at gathering fish due to some
local geologic peculiarity (the perfect river to catch salmon by hand
occasionally or whatever), a zillion generation later of ever improving
therefore ever fatter fisherman and we got a prehistoric Captain Ahab filling
the tribe trash piles with fish bones and the non-homo sapiens cousins can't
fish compared to us and die off because we're fat from fish and they aren't.

Large amounts of starch would seem to require agriculture which is pretty
recent compared to fishing.

~~~
dualogy
@VLM, ice age ancestors needed big game. Small fish are a joke for sustained
energy and nourishment, maybe as a condiment to the main meal of mammoth or
aurochs.

Our "ancestors" _really_ thrived for the most part of the very mega-fauna we
helped extinguish.

> Their trash piles have all kinds of bones, just not fish bones. Ours
> are/were full of fish bones, anywhere there's fish.

Your sources seem to wildly differ from my sources regarding the bulk of
ancestral homo sapiens evolution during the ice ages. Feel free to share them.

------
pmontra
Why that pdf in the title? The page is a long html article.

