
A computational model of the Moth Olfactory Network learns to read MNIST [pdf] - higgsfield
https://openreview.net/pdf?id=HyYuqoCUz
======
carbocation
For some reason, Figure 2 doesn't continue out beyond the few-training-samples
regime. Therefore, I think we're left to assume that MothNet underperforms the
other techniques in the many-samples regime. Is there something I'm missing?

~~~
chestervonwinch
It is ambiguous. It's not clear whether or not they performed the experiments
with # samples / class > 20.

~~~
cdelahunt
(paper author) You are correct that the 'natural' moth maxes out after about
20 samples/class. It is not yet clear whether this is an intrinsic limitation
of the architecture (the competitive pressure on an insect is for fast and
rough learning), or whether it is just an artifact of the parameters of the
natural moth. For example, slowing the Hebbian growth parameters would allow
the system to respond to more training samples, which should give better test-
set accuracy. We're still running experiments.

~~~
fpgaminer
It sounds like you ran experiments on the BNN with >20 samples/class. Why were
those data points not included in Figure 2?

------
lootsauce
Yet to read this paper but wondering if the authors a familiar with the work
of Dasgupta et al. on fly olfactory model for locality sensitive hashing?

[https://www.biorxiv.org/content/biorxiv/early/2017/08/25/180...](https://www.biorxiv.org/content/biorxiv/early/2017/08/25/180471.full.pdf?%3Fcollection=)

I have been contemplating the relationship between random projections and
compressive sensing since reading it and curious to read this paper for any
insights on compressive sensing.

------
memebox3v
This is absolutely brilliant. I have been looking for a way into an
understanding of learning within biological neural nets. I dont suppose there
is source code around?

------
robinduckett
So are we learning that brains and neurons are general purpose computation goo
that can be applied to many different areas of signal processing yet?

~~~
bbctol
Kind of feel like we already know that brains can do general purpose
computation...

------
fovc
Needs a [pdf] flag

~~~
Aardwolf
Why is it, by the way, that papers have the author names at the top but not
the date? The dates are added to papers in references, so why not the date of
this paper itself too?

This one happens to have "Workshop track - ICLR 2018" at the top so has some
dating, but most don't even have that

~~~
tsomctl
That's the nice thing about arXiv. The first four digits of the paper's number
tells you the month and year it was first published.

~~~
JadeNB
> The first four digits of the paper's number tells you the month and year it
> was first published.

I think "published" should be "submitted" there. (I suppose that one could
argue for regarding submission to the arXiv as publication, especially given
the presence of overlay journals—but probably that's not what you meant.)

