
PyCNN: Cellular Neural Networks Image Processing Python Library - bane
https://github.com/ankitaggarwal011/PyCNN
======
iraphael
I'm reading up on Cellular Neural Networks and it seems like they are simple
convolutions, of which you can specify the kernel. In fact, this library is
just calling scipy.signal.convolve2d() with different kernels [2].

The research (from the original 1988 paper [1]) apparently tried tackling the
signal processing problem from a different angle (namely, having connected
neighboring 'cells'), but the end result is the same as what we know today. I
actually wouldn't be surprised if CNNs were a precursor to today's ConvNets in
some way.

TLDR: these 'CNN's are convolutions. They aren't ConvNets because they don't
have pooling or fully connected layers.

[1]
[http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600](http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600)
[2]
[https://github.com/ankitaggarwal011/PyCNN/blob/master/cnnimg...](https://github.com/ankitaggarwal011/PyCNN/blob/master/cnnimg.py#L68)

~~~
aaggarwal
Hi, author here. Even though I was on HN, I didn't realize that this is posted
over here :) I'm glad about all the feedback. Thank you.

> I'm reading up on Cellular Neural Networks and it seems like they are simple
> convolutions, of which you can specify the kernel.

Actually its more than that, simply put, cellular neural networks are a
parallel computing paradigm similar to neural networks, with the difference
that communication is allowed between neighboring units only [1].

> In fact, this library is just calling scipy.signal.convolve2d() with
> different kernels.

The part you're referring to performs the convolution between the kernel
function and the feedback template to get the result of the feedback loop.
Please note the kernel function is sigmoidal or its approximation and remains
unchanged.

It will be easier to understand if you'll visualize it as a control system as
shown in [2] with a feedback template and a control template. These templates
(coefficients) are configurable and produce different results for different
configurations.

One of the applications of these networks is image processing as stated in [3]
"CNN processors were designed to perform image processing; specifically, the
original application of CNN processors was to perform real-time ultra-high
frame-rate (>10,000 frame/s) processing unachievable by digital processors
needed for applications like particle detection in jet engine fluids and
spark-plug detection.".

[1]
[https://en.wikipedia.org/wiki/Cellular_neural_network](https://en.wikipedia.org/wiki/Cellular_neural_network)

[2]
[http://www.isiweb.ee.ethz.ch/haenggi/CNN_web/CNN_figures/blo...](http://www.isiweb.ee.ethz.ch/haenggi/CNN_web/CNN_figures/blockdiagram.gif)

[3]
[https://en.wikipedia.org/wiki/Cellular_neural_network#Applic...](https://en.wikipedia.org/wiki/Cellular_neural_network#Applications)

~~~
frikk
Hi there. I'm completely unfamiliar with CNN. Do they relate to Cellular
Autonoma in any way other than sharing part of a name? And of course the fact
that they communicate only with neighbors. Are there patterns of emergent
behavior in the classical sense, or is it more closely related to a neural
networks?

~~~
aaggarwal
Nice catch! It is indeed closely related to cellular automata. CNN processors
could be thought of as a hybrid between ANN (artificial neural networks) and
CA (continuous/cellular automata) [1]. Like neural networks, they are large-
scale nonlinear analog circuits that process signals in real time. Like
cellular automata, they consist of a massive aggregate of regularly spaced
circuit clones, called cells, which communicate with each other directly only
through their nearest neighbors [2].

The topology and dynamics of CNN processors closely resembles that of CA. Like
most CNN processors, CA consists of a fixed-number of identical processors
that are spatially discrete and topologically uniform. The difference is that
most CNN processors are continuous-valued whereas CA have discrete-values [1].

You can also read this:
[http://www.nims.go.jp/nanophys6/Anirban%20Bandyopadhyay/site...](http://www.nims.go.jp/nanophys6/Anirban%20Bandyopadhyay/site/CellularAutomataCA.htm)

[1]
[https://en.wikipedia.org/wiki/Cellular_neural_network#Relate...](https://en.wikipedia.org/wiki/Cellular_neural_network#Related_processing_architectures)

[2]
[http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600](http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600)

------
dancsi
The name CNN is quite unfortunate, as it is most often used for convolutional
neural networks.

~~~
wodenokoto
Speaking of unfortunate naming, there already exist a neural network package
called PyCNN, which binds python to the neural network library CNN, named so,
because it is a neural network library written in C++.

[https://github.com/clab/cnn](https://github.com/clab/cnn)

------
anc84
Those examples seem like standard image processing to me, could someone
explain why a neural network is useful for these?

~~~
aaggarwal
Please take a look at the image processing specific applications [1] and
advantages of the cellular neural networks.

[1]
[https://en.wikipedia.org/wiki/Cellular_neural_network#Applic...](https://en.wikipedia.org/wiki/Cellular_neural_network#Applications)

~~~
bobosha
the tl;dr version is: cnns or cellnets (how's that for a name), offer speedier
image processing with lower computational costs?

am i right?

~~~
aaggarwal
Yes, this is one of the applications of these networks.

~~~
bobosha
>Yes, this is one of the applications of these networks. Thanks, would you
mind elaborating more on other advantages? I did read the wikipedia link and
other links you posted, however it's not entirely clear what other benefits
exist. May I suggest writing a blog post about what you perceive are the
benefits of CNN (cellular networks) vis-a-vis the other CNNs.

------
blennon
For those interested in studying the dynamics of networks with
lateral/recurrent connections, Stephen Grossberg pretty much wrote the book on
mathematically analyzing these systems:
[http://www.scholarpedia.org/article/Recurrent_neural_network...](http://www.scholarpedia.org/article/Recurrent_neural_networks).

A number of really interesting properties emerge like automatic gain control
and contrast enhancement when you include network properties similar to what
is seen in the brain.

------
Noctem
Hmm, I wonder if this could be adapted to be a good VapourSynth filter.

------
nzjrs
Making a play for the CNN TLA I see...

