Ask HN: What's the 'Hello World' program of neural networks? - essofluffy
======
mimo777
MNIST data, which is the hand written number data. Download Tensorflow and do
the examples. You will find it there.

------
wayn3
Tensorflow has a whole bunch of tutorials, but those are the "Hello World"s of
tensorflow, not of neural networks.

In order to get started with neural networks, begin with drawing simple neural
nets for basic operations like addition, multiplication, XOR. Just represent
boolean tables as neural networks.

Once you can do that, move on to implementing the algorithm yourself. A simple
3 layer network is enough to understand how the concept works. 4/2/2 nodes is
plenty. Just understand how the calculations work.

Then move on to a framework - only after you understood the math. The machine
learning course on coursera by Andrew Ng(?) explains the algorithms.

------
abee1331
XOR for fully connected networks, and mnist for convolutional networks

------
argonaut
Make sure to implement backprop yourself: don't use Tensorflow, just use numpy
and write out the matrix multiplications yourself.

~~~
mimo777
Not bad advice, but to get a feel before you dive into the deep end, use the
libraries and then, by all means, follow this advice. You should always know
what's underlying the functionality a library provides, but don't hesitate to
start simple. I remember "C++ Neural Networks and Fuzzy Logic" at 16 and being
intimidated because I had to develop and understand the code without having a
background in statistics, calculus, or linear algebra. Don't let it intimidate
you.

~~~
argonaut
If backprop is "the deep end," then I really don't know what you're doing...

------
malux85
Probably the XOR dataset.

------
Wonnk13
Not trying to be a smartass, if "Hello, World" is the most basic program, why
would a single layer perceptron model not be what OP is looking for?

~~~
wayn3
Because that would be the equivalent of telling someone to write "Hello World"
on a piece of paper in order to show them how programming works.

A single layer isn't a "neural network". The difference between logistic
regression and a really simple neural network is the ability to behave in a
non-linear way.

A single layer amounts to a matrix multiplication. Matrices are linear
operators.

------
billconan
1\. using a simple nn to do xor bit operation

2\. mnist hand written digit recognition

------
max_
XoR

------
imakesnowflakes
I once implemented a simple back propagation algorithm in Haskell (without any
libraries) that could identify the pattern (one amoung 'A', 'B', 'C' or 'D')
represented on an 8x8 matrix...

Here is the code..

[https://bitbucket.org/sras/haskell-
stuff/src/b58f3fc017ce303...](https://bitbucket.org/sras/haskell-
stuff/src/b58f3fc017ce303fd9733184f599e916739f6ed2/snn.hs?at=default&fileviewer=file-
view-default)

