
How to build a simple neural network in 9 lines of Python code - bryanrasmussen
https://medium.com/technology-invention-and-more/how-to-build-a-simple-neural-network-in-9-lines-of-python-code-cc8f23647ca1
======
matt_wulfeck
How to draw an owl in two lines of code;

    
    
        import owl
        owl.draw()

~~~
ggambetta
I chuckled, but I don't think the criticism is fair. The imports are for
generic math tools that aren't specific to neural networks, the most complex
one being a dot product, so I'd say there's no trickery here.

------
hprotagonist
>The human brain consists of 100 billion cells called neurons, connected
together by synapses. If sufficient synaptic inputs to a neuron fire, that
neuron will also fire. We call this process “thinking”.

86 billion neurons, with many thousands of synapses between each.

If enough action potentials from presynaptic neurons arrive within a little
enough amount of time, the postsynaptic neuron will _probably_ also fire.

We do not call this process "thinking".

------
bllguo
Yeah in general I find that cleaning the data, doing exploratory analysis,
doing sensible feature engineering, etc. are much more involved tasks than
running ML algorithm XYZ.

------
gravypod
I've often seen many fixed-input posts like this. I've been looking for
something that will allow me to train a model that will take a varying number
of inputs and produce a single output.

So for instances:

    
    
          [list of 10 items] -> Some Number
          [list of 500 items] -> Some Number
    
    

These items are not reducible to a single scalar value. They have too many
widely varying meanings.

Can anyone point me in the right direction?

~~~
michaf
I am not an expert, but maybe look into recurrent neural networks [0]? These
things can turn a sequence of inputs of possibly arbitrary length into a fixed
size internal representation, and can output a single value derived from this
representation.

[0]
[https://en.wikipedia.org/wiki/Recurrent_neural_network](https://en.wikipedia.org/wiki/Recurrent_neural_network)

~~~
anon1253
most implementations are not arbitrary length, but instead rely on padding to
make it a fix length sequence.

------
harel
I've been meaning to do a similar exercise, and this one is very helpful.
Thanks.

------
rsrsrs86
What he calls back propagation is actually gradient descent.

~~~
majewsky
Backpropagation (as in, the training method for NNs) is an instance of
gradient descent. There are many other instances (e.g. any EM algorithm does
gradient descent, as does minimum search on a simple real function), so using
the more specific term is appropriate.

------
vonnik
for the 400th time.

