
Build a Neural Network - shamdasani
https://enlight.nyc/projects/neural-network
======
fartcannon
As someone who has read a lot of implementing neural networks from articles,
the massive problem with all of them is that they import numpy. You may think
that it is silly to reimplement the matrix math but with out that part of the
code, you can't easily port it to other
languages/microcontrollers/microwaves/badgers.

It's a legitimately valid part of machine learning, and its not easy to do for
novices.

And I need help putting it on my badger damn it!

~~~
Anon84
As someone who does teach tutorials as a side gig, I would argue that
implementing matrix operations in a tutorial on neural networks is overkill.
No matter what the level of the tutorial you always need to draw a line and
assume a certain amount of background knowledge and knowing how to use
standard tools isn't too much to ask. (yes, I know numpy isn't part of
python's standard library, but it comes with pretty much any Python
distribution as many other libraries depend on it.)

If we're talking about a longer format, such as a book, then we might consider
digging deeper and implementing as much as possible using the barest of Python
requirements. Indeed, Joel Grus does implement _everything_ from scratch in
his great (although a bit dated) book [https://www.amazon.com/Data-Science-
Scratch-Principles-Pytho...](https://www.amazon.com/Data-Science-Scratch-
Principles-Python/dp/149190142X).

EDIT: This is still a work in progress (and relies on numpy and matplotlib),
but here is my version:
[https://github.com/DataForScience/DeepLearning](https://github.com/DataForScience/DeepLearning)
These notebooks are meant as support for a webinar so they might not be the
clearest as standalone, but you also have the slides there.

~~~
HuShifang
A new edition of Grus comes out next week actually...

[https://www.amazon.com/Data-Science-Scratch-Principles-
Pytho...](https://www.amazon.com/Data-Science-Scratch-Principles-
Python/dp/1492041130)

~~~
Anon84
Nice! He mentioned he was working on it when I met him at Strata last year,
but I didn't know it was coming out already.

------
cwt137
If you think this blog article is lacking, get "Make Your Own Neural Network"
by Tariq Rashid[1]. It is way more comprehensive, but still easy to
comprehend. It also uses Python to create NN from scratch.

1\. [https://www.amazon.com/Make-Your-Own-Neural-
Network/dp/15308...](https://www.amazon.com/Make-Your-Own-Neural-
Network/dp/1530826608)

~~~
asdfman123
Also, Andrew Ng's course on Coursera is free if you want to really learn it
and have a few weeks to throw at it.

~~~
cr0sh
I second this suggestion; I took that course when it was called "ML Class"
during the Fall of 2011 (yep, I was one of the guinea pigs for what became one
of the first courses of Coursera). It was an excellent course.

Here's an example of what one student of the ML Class built, after being
inspired by what he was learning and videos that played during the course:

[https://blog.davidsingleton.org/nnrccar/](https://blog.davidsingleton.org/nnrccar/)

It kinda shocked me at the time, because I knew quite a bit about ALVINN from
books and articles I had read as a teenager in the 80s and 90s. This guy had
created the same thing using a cell phone and a cheap RC vehicle! Ok, there
was also an Arduino and computer involved - but it really hit home the fact
that technology around neural networks had advanced quite a bit!

I also took the other course, "AI Class", but due to personal issues I had to
drop out about halfway through.

The next year, after Udacity started, they introduced a course similar to AI
Class called "How to Build Your Own Self-Driving Vehicle" (it's called
something else today - something like "Robotics and Artificial Intelligence
302" or something like that).

That class was done in Python, and taught me even more about AI/ML - with a
focus towards self-driving vehicles of course. Things I learned about that I
struggled with or had no real concepts of before:

1\. SLAM (Simultaneous Localization and Mapping) 2\. Path Finding algorithms
(A* and the like) 3\. Kalman Filtering (what it is for, how it works) 4\. PID
Algorithm (how to implement and tune it) 5\. More neural network stuff...

...and many other things. Another very excellent and free course to take if
you're interested in learning this stuff.

~~~
JabavuAdams
Second those recommendations. I took the same classes. While Thrun and
Norvig's AI class had some neat teaching / quiz tools, I found that Andrew Ng
was a much better teacher. Very thorough and clear. Thrun and Norvig felt
rushed and like they were assuming a lot when asking questions.

------
melling
Here's another Neural Network from scratch that I found useful:

[https://victorzhou.com/blog/intro-to-neural-
networks/](https://victorzhou.com/blog/intro-to-neural-networks/)

~~~
samsonradu
Thanks a lot for this, it is indeed very clear and easy to follow! Good
walkthrough on the partial derivatives calculations which imo are the hardest
part.

------
jorgeleo
This tutorial explained to me at the exact level of detail:

[https://mattmazur.com/2015/03/17/a-step-by-step-
backpropagat...](https://mattmazur.com/2015/03/17/a-step-by-step-
backpropagation-example/)

It was detailed enough for me to do all the calculations in an excel workbook,
1 complete cycle (forward, backward, and forward with the learned weights)

[https://1drv.ms/x/s!Ar06sKFtc9d7goR5WQLo-
RkB0XvWAA](https://1drv.ms/x/s!Ar06sKFtc9d7goR5WQLo-RkB0XvWAA)

Which allowed me to play with the name and factors to understand better how
they impact the network as a whole.

~~~
inertiatic
Having spent a lot of time hunting for the best way to figure out backprop,
that is the best resource I've found and the one that finally made everything
I've read click.

------
markbnj
Seems like a good intro and I plan to work through it later. I've been
learning a lot from Michael Nielsen's book, available at
[http://neuralnetworksanddeeplearning.com/index.html](http://neuralnetworksanddeeplearning.com/index.html).
He doesn't shy away from the underlying math, and his appreciation for it
comes through in the writing. Even without a strong math background I was able
to punch through the notation and figure things out.

------
_jsdw
In case it helps, I also had a go at an introductory neural net tutorial which
I probably never shared anywhere:

[https://jsdw.me/posts/neural-nets/](https://jsdw.me/posts/neural-nets/)

I found that I had to read a bunch of these things to really grasp them
myself.

------
rrggrr
Would be great if this included real world data or application to understand
context.

------
codesternews
Why no biases?

