
A blog I started on Neural Networks and Probability - jsinai
https://jontysinai.github.io
======
visarga
You know what would be great for learning probability? A set of easy problems
(graded in difficulty) and their answers. There are many introductions in
text, but the best presentation would be one where you also play with the
notions, and initially this must be easy enough.

It's like the difference between supervised learning and RL. In one you just
get a dataset, in the other you get a playground - a dynamic dataset where you
can explore and test hypothesis.

Selecting the problems requires real flair, it's one thing to understand the
subject, and quite another to understand the usual pitfalls and stumbling
blocks of students learning it.

~~~
196883
I think somebody should make this for many subjects in math

~~~
aisofteng
That's what textbooks are.

------
partycoder
Regarding the biology of neurons, it is important to know:

1) that neurons are a family of cells rather than a uniform, homogeneous
thing.

2) initially it was believed that dendrites only propagated spikes. now it is
accepted that dentrites generate spikes of their own.
[https://en.wikipedia.org/wiki/Dendritic_spike](https://en.wikipedia.org/wiki/Dendritic_spike)

3) the concept of threshold has been challenged. eugene izhikevich for example
presents some counterexamples for the idea of the threshold (where a spike a
spike was expected and doesn't happen, or when a spike was not expected and
happens), and provides an alternative explanation involving dynamical systems.

so, while useful, it is important to understand that most artificial neural
models are simplified abstractions. eventually we will figure out why biology
is doing what is doing. e.g: how much of the biology has to do with
information processing, how much of it has to do with keeping neurons alive,
and how much of it has to do with passing information around.

~~~
aisofteng
What do you mean when you say a neuron is a family of cells?

~~~
partycoder
e.g: a pyramidal cell is different from a purkinje cell.

~~~
aisofteng
Ah - it sounded like you meant that a neuron consists of more than one cell,
not that that there are different types of neurons.

------
syphilis2
I'm enjoying the Probability for Everyone post. I'm also quickly reminded how
complex the notion used in Probability Theory is. Lowercase, uppercase, script
uppercase, double struck uppercase, and Greek uppercase all have specific
meanings (sometimes multiple meaning), and they get held together with the set
theory symbols. It's even worse since some people use different characters for
inconsequential things, and then there's situations (online forums) where the
notation can't be used and substitutes are made, I even recall some books
using bold letters which made note taking a real hassle.

~~~
T_D_K
Well, all of that is really just Hungarian Notation [0] with typesetting
rather than extra characters, so it's not 100% necessary to fully understand
-- just look at the definition again. The exception is standard sets, like the
Reals or Natural Numbers etc. (which use the double strike through). But
that's pretty standardized.

[0]:
[https://en.wikipedia.org/wiki/Hungarian_notation](https://en.wikipedia.org/wiki/Hungarian_notation)

