Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A blog I started on Neural Networks and Probability (jontysinai.github.io)
88 points by lhomdee on Nov 30, 2017 | hide | past | favorite | 11 comments


You know what would be great for learning probability? A set of easy problems (graded in difficulty) and their answers. There are many introductions in text, but the best presentation would be one where you also play with the notions, and initially this must be easy enough.

It's like the difference between supervised learning and RL. In one you just get a dataset, in the other you get a playground - a dynamic dataset where you can explore and test hypothesis.

Selecting the problems requires real flair, it's one thing to understand the subject, and quite another to understand the usual pitfalls and stumbling blocks of students learning it.


This might be of interest to you :) -- Probability Through Problems[http://www.springer.com/in/book/9780387950631]


I think somebody should make this for many subjects in math


That's what textbooks are.


Regarding the biology of neurons, it is important to know:

1) that neurons are a family of cells rather than a uniform, homogeneous thing.

2) initially it was believed that dendrites only propagated spikes. now it is accepted that dentrites generate spikes of their own. https://en.wikipedia.org/wiki/Dendritic_spike

3) the concept of threshold has been challenged. eugene izhikevich for example presents some counterexamples for the idea of the threshold (where a spike a spike was expected and doesn't happen, or when a spike was not expected and happens), and provides an alternative explanation involving dynamical systems.

so, while useful, it is important to understand that most artificial neural models are simplified abstractions. eventually we will figure out why biology is doing what is doing. e.g: how much of the biology has to do with information processing, how much of it has to do with keeping neurons alive, and how much of it has to do with passing information around.


What do you mean when you say a neuron is a family of cells?


e.g: a pyramidal cell is different from a purkinje cell.


Ah - it sounded like you meant that a neuron consists of more than one cell, not that that there are different types of neurons.


I'm enjoying the Probability for Everyone post. I'm also quickly reminded how complex the notion used in Probability Theory is. Lowercase, uppercase, script uppercase, double struck uppercase, and Greek uppercase all have specific meanings (sometimes multiple meaning), and they get held together with the set theory symbols. It's even worse since some people use different characters for inconsequential things, and then there's situations (online forums) where the notation can't be used and substitutes are made, I even recall some books using bold letters which made note taking a real hassle.


Well, all of that is really just Hungarian Notation [0] with typesetting rather than extra characters, so it's not 100% necessary to fully understand -- just look at the definition again. The exception is standard sets, like the Reals or Natural Numbers etc. (which use the double strike through). But that's pretty standardized.

[0]: https://en.wikipedia.org/wiki/Hungarian_notation


I think any field having so many consumers will struggle to standardize notation, but for boldface I can offer this—it’s typically one more stroke.

https://en.wikipedia.org/wiki/Blackboard_bold




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: