Hacker News new | past | comments | ask | show | jobs | submit login

If you take the sum of a large, but finite number of random variates from a probability distribution you can show that the resulting distribution is approximately a Gaussian perturbed by Hermite polynomials. The size of these perturbations decays inversely with the order of the polynomial divided by two minus one. This is the so-called Edgeworth expansion. I found this chapter to have a good explanation of the Edgeworth expansion if you're interested in more detail: http://web.math.ku.dk/~erhansen/bootstrap_05/doku/noter/Edge...

I had a little workshop paper earlier this year showing that you can apply the Edgeworth expansion to wide, but finite neural networks: https://arxiv.org/abs/1908.10030

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact