
Engineering Uncertainty Estimation in Neural Networks - dsr12
https://eng.uber.com/neural-networks-uncertainty-estimation/
======
tony_cannistra
It's cool how "bayesian neural networks define a distribution over neural
networks" –– so you can "sample from the posterior." Reminds me of the
Gaussian Process learning framework, which seems quite similar (distributions
over functions). Has anyone thought of this overlap?

edit: posted before reading other comments; seems like @syntaxing's link is
what I'm looking for: "A network with infinitely many weights with a
distribution on each weight is a Gaussian process"

~~~
clircle
That a Bayesian NN "defines a distribution over neural networks" means that a
Bayesian NN is a NN with priors on the weights. So it's not that there is some
special similarity to GPs. BNNs are like any other Bayesian model, but now a
NN is the likelihood.

The situation is usually flipped with Bayesian GP models -- a GP is usually
used as a prior on the linear predictor.

------
no_identd
If anyone wants to learn about more advanced, bleeding edge uncertainty
estimation methods unconcerned with neural nets, I can highly recommend Andrew
Pownuk & Vladik Kreinovich's 2018 book "Combining Interval, Probabilistic, and
Other Types of Uncertainty in Engineering Applications":

[https://link.springer.com/book/10.1007%2F978-3-319-91026-0](https://link.springer.com/book/10.1007%2F978-3-319-91026-0)

Which basically represents a textbook summary of Pownuk's PhD thesis:

[http://www.cs.utep.edu/vladik/pownukPhD.pdf](http://www.cs.utep.edu/vladik/pownukPhD.pdf)

------
srean
David McKay, 1992, Bayesian Methods for Adaptive Models
[http://www.inference.org.uk/mackay/thesis.pdf](http://www.inference.org.uk/mackay/thesis.pdf)

Radford Neal, 1994, Bayesian Learning for Neural Network
[https://www.cs.toronto.edu/~radford/ftp/thesis.pdf](https://www.cs.toronto.edu/~radford/ftp/thesis.pdf)

------
syntaxing
Super interesting. I've been wanting to play around with BNNs after reading
Yarin Gal's post on it [1] but my knowledge is limited on how to make it work.
Does anyone here have a library or tutorial they recommend using Pytorch?

[1]
[http://mlg.eng.cam.ac.uk/yarin/blog_3d801aa532c1ce.html](http://mlg.eng.cam.ac.uk/yarin/blog_3d801aa532c1ce.html)

~~~
maiybe
I'd check out Pyro, a probabilistic programming language built on Pytorch.

You can find Bayesian Neural Network examples starting here:
[http://pyro.ai/examples/bayesian_regression.html](http://pyro.ai/examples/bayesian_regression.html)

I think the documentation and tutorials are thorough and laid out well to ease
you into Bayesian NN and generally handling uncertainty with Neural Networks +
Distributions. There's some Pyro-specific constructs in there, but it's the
easiest way to get into BNNs without lots of prior knowledge.

------
mlthoughts2018
This pymc turorial is a nice place to start in this topic,

[https://docs.pymc.io/notebooks/bayesian_neural_network_with_...](https://docs.pymc.io/notebooks/bayesian_neural_network_with_sgfs.html)

