edit: posted before reading other comments; seems like @syntaxing's link is what I'm looking for: "A network with infinitely many weights with a distribution on each weight is a Gaussian process"
The situation is usually flipped with Bayesian GP models -- a GP is usually used as a prior on the linear predictor.
Which basically represents a textbook summary of Pownuk's PhD thesis:
Radford Neal, 1994, Bayesian Learning for Neural Network https://www.cs.toronto.edu/~radford/ftp/thesis.pdf
You can find Bayesian Neural Network examples starting here: http://pyro.ai/examples/bayesian_regression.html
I think the documentation and tutorials are thorough and laid out well to ease you into Bayesian NN and generally handling uncertainty with Neural Networks + Distributions. There's some Pyro-specific constructs in there, but it's the easiest way to get into BNNs without lots of prior knowledge.
This is a good example. There’s not much info I’ve been able to find - I’d be interested if anyone else has a solid tutorial.