
Ask HN: Simple Explanation of Bayesian Logic - DanielBMarkham
Anybody know of a good tutorial/primer for programmers about how to use Bayesian constructs? I looked on Wiki and my eyes immediately started to glaze over.
======
michael_dorfman
I think the best introduction remains <http://yudkowsky.net/rational/bayes>,
by our own Eliezer.

~~~
tokenadult
Fully agreed. I've been very impressed by the clarity of that article and
recommend it often in online discussions of research methodology.

------
physcab
I always like to think of an application when learning new concepts. Bayesian
theory is often used in machine learning, for example, where the "learning" is
simply done by updating a bayesian equation. In ML you often have this:

posterior probability = conditional probability * marginal probability.

where,

marginal prob is an initial guess (ie you have two boxes and the probability
of choosing one is 5/10)

conditional probability is typically an observation given the marginal (ie you
look inside one of the boxes and there are 4 red balls and 2 green balls, so
the probability of a red ball in box 1 is 4/6 or 2/3.)

posterior is the simply the probability of those two multiplied together (ie
2/3 * 1/2 = 1/3).

Now, in ML, because of a property called conjugacy...you can use this
posterior probability to update the learning process. In the next iteration,
you would use 1/3 as your initial marginal prob guess.

If you're looking for good tutorials, I would suggest:

Paul Graham's Spam Bayes filter:

<http://www.paulgraham.com/better.html>

Andrew Moore's Statistical learning tutorials:

<http://www.autonlab.org/tutorials/>

Christopher Bishop's Pattern Recognition and Machine Learning- Chapter 1:
Probability Theory

[http://research.microsoft.com/en-
us/um/people/cmbishop/prml/...](http://research.microsoft.com/en-
us/um/people/cmbishop/prml/slides/prml-slides-1.pdf)

------
raffi
Programming Collective Intelligence by Toby Segarin is a gentle introduction
to many ML topics include Bayesian classifiers.

