
A Gentle Introduction to Bayes’ Theorem for Machine Learning - headalgorithm
https://machinelearningmastery.com/bayes-theorem-for-machine-learning/
======
hnhg
This is a far gentler introduction, and the rest of the blog is pretty good
too: [https://www.countbayesie.com/blog/2016/5/1/a-guide-to-
bayesi...](https://www.countbayesie.com/blog/2016/5/1/a-guide-to-bayesian-
statistics)

Edit: this is a different take on the subject but an enjoyable and accessible
read too: [http://mbmlbook.com/toc.html](http://mbmlbook.com/toc.html)

~~~
laacz
Thank you for the second link. Apart from great content, it is really well
presented for web.

------
raibosome
Shameless plug but if anybody wants to look at Naive Bayes classifiers for
continuous and categorical dat, I wrote a library here:

[https://github.com/remykarem/mixed-naive-
bayes](https://github.com/remykarem/mixed-naive-bayes)

------
Nicci00
I appreciate the lack of math notation, for many with a poor mathematics
backgrounds it feels like a huge wall into getting into interesting and useful
theories.

~~~
codesushi42
Bayes Theorem hardly requires any math notation at all. It would literally
take you less than a minute to understand conditional probability.

Yikes.

~~~
freehunter
Possibly true, but just looking at the Wikipedia page for Bayes Theorem, more
than half the text on the page is math notation:
[https://en.wikipedia.org/wiki/Bayes%27_theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem)

It doesn't matter how simple the math actually is, if someone is unfamiliar
with mathematical notation it's going to be overwhelming to read.

~~~
codesushi42
Then learn the notation or find another source.

This is a bit like complaining about the existence of books because you never
learned how to read.

------
WilliamEdward
Always nice to see a little piece of python code show up whenever one of these
ML blogs get posted to HN. It's becoming, or has become, the lingua franca of
this field.

~~~
usmannk
I'm very excited for Tensorflow for Swift, personally. The type system will
make development so much more pleasant IMO. Judging by the way things have
been lately though, I imagine that by the time TF for Swift is ready I'll be
reaching for PyTorch at every opportunity anyways...

~~~
arman_ashrafian
I think the fact that Swift has first-class automatic differentiation is an
even bigger deal than the fact that it is strongly-typed. Here is an
interesting write-up about it...

[https://gist.github.com/rxwei/30ba75ce092ab3b0dce4bde1fc2c9f...](https://gist.github.com/rxwei/30ba75ce092ab3b0dce4bde1fc2c9f1d)

------
allcentury
I've used Bayes for classification before, is it still the go to for that?

~~~
sacado2
Naive Bayes is a good baseline since it's both very fast and quite efficient,
and it doesn't need a big training set. But it's not often the best model you
can find, and it only works if your classes can be linearly separable, i.e it
can't model an xor.

------
mlevental
change my mind: bayes in practice is just a way to regularize your model and
the language of bayes makes it seem principled but really you could use
literally any regularizer and it would work almost just as well. i believe
this because ultimately you're always going to minimize loglikelihood anyway
(and so the prior becomes the regularization term).

~~~
rwilson4
Counterpoint: regularization is just a way of specifying a Bayesian prior for
maximum a posteriori estimation.

~~~
mlevental
but what value is that perspective? how do i use this to actually fit a model?

------
atum47
I remember the first time I heard of Bayes Theorem , it took me a while to
grasp it.

------
panpanna
His alternativ formulation of P(A|B) is really confusing me.

Is there a typo in there?

