

How to teach a Bayesian spam filter to play chess - michael_nielsen
http://dbacl.sourceforge.net/spam_chess-1.html

======
npk
On another random note, was anyone else dissapointed when they found out that
a neural net is a fancy way of drawing a line through some points. i.e.
regression.

~~~
pixcavator
I was mostly relieved. I wasn't insane after all. Some problems are simple,
some hard, and some can't be solved. These approaches are very indirect and
people may be tempted to use them without first asking the question: can this
problem be solved (in principle!) with the data that we have?

------
pixcavator
How about teaching it how to do ADDITION?

This would be a better experiment because (1) it is simpler and faster, (2)
the feedback is unambiguous, (3) the ability to add is verifiable.

Essentially you supply it with all sums of all pairs of numbers from 0 to 99
and then see if it can compute 100+100.

~~~
zyroth
Bayesian classification works a bit like this: You have a set of inputs and a
set of targets. By having seen a history of elements of the powerset of inputs
and its manually tagged classes, the bayesian classificator learns how to
classify new elements of the powerset of inputs.

Thus, a bayesian classifier has to know the classes and the inputs before. It
cannot extrapolate from the sets it was trained on, since the bayesian
approach does not see numbers as something that some operations are defined
on, but only as symbols.

A simple backprop neural network can learn addition, though.

~~~
pixcavator
Are you saying that a neural network can learn addition symbolically? Can you
recommend a book or site to read about this?

~~~
dpapathanasiou
_Can you recommend a book or site to read about this?_

David MacKay wrote a good intro book on Bayesian, neural networks, and related
topics: <http://www.inference.phy.cam.ac.uk/mackay/itila/>

~~~
pixcavator
I've looked at the chapter on neurall networks. It does not seem to address
operations with symbols, only numbers...

~~~
hhm
You can always encode symbols as numbers, and otherwise. That's what computers
do, that's what you do when you do a sum by yourself, and that's what such a
neural network should do too. (You should encode the symbols to a binary
input, then decode the neural network output to symbols).

~~~
pixcavator
So given a function with f('1','1')='2', the computer will figure out that
f('1','2')='3', right?

~~~
hhm
Yes, that's what I'm talking about.

~~~
pixcavator
Wow!

------
leecs1
If chess playing is a matter of classification (ignoring the complexity of
input vector encoding and computational feasibility), then Bayesian filter can
surely achieve this goal.

------
zandorg
On a random note, I was first introduced to Paul Graham's writings by his
article on the Bayesian spam filter he wrote.

