

Statistical Modeling: The Two Cultures - ulvund
http://www.cis.upenn.edu/group/datamining/ReadingGroup/papers/breiman2001.pdf

======
jibiki
I thought this would be about Bayesian vs. Frequentist, but it really isn't.

The sort of statistical problems he is talking about are basically predictive:
what is the distribution of variable y if we know the values of variables x_i?
How high can we expect ozone levels to be, given the previous weeks
meteorological data?

His claim is that statistics is dominated by people who begin with a data
model already decided upon. These models tend to be simplistic, but can be
tuned to fit any data set by including enough parameters. His suggested
replacement seems to involve decision trees and neural nets, what he calls
"algorithmic modeling" (on the other hand, he doesn't seem to like MCMC, which
is usually based on a data model, albeit a more sophisticated one.) This is
not exactly a frequentist position, since it assumes that data can be modeled
by an NN.

Anyways, the article is from 2001, and I don't think statistics people have
abandoned data models because such models are quite in vogue among AI people.

