Hacker News new | past | comments | ask | show | jobs | submit login

Cached text-only version (apparently his server can't handle the traffic): http://webcache.googleusercontent.com/search?q=cache:timharf...

Key point: "It seems that doctors may need a good deal of help interpreting the evidence they are likely to be exposed to on clinical effectiveness, while epidemiologists and statisticians need to think hard about how they present their discoveries." This is based on the observation that improving "five-year survival rates" may not actually mean the screening is helpful, it may just mean that you're learning about an untreatable disease 6 years before it kills you instead of 4 years before it kills you, but doctors don't seem to understand this.

(Also at: http://www.ft.com/cms/s/2/118169b6-1d74-11e2-869b-00144feabd...)

I have a thesis that the kind of thinking required to survive med school is diametrically opposed to the kind of thinking required to do statistics well. It's the "rote pattern matching" versus "mathetic language fluency" issue that's at the heart of things like Papert's Constructivist learning theory[1] and it really causes me to have little surprise at an article like this. Doctors are (usually) viciously smart people who have to make a wide array of difficult decisions daily, but to operate at that level requires an intuition around a lot of cached knowledge, something I feel to be basically the opposite of statistical thought.

I don't think this is unique, either. It's the heart of Fisher's program to provide statistical tests as tools to decision-makers[2]. It's an undoubted success in providing general defense against coincidences to a wide audience, but it casts the deductive process needed in a pale light.

I think a principle component of the computer revolution is to provide more people with better insight into mathetic thought. Papert focuses on combinatorial examples in children in Mindstorms[3] but I think the next level is understand information theory, distributions, and correlation on an intuitive level. MCMC sampling went an incredible way to helping me to understand these ideas and probabilistic programming languages are a great step toward making these ideas more available to the common public, but we also need great visualization (something far removed from today's often lazy "data viz").

Ideally, things like means and variances will be concepts that are stronger than just parameters of the normal distribution---which I feel is about as far as a good student in a typical college curriculum statistics class in a science or engineering major can go---but instead be tightly connected to using distributions accurately when thinking of complex systems of many interacting parts and using concentration inequalities to guide intuition.

I think the biggest driver of the recent popularization of Bayesian statistics is that distributions as a mode of thought is something quite natural to the human brain, but also something rather unrefined. People can roughly understand uncertainty about an outcome, but have a harder time with conjunctions or risk. How can we build tools that will teach people greater refinement of these intuitions?

[1] http://en.wikipedia.org/wiki/Constructivism_(learning_theory... [2] http://en.wikipedia.org/wiki/Statistical_Methods_for_Researc... [3] http://www.amazon.com/dp/0465046746

Med/Law = pattern recognition machines to detect statistical regularities. You show them a plane. Then give them another object and ask them whether or not it is a plane.

Math/Eng/Science = use of pattern recognition over a multiple of composable machines to create something new. You show them a combustion engine, steel frames, gears and vulcanised rubber wheels, then they connect it to the invention of bikes/trains to make a car.

Or "pattern recognition" versus "model building".

Medicine took a huge leap forward around 1800 or so when people started collecting statistics on what worked and what didn't. Evidently the doctors' intuition was very, very wrong.

Impossible. The whole lesson of statistics is that computing probabilities is an intricate process. It will never be intuitive. I learn to throw a ball at a target on intuition, but I will never learn to launch a rocket at Mars on intution.

At best, it can become intuitive to ask the right skeptical questions when being shown a claim.

That's an interesting viewpoint that I'd love to discuss more. I disagree, obviously, but want to know why you feel so strongly that statistical thought is intuitively impossible?

I feel like it's closely related to combinatorial thought. To again steal an example from Papert, he often talks about asking children to count the numbers of possible pairs of colors of marbles given to them. With some formal training it's easy to visualize and pare down to the right information, and it's also easy to visualize the process. Given a variety of colored marbles, I imagine you could easily estimate the magnitude of colored pairs possible. Children cannot and must learn to think that way at a certain point.

In the same way, conceptualizing uncertain events in the larger space of things that could happen and becoming familiar with the extents and limitations of the casual models we all use is a way of thinking that takes a great deal of effort (today) to come to have, but feels intuitive once you do have it. I believe that there's nothing inherently impossible about teaching it if the appropriate tools are available.

"Impossible" seems a broad claim - I don't see why it shouldn't be possible to put the information in a form, possibly decorated with details from a rigorous analysis, that makes pattern matching work. If the pattern matching is otherwise proving effective (itself an empirical claim, to be sure), we should be careful about teaching doctors not to pattern match.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact