Hacker News new | past | comments | ask | show | jobs | submit login

Let's suppose we are doing a computer simulation.

1. We simulate a million trials where the contestant chooses one of three door, one of which has a car and two a goat

2. We simulate the host randomly choosing one of the remaining doors to open.

3. We discard all trials that resulted in a car. We are left with the number of trials that resulted in a goat and store that number in a variable 'total'

4. In the remaining trails we switch and reveal what was behind the door. We store the number of times we saw a car in a variable called 'wins'.

5. The probability of winning after switching is 'wins' / 'total'

It doesn't matter how many times the host shows a car because those trials are discarded.




Did you actually write that simulation? Because once up on a time (many years ago) I followed the same line of reasoning, wrote a simulation to show that I was right, and that simulation showed me I was wrong.

If the host picks randomly and you discard rounds with a car, your odds switching are 50/50. If the host uses knowledge of where the car is to definitely reveal a goat, your odds switching are better.


@dllthomas

I've written the simulation, and you're right. It is 50% when the host chooses randomly!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: