Hacker News new | past | comments | ask | show | jobs | submit login

> If you test positive, there is a 10% chance it's a false positive.

Well, don't misunderstand -- it's got nothing to do with the test per se, but with the probability that you had the disease in the first place.

The test itself has two probabilities:

1. If you've had COVID-19, the probability that it will report positive (sensitivity)

2. If you haven't had COVID-19, the probability that it will report negative (selectivity)

But those probabilities give you a mapping from reality -> test_result. What you want is the reverse of that -- and find the probability from a test_result -> reality. When you do that, you have to factor in the probability that you have the disease in the first place.

If 50% of the population have had COVID-19, then a positive test means a 99.9% probability of having had the virus. If 1% of the population, a positive test means 91% likely you have it. If only 1 in a million people had COVID-19, then the number of false positives would completely overwhelm the number of true positives.

This is sometimes called the "Base rate fallacy": forgetting to factor in the base rate when determining something like this.

It's important for things like, say, systems which automatically detect terrorists at airports. How many travelers at an airport are actually terrorists planning to attack a plane? It's got to be one in hundreds of millions, if not billions. With that low of a base rate, even if you had a system that was 99.999% accurate, the vast majority of people it flagged up would be innocent.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: