> The article for this post is a pretty good source: 100% rate for the dog indicating the presence of drugs is pretty much a rubber stamp.
If you read the article you will find a description that the dog didn't alert the first time around the car, didn't alert the second time around the car, and only alerted after the cop maliciously signaled to the dog that they want an alert. To me that doesn't sound like a bad dog, it sounds like a bad cop.
>Similar patterns abound nationwide, suggesting that Karma's career was not unusual. Lex, a drug detection dog in Illinois, alerted for narcotics 93 percent of the time during roadside sniffs, but was wrong in more than 40 percent of cases. Sella, a drug detection dog in Florida, gave false alerts 53 percent of the time. Bono, a drug detection dog in Virginia, incorrectly indicated the presence of drugs 74 percent of the time.
These are concerning false positive rates, but I don't think anything in the article suggests the extent of what the parent wrote, in a universal sense:
>They're basically a rubber-stamp. Dogs want to please their handlers. If the handler wants the dog to alert, the dog will alert. The presence or absence of drugs isn't relevant to the dog.
Sure, but that's one particular dog. I'm asking about the statistics for all of the drug dogs in the US.
I agree if a particular dog is wrong more than 10 - 20% of the time, the dog and/or the handler should be taken off of the job. And I agree that they probably shouldn't be allowed to be used by police unless they already have probable cause through some separate means.
I think it was seen as me slyly trying to defend the use of drug-sniffing dogs. I'm really not; I was just curious about the data, and wondered how much variation there might be.
It's interesting to see the difference in testing between the US and Europe (or NL at least) - here getting 60% correct is a passing grade (usually 50% is the minimum)