
Face recognition police tools 'staggeringly inaccurate' - pmoriarty
http://www.bbc.co.uk/news/technology-44089161
======
godelski
I know a lot of people here are talking about the tech and how of course it
makes too many false positives. But are we not going to talk about IF this
type of thing is even okay? I for one don't think so. Too Orwellian.

~~~
bsenftner
Ya know, that conversation already happened, and the "powers that be" chose
public safety over personal privacy. I work in facial recognition. You're
gonna see it blanket this civilization, completely. We are selling systems to
every school district, shopping center, sports stadium, and municipal district
nationwide. In the 2nd world it is even more active. The time to debate FR is
past.

~~~
WhompingWindows
When was that time, precisely? I don't recall there being a debate about this.
Will we just blow through the "time for debates" on every new technology?

~~~
bsenftner
It's been building with every school shooting, public gun violence, and high
tension political gathering. The conversation takes place at the local level,
often at the specific school itself, the specific campus, stadium and shopping
district. You see, many of these places are private, so they can do what ever
they want to a large degree; and many of the public spaces are patrolled by
private security companies, who use the most cost effective method to cover
large areas. The police are the end of the line as far as deploying facial
recognition.

------
stochastic_monk
Police add yet another massively inaccurate tool to their arsenal which they
can easily put innocent people behind bars.

For example, bite mark “evidence” has been long disproven as having no basis
in fact, but that doesn’t stop prosecutors from using it.

~~~
ChristianBundy
Maybe I'm misremembering, but wasn't bite mark evidence used to catch Ted
Bundy?

~~~
dbasedweeb
Not really, and frankly as with many prolific serial killers the active
ingredient was police incompetence rather than rare genius of the killer.

[https://en.m.wikipedia.org/wiki/Ted_Bundy#Arrest_and_first_t...](https://en.m.wikipedia.org/wiki/Ted_Bundy#Arrest_and_first_trial)

------
softbuilder
I once helped build a fingerprint processing system. One of the things I
learned is that police technicians essentially photoshop fingerprints before
searching to ensure the best possible match. It took a lot of the shine off of
the process and made it clear that as high-tech as things seemed, there were a
lot of organic factors at play that were hard or impossible for a computer to
sort out by itself. I expect facial recognition has similar constraints.

Edit: Conjugation

------
IloveHN84
Computer vision is hard.

Face recognition, autonomous driving, whatever works with dynamic environments
will terribly fail for at least another 5 years.

Recognition and classification work fine when working with static environments
(pipe checking, train rail track checking, etc), but people and cars are hard
to be tracked.. too many variables (light,. Day/night transition, skin tone)

~~~
bsenftner
Yes, it is very hard. That is why the leaders in the industry are not recent
startups, rarely have anyone under 40 on their staff, and have been working in
the field for 15 to 20 years. This is not casual technology, it is intended as
a public safety technology, and the industry takes that seriously.

------
trophycase
Let's not let the ineffectiveness of this technology distract us from the fact
that it is unethical. If it worked properly it would still be horrible.

~~~
iblo66
Why is it unethical? Because it allows us to catch criminals?

~~~
bigiain
Perhaps because it allows "them" to catch whoever they feel like, and claim
"oh, the computer said they were a criminal".

I'm astounded anybody who lives in a country where police can and do get away
with shooting and killing people for faulty taillights - could possibly think
this is a good idea...

~~~
iblo66
> Perhaps because it allows "them" to catch whoever they feel like, and claim
> "oh, the computer said they were a criminal".

No because they have to prove it's the actual criminal, otherwise they have to
release the guy.

~~~
Clubber
Resisting arrest is a typical workaround for that. The antidote is, "well why
were you arresting him?" This dilutes the antidote with, "the computer said he
was a criminal."

Also, we make way too many things illegal in the US and the punishments are
often harsh for a country founded on "freedom." I wish more people would shed
this archaic delusion.

~~~
iblo66
Why would you resist arrest even if you're innocent?

~~~
bigiain
Ask anyone from any sort of minority community how often their members get
charged with resisting arrest vs how often they _actually_ resist arrest. Even
allowing for self reporting always making the comparison skewed I suspect most
non-minority people won't even believe the difference.

Over here (Australia) they call it "the trifecta of charges" \- offensive
language, resist arrest, assault a police officer - and it's well known to be
used as "Arrest as a method of oppression".

These regularly get thrown out by victims capable of fighting them in court
(eg: [https://www.smh.com.au/national/nsw/arrest-of-student-for-
of...](https://www.smh.com.au/national/nsw/arrest-of-student-for-offensive-
language-found-unlawful-20171120-gzp21t.html) ) - but they're overwhelmingly
used against groups who're least likely to be able to do that: "Indigenous
Australians account for 15 times as many offensive language offences as would
be expected for their population."
[http://www5.austlii.edu.au/au/journals/AltLawJl/2004/53.html](http://www5.austlii.edu.au/au/journals/AltLawJl/2004/53.html)
That's not because white folk are any less likely to tell a cop to fuck off,
it's because there's systemic racism built into the police force here - they
"know_ they'll get away with "the trifecta" against "people who look a certain
way".

------
Taniwha
We've had a lot of press this week about the police here in New Zealand
rolling out similar stuff - really we need to have a wide ranging public
discussion on the issues around this stuff before it's done, sadly the police
have largely been doing this under the radar.

My take is that - yes we should have this stuff, but with the following
caveats:

1) it only get loaded with the faces of people for whom there is an active
warrant out there - ie you need a judge to sign something

2) all data collected that is not relevant should be discarded asap (including
false positives the moment you know they are false)

3) you don't deploy something like this until the false positive rate is
appropriately low, for everyone - it has to be able to deal equally to people
of different races, genders, haircuts, with and without beards, makeup, hats,
helmets, zinc oxide sunscreen, green St Paddy's day faces, actual smurfs etc
etc all the diversity of normal street life

~~~
flashman
Even if you get the false positive rate low, the low prevalence of criminals
versus innocents will lead to a large number of false positives.

Let's say one person in 1000 has an outstanding warrant. Let's say that when
the camera sees this person, it has a 99% chance of recognising them
correctly. Let's also say that it has only a 1% chance of recognising an
innocent person as a suspect.

Under these generous conditions, 10 out of every 11 hits will be false
positives. Put another way, if you run the hypothetical system on a football
match with 60,000 people in the crowd, you'll find 59 or 60 of the ones with
outstanding warrants, and several hundred without.

This is just the same math that was used to show how ineffective PRISM mass
surveillance would be: [https://bayesianbiologist.com/2013/06/06/how-likely-
is-the-n...](https://bayesianbiologist.com/2013/06/06/how-likely-is-the-nsa-
prism-program-to-catch-a-terrorist/)

~~~
a1369209993
I think this may be a problem of terminology. When I (and, I think, others)
say "low false positive rate" what we're actually talking about is the (more
useful for approximation) ratio "fraction of negatives classified positive"
over "fraction of population that is positive".

You have described a false positive 'rate' of _one thousand percent_ (10 to
1). A genuinely low false positive 'rate' of 1 to 10 would require a actual-
rate around 0.01% chance of recognising an innocent person as a suspect.

It might help if there was widely known term for 'rate's, rather than using
wrong-term-in-scare-quotes.

Edit: Also, it's increasingly impossible to have a low false positive 'rate'
for one-in-a-million or one-in-a-billion events, and not-technically-lying
about how good your detectors are is probably a significant secondary factor
in why this sort of thing gets so much flak for "staggering inaccuracy".

~~~
jessaustin
You can't get around the fact that rare events happen rarely. False positive
rate is the rate of false positives: given that this one event is a positive,
what is the chance that it is a false positive? This is important for
decision-making in a real situation. Your suggested definition is for a much
less helpful statistic, no matter how comforting it might seem to accuse the
field of statistics of "not-technically-lying". In fact your idiosyncratic
definition has the opposite effect from the one you seem to seek: when "true"
negatives are far more common than "true" positives, false positives typically
will also be far more common than true positives.

~~~
a1369209993
Yes, I know what a false positive rate is, hence the scare-quotes everywhere I
misused "rate". And I'm (obviously) not accusing the _field of statistics_ of
not-technically-lying; I'm accusing supporters of face recognition police
tools of not-technically-lying. False positive rates are actively misleading
when (trying to avoid) answering the question "given that this one event is
_reported_ positive, what is the chance that it actually _is_?".

------
petermcneeley
Perhaps humans will reapply for this job: [http://www.rcmp-
grc.gc.ca/en/gazette/never-forget-a-face](http://www.rcmp-
grc.gc.ca/en/gazette/never-forget-a-face)

~~~
thisacctforreal
People with rare abilities have a chance to refuse or whistleblow unethical
work.

Automated facial recognition systems will listen to whatever they’re told
without question, and can watch all cameras at all times.

Add gait recognition and you won’t be able to go anywhere within CCTV without
it being known.

~~~
AstralStorm
Not with current failure rates where they are unable to generalise glasses,
facial hair, bad lighting, different clothing etc.

------
camel_gopher
The Nest doorbell is fairly good at this, but it has a relatively narrow field
of view and 1080p resolution. Most of the CCTVs out there have a very wide
field of view, which makes the number of pixels per face low.

~~~
jjxw
As another commenter pointed out, it also doesn't have to sift through
thousands of faces. Chances are the doorbell is only exposed to a handful of
faces a day which greatly reduces the volume of false positives regardless of
the field of view and resolution.

~~~
bsenftner
Modern FR systems can compare tens of millions of faces per second, per core,
on a $99 Intel Compute Stick. The thing that sucks about Nest and similar
"security cameras" is their wide angles - that makes the distance of
recognition far too close. To detect a face, FR needs between 28 and 54 pixels
of head height, but to recognize with accuracy is requires a minimum of 120
pixels of head height. A Nest camera will not produce a head that large until
it is 3 feet from the camera. I'd prefer to make recognition at 30 feet or
more, preferably when a bad actor is not yet inside the property.

------
jfasi
> Police facial recognition cameras have been trialled at events such as
> football matches, festivals and parades... High-definition cameras detect
> all the faces in a crowd and compare them with existing police photographs,
> such as mugshots from previous arrests.

> "When we first deployed and we were learning how to use it... some of the
> digital images we used weren't of sufficient quality," said Deputy Chief
> Constable Richard Lewis. "Because of the poor quality, it was identifying
> people wrongly. They weren't able to get the detail from the picture."

> Information Commissioner Elizabeth Denham said the issue had become a
> "priority" for her office.

When you roll out facial recognition technology at an event with ten thousand
people and expect your magical facial recognition system to pick out just the
bad guys, you quickly discover that even a system that's 99.99 percent
accurate has a 63 percent change of giving you at least one false positive in
that crowd. In reality these systems are probably closer to 95 percent
accurate, in which case you're looking at _vastly_ more false positives. Like
hundreds per event.

This is the exact same statement as "breast cancer screening is staggeringly
inaccurate." Do you know why breast annual cancer screenings aren't
recommended for women under 40? Because breast cancer is so rare in that
population that the high false positive rate outweighs the benefit of early
detection. The correct thing to do is only screen _only if there is already a
suspicion of cancer._

Similarly, pervasive surveillance where you (statistically) consider everyone
a possible criminal is a waste of time. Anyone who's ever taken a probability
course will tell you that unless your facial recognition tech is fantastically
accurate you'll waste all your police officers' valuable time sorting through
false positives.

This is not an image quality issue. It's not a data quality issue. It's not a
deployment issue. It's a Bayesian statistics issue. Unless your tech has
superhuman 99.999% accuracy, it doesn't matter how accurate your facial
recognition is if you're looking for a needle in a haystack. I'm sure the
facial recognition is just as wonderfully accurate as breast cancer screening,
it's just being applied to a population with a stupidly rare prior.

This is a statistics 101, freshman undergrad-level mistake, and frankly it
makes me mad as hell that authorities are stupid enough to make it and
boneheaded enough to endanger their entire population's civil liberties in the
process.

For more info, check Wikipedia's page on the aptly-named prosecutor's fallacy:
[https://en.wikipedia.org/wiki/Prosecutor%27s_fallacy](https://en.wikipedia.org/wiki/Prosecutor%27s_fallacy)

~~~
tzs
The base rate fallacy [1] and the false positive paradox [2] seem to be much
more on point for using face recognition for screening seem to be much more on
point than the prosecutor's fallacy.

[1]
[https://en.wikipedia.org/wiki/Base_rate_fallacy](https://en.wikipedia.org/wiki/Base_rate_fallacy)

[2]
[https://en.wikipedia.org/wiki/False_positive_paradox](https://en.wikipedia.org/wiki/False_positive_paradox)

~~~
thaumasiotes
What exactly are you trying to correct? Read your own link.

[https://en.wikipedia.org/wiki/False_positive_paradox](https://en.wikipedia.org/wiki/False_positive_paradox)
says:

> concluding that a positive test result probably indicates a positive
> subject, even though population incidence is below the false positive rate,
> is a "base rate fallacy".

Whereas
[https://en.wikipedia.org/wiki/Prosecutor%27s_fallacy](https://en.wikipedia.org/wiki/Prosecutor%27s_fallacy)
says:

> At its heart, the fallacy involves assuming that the prior probability of a
> random match [i.e. the odds of a positive test result] is equal to the
> probability that the defendant is innocent [i.e. the odds of a positive
> subject]. For instance, if a perpetrator is known to have the same blood
> type as a defendant and 10% of the population share that blood type, then to
> argue on that basis alone that the probability of the defendant being guilty
> is 90% makes the prosecutor's fallacy

Those are the same mistake. The prosecutor's fallacy is a base rate fallacy.

~~~
tzs
False positive paradox: "The false positive paradox is a statistical result
where false positive tests are more probable than true positive tests,
occurring when the overall population has a low incidence of a condition and
the incidence rate is lower than the false positive rate"

Prosecutor's fallacy: "The prosecutor's fallacy is a fallacy of statistical
reasoning, typically used by the prosecution to argue for the guilt of a
defendant during a criminal trial".

The false positive paradox is a result. To get to a prosecutor's fallacy or
base rate fallacy from it, you have to use the result in a way that involves
certain fallacious reasoning.

There's nothing necessarily wrong with using a screening test that the false
positive paradox applies to, as long as you recognize its limitations. When
you are looking for a needle in a haystack (which is typically when you hit
the false positive paradox) it gives you a smaller haystack to search for the
needle, but that can still be a big improvement over not using it as long as
it has a low false _negative_ rate.

~~~
jessaustin
That is simple arithmetic, not a "paradox". Those who can't do that arithmetic
will be ill-equipped to use this "smaller haystack" in a sensible or just way.
Instead, they will simply arrest the first member of that smaller haystack
they find.

------
rqs
Still remember few weeks ago or so people are talking about China's few
jaywalk shame thing?

I wonder how many people got their face, address and ID publicly displayed
there for false positives, and it maybe even effects their "Social Credit
Score".

I believe had this kind of bold to blindly implement such system which had
such false positive rates clearly showing how incredibly responsible our
government really is.

~~~
nayuki
I'm curious about how the China facial recognition experiment works too.
Giving the generous assumption that the matches are correct, I wonder if they
are using mobile phone tracking or RFID card reading to supplement the visual
recognition system.

~~~
rqs
I didn't found mention about supplement systems, so I don't actually know.

Only thing I do found is mentions about the correct rate, some over 90%, some
over 99%, none 100%.

------
jimnotgym
So the UK police, under huge budgetary pressures (there are 21,000 less police
than 7 years ago at the start of "the cuts" ) were duped into investing in an
expensive technology that promised to save them resources in catching
criminals. The pressure to get results overrode common-sense and concerns
about civil-liberties. A bit like the DNA "fingerprint" craze which is now
unraveling with a number of convictions being challenged. If it had been
treated from the beginning as corroborating evidence, rather than The Answer,
this would never have happened

I am amazed every day at work that everyone expects technology to be a
"silver-bullet", and are outraged if they still have to do anything themselves

------
Maybestring
There's an easy fix here. Criminalize looking like a criminal.

~~~
bigiain
"It's a crime to be broke in America! And it's a crime to be Black in
America!" \-- Spearhead/Michael Franti 1994.

