
California Police Are Sharing Facial Recognition Databases to ID Suspects - jmsflknr
https://onezero.medium.com/california-police-are-sharing-facial-recognition-databases-to-id-suspects-3317726d31ad
======
landcoctos
My local PD wants to do this. The need the City Commission to pass a
resolution allowing them to sign this agreement.

I spoke with one of my commissioners who then had a meeting with the Police
Department. The end result was the Police removed the item from the Commission
meeting because they needed more time to prepare, justify and lay out policies
in its use.

Had I not reached out this likely would have been rubber stamped.

~~~
landcoctos
For the record I am against any use of facial recognition, however, I do
acknowledge that many are not.

Thus, I would at a minimum like to see a clear policy for when it can be used,
image retention (if any), image sharing, etc.

~~~
nerdponx
_Thus, I would at a minimum like to see a clear policy for when it can be
used, image retention (if any), image sharing, etc._

This is what's important. There need to be clear legal expectations set about
the security of this data. It has legitimate uses, but it should be a pain in
the ass to access and share, like HIPAA. You have to really need it and make a
case for needing it.

------
kbos87
Facial recognition technology has all sorts of situations where one can
imagine where it would be useful. Unfortunately all of the ways you could use
it to suppress, discriminate, silence, and intimidate people outweigh any
upsides by miles. Facial recognition tech should be outlawed, unequivocally,
plain and simple. I’m not holding my breath that will happen.

~~~
SN76477
Any profiling or identifying should require a warrant or reasonable (and well
documented) suspicion.

------
gumby
I have a hard time getting too upset about this _as described in the article_.
Sharing _police_ fingerprints and mug shots doesn’t sound too bad.

There are of course “but”s: \- should mug shots be retained for people later
freed/not convicted? Certainly. It he case today for photos, prints, and (in
California) DNA.

\- expanding into the DL database, as mentioned in the article seems like a
dangerous scope creep — though what if they are investigating a driving
offense like a hit and run?

Anyway, a useful article. But not _necessarily_ a bad thing, for a change.

------
rhacker
San Jose, 2010

Me to cop: That's him you can see him breaking into the car here at 10:15,
then at 12:05 we have a perfect clear view of his face.

Cop: We can't really do anything with this, I mean we can take the copy of the
video but we don't have any way to identify him.

Me: So we can't really do anything?

Cop: Yep sorry

~~~
SN76477
I hate to say this, but I believe that petty crime has to be accepted as a
cost of a truly free society.

I would not want to be in your situation, but the needs of the many outweighs
the needs of the few.

~~~
unreal37
Um, no thank you.

I've seen a mugging in Brazil. They are so frequent there that everyone I
talked to has been mugged on the street. Friends of mine have been carjacked
and kidnapped for 8 hours.

The police won't investigate a mugging. You're wasting your time to report it.

Let's not "accept" crimes that we think are not serious. All crime needs to be
investigated and appropriate punishment given. Living in a society where some
crime is OK is not a great place to live.

~~~
sixothree
I wouldn't consider mugging a petty crime.

~~~
dsfyu404ed
I wouldn't consider anything violent or premeditated "petty"

~~~
SN76477
Correct.

And I say over and over, it is time to evaluate the true cost of crime.

~~~
rhacker
I think that's a different statement than the one you said earlier.

Society should accept a little crime to be free != evaluate the true cost of
crime.

So if there's a neighborhood that has the potential to sell Condos for $1.7M
each but currently has houses that are $50k and we find out that the crime in
the area is keeping that value so low - should we go after the thieves for
stealing $1.7M from homeowners?

I'm just saying because that's one of the ways you add up the "true cost of
crime".

~~~
SN76477
Im thinking on a very small level. Policing efforts should cost less than the
crime. Too often it is the opposite.

I would rather someone speed through my neighborhood than have 3 police cars
make a spectacle and spend thousands of dollars.

Speeding is a problem fixed by a proper education (in theory) It is not fixed
by chasing someone down and giving them a speeding ticket.

------
User23
I came to a realization on this subject while visiting Las Vegas:
Pragmatically, a maximally high trust society and a maximally low trust
society permit the same behavior patterns. In Las Vegas casinos, which are low
trust and have high surveillance, one can comfortably leave tens of thousands
of dollars or more on a gaming table while one goes to use the bathroom or
whatever. In a hypothetical high trust low surveillance environment one could
do the same.

The difference is, we know how to engineer for low trust and high
surveillance. We don't really know how to engineer for high trust and low
surveillance. So our practical choices are to either live in a low trust low
surveillance society and having to constrain ourselves as that requires, or to
live in a low trust high surveillance society that simulates a much more
pleasant high trust society.

~~~
carapace
> live in a low trust high surveillance society that simulates a much more
> pleasant high trust society

And in a high surveillance society, whatever the initial conditions, the
cost/benefit ratio of cooperation vs. defection is biased towards trust. In
other words, the _ersatz_ high trust society would become the real thing, of
course only provided the surveillance is trustworthy. For example, China's
social credit system _might work_ if it is administered _sans_ corruption. It
has to be self-reflexive.

As incredible as it seems, _Mrs. Grundy_ will save us.

------
mcny
Previously on HN:

[https://news.ycombinator.com/item?id=20563430](https://news.ycombinator.com/item?id=20563430)

> Today, you are not you, you are your data, a persona. And you are somehow
> responsible for it or anything that casts a similar shadow.

I think it is a similar case here. The main problem won't be that my face is
out there. The problem will be that the police will stop you or stop by at
your house to "just have a chat" because some algorithm matched your face to
some input somewhere with a 90% confidence.

By the way, who owns the photographs and how do the departments have authority
to share them with some company? If they can share with with that company, why
not make the data public and let everyone have access to it? I'd like to play
with the data as well.

~~~
johnday
Do you think there's more chance of this happening, versus Barbara down the
shops being shown a photograph and saying "oh yes, that looks just like mcny
from two doors down"?

~~~
pavel_lishin
I'm not mcny, but yes, of course. It'll be massively cheaper to just yank the
confidence slider down until results pop out, vs. sending meatbags walking
around an area, showing photographs to neighbors.

------
justchilly
Question for people that are anti facial recognition. What methods of
identifying people are you ok with? Are you ok with the police/news asking
other people to help identify someone? To me that sounds like facial
recognition with extra steps?

The arguments against facial recognition like that there can be false
positives, or that can affect some groups more than others, doesn't that also
apply when people are identifying people? If so isn't the real solution to
require more evidence than just a facial match, not to ban an effective way of
narrowing a suspect pool. That way police can spend less time manually
identifying people and more time getting other evidence.

~~~
sjaknanxnnx
I’m concerned because probability is not intuitive.

Suppose a store is robbed, and there’s a video.

The police identity some suspects - the guy who just got out of jail for
robbing the same store, and another person the store owner had a dispute with.
Neither of them look like the robber in the video. Then the police take a
still from the video and knock on some doors around the block. Somebody
recognizes the person in the video, and the police investigate that person.
This scenario seems pretty fair to me.

Now suppose the police run it through the facial recognition system. It
identifies one person as a 99% match, and the police go investigate this
person. This scenario does not seem so fair to me.

Here’s how I see the math:

P(A) = P(robber has a doppelgänger living on the same block) = .01

P(B) = P(robber had a doppelgänger somewhere in the database) = .9

P(X) = P(police screw up investigation, and will convict the suspect whether
or not they are guilty) = .2

P(AX) = .002

P(BX) = .18

The exact numbers are made up, but as long as P(A) << P(B), you can see you
this tech will result in a huge increase in false convictions. Even if P(X) is
low, the number of false convictions increases by P(B)/P(A).

~~~
justchilly
I'd argue the %s are not intuitive either way. In fact, if P(A) does happen,
and there really is an unlucky doppelganger, that person is very likely to be
charged. That could have been avoided with technology produced gave 5 other
suspects that don't live on that block. Should it therefor be allowed for
criminal defense if not prosecution?

The issue with the P(A) and P(B) argument is that police already use databases
heavily, and most people don't have any problem with it. But why when it comes
to facial recognition, is it too dangerous to use technology to drive
efficiency.

If they're looking for somebody named Jane Doe, anybody with that name shows
up on a list and police investigate. Of course if there are Jane Does in a 2
mile radius, they start with those. So why not just say if the system delivers
a match within x accuracy and the person is within y residents (plus a variety
of other variables), and x/y is below a threshold, then the match can be
presented to police for further investigation.

Searching databases for matches is fine for names, or fingerprints, shoe
prints, tire track, fiber analysis - but not faces? I personally wonder if
it's really any different, or if its just better tailored for the media
outrage machine because "China does it", or because "facial recognition
targets minorities".

~~~
sjaknanxnnx
I think police searching databases for low-quality evidence like tire track,
shoe prints, and fiber analysis is a very dodgy practice, for these exact
reasons. The key is the specificity of the match, and I don’t think facial
recognition is good enough. Fingerprints and DNA can be, but there are still
known cases of people being falsely charged based on databases searches with a
partial match.

This tech can be good if applied to a narrow range of people like you suggest
(eg. only searching people who live in neighboring blocks) but nobody is
actually doing that. We should pass laws requiring a rigorous analysis of
these probabilities for such databases to be used, including a conversation
about what rate of false positives we are willing to tolerate. Guardrails
should be put in place to enforce those limits. If this is too hard, we don’t
have a strong enough handle on this technology to be using it.

Here’s the scenario that scares me the most:

Police identify a suspect using facial recognition. Then puts that person in a
lineup for a witness. Of course the witness is going to say “that’s the one!”
because the suspect actually looks like the perpetrator. The witness will be
sure, the cops will be sure, and a jury will convict. And this scenario is
completely determined by the use of the facial recognition database. This will
happen unless we pass laws to prevent it.

------
alfromspace
This is something I think is justifiable in time-sensitive cases like tracking
down a kidnapper, but I fear there's no way to grant such power and expect its
use to always remain narrow. In my opinion, most issues in our country come
from the compounding of lots of well-intentioned systems built up over several
centuries that no longer even remotely serve their original purposes.

~~~
icxa
That's how every infringement of our civil liberties has come into existence:
It has one or two very limited justifiable cases, and everyone goes 100% all
in on that and focuses on that, but not what we will lose. Everyone always
dismisses it as a slippery slope fallacy but here we are, with our rights
against illegal search and seizures almost completely chipped away. Asking an
officer for a search warrant is almost a formality at this point. Cops
routinely exercise civil asset forfeiture and make it impossible to regain the
goods lost, the 3 letter agencies are downloading and analyzing anything we
type online and building comprehensive profiles and behavior models and
running our actions online against said models, etc, etc.

~~~
alfromspace
I see far more slippery slopes than slippery slope fallacies. It makes me want
to reject these kinds of ideas from the get-go, because the pasts of our
bureaucracies and representatives have given us really no good reason to
believe every great new idea won't be expanded far beyond its scope sooner or
later.

~~~
close04
> every great new idea won't be expanded far beyond its scope sooner or later

And when the expansion is justified by shocking statements ("stop terrorists
and pedophiles, think of the children" type stuff) that's when people should
stop and think. Because anyone trying to convince you by shocking you probably
wants to hide something in that shock.

------
ThrustVectoring
The cat's out of the bag with this technology - it exists, citizens will
demand that the police use it to help solve crimes. The way forward is to
ensure that it gets used as an investigative tool, rather than part of mass
surveillance.

The way to do that is to make sure that querying a facial recognition database
is too expensive for ubiquitous use. A court order per face you want to
identify is likely enough, though I'd like an additional $100-$1000 fee to
discourage rubber-stamping.

IIRC, the police have hired people who are just _really_ good at recognizing
faces and given them a giant stack of photographs of criminals. Driving down
the difficulty of that sort of process - identifying suspects with known faces
- seems actually reasonable. Driving down the cost of identifying faces to the
point where you can have a historical database of identity-annotated sidewalk
footage of an entire city is the sort of thing we need to fight.

~~~
dsfyu404ed
>IIRC, the police have hired people who are just really good at recognizing
faces and given them a giant stack of photographs of criminals. Driving down
the difficulty of that sort of process - identifying suspects with known faces

It's believed in some circles this at least sometimes just BS and is a cover
for parallel construction.

------
dannykwells
I often criticize SF for being overly zealous around weird issues...but their
ban on facial recognition is looking more prescient by the day. I hope other
cities follow suit.

Also, non sequitur, have there been any proposals for how to detect when your
country has become an authoritarian dystopia?

------
raven105x
These are essentially 'super-technologies', much like government-requested
back doors and master keys. It's not an overstatement to say that the
potential they hold is on par with human genetic engineering - something the
civilized world has thus far agreed is a road best avoided. Sure, the ethical
dilemma of them is enough to give pause - but are the following not much
greater risks?

* These systems will crowdsource "evidence", and by proxy, accountability - what happens when a mistake is made?

* Increased centralization will eventually enable 'single point of failure' scenarios in systems of a national/international scale.

* What happens when someone misuses or gains unauthorized access to such systems? Will the damage be reversible?

Such technologies will be deployed before accountability for them is defined,
leading to a temporary absence thereof. Systems of the past (digital or
otherwise) were inherently tolerant to misuse, malfunctions & misfires because
only so much damage could be done before the errant behavior was discovered &
handled. Today, how many seconds would it take to plunge a country into chaos?

------
proc0
I'm glad SF outlawed facial recognition. At least they got one law right.

~~~
justchilly
San Francisco has the #1 highest property crime rate in the country. Can
hardly compliment their policing policies.
[https://www.sfchronicle.com/crime/article/The-Scanner-San-
Fr...](https://www.sfchronicle.com/crime/article/The-Scanner-San-Francisco-
ranks-No-1-in-13267113.php)

~~~
proc0
SFPD don't even enforce homeless laws properly and let people live in same
street in tents for years. However facial recognition wont fix any of this,
and just make things worse.

------
throwawaysea
Good. This means police will be more efficient in identifying and catching
suspects. Facial recognition is always used as a first-pass filter, so the
false positive rate is not relevant - it simply narrows the search space so
that limited police resources are used efficiently.

~~~
inetknght
> _This means police will be more efficient in identifying and catching
> suspects._

That's quite a claim. Prove it.

> _Facial recognition is always used as a first-pass filter, so the false
> positive rate is not relevant_

I don't understand. Are you saying that using a filter negates a false
positive? I don't think that's accurate at all.

> _it simply narrows the search space so that limited police resources are
> used efficiently._

What about the rights of someone who's been falsely implicated?

~~~
throwawaysea
> That's quite a claim. Prove it.

Why does it need to be proven before it is implemented? Did you ask for proof
that use of Microsoft Office or electricity by police departments would make
them more efficient?

It seems inherently obvious and frankly, common sense, that a network of
cameras and facial recognition software are going to be more effective at
identifying perps in the wild than a handful of officers patrolling the
streets, needing to remember the faces of everyone they're on the lookout for,
and scanning their surroundings constantly for matches.

> I don't understand. Are you saying that using a filter negates a false
> positive? I don't think that's accurate at all.

The concern often raised is that trusting an imprecise algorithm is risky. But
in reality matches from facial recognition software are reviewed by humans as
a second-pass filter prior to taking action (like dispatching officers). We
already trust humans to perform policing inherently by the act of having
police forces exist, so there is no _elevated_ risk introduced by use of
facial recognition that isn't proportionally offset by a higher degree of true
positive matches, since any elevated false positive match raised by the
algorithm is contained by the humans reviewing potential matches first.

> What about the rights of someone who's been falsely implicated?

They still have rights, and can fight those claims. Why would you expect this
to be any different due to the use of facial recognition? If the suspect looks
like whoever police are seeking, then they would be exposed to the risk of
implication orthogonal to the existence of facial recognition. And there are
ways (even if not perfect) to fight those claims. We retain policing as a
necessary part of society despite its imperfections because it is necessary
and hugely better than the alternative of not having police forces. And all
this holds true even with police departments adopting technology - whether
that is electricity, Microsoft Office, or facial recognition.

~~~
inetknght
> _Why does it need to be proven before it is implemented?_

We're talking about peoples' safety and justice. It's not just some product
that people can return if they don't like it.

> _Did you ask for proof that use of Microsoft Office or electricity by police
> departments would make them more efficient?_

I didn't have that opportunity. But yes, I would have.

> _It seems inherently obvious and frankly, common sense, that a network of
> cameras and facial recognition software are going to be more effective at
> identifying perps in the wild_

I don't think it's obvious. Indeed, I think the potential for abuse is very
high. It would be easy for someone to say "the algorithm said so" to someone
who's been accused. How would someone without specific knowledge of the
algorithm be able to dispute that?

> _But in reality matches from facial recognition software are reviewed by
> humans as a second-pass filter prior to taking action (like dispatching
> officers). We already trust humans to perform policing inherently by the act
> of having police forces exist, so there is no elevated risk introduced by
> use of facial recognition_

You are wrong.

Matches from facial recognition software might be reviewed by humans. But then
again, payroll is reviewed by humans too and that has tons of problems. If we
can't fix our own money tracking systems, then how exactly do you think we can
keep our facial recognition software correctly working?

To claim that " _there is no _elevated_ risk introduced by use of facial
recognition_ " makes me wonder whether you've thought about how it could even
_unintentionally_ cause problems. That's a very dangerous line of thought to
ignore, by the way.

> _They still have rights, and can fight those claims._

Bullshit. Not just bullshit, but _fucking_ bullshit. People who are accused
_today_ don't have the knowledge and money necessary to fight claims that
don't even involve magic hidden behind the guise of an algorithm. When [1]
officers [2] plant [3] evidence [4] and prosecutors [5] go [6] for [7] plea
deals, the public suffers. It's especially perverse when the people who are
supposed to be a part of the justice system argue that "they're guilty of
_something_ ". Innocent peoples lives can be _ruined_ when they don't have the
ability to demonstrate their innocence [8]. I'm going to stop adding links
because these ones were found with literally just thirty seconds of searching.
It's _that_ common of a problem.

I could see you then arguing that the goalpost is moving: solve the officer
problem or solve the prosecutorial problem. Of course you're right. But you'd
also miss the point: adding _more_ abilities for "law enforcement" to put
people behind bars is _not_ what the world needs today.

You support adding a new, ever more complex, way for people to be caught up by
the law for reasons completely foreign to them. Your love for computers and
algorithms and AI, and desire to see them in action, blinds you to the real
world. I challenge you to meet people, innocent and not, who've been affected
by law enforcement. I challenge you to really _think hard_ about what you want
from facial recognition.

[1]: [https://www.nbcnews.com/news/us-news/st-louis-officer-
execut...](https://www.nbcnews.com/news/us-news/st-louis-officer-executed-
suspect-planted-gun-his-car-prosecutor-n788626)

[2]: [https://www.cbsnews.com/news/body-cam-video-baltimore-
police...](https://www.cbsnews.com/news/body-cam-video-baltimore-police-
department-officer-planted-drugs/)

[3]: [https://news.yahoo.com/former-florida-officer-allegedly-
plan...](https://news.yahoo.com/former-florida-officer-allegedly-
planted-180952213.html)

[4]: [https://mynewsla.com/crime/2017/11/12/lapd-officer-
planted-c...](https://mynewsla.com/crime/2017/11/12/lapd-officer-planted-
cocaine-suspect-department-examining-body-camera-footage/)

[5]:
[https://www.theatlantic.com/magazine/archive/2017/09/innocen...](https://www.theatlantic.com/magazine/archive/2017/09/innocence-
is-irrelevant/534171/)

[6]:
[https://digitalcommons.law.byu.edu/cgi/viewcontent.cgi?artic...](https://digitalcommons.law.byu.edu/cgi/viewcontent.cgi?article=1290&context=jpl)

[7]: [https://www.theatlantic.com/politics/archive/2017/05/plea-
ba...](https://www.theatlantic.com/politics/archive/2017/05/plea-bargaining-
courts-prosecutors/524112/)

[8]: [https://nvcopblock.org/163026/innocent-mans-life-ruined-
by-f...](https://nvcopblock.org/163026/innocent-mans-life-ruined-by-false-dui-
charges/)

------
doc_gunthrop
Perhaps this can lead to the birth of an industry of clothing and facial-wear
as a countermeasure against facial recognition.

~~~
openforce
Burqas?

------
sadris
This is great. More sharing between departments will help catch criminals
faster.

~~~
elliekelly
And help catch innocent people faster, which is not so great.

~~~
sodosopa
They’ve shared fingerprint data as soon as AFIS went online in 84. This is an
advancement of the same process.

How many innocent people are caught with fingerprint data?

~~~
lightbyte
Considering fingerprint analysis is pseudoscience and almost entirely made up
(along with most other "forensic science") [1] [2] we can assume LOTS AND LOTS
of innocent people are caught with "fingerprint data"

[1]
[http://www8.nationalacademies.org/onpinews/newsitem.aspx?rec...](http://www8.nationalacademies.org/onpinews/newsitem.aspx?recordid=12589)

[2] [https://www.nap.edu/catalog/12589/strengthening-forensic-
sci...](https://www.nap.edu/catalog/12589/strengthening-forensic-science-in-
the-united-states-a-path-forward)

>"There is some evidence that fingerprints are unique to each person, and it
is plausible that careful analysis could accurately discern whether two prints
have a common source, the report says. However, claims that these analyses
have zero-error rates are not plausible; uniqueness does not guarantee that
two individuals' prints are always sufficiently different that they could not
be confused, for example. Studies should accumulate data on how much a
person's fingerprints vary from impression to impression, as well as the
degree to which fingerprints vary across a population. With this kind of
research, examiners could begin to attach confidence limits to conclusions
about whether a print is linked to a particular person."

~~~
sodosopa
I would still want to look at the rate. Is it 1 of 10 or 1 in a million? At
what number are we comfortable for our safety?

~~~
wtetzner
How would you know? If an innocent person is convicted, then they’re assumed
to be guilty.

~~~
rrauenza
Or even more common, a plea bargain recommended by their public defender.

