
Police Feeding Celebrity Photos into Facial Recognition Software to Solve Crimes - saravana85
https://www.vice.com/en_us/article/xwngn3/police-are-feeding-celebrity-photos-into-facial-recognition-software-to-solve-crimes
======
dsfyu404ed
Feeding in doctored images, grainy social media pics of people that look
"kinda similar" and other garbage data into these tools just to generate a
list of possible suspects is a surefire way to generate garbage matches (i.e.
non-matches).

I don't see how this is any better than just putting all the names of people
who have the same demographic info into a hat and picking a set of them.

I guess incompetence might be the thing that saves us from a truly effective
Orwellian state.

~~~
bsenftner
I write FR software, and this practice irritates the fuck out of me. A common
phrase in the industry is you can't anticipate what stupid users won't do, and
this is a clear case of end-users sticking their fingers into powered light
sockets, at the expense of the general public's safety by the very agency
enlisted to protect the public. The stupidity is amazing. If there is ever any
regulations around FR's use, probing with doctored or "looks like" images
needs to be illegal or at minimum no grounds for any warrant of any kind.

~~~
adossi
If the FR were to learn based on input I could see how feeding it doctored
images would screw it up in the long run. But if its just the FR utility,
minus any ongoing machine learning, how does this effect the public's safety
exactly?

Not disagreeing just curious.

------
mabbo
I had thought they were going to say that they use celebrity images to train
the systems. Because celebrities are an _incredible_ training set for facial
recognition.

Consider: You want to train a one-shot matching system for taking photos from
crime scene cameras and searching a database of drivers licence photos. You
need a training set of different images of the same person, and then lots of
those sets for different people. Boom: celebrities! They have thousands of
photos from thousands of angles. Poorly taken shots by paparazzi. Stills from
films at a zillion different angles. And some nice, face-on shots like a
drivers licence photo. (Heck, if it's the police they have access to that
celebrity's actual DL photo).

The one nice side about training that way is that most celebrities are really
good looking people, so only the really pretty criminals will be caught thanks
to training bias.

~~~
Scoundreller
It’s also bad news for people that have stunt doubles or thought cosmetic
surgery would keep them free.

------
CoffeeDregs
While I've never been detained, I've been around police enough to know that
they generally know the likely suspects immediately after many crimes (turning
in a statement after an SF Mission smash-grab; talking with a friend about a
-significant- tool theft from their barn). I suspect the reason this
celebrity-photo thing is happening is that the police already know their
primary suspect and they're looking for "modern", crappy supporting evidence.
It's like the JS/PHP of police work...

------
brohee
I cannot fathom how they put obvious mistakes in the training material (e.g.
hairline in the last example). How are they selling something when their own
training material shows it's total bunk.

The time and money spent on that could be spent on actual police work...

~~~
dsfyu404ed
Because it's not about accuracy. It's about generating a match, no matter how
crappy. It's no different than back when they used to pick up random known
criminals (usually non-white) and charge them with whatever crimes they had no
leads for in order to clean up the number of open cases. Selecting the random
person to pick up is now outsourced to the software. The cops can claim not to
be responsible because they're just doing what the computer says. The company
selling the crap can claim not to be responsible because the cops are using it
wrong.

~~~
taneq
> It's about generating a match, no matter how crappy.

That's the worst thing about this kind of dragnet search is that the more
comprehensive the surveillance and the more accurate the picture of the
offender, the more likely it is that when you're picked up as the prime
suspect in something you had nothing to do with, that you look identical to
the culprit and you were in the area at that exact time and so on.

And so if you _are_ the unlucky bugger that gets caught up in it, the better
the system works, the harder it is for you to argue that you actually WERE
innocent. Juries aren't good at the Law of Large Numbers.

~~~
Shivetya
Well we can look at this from the opposite side, it eliminates more people far
faster giving the authorities fewer people to check. Now this is fine provided
there are sufficient rules governing that second step.

That is where the process currently fails. There are no published checks and
balances.

Look, we don't have to like this one bit but we are kidding ourselves to think
it can be stopped. Even the actions of San Francisco can be overridden at the
state or federal level when push comes to shove if not a shift in electorate.
so the task ahead is to regulate strictly how it is used and what can happen
from beginning to end of life of the data. the first rule being, it can never
establish guilt on its own.

~~~
sfink
There will be some percentage of false positives. If you're feeding the system
with a biased, handpicked population, you will get false positive matches from
that population and not from other populations. If you are on the shit list,
your odds of being accused when you're actually innocent are far, far higher
than those of someone not on the shit list. If the shit list is predominately
black, then innocent blacks will be sent to jail more often than innocent
whites. It's profiling.

------
anigbrowl
Fantastic, because I look like the literal twin of a very famous actor who
appears in multiple current blockbusters. It's not that either of us is
especially good looking, but we have that sort of face that is bland enough to
take on many different looks.

------
thinkcontext
It will be interesting to see how facial recognition is treated as evidence.
Things like, will judges grant warrants based on a 80% match but not a 70%
match? Will juries consider it more or less reliable than recognition by eye
witnesses? Will defense experts be able to cast doubt on certain algorithms or
techniques vs others?

------
josinalvo
I disagree that this is, per se, a violation of due process

As long as the matches are not being used as evidence on trials or as evidence
to get warrants, the police can generate their list of leads by consulting a
hamster, or by taking names out of a hat. Right? Or is there/should there be a
right to not be considered a suspect on foolish motives?

Also, maybe these techniques are useful, in the sense that they allow for
police to use a bad image to reduce a pool of suspects to a manageable pile
that can be examined by hand.

~~~
drblast
So you would be comfortable with police essentially randomly selecting people
to arrest for various unsolved crimes? And we'll just let the courts sort it
out?

The reason the fourth amendment exists is exactly to prevent this sort of
thing.

~~~
josinalvo
No, but it is my understanding that to even detain a person you need to meet
some standard of evidence. I am saying: let the police use whatever lead
generating method they can, as long as they are not using it as evidence to
detain, search, or imprison.

That is, their system can be whatever they want to generate a list of names.
Afterwards, they need evidence to search a persons house, or to detain them,
or to arrest them.

(In legal lingo: I am okay with them using these methods, as long as they are
not considered probable cause [https://www.nolo.com/legal-encyclopedia/when-
police-can-make...](https://www.nolo.com/legal-encyclopedia/when-police-can-
make-arrest-probable-cause.html))

------
drblast
This reminds me of the common practice of using drug-sniffing dogs not to
actually sniff out drugs but as an excuse to do a search.

I think there should be an examination and maybe some additional rules around
establishing probable cause using what amounts to a divining rod.

~~~
e40
Or the "you didn't use a turn signal" as an excuse for a traffic stop.

------
ggggtez
Considering that all the research points to image recognition having basically
no robustness to noise, I imagine the path here is going to be to at least
block it's use in court, if not to generate the lead to begin with.

------
LifeLiverTransp
Surely, having such a excellent track record- criminalistic techniques and
sciences, will soon reach the level of a true science such psychology.

------
eriktrautman
Can’t get past that awful ad play with no close button taking up half the
screen

------
np_tedious
So this is what actual True Detectives do?

