
Neural Net Trained on Mugshots Predicts Criminals - jastr
https://www.technologyreview.com/s/602955/neural-network-learns-to-identify-criminals-by-their-faces
======
thechao
Alternately: criminal prosecution is targeted at people who "look like"
criminals; thus, the NN is just selecting those people we think look like
criminals, rather than any inherent criminality.

~~~
ganeshkrishnan
This is so true. We are training the neural networks from our own bias and
cannot expect it to be fair. A supervised ML algorithm is as good as it's
training data and in our current scenario the training data is almost always
flawed.

Judges receiving kickbacks to throw minors into juvenile jail, racial bias in
sentences, jail for marijuana, private prison etc. These are all trademarks of
a broken penal system and we cannot and should not train machines based on
this data.

~~~
daveguy
> Judges receiving kickbacks to throw minors into juvenile jail, racial bias
> in sentences, jail for marijuana, private prison etc.

I expect you have some solid evidence for this? Maybe just a citation?
Anything? Obviously there is institutional racial bias, but kickbacks for
private jails? for marijuana? for minors? Really?

~~~
woodruffw
> Judges receiving kickbacks to throw minors into juvenile jail

He's probably referring to "Kids For Cash" [0].

> racial bias in sentences

You can look at federal and state incarceration statistics for yourself.
African-Americans and Hispanic-Americans serve, on average, longer sentences
for the same crimes as their Caucasian counterparts. Here's one source.[1]

> jail for marijuana

Mandatory minimums for possession (especially "with intent to sell") have been
part of federal drug enforcement policy for the last 50 years. Here are a few
of them.[2]

> private prison etc

It was only _this year_ that the DOJ finally committed to eliminating private
prisons in the federal system.[3] They're still a massive part of state
penitentiary systems (see: cash for kids).

[0]:
[https://en.wikipedia.org/wiki/Kids_for_cash_scandal](https://en.wikipedia.org/wiki/Kids_for_cash_scandal)

[1]: [https://www.aclu.org/issues/mass-incarceration/racial-
dispar...](https://www.aclu.org/issues/mass-incarceration/racial-disparities-
criminal-justice)

[2]:
[http://www.pbs.org/wgbh/pages/frontline/shows/snitch/primer/](http://www.pbs.org/wgbh/pages/frontline/shows/snitch/primer/)

[3]: [https://www.justice.gov/opa/blog/phasing-out-our-use-
private...](https://www.justice.gov/opa/blog/phasing-out-our-use-private-
prisons)

~~~
daveguy
I misread that as "Judges receiving kickbacks to:" 1) throw minors into
juvenile jail, 2) racial bias in sentences, 3) jail for marijuana, 4) private
prison 5) etc.

But the kickbacks was in the cash for kids scandal -- hopefully isolated
criminal corruption in judges. The others are definitely negative and
indications of a broken system, but not the level of "judges taking cash for
sentences" corruption. Thank you for the clarification.

~~~
bitJericho
Anybody who actually wants to be a judge in the American system is corrupt.
The American system of justice is so obviously flawed that only a person
corrupt in their thinking would want to become a judge.

~~~
grzm
[Removed]

~~~
SomeStupidPoint
Is a loving husband, devoted father, kind neighbor, etc etc person who drove
trains for the Nazi's a good person?

I feel that most people are in (considerably less extreme) versions of that
situation: kind, generous people in general, who simply don't think about how
the fairly banal things they do every day are part of really bad stuff. The
distributed, banal nature of modern evils -- and they were intentionally
structured that way so people _wouldnt_ think about it -- is really what
allows evil to flourish in modern society.

Yeah, it wasnt a big deal you flipped a train switch (or filled out mortgage
loan paperwork a little shoddily, etc), and likely someone else would've done
it instead, but if we all refused to do it, the world would be a better place.

I think many US judges are similar: they're good, a little bit selfish people
who just don't think about the evil they contribute to.

Enforcing unjust laws, even fairly, is evil. And (almost) all US judges are
guilty of that. (Eg, mandatory minimum drug laws, which empirically just cause
harm.)

------
decker
Their method had a 89.5% success rate, which might seem great, but is pretty
much worthless in real life. The US has the highest incarceration rate, so we
can use the US incarceration rate as an upper bound for the probability of
randomly selecting a criminal from the population (716 per 100k, P=0.00716).
This means that if we apply the same method at random to members of the
general population, there's actually at most a 5.79% chance that the result of
"criminal" is accurate.

    
    
      Maths:
      Let Pc = probability of criminal = 0.00716
      Let Pt = probability of test being accurate = .895
      Probability of criminal given criminal conclusion = Pc * Pt / (Pc * Pt + (1 - Pc) * (1 - Pt)) = 0.0579

~~~
fractalwrench
If the method was specifically targeted at high-crime neighbourhoods, the
chance of an accurate match would presumably increase. You could also map the
entire population of a country and detect previously unknown "criminal"
hotspots. Both are awful, authoritarian ideas, but would potentially give
useful information to support regular policing.

Thankfully, the most likely application of this technology I can see in the
near-future is someone making an app that scores your face for criminality.

~~~
tremon
_Thankfully, the most likely application of this technology I can see in the
near-future is someone making an app that scores your face for criminality._

I don't share your positive outlook on law enforcement officials.

------
woodruffw
Give this to a despot to train on his political enemies (or ethnic/religious
minorities), and you suddenly have a very good NN for condemning innocent
people to jail.

Research should continue into this, but it's worth remembering that the
"criminals" being trained on aren't necessarily _bad people_ in the moral
sense. They're merely the recipients of judgment by some third entity (in this
case, China's legal system).

~~~
c0nducktr
Phrenology for the 21st century.

------
jbpetersen
"Their method is straightforward. They take ID photos of 1856 Chinese men
between the ages of 18 and 55 with no facial hair. Half of these men were
criminals.

They then used 90 percent of these images to train a convolutional neural
network to recognize the difference and then tested the neural net on the
remaining 10 percent of the images.

The results are unsettling. Xiaolin and Xi found that the neural network could
correctly identify criminals and noncriminals with an accuracy of 89.5
percent."

~~~
rdlecler1
I'd like to see what human recognition would be as a null hypothesis. You
could also block out certain features to see what's driving the decision
making here.

~~~
jbpetersen
The article details the specific facial features that were focused on by the
trained model. So that's at least part of what you're looking for.

------
paulajohnson
This doesn't "predict criminals", it predicts those who will be convicted of a
crime, which is not the same thing. Suppose that people have an unconscious
prejudice against those with eyes set close together. They will be
disproportionately convicted, and this neural net will find the correlation.

~~~
bryanrasmussen
Since people assume beautiful people are good what is the correlation between
these results and people who are considered ugly in their particular
population?

------
empath75
'Criminals' being defined as people who have been convicted by a judicial
system that is full of bias.

~~~
freyr
What is the bias in this case?

~~~
null0pointer
Every country has different laws so a 'criminal' in one jurisdiction may not
be a 'criminal' in another. The training data is fundamentally flawed as the
target classes (criminal and not-criminal) are defined by largely arbitrary
rules.

~~~
freyr
We can simply restrict our attention to one country, as they did in the study,
to eliminate that variability. What is the bias?

------
okonomiyaki3000
This concept is troubling, uncomfortable, and could potentially be the basis
for some very bad policy but none of that is a reason to dismiss it outright.
If these correlations are real, it's worthwhile to find out more about it with
an open mind.

~~~
Ace17
Minority report scenario hinges on the fact that the crime predictions are not
completely reliable.

But if we had a 100% sure way of detecting that someone's going to commit a
crime, should we use it? This would be a dangerous change of perspective about
free will and its relation to the law.

~~~
halomru
>if we had a 100% sure way of detecting that someone's going to commit a
crime, should we use it

If it was 100% correct, it seems morally justifiable to use it, as long as you
are a dictatorship. For a democracy, you would need not only correctness, but
independently verifiable correctness. If the algorithm is 100% correct today
but can be influenced tomorrow, you are giving absolute power over the state
to whoever can influence the algorithm. After all, it's impossible to proof
your innocence if you are convicted before the crime.

Basically, after solving the technical problems with creating the algorithm,
you also need a system of checks and balances that can replace the public
verifiability of our current court system.

~~~
oh_sigh
If it was 100%, all you would need to do was intervene in some manner where
that likelihood dropped to 0%. For example, if the system can predict you will
beat your spouse, then force the future-perp to come outside 10 minutes before
and modify the scenario to the point where they wouldn't beat their spouse.

With minority-report scenarios, it's not like you actually need to punish
people as if they really did commit the crime, even though the crime is in the
future. All you need to do is prevent it.

------
KKKKkkkk1
I bet you could train a neural net to detect the exact moment when the AI
bubble has jumped the shark.

~~~
foota
Right when you turn on the shark jump detecting AI, would be my bet.

------
sdoering
Well what comes around, goes around or such. I remember during my studies
(Literature, Culture and such) to having read of methodologies used in the
18th century to detect criminals by their physiological features.

On person doing these studies was for example Francis Galton (who by the way
did quite a mix of things from eugenics to statistics and "the wisdom of the
crowd")[1].

Lots of things, coming from the old times into the modern times like
Physignomy [2] that was also used for racial identification/discrimination in
the 20th century.

Have fun walking deeper into that rabbit hole of history. What I take from
that is, that bad ideas never die, even if science was able to debunk them.

Or as @thechao already said:

> Alternately: criminal prosecution is targeted at people who "look like"
> criminals; thus, the NN is just selecting those people we think look like
> criminals, rather than any inherent criminality.

[1]
[https://en.wikipedia.org/wiki/Francis_Galton](https://en.wikipedia.org/wiki/Francis_Galton)
[2]
[https://en.wikipedia.org/wiki/Physiognomy](https://en.wikipedia.org/wiki/Physiognomy)

[Edit] Formatting

------
YeGoblynQueenne
The paper's conclusion claims that " Furthermore, we have discovered that a
law of normality for faces of non- criminals. After controlled for race,
gender and age, the general law-biding public have facial appearances that
vary in a significantly lesser degree than criminals."

Given the above I move that the title of this post is changed to "Neural Net
trained on mugshots confirms the findings of Phrenology".

------
sundvor
So were faces of e.g. dodgy GFC bankers included in that pool? Or are we just
talking petty criminals here.

------
stewhuk
I looked at this paper the other day and it looked to me like the non-criminal
faces examples were men with shirt collars, whereas the criminal examples were
men in t-shirts. If they got above random accuracy, then I wonder if they
simply overfitted on that, and the lighting and colour differences between the
two styles of photos.

------
jlos
Did not think this would be the year phrenology made a comeback. . .

~~~
tropo
Why not? The inaccuracy problems were merely poor-quality measurements and
statistics. Prediction accuracy will continue to improve with technology.

------
rsl7
I wonder if it is the same before and after one becomes a criminal. Being a
criminal does not mean they don't regret or feel guilt. Perhaps that is what
is detected.

------
kuroguro
"In other words, the faces of general law-biding public have a greater degree
of resemblance compared with the faces of criminals, or criminals have a
higher degree of dissimilarity in facial appearance than normal people"

Hmm... so.

You look weird -> people treat you worse -> you don't feel like working with
them -> higher chance of being a criminal.

I know that's a giant leap in reasoning, but that was the first thing that
came to mind.

------
spacemanmatt
Could it be used to identify those likely to commit election or securities
frauds? Or maybe those likely to poison an entire city by ruining their water
supply?

------
grzm
Previous discussion on arXiv paper:

[https://news.ycombinator.com/item?id=12983827](https://news.ycombinator.com/item?id=12983827)

~~~
mc32
So if the 2011 paper shows similar conclusions as this paper... and if people
as well as this CNN can "tell" criminals apart from non-criminals could non-
criminal people's unconscious bias lead to non-criminal faced people to see
criminal-faced as actual criminals and kind of funnel them into criminality?

That is non-criminal faced people might show bias against criminal faced
people such that these criminal-faced people resort to [petty or other] crime
to get by?

Of course, one question is why this bias might have emerged in the first
place. What was its genesis?

------
aktiur
"They take ID photos of 1856 Chinese men [...]. Half of these men were
criminals."

Because half of the general male population is criminal, of course.

The accuracy rate would be very different with a training/testing sample that
takes the base rate of criminality into account.

------
stolk
Let me guess... the NN was trained to spot tattoos? I bet you tattoos and
crime correlate.

------
KaiserPro
Phrenology is still phrenology even if its digital.

------
mobiuscog
The future of this would make a great film...

