
Police Program Aims to Pinpoint Those Most Likely to Commit Crimes - charrisku
http://www.nytimes.com/2015/09/25/us/police-program-aims-to-pinpoint-those-most-likely-to-commit-crimes.html?_r=0
======
DanBC
We can see what the results of this style of policing is when we look at
juveniles.

[http://www.economist.com/blogs/democracyinamerica/2014/03/am...](http://www.economist.com/blogs/democracyinamerica/2014/03/americas-
prison-population)

> This overreliance on imprisonment can be seen most starkly, and sadly, by
> looking at the juvenile population, which is just under 71,000 nationally.
> Around 11,600 are imprisoned for "technical violations" of their probation
> or parole terms, rather than because they committed a new crime. In 11
> states such juvenile prisoners outnumber those in for crimes against other
> people. In only one state (Massachusetts) did juveniles imprisoned for
> crimes committed against people comprise a majority of juvenile prisoners.
> Around 3,000 are locked up for things that aren't crimes for adults, "such
> as running away, truancy and incorrigibility." Incarcerated children are
> less likely to graduate high-school and more likely to spend time in prison
> as adults. If America is interested in reducing its prison population,
> locking up fewer juveniles for silly reasons would be a good place to start.

------
jmnicolas
"In Chicago, the police have developed a “heat list” of 400 people who are
considered far more likely than the average person to be involved in violent
crime. Factors in compiling that list included their criminal records, social
circles and gang connections. Also a factor was whether they had been victims
of an assault or a shooting."

Yeah and I'm sure that Chicago detectives had absolutely no clue about those
400 people, they really needed a piece of expensive software to tell them
where to look for. The trend is to think that technology will solve all
problems, but it's just wishful thinking imho.

~~~
mentat
Instead of protecting and helping victims, expect them to retaliate and be
waiting when they do... Awesome.

~~~
riskneural
In the insurance industry, if you are the victim of a crash you may lose your
no claims bonus. The situation is not just as simple as a victim of attack
being a victim.

Wouldn't it be the case that, for example, those in the drug business are far
more likely to harm their competition than their market?

~~~
mentat
For what types of attacks is it OK to "take away the no claims bonus" for fair
and equal protection (and scrutiny) by the law?

~~~
riskneural
If the goal is to prospectively keep the peace, then the monitoring should be
directed towards those most likely to break the peace. I think that is rather
orthogonal to fair and equal protection.

------
pdkl95
[https://en.wikipedia.org/wiki/Apophenia](https://en.wikipedia.org/wiki/Apophenia)

Given the problems in police departments (which have fortunately started to
appearing in the news), giving the police a system that will essentially let
them see what you want to see is a terrible idea. Police work is already full
of "forensic tools" that don't actually work (like idea that fingerprints
actually identify someone uniquely, or the various techniques that are
examples of the Birthday Paradox).

While I'm sure that it's possible to use modern techniques to estimate where
crime will occur, it won't work in practice. There are simply too many ways to
bias the results (intentionally or not). I suspect giving police this kind of
tool is simply a way to give legitimacy and cover to their bad behavior.

> including information about friendships, social media activity

COINTELPRO is a helluva drug.

> advocates say predictive policing can help improve police-community
> relations by focusing on the people most likely to become involved in
> violent crime.

That sounds _suspiciously_ like an excuse to improve _white_ communities, by
focusing on the _blacks_ (who have historically been seen as "violent savages"
by racists).

> because our predictive tool shows us you might commit a crime at some point
> in the future

The big question is how long until someone tries to use that "predictive tool"
as _probable cause_.

~~~
Baghard
Apophenia is the _human_ tendency to see patterns in random _data_. Predictive
_analytics_ is a machine that pulls non-random patterns out of data and
presents this as _information_.

To say that other tools have a bad track record may count as a valid argument,
but to me, it is a weak and fatalistic one. Judge each tool on its own merit
or discard all tools as useless seems like an easy choice.

Predicting crime works in practice and theory. These models are not black
boxes, they can be introspected. Bias can be detected and removed.

COINTELPRO was a program to infiltrate and disturb organizations that the
state viewed as unwelcome. Monitoring social media activity is common
detective work. The modern equivalent of an officer peeking over the fence in
your back garden to see a stolen motorbike. Now they can use Google Maps for
that. This is public information: The criminals feel free and safe enough to
post and brag about their crimes on Facebook.

Removing or combating criminal elements in any community will improve that
community, regardless of skin color. Black youth is helped, not suppressed,
when gang recruiters are identified and punished.

Predictive tools are already used as probable cause. Prisoners in Guantanamo
Bay can get a brain-wave reader test. This device will tell you what someone
is thinking about and may reveal the plans of future terrorist attack.

~~~
pdkl95
> Apophenia is the _human_ tendency to see patterns in random _data_.
> Predictive _analytics_ is a machine that pulls non-random patterns out of
> data and presents this as _information_.

Yes. I know that. Apophenia is the correct term. Your fancy machine that
predicts crime is only as good as the data it is fed and is made worse by the
person who interprets the results. Both of these are easily biased.

> Bias can be detected and removed.

Just like how they removed the numerous biases, assumptions, and bad
methodology that are well-known problems with breathalysers? Even if the model
was theoretically accurate, the implementation can (and will) be wrong. You
seem to be using a just-world assumption that doesn't have lazy, incompetent
or malicious people.

> Monitoring social media is common detective work

It can be both. While these methods may be useful for going after _stupid
criminals_ , you're ignoring that it is also useful when targeting activists,
political dissidents, etc. If you think this doesn't happen, you haven't been
paying attention.

You're problem is that you are assuming it is only "criminals" that are
targeted, but you live in a world where, to use an obvious example, some
people assume that any black person is a "criminal".

~~~
Baghard
> Both of these are easily biased.

Indeed. But remedies exist. Statisticians can examine the validity of the
data, analysts and detectives can be trained to interpret the results
correctly, and social scientists can point out the dangers of relying solely
on computer systems.

I have no strong view for or against breathalysers. I'll concede that there
may be some errors in those tests. Does that completely invalidate these tools
in judging if someone is too drunk to drive and may cause harm to self or
others? Should we only opt for rigorous methods like drawing blood samples? My
view is: no, we should not. These are valuable tools that work for the large
majority of times and help save lives (at the inevitable cost of some errors
and inconvenience).

In my world I believe in just intentions. Breathalysers are not introduced to
imprison sober drivers, they are to combat drunk ("lazy, incompetent or
malicious") drivers on our roads.

These methods are useful for catching the savvy criminals too. I am not
ignoring that these systems are also useful to target activists and political
dissidents. That's basically what they were build for in the first place
(well, that and the terrorists, see DARPA LifeLog). It's just now that these
tools are adopted by local Police departments.

A weapon stick can be used to subdue a suspect through non-lethal force, and
it can be used to choke a peaceful protester. It will succeed in both tasks.
It's not a stupid ineffective tool we should take away, because it can be used
in bad ways. We should make sure to avoid the bad usage, and provide police
with the best weapon stick possible for the good usage.

You assume that this system will be used to justify police brutality and that
this system will be used by people who think that any black person is a
"criminal". I have a higher opinion of the people who join the police. I
rather reserve such judgment to the criminals themselves.

~~~
pdkl95
> Does that completely invalidate these tools

Yes, when the error is this large and so easily manipulatable. (breathalysers
are notorious for being incredibly broad in what the detect (bad false
positive rate), and they are required to assume a 2100:1 ratio when estimating
blood concentration from the measured breath concentration. In reality, there
is a lot (up to +/-800 for some people). There _is_ a good, science-based
reason for that ratio involving the partial pressure of EtOH. The reason is
valid, it simply ignores the (large) minority of cases where other factors
complicate the analysis.

You may suggest that it would be easy to use modern techniques to find a
better formula that accounts for these variations. That would work... but it
has always been possible. You don't need anything particularly fancy to add a
few corrections. These problems - and how to correct them - have been known
for _decades_ , yet breathalysers haven't changed. Why? Because an inaccurate
tool gives police the leeway to target a much larger set of people (if they
want to - selective enforcement is a powerful tool).

> Should we only opt for rigorous methods like drawing blood samples?

Yes, absolutely, and I (and _many_ defence lawyers and civil right
organizations) recommend insisting on a blood test should you ever asked to
take a breathalyser because of how inaccurate and manipulatable the breath
test is in real-world situations. (disclaimer: this can vary between states;
see a local lawyer for proper advice)

> at the inevitable cost of some errors and inconvenience

A necessary feature of a free society is the assumption of innocence until
proven guilty. Law enforcement is _deliberately_ given a harder task, because
errors are not simply an _inconvenience_. Errors risk charging an innocent
person with a crime. You would not call being arrested because of a false-
positive an _inconvenience_.

> You assume that this system will be used to justify police brutality and
> that this system will be used by people who think that any black person is a
> "criminal".

I don't need to _assume_ anything. These things are already extremely common
today without the need for advanced data processing techniques. If for some
reason you doubt these facts, you may want to look up the per-capita
incarceration rates by race and compare that to stuff like the drug use rates
for the same groups.

> In my world I believe in just intentions

[https://en.wikipedia.org/wiki/Just-
world_fallacy](https://en.wikipedia.org/wiki/Just-world_fallacy)

> I have a higher opinion of the people who join the police

I prefer to keep my opinions based in reality.

While I'm sure only a minority of police are abusive, the rest are aiding and
abetting by not reporting the crimes committed other officers. Misprision of
felony is still a crime in the US (18 U.S. Code § 4) (
[https://www.law.cornell.edu/uscode/text/18/4](https://www.law.cornell.edu/uscode/text/18/4)
)

~~~
dragonwriter
> While I'm sure only a minority of police are abusive, the rest are aiding
> and abetting by not reporting the crimes committed other officers.
> Misprision of felony is still a crime in the US (18 U.S. Code § 4) (
> [https://www.law.cornell.edu/uscode/text/18/4](https://www.law.cornell.edu/uscode/text/18/4)
> )

Misprision requires concealment _and_ not reporting; not reporting by itself
is insufficient, concealment is _active_. See, United States v. Johnson, 546
F.2d 1225 (5th Cir. 1977) [0]

[0] [http://openjurist.org/546/f2d/1225/united-states-v-
johnson](http://openjurist.org/546/f2d/1225/united-states-v-johnson)

~~~
pdkl95
I should have been more precise; you're quite correct about the requirements
for misprision. Police absolutely are concealing the crimes of the "bad
apples" in the department. To be precise, I'm sure we can easily find
candidates of for all levels of involvement (it's a big country).

There is no shortage of police that are giving cover for the _murders_ in
their department, and most of the rest still look the other way.

------
e12e
"During an August call-in, the speakers told the men that this was their last
chance. Tammy Dickinson, the United States attorney for the Western District
of Missouri, related the story of a man in the program who was given a 15-year
prison sentence for being caught with a bullet in his pocket."

So, yet again software is the new force-multiplier? Strict (IMNHO _crazy_ )
sentencing guides, arguably designed for the purpose of reducing crime by
being a deterrent, now leads to even more filling up of prisons due to
targeted (ab)use against certain groups?

On a side note, I wonder how these algorithms handle police brutality etc. I
can just imagine sitting in such a meeting, and seeing a couple of police
officers in full uniform popping up on that mugshot wall of shame...

------
swframe
There is a thisamericanlife show (Crime Pays) about cops paying kids to give
up (or avoid) a life of crime. It is quite amazing how little it takes to
prevent someone from costing society $40K a year in jail.
[http://www.thisamericanlife.org/radio-
archives/episode/555/t...](http://www.thisamericanlife.org/radio-
archives/episode/555/the-incredible-rarity-of-changing-your-mind?act=2)

~~~
confluence
End of the day, it's cheaper to pay people to do nothing, than it is to lock
them up when they become desperate and commit crimes.

------
Wonderdonkey
I started reading this hopefully. I liked the idea that seemed to be
developing at first — reaching out to people before crimes are committed,
potentially saving a human being from a life in prison (not to mention saving
any potential victims).

Instead, it developed into a story of what amounts to pre-meditated blind rage
against any and all associated with a given criminal.

This isn't new. And it's exactly what we don't need more of.

------
jellicle
Step 1: Discriminatory policing against disliked groups.

Step 2: Run the results of that through the computer.

Step 3: The unbiased computer tells me that disliked groups are more likely to
be charged with crimes! That justifies my discrimination! I knew they were up
to no good! And it's not me, the COMPUTER says it!

------
fijal
I'm a bit surprised noone brought up minority report. Philip K Dick living a
good afterlife :-)

~~~
_nedR
I am more reminded about Psycho Pass
([http://www.imdb.com/title/tt2379308/](http://www.imdb.com/title/tt2379308/))
a terrific anime tv series. It basically explores a future in which a person's
criminal intent (called their crime coefficient) along with other mental
attributes (collectively referred to as one's psycho pass) is judged by the
Sibyl system - A system that works so well that it acts as judge, jury and
executioner with little human input. It explores a lot of questions such as
the role of free will and potential for change in such a world. Interestingly
lot of people find it hard to lower their crime coefficients after it
'stabilizes' to a new high (Like how it becomes easier to commit a crime again
after having done it once - the same judgement the police make in this
article). Also like in this article, people who are victims of abuse and crime
often find their crime coefficients inevitably rise.

Don't let the cartoon characters fool you. IMO The complexity and depth of
this series (and a lot of sci-fi anime) far surpasses the likes of minority
report and hollywood sci-fi in general. My description really doesn't do it
justice.

~~~
Raphmedia
The scene where the "cops" have to arrest (or was it kill?) the victim of a
rape because her crime coefficient increased was pretty harsh.

------
phkahler
>> But Mr. Brown, 29, got more than he had bargained for. A police captain
presented a slide show featuring mug shots of people they were cracking down
on. Up popped a picture of Mr. Brown linking him to a criminal group that had
been implicated in a homicide.

“I was disturbed,” said Mr. Brown

Sounds like intimidation.

------
e12e
"(...) an experiment taking place in dozens of police departments across the
country, one in which the authorities have turned to complex computer
algorithms to try to pinpoint the people most likely to be involved in future
violent crimes — as either predator or prey."

Interesting (eerie) parallel to the intro in the TV series "Person of
Interest", although supposedly this system doesn't get its data from the NSA
(as with parallel construction etc), but rather the information comes from
more-or-less open data (legal surveillance etc) -- and of course it isn't
vigilantes but police that will, 'victim or perpetrator, if your number's up
(...) find you':

"You are being watched. The government has a secret system: a machine that
spies on you every hour of every day. I know, because I built it. I designed
the machine to detect acts of terror, but it sees everything. Violent crimes
involving ordinary people; people like you. Crimes the government considered
'irrelevant'. They wouldn't act, so I decided I would. But I needed a partner,
someone with the skills to intervene. Hunted by the authorities, we work in
secret. You'll never find us, but victim or perpetrator, if your number's
up... we'll find you".

------
nsns
This illusion of efficacy, with often detrimental results, is nothing new[0];
a fatal naivety which ignores human agency. [0]
[https://en.wikipedia.org/wiki/Criminal_Tribes_Act](https://en.wikipedia.org/wiki/Criminal_Tribes_Act)

------
microcolonel
I figure this will look good and "tough on crime" to the people seeing the
news, but intimidation and preemptive punishment will not give people the self
confidence they need to make a success of themselves in polite society.

Not to mention, I'm sure you can see how this is a breach of justice.

------
nobody_nowhere
What could possibly go wrong?

------
jrochkind1
PreCrime

------
knd775
There was a Captain America movie about this sort of thing.

------
sobkas
Ah, the famous "Computer said so" ass cover. Because computer made that
decision, no one can be held responsible for it.

------
mtgx
How long until they just send armed drones automatically after them based on
the software's algorithm? 10-20 years?

~~~
krapp
Isn't this the sort of thing humans are supposed to be bad at but AI should be
good at?

An automated justice system wouldn't be biased by human prejudice, ignorance
or fear. It doesn't get tired, doesn't feel pain or pity or remorse. It would
be able to impartially and accurately process vast amounts of data - far more
than a human, and the actions of the drones could be completely auditable.
Drones won't lie on the stand to protect their fellow drones, or tamper with
evidence.

You could walk down the street surrounded by police drones and be confident
that you're not being profiled based on racist or religious bigotry, but pure
mathematics and statistics. In every conceivable way, an armed drone with a
license to kill is safer, faster, more reliable than a human. One only has to
look at the current justice system in any country to see that humans are
simply not capable of properly judging the motives of, or punishing, other
humans in any reasonable way.

People will simply have to accept that the day will come when human-applied
justice is viewed with the same ridicule and scorn that witch-burning is
today... as the vicious, superstitious barbarism of an ignorant past.

~~~
smtddr
_> >An automated justice system wouldn't be biased by human prejudice,
ignorance or fear_

An automated system is written by people and those prejudices can still sneak
in...

[http://www.nytimes.com/2015/07/10/upshot/when-algorithms-
dis...](http://www.nytimes.com/2015/07/10/upshot/when-algorithms-
discriminate.html?_r=0)

[http://www.salon.com/2013/02/04/online_advertisings_racism_m...](http://www.salon.com/2013/02/04/online_advertisings_racism_mess/)

~~~
yummyfajitas
Those articles show that a completely inhuman intelligence, with no intrinsic
biases of it's own, reproduces the conclusions that allegedly come from human
bias. I.e., it turns out that (sometimes) racism and sexism are useful and
predictive heuristics.

Of course, being in the NYTimes and Salon, they need to obfuscate this point
and appeal to standard mood affiliation.

~~~
rmxt
Since you claim that "racism and sexism are useful and predictive heuristics,"
does that mean mainstream society should accept/tolerate/promote such belief
systems? Who is it exactly that you think would find these heuristics useful?

Also, describing the results of any code/program written by a human as a
"completely inhuman intelligence" is a tenuous claim at best.

~~~
smtddr
Just fyi, several people have tried engaging this person in similar
discussions before...
[https://news.ycombinator.com/item?id=8613711](https://news.ycombinator.com/item?id=8613711)

~~~
Baghard
Now is this an example of predictive analytics or of human prejudice? :)

------
alecco
This should be handled by something like social services, not police.
Prevention is not their game.

------
mathgeek
Seems to stand in rather stark contrast to a system designed to rehabilitate
criminals.

------
forgotmysn
Pro tip: it's the police.

------
draugadrotten
The picture illustrating the article seems to enforce racist stereotypes.

~~~
louhike
It is a picture of a guy they are talking about in the article so I am not
sure it is constructive to say it enforces racist stereotypes. It is not a
lambda photography from Getty (or other) just used to fill some space.

------
openfuture
This is incredible. Are people seriously ok with this?

