
Palantir deployed a predictive policing system in New Orleans - dsr12
https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd
======
tyfon
The game Watch Dogs 2 is actually quite on target regarding this. In the game,
the TPTB puts a non-scientific bias into predictive algorithms for certain
neighbourhoods making the population of those areas pay more in insurance and
be targeted by police a lot more often.

It's a dangerous path to take as suddenly you can be labeled a criminal just
for living in the wrong place or even worse if the police put their bias into
the algorithms.

~~~
galdosdi
History would be repeating itself. This was one of the factors in the burning
down of much of the south Bronx in the 1970s. The city was broke and asked
RAND Corporation to scientifically determine the least necessary firehouses to
close.

Their methods were in retrospect biased and just plain inaccurate, as
firehouses and police precincts in poor neighborhoods were closed just as
crime and fires were spiking, creating some sick sort of positive feedback
loop. There are zip codes in the south Bronx that lost 90%+ of their housing.
It's incredible.

Beware of those who would use the word "science" to justify their preferences
without explaining in detail.

~~~
bilbo0s
I think Warren Buffet put it best:

"Beware of geeks bearing formulas."

------
dmckeon
Is "network analysis" a better term for what this article describes?
Seriously, if taking a deep, hard look at connections among alleged drug
dealers and gang members is bad, what would be better?

~~~
mulmen
The word "alleged" certainly concerns me w.r.t state surveillance, especially
in the hands of law enforcement.

Connecting dots between convicted criminals is one thing, _predicting_ crime
based on associations with people who are not criminals is a whole other
ballgame.

~~~
derefr
A predictive model isn’t always trying to predict the future. It can also be
used to predict the present—to find hidden connections in (perhaps
maliciously) incomplete data.

In this case, that’s to say: if an unnamed gang member committed a crime, but
all the known members of said gang have alibis... maybe you should be looking
for an _unknown_ member of said gang. And who’s likely to be secretly in a
gang? Well, someone whose friends are all in said gang would be a good first
guess.

~~~
DoreenMichele
OMG.

What about outreach workers? How many of them are going to have connections to
junkies, homeless people and gang members?

Sincere good Christians often believe in trying to help the unfortunate. Some
good Christians spend all their time surrounded by clean living church goers.
Others spend a whole lot of time surrounded by folks who are from the wrong
side of town and the wrong side of the law.

~~~
ggg9990
The point of these analyses isn’t to say “arrest this man he’s a criminal” but
rather “closely observe this man and ascertain whether he’s a gangbanger or an
outreach worker.” Of course it doesn’t even need to get that far since it’s
trivial to tell the two apart from the network analysis alone (the outreach
worker will have a far higher ratio of legitimate-world contacts to criminal-
world contacts than the gangbanger).

~~~
alistairSH
And you don't think law enforcement agencies will quickly make the leap from
"closely observe" to "harass and arrest for any tiny transgression in an
effort to coerce a confession"?

~~~
Chriky
I don't, no, why would they?

\---

Downvoters, care to explain?

~~~
Tyrek
Because Law Enforcement generally behaves like people, and (like people),
often take cognitive short-cuts when making certain assessments. In this case,
the short-cut would be to assume that the individual _has_ to be a criminal,
and act as such.

Now, there are step that you can take to reduce these assumptions, but those
are steps _outside_ of the program, and would have to be introduced in tandem
with the introduction of the program. Unfortunately, policy-makers often
assume that any single idea that is funded (i.e. buying a predictive model
from Palantir) is a comprehensive solution to the problem (often encouraged by
the sales reps peddling whatever the 'solution' is) and fail to recognize that
other programs will have to be funded alongside in order to deploy the program
effectively.

~~~
Chriky
I am law enforcement - the opposite is true. LEOs are looking for any
opportunity to close off their cases with as little effort as possible.

~~~
alistairSH
If the software says Bob probably did it, Johnny Law will assume Bob did it
and stop looking elsewhere, precisely because Johnny Law wants to close the
case. Throw an overzealous DA in the mix, and it sucks to be Bob.

~~~
Chriky
That's just not how it works. Eventually you have to present actual evidence
in an actual court and it won't be "this software you never heard of said so".

------
bilibala
I recommend reading the Math and Murder section of The Rise of Big Data
Policing for a more researched view of how New Orleans is using Palantir
(starts at 2nd half of page 40).

[https://books.google.com/books?id=-IPOAQAACAAJ&lpg=PA7&pg=PA...](https://books.google.com/books?id=-IPOAQAACAAJ&lpg=PA7&pg=PA40#v=onepage&q&f=false)

------
Spooky23
"A man is known by the company he keeps" is a concept that phase been around
for millenia, and is no longer valid today.

The underlying question with all of these types of issues is that you are
doing something "with a computer". Beat cops in community policing models do
the same thing in an analog world. Good detectives know the territory, know
the players and know where to look for things.

At what point do do we want to stop? I don't know the answer.

~~~
gowld
Why is it no longer valid?

~~~
QuercusMax
Yeah - seems like it's no _less_ valid. And quite possibly more valid.

~~~
Jedi72
The definition of "company one keeps" is open to interpretation. How many
Facebook connections away do you think you are from a drug dealer? Take him
away, boys!

~~~
carlmr
Social media also provides strength of connection. How often do you
communicate with the drug dealer, at what times, how often is your GPS
position in his house for 5 minutes.

Probably everyone who isn't friendless on Facebook has a drug dealer in their
immediate network. But using more data you can filter out the actual druggies
with high confidence.

~~~
arthur_pryor
> But using more data you can filter out the actual druggies with high
> confidence.

i'm not comfortable with our justice system acting on the "high confidence" of
some proprietary heuristic with a closed source implementation and little
scientific evidence to support its claims, and with little ability for
citizens to vet its workings at a detailed level, to see whether it implements
their values or the values enshrined by the constitution.

to say nothing of the fact that i have zero interest in using the criminal
justice system to look for "druggies" (violent drug dealers, sure, because
they're violent; drug abuse is a health problem that the criminal justice
system is ill-equipped to deal with).

~~~
carlmr
>to say nothing of the fact that i have zero interest in using the criminal
justice system to look for "druggies" (violent drug dealers, sure, because
they're violent; drug abuse is a health problem that the criminal justice
system is ill-equipped to deal with).

Me neither, I was using your example (connection to drug dealers). You could
similarly apply this to connections to murderous gangs and you have a similar
argument. Saying drugs are ok is just detracting from the point.

------
aslewofmice
private data-driven policing will end up reinforcing any institutional racism
under the guise of math

~~~
aaronbrethorst
99% Invisible had a great episode that touched on this last Fall. Well worth
20 minutes of your time: [https://99percentinvisible.org/episode/the-age-of-
the-algori...](https://99percentinvisible.org/episode/the-age-of-the-
algorithm/)

~~~
flother
And Cathy O’Neil's book Weapons of Math Destruction, mentioned in the episode,
is also well worth reading.

[https://www.amazon.co.uk/dp/0141985410](https://www.amazon.co.uk/dp/0141985410)

------
throw7
too bad carville didn't recommend palantir do some precrime analysis on
politicians. but then that would be biting the hand that feeds you. surely
palantir wouldn't stand for that.

~~~
saas_co_de
Yeah, maybe they should use their systems to analyze government contracting.

------
justicezyx
Turns out China still copies from USA, even when it's for surveillance

------
ignoramceisblis
Reminds me of one example of the dangers of "predictive policing", when
controlled by one unchecked group:
[http://opentranscripts.org/transcript/tyranny-algorithms-
pre...](http://opentranscripts.org/transcript/tyranny-algorithms-predictive-
policing-israel/)

And [https://www.theguardian.com/world/2014/sep/12/israeli-
intell...](https://www.theguardian.com/world/2014/sep/12/israeli-intelligence-
unit-testimonies) and [https://www.theguardian.com/world/2014/sep/12/israeli-
intell...](https://www.theguardian.com/world/2014/sep/12/israeli-intelligence-
veterans-letter-netanyahu-military-chiefs).

------
cinquemb
I look forward when systems like these start having to adapt to actors start
using cointel tools and seed information into their systems with the intent to
see how decisions are made from them to start gaming it for their own ends.

------
youdontknowtho
The New Orleans police department is so insanely corrupt that if I related
some of the stories I've heard from people who live there you would not
believe them.

~~~
fokinsean
Care to share one?

~~~
youdontknowtho
A girl I dated had a NOPD cop that stalked her. He invited her to an "evidence
party". She ended up leaving the city.

------
tyu100
Frankly, cities like New Orleans, Baltimore and New Orleans need systems like
Palantir, drones, license plate readers and etc.. They are like something out
of Hobbes right now, with no effective state authority and violence levels
unseen elsewhere in the developed world.

Once basic rule-of-law is established then programs like those can be scaled
back but a lot of other countries would have established a state of emergency
if they were confronted with the levels of violence seen in some of these
American big cities. There's already a big push for body-worn camers for
police, this is just a natural extension.

~~~
Casseres
> programs like those can be scaled back

"can" does not equal "will"

------
mtgx
Microsoft has been doing some of this work for NYPD for some years, too.

------
omarforgotpwd
Nice, they mentioned the company I founded in college, PredPol. They claim
that our algorithm encouraged over policing of minority neighborhoods, but
there's no such thing as bad press right?

~~~
jadedhacker
If police harass minorities, it generates the most data in the places they
harass them. Then they feed this data into the computer and algorithmically
generate "unbiased" results that prove they should be harassing minorities....
that is unless this tendency was explicitly corrected for somehow. Was it? If
it was, that's its own can of worms, but is at least debatable as to whether
there's an acceptable way to do so.

Importantly, these algorithms should not be secret. These are the kinds of
choices a democratic society should make.

~~~
omarforgotpwd
Yes, this is something we researched and thought about. The idea is "garbage
in, garbage out". If the data input into the algorithm reflects a systemic
bias, obviously any predictions are going to be a product of the input to some
extent. Our predictions were based on the city's crime report data, which is
clearly not an accurate picture of crime in the city, but it's what we've got.
Sure, it's entirely possible that if police are dramatically over-policing
certain poor or minority neighborhoods and generating a lot of crime report
data based on those arrests that the predictions could potentially be
affected. But poor and minority neighborhoods tend to under-report crimes
compared to wealthier areas, so there is a balancing counter-effect to some
extent. Another thing we've seen happen sometimes is systematic under-
reporting or non-reporting of crimes to make the crime stats for the city look
better, as the police department is often judged on these crime stats.

Ultimately we have to work with what we have and the police are responsible
for maintaining high quality data and ethical standards, but I don't lose any
sleep worrying about if my code systematically oppressed minorities for two
reasons:

1.First of all, most police departments had separate predictions run per
"district" and typically had a fixed number of officers allocated to each
district. The idea that we were directing cops away from rich neighborhoods
and having them go after the poor or minorities assumes that we had far more
power than we actually did. The number of cops assigned to each city district
/ beat is a political or bureaucratic decision and not something our software
is involved in.

2\. Furthermore, our app was not directing the cops every move, nor was it
trying to. We just told them "when you're not answering a call or doing
anything else, when you have a spare minute, check if you're near a prediction
box and go figure out why the computer flagged it as a high risk area. Maybe
it's a parking lot and the lights have gone out encouraging car burglaries".
And even as just a small slice of their time, we really had to push to get
them to use the app because the way they saw it was "What does this stupid app
know about my city that I, with 20 years of experience, don't know? One of
these prediction boxes is in the middle of a lake". Based on what I saw from
the analytics and usage logs, I think it would be tough to make a case that
anyone was oppressed because the cops barely even used the thing! Not that I
take these issues lightly, but I always laugh when people criticize PredPol or
someone calls me a murderer on Twitter because it vastly overstates our
impact. Maybe I should take it as a compliment that people really think I am
powerful enough to direct police to oppress minorities using only a shitty
rails app I wrote in my college dorm room. But whatever, you can't do or try
anything new in this country today without getting dramatically criticized for
it to fuel the internet-rage clickbait economy, so I guess it's just something
all of us have to get used to. I haven't had a day to day role at the company
for a few years, which makes it a lot easier not to worry about these things
as well. But ultimately police using technology and data effectively can
REDUCE bias and injustice AND make people safer. I think the questions the
Verge presents are certainly worth asking, but based on my experience I do
believe that the app I implemented helped more people than it harmed.

~~~
majos
But you _don 't_ have to work with the data you have. Nobody _has_ to create a
software start-up they don't believe in.

If I'm reading your two points correctly, you don't worry about whether or not
you made a bad tool because it wasn't ever clear how much it was used?

~~~
jrs95
I didn't read it that way. To me, it seems like he's saying that how the tool
is used pretty much eliminates it's downsides. It's not like this was being
used to conduct no-knock SWAT raids or something.

