
New Jersey’s experiment to reduce the number of people in jail awaiting trial - sohkamyung
https://www.economist.com/news/united-states/21731631-new-jersey-has-bold-experiment-reduce-number-people-jail-awaiting
======
dzdt
This isn't an "algorithm" the way people here are thinking. It is a risk
formula. The thing it does is standardize which risk factors have been shown
to be important. The inputs are:

    
    
      Age at current arrest 
      Current violent offense  
      Current violent offense & 20 years old or younger 
      Pending charge at the time of the offense 
      Prior misdemeanor conviction 
      Prior felony conviction 
      Prior conviction (misdemeanor or felony) 
      Prior violent conviction 
      Prior failure to appear in the past two years 
      Prior failure to appear older than two years 
      Prior sentence to incarceration
    

Each input is given a simple 0-2 integer importance weighting. Scores are
summed, then rescaled by a translation table.

Outputs are a 1-6 "fail to appear" risk score, a 1-6 "new criminal activity"
risk score, and a yes/no "new violent criminal activity" risk flag.

What to do with the scores (and any other factors to consider) is still left
up to the judge. The advantage is in translating "does this defendent have a
record?" into a numerical risk score in an evidence-based way.

[1] [http://www.arnoldfoundation.org/wp-content/uploads/PSA-
Risk-...](http://www.arnoldfoundation.org/wp-content/uploads/PSA-Risk-Factors-
and-Formula.pdf)

~~~
adekok
All of which are strongly correlated with factors open to bias.

[http://www.cnn.com/2017/03/07/politics/blacks-wrongful-
convi...](http://www.cnn.com/2017/03/07/politics/blacks-wrongful-convictions-
study/index.html)

[https://www.law.umich.edu/newsandinfo/features/Pages/starr_g...](https://www.law.umich.edu/newsandinfo/features/Pages/starr_gender_disparities.aspx)

While people complain about the racial bias in the courtroom, the _sex_ bias
is substantially larger.

Heck, the Supreme Court of Canada recently gave a woman a _complete pass_ for
trying to have her husband killed. Because after she was charged with
attempted murder, she _conveniently remembered_ that she was the victim of
domestic violence.

Despite her claims being provably false.

i.e. claimed incidents occurred when they lived hundreds of miles apart.
There's no record of her ever calling the police, despite her claims of
multiple police visits, etc.

[https://www.youtube.com/watch?v=yq2WWsY8Rmc](https://www.youtube.com/watch?v=yq2WWsY8Rmc)

As was noted in other comments here, such a "data driven" approach just
continues existing prejudices.

~~~
skybrian
That may be so, but none of these links are about bias in setting bail.

~~~
adekok
The comment I responded to discussed how bail was set based on _prior_ events.
My comment showed how those prior events were substantially biased.

Therefore, by a simple and clear chain of logic, setting bail based on prior
events is _also_ biased.

An alternate answer would be: do you really think that there is bias
everywhere ELSE in the system, but not in bail?

Of course not.

And, if you look, you find papers like this:

[https://link.springer.com/article/10.1007/BF02885913](https://link.springer.com/article/10.1007/BF02885913)

 _We found that judges take gender, but not race, into account in determining
the amount of bail for certain types of cases; more specifically, Black
females faced lower bail than Black males in less serious cases. In contrast,
we found that both race and gender affected the likelihood of pretrial
release. White defendants were more likely than black defendants to be
released pending trial and females were more likely than males to be released
prior to trial. In fact, white females, white males, and black females all
were more likely than black males to be released._

~~~
skybrian
Okay, I think I see what you mean now. It's a bit hard to see since you're so
focused on gender bias.

I think you're right that these factors, while answering fairly objective
questions, reinforce bias due to things like prior convictions. Once someone
starts down this path, they get treated worse by the system based on history.
Even though they did their time, they aren't starting fresh.

Still, I think it's an improvement (to a very flawed system) because it's not
adding _new_ bias. It also doesn't seem practical to reexamine previous
convictions to see if they were fair when setting bail.

~~~
adekok
> Okay, I think I see what you mean now. It's a bit hard to see

I must admit to not understanding how it's difficult to see the correlation.
If bail is set on factors X, Y, and Z, AND those factors are shown to be
biased, then by definition, bail is also biased.

> since you're so focused on gender bias.

That's just a weird statement to make. The research shows bias and I quoted
the research... how does that make me "so focused" on gender bias?

> it's not adding new bias.

That is a good point, but continuing _existing_ bias is a serious problem.

~~~
skybrian
The logic behind this is that any bias resulting in unjust convictions will
later also cause bias when setting bail. It seems like making sure unjust
convictions don't happen is probably the more important of the two? And fixing
unjust convictions would also fix the issue with setting bail.

------
asadjb
It's better to move towards a more open and transparent system that's data
driven. However if the system used here will be transparent isn't clear, only
that it can be. The builders of these systems can choose not to share the
algorithms behind it.

A podcast I listened to a day or two ago [1]; You Are Not So Smart, discussed
something related to this that I feel is important to point out here as well.

It's about how we transfer our biases into the algorithms and ML solutions
that we build. Given the move towards using algorithms for making decisions
like this, it's something we should definitely consider.

If you have the time, definitely listen to this episode. It's an amazing
podcast, but this one episode really hit me about how software can affect
peoples lives – and not always for the better. How we as software engineers
should be more aware of how the solutions we build will be used down-the-line.

tl;dr of the episode would be:

ML solutions, and algorithms designed by humans, are designed by looking at
historic data. Historically, a number of races have not been treated fairly
when it comes to the justice system, e.g. black people being treated harshly
for the same crime as non-black people.

When our ML solutions are built on historical data, it learns those biases as
well. Which means that the "racism" is also built into the algorithm as well.
Of course the algorithm doesn't have any concept of racism, it's just another
feature it uses to compute it's decision. But it is something we as designers
of algorithms should keep in mind.

[1] [https://youarenotsosmart.com/2017/11/20/yanss-115-how-we-
tra...](https://youarenotsosmart.com/2017/11/20/yanss-115-how-we-transferred-
our-biases-into-our-machines-and-what-we-can-do-about-it/#more-5723)

~~~
opportune
Another problem in the same vein is that when ML algorithms make use of
Bayesian inference they can bake in correlations ( e.g. between race and
credit score) that we would normally purposefully avoid using as a factor,
because while it enhances predictive power, it again codifies our existing
biases, prejudices, and injustices. For example if you were deploying an ML
model to determine whether someone deserved a loan, features such as ZIP code
or race could encode discrimination into the model

~~~
maxerickson
Including race as a parameter should reduce the impact of those correlations
on the output of the model (by allowing the model to measure and control for
the bias that exists in the input data).

Incautiously using race just because it reflects those existing biases would
be a problem. This is what lots of humans do, overestimating the information
provided by their own inferences of race. Like internet assholes who blather
about how it is rational to be afraid of black men because of their higher
rates of assault. Never mind that the absolute rate is still so low that there
is ~0 predictive power from the race of a given individual.

~~~
azernik
That's actually a very interesting idea for a technique. But not one I've ever
heard of actually being used.

~~~
maxerickson
It's not a technique, it's a natural outcome of statistical modeling.

It doesn't get used because people are innumerate and reactionary.

~~~
azernik
The technique I'm referring to is including race as a factor, and then
instructing your model to _disregard_ that factor. Not being race-blind, but
being race-aware while attempting to be neutral to race.

~~~
opportune
Yeah I think this is an interesting idea but I don't actually understand how
the great-grandparent comment believes that this could be done. I don't think
it works in a strict Bayesian sense. You would have to go out of your way to
instruct your model to operate correctively

------
avs733
Some of the most interesting and thorough public work on just how problematic
these algorithms are was done by ProPublica[0]. They have a series of articles
specifically focused on how machine learning/algorithms can either enable or
create racist outcomes[1] that are all worth reading.

[0] [https://www.propublica.org/article/machine-bias-risk-
assessm...](https://www.propublica.org/article/machine-bias-risk-assessments-
in-criminal-sentencing) [1] [https://www.propublica.org/article/breaking-the-
black-box-ho...](https://www.propublica.org/article/breaking-the-black-box-
how-machines-learn-to-be-racist)

~~~
skybrian
These are interesting links, but off topic because there's no machine learning
involved.

------
orev
The most encouraging thing about this approach is the change to a data-driven
systematic approach from the previous “whatever the judge thinks by looking at
you” approach. It really doesn’t matter if the algorithm is completely
correct, as once you make the conceptual change to using such a system, you
can tune the parameters to fix any problems that arise.

------
yorwba
Unfortunately, no detail on the actual algorithm in question. Is it executed
manually or is it using a statistical model optimized for some tradeoff of
metrics?

~~~
IncRnd
Those aren't the exclusive possibilities. The algorithm for the six-point
scale is detailed here: [http://www.arnoldfoundation.org/wp-
content/uploads/PSA-Risk-...](http://www.arnoldfoundation.org/wp-
content/uploads/PSA-Risk-Factors-and-Formula.pdf)

Apparently, it's used by almost 40 different jurisdictions.

~~~
Sniffnoy
So how is this actually used, then? There's two six-point scales and a flag.
How are these integrated into a detain/don't detain?

~~~
IncRnd
I can't answer that from my personal knowledge, but I did find the information
on a different page of the same website:
[http://www.arnoldfoundation.org/initiative/criminal-
justice/...](http://www.arnoldfoundation.org/initiative/criminal-
justice/crime-prevention/public-safety-assessment/)

In that spot there are some videos that I think will answer your question more
thoroughly than I can.

------
mholmes680
Some anecdotal outcomes [0], [1], [2] of the system below. In general, people
were outraged for a few months and local news seemed to mention it in most
drug-bust stories, but that seems to have died off. My law enforcement friends
were beside themselves at first, but I think both sides are adjusting to the
system.

I think in general I agree with it. Without context of witnessing the
[alleged] crime and without the algorithm in front of me, sometimes I wonder
if certain situations have exposed some holes in the algorithm. In addition to
the three cited below, I remember reading about a guy in Ocean County being
released after they found him with $1M of drugs... maybe his supplier took
care of that problem for the community...

[0] - Burglary: [http://pix11.com/2017/02/16/accused-serial-burglar-
arrested-...](http://pix11.com/2017/02/16/accused-serial-burglar-arrested-and-
released-3-times-in-1-week/)

[1] - 1st deg murder:
[http://www.gloucestercitynews.net/clearysnotebook/2017/07/li...](http://www.gloucestercitynews.net/clearysnotebook/2017/07/lindenwold-
woman-murdered-suspect-arrested.html)

[2] - possession and resisting arrest, but look at the 2016/2017 rap sheet:
[http://www.usbailreform.com/camden-nj-ex-con-shoots-cop-
poin...](http://www.usbailreform.com/camden-nj-ex-con-shoots-cop-point-blank-
released-5-days-earlier-free-go-thanks-nj-bail-reform/)

------
tcj_phx
I got arrested 26 months ago. I'd gone back to the hospital with a court order
that said the hospital's behavior was not in compliance with the law, and that
they had to let their patient (my friend) go.

I made the mistake of going without a police escort. The ER staff didn't know
that their employer actually had no legal authority to hold their patient in
their psychiatric ward. I called the police and was patiently waiting for
their help to enforce the court's order, but before the police arrived I
irritated one of the security guards and they all attacked me. I was charged
with 'misdemeanor assault' and was released on my own recognizance after a
night at the city jail.

Two months later I got the police report, and learned what I'd allegedly done.
9 months later I learned the "old man" security guard that freaked out on me
was a retired police officer. He knew exactly what to say to the responding
officer to cover his ass, and was well-practiced at testifying on the stand.
It's easy to make shit up if you know what you're doing. His co-conspirator
was kind of a doofus - he'd been a hospital security guard for 20+ years.

Two months after my trial, my girlfriend got arrested - her court-ordered
mental health professionals didn't think her substance use was of any concern
to them, and were heroically treating her presenting systems without any
concern as to their cause. While I was arrested for assault
(misdemeanor/victim), she was arrested for heroin possession
(felony/victimless). Her bail was $5000. The various bail bondsmen I called
weren't interested, or wanted more collateral than I had - apparently people
arrested for drug charges have a tendency to not show up for their court
dates. She spent 2 weeks in jail while her attorney negotiated with the
justice system on her behalf.

(The system she was arrested into has since made some efforts to keep people
out of the jails, or to reduce bails, but I don't know how these changes
would've affected her experience...)

In a separate anecdote, my ex-wife recently picked a fight with her then-
boyfriend (California). He responded to her provocation by escalating the
fight into a very physical confrontation. Bystanders called the police, he got
arrested... Bond was set for $50,000. He bailed himself out by charging $5,000
to his credit card with a bail bondsman. The charges were dropped a few days
later. The bondsman apparently got to keep all $5,000 (edit: I think the term
'racket' applies).

I don't know that an algorithmic approach to justice will be much help at this
stage, because the system is so corrupt. Any algorithm will have to start with
"Is there an actual victim [0]? No: release without bond. Yes: [...]". I don't
think the system will take well to considering that they've spent decades
making work for themselves...

[0] I learned about 'strict liability' from a HN comment - thanks, whoever
that was.
[https://en.wikipedia.org/wiki/Strict_liability](https://en.wikipedia.org/wiki/Strict_liability)

~~~
Simon_says
You lead an interesting life. Your comment reminded me how curious it is that
some people tend to have regular (yearly-ish?) clashes with police and the
criminal justice system, and others go their entire life without anything more
than a "Good morning" with a cop.

~~~
mschuster91
Well. There are factors that increase your risk of dealing with cops: 1) being
non white (or, not the same skin color/ethnical background as the majority
population) 2) being in any way "non-conforming to mainstream" \- e.g. wearing
dreads, "gangsta clothing", looking like a punk or basically anything that
makes you stand out.

Cops always go for those looking different than the mass when they're bored.

~~~
dsfyu404ed
The longer your history the more the cops will treat you like a criminal
during even the most trivial interactions. Due to the large amount of
discretion they have the more scrutiny you get the more likely you are to wind
up with a lengthier criminal history.

Old men who have an (irrelevant) DUI from the '70s on their history get asked
if they've been drinking when they roll a 4-way stop on their way to church on
Sunday morning.

------
tomohawk
Once they reduce the jail population, what's next? Likely, cuts in the number
of jail cells.

Seems like a good thing, but will this make it less likely for police to
arrest people in the first place, potentially making the streets less safe?

This article talks about the effect in the UK:

[http://www.telegraph.co.uk/news/2017/11/24/police-
reluctant-...](http://www.telegraph.co.uk/news/2017/11/24/police-reluctant-
make-arrests-due-sharp-fall-number-custody/)

~~~
GVIrish
If a series of stupid policy decisions are made that constricts jail cell
supply so much that even dangerous suspects are not detained, then yeah.

But there is a lot of social benefit to not having people sit in jail awaiting
trial if they don't need to. People on limited income can't afford to miss
work or not be around to take care of their kids or other love ones. People
who turn out to be innocent still end up experiencing difficult life
consequences as a result of unnecessary jail time.

I think the public interest is served by reducing the number of people in jail
in a more fair and transparent manner. It will take sensible policy-making to
make sure that it doesn't result in unintended consequences (as some others
have pointed out) down the road, but it's worth starting down the road in the
first place.

