
California just replaced cash bail with algorithms - antr
https://qz.com/1375820/california-just-replaced-cash-bail-with-algorithms/
======
bjt2n3904
There's this growing trend of... "Well look, this person in a lab coat with a
clipboard did some complicated things with a computer, using all the trendy
words like 'algorithms', 'machine learning', and 'big data'! Their results
must be very accurate and unbiased, and much better than a human looking at
things."

I'm more and more convinced that most of these cases are just stuffing an
excel spreadsheet into a black box, and shaking it. Hearing that we're using
"algorithms" in criminal justice doesn't give me warm and fuzzy.

~~~
IanDrake
>Their results must be very accurate and unbiased

That is the assumption that scares me. The algorithm _encodes_ the bias and
applies it in every case.

This _feature_ will only save you from a _shifting_ bias.

~~~
rectang
Up until now we were saddled with the extremely biased and unjust bond system.
Calls to make the algorithms transparent are well taken -- but let's not go
back.

~~~
cgriswald
Your thinking is very dangerous. Even if the algorithm is transparent and
fair, that does not necessarily equate to a step forward because of the bias
already present in the system. The algorithm will be[0] _applied_ equally,
sure, but that does not mean outcomes will be equal.

For instance, consider that owning a major asset probably decreases flight
risk. That seems fair. But, this means that poor people are probably more of a
flight risk than a middle- or upper-class homeowner. So if the algorithm
considers this, poor people would be disproportionately affected.

Also consider that having family abroad or being from another country probably
increases flight risk. Hence, immigrants would be disproportionately affected.

But now, when there is an unjust outcome, there isn't even an _option_ to
avoid jail time (even if that option is difficult to reach or undesirable);
and the human responsible for the decision to incarcerate can simply wash
their hands of it, since the algorithm gave the suspect a grade of "D". No
need for empathy, sympathy, or guilt.

Then there are the pending discussions about what is 'fair' and the answer to
those questions is politics...

[0] - Edit: _Could_ be

~~~
rectang
Those critiques are all indeed things to watch out for, but progress is
possible -- see New Jersey.

The status quo is a roaring injustice and ongoing disaster, and by insisting
on the perfect over the good you are condemning millions of people to continue
suffering indefinitely. How is your thinking not "very dangerous" to all those
wronged day after day by the current system?

~~~
cgriswald
You've completely misunderstood my point if you think I'm advocating for the
perfect at the cost of the good. I'm advocating caution rather than a blind
faith that _this_ solution is _ipso facto_ forward progress, that just needs
some tweaking.

The danger here is doubling down on the existing injustices, removing the
limited accountability and positive exceptions that exist, and effectively
sweeping conversation under the rug or arguing ideologies about the fairness
of tweaks to the algorithms at the cost of real human lives.

~~~
rectang
You're making a distinction without a difference -- whether it's because of
"caution" or whatever, no progress will be made if you have your way.

According to you the solution lies in "politics". Yet, this law is the product
of California's government, a political body -- is that not legitimate
"politics"?

So what _would_ make for a "political" solution that would qualify for your
stamp of approval? Chasing a consensus with the "tough on crime" crowd who
benefit from felony disenfranchisement disproportionately suppressing minority
votes? No such consensus will ever happen.

~~~
cgriswald
Again, you either misunderstand or misrepresent my points.

> According to you the solution lies in "politics".

My post was _critical_ of the role of politics in determining the fairness of
the algorithm. It removes the human lives factor and provides a platform for
politicians and ideologues ("tough on crime" or otherwise) to either do
nothing, argue about the ideology of fairness, or ignore ground truth about
outcomes and pre-existing bias in the system.

------
will_brown
A little over a year ago I was the victim of some crimes including:

-kidnapping with a deadly weapon

-assault with a deadly weapon/firearm

-car jacking with a deadly weapon

The defendant has bonded out twice in the case. The first time there was only
a charge of grand theft of my car, and bond was revoked after the
investigation and the other more serious charges were filed. Thereafter, the
defendant bonded out again after all the charges were filed, and posted
$300,000+ bond. As the victim (and a lawyer) I knew the defendant would be
granted bond (although I believed the defendant was both a flight risk and
danger to the community and bond should have been denied), so I requested gps
tracking. Less than a week later the defendant was rearrested for violating
the terms of bond...all based on the gps tracking.

I’m not sure I trust an algorithm to determine if someone is a flight risk or
danger to the community. Historically, the cash bond was like the carrot and
stick to making sure a released defendant showed up to court. I think
technology, like the gps tracker, can probably replace cash bond in almost all
cases.

~~~
jessaustin
Was the defendant rearrested by police or by bail agents?

~~~
will_brown
I think on the first violation/revocation of bond the defendant was arrested
by police (but the defendant was in another county at the time, so they had to
extradite back to the county). The 2nd violation/revocation of bond, I think
the warrant was executed by police (but in the same county as the case).

------
andrewaylett
Sounds like the sort of thing GDPR's regulations of automated decision making
is intended to deal with: [https://ico.org.uk/for-organisations/guide-to-the-
general-da...](https://ico.org.uk/for-organisations/guide-to-the-general-data-
protection-regulation-gdpr/individual-rights/rights-related-to-automated-
decision-making-including-profiling/)

Specifically, even if you're allowed by Article 22 to perform automated
decision-making (and decisions on bail that are mandated by law, as in the
case at hand, would be allowed) must still:

* provide meaningful information about the logic involved in the decision-making process, as well as the significance and the envisaged consequences for the individual;

* use appropriate mathematical or statistical procedures;

ensure that individuals can:

* obtain human intervention;

* express their point of view; and

* obtain an explanation of the decision and challenge it;

~~~
chatmasta
GDPR does not apply to sovereign nations and states. The European Commission
is not the boss of the world.

That said, it may be a useful framework to adopt for uses such as regulating
pre-trial risk assessment algorithms.

~~~
DerpyBaby123
Not OP, but they never claimed that what California is doing is unlawful
because of GDPR. Their comment is still insightful because it shows an example
of legislation that, if applied, could ensure more transparency and humanity
to automated decision making.

The lack of such legislation becomes more glaring as more and more examples
are demonstrated.

------
stanleydrew
I think there are valid reasons to not trust an algorithm to determine whether
someone is a flight risk or danger to the community.

But there was already an algorithm that came up with the cash bail amount for
a defendant, which was purportedly based on the defendant's flight risk and
likelihood of endangering the communoty.

I don't pretend to be an expert so I may be wrong but I think the algorithm
boiled down to something like "let the judge decide".

I think transparency around the algorithm is incredibly important, so anything
that explicitly makes the properties of the algorithm part of the debate I
think is good.

------
SargeZT
This worked _really_ well in New Jersey. However, California's implementation
of county-by-county standards as opposed to a statewide system could still
leave a lot of room for error.

~~~
spuz
There is a good Planet Money episode about New Jersey's overhaul of the bail
system:
[https://www.npr.org/sections/money/2018/08/29/643072388/epis...](https://www.npr.org/sections/money/2018/08/29/643072388/episode-783-new-
jersey-bails-out)

------
panic
Several of this bill's original co-sponsors ended up opposing it due to the
lack of accountability in the risk assessment system and the fact that it
gives police even more power over the justice process:
[https://siliconvalleydebug.org/stories/silicon-valley-de-
bug...](https://siliconvalleydebug.org/stories/silicon-valley-de-bug-s-letter-
of-opposition-to-california-s-false-bail-reform-bill-sb10)

------
basch
I post this too often, but its very applicable here.

[http://www.bbc.co.uk/blogs/adamcurtis/entries/78691781-c9b7-...](http://www.bbc.co.uk/blogs/adamcurtis/entries/78691781-c9b7-30a0-9a0a-3ff76e8bfe58)

------
gpm
While there are some valid concerns about using algorithms I'm quite happy
that they got rid of cash bail. Making presumed innocent people pay to get out
of jail has never made any sense to me.

For serious crimes, the number of people who say "well I would run for it to
try and most of the rest of my life in prison, but I won't because then I (or
whoever put the money up for me) won't get my bail back" has to be tiny.

For non serious crimes the number of people motivated by bail and not by
avoiding spending the rest of their life as a fugitive has to be tiny.

~~~
posterboy
Isn't the rational that somebody needs to vouch for release and prove trust by
risking loss of the money?

------
Deckard256
Our courts here in New Mexico got bamboozled by the use of the "Arnold
Tool"which is basically the same thing. Anyways, Yada Yada Yada, we're now
worst in nation for auto theft because of it and we have no recourse against
it unless we want to change our state constitution because our Supreme Court
mandated it as it was sold as being more fair. All it did was put repeat
offenders on the street the next day. My car has been broken into three times
since the change. Garbage in, garbage out.

~~~
sambull
Your anecdote about petty theft is so valid

~~~
magduf
Auto theft is _by definition_ not "petty theft".

~~~
sambull
Except they didnt steal his car just broke into it.

------
pseudolus
This points to one of the dilemmas that societies will be hard pressed to
confront. Are we capable of accepting decisions underpinned by algorithms even
when the outcome is negative in a few cases? Inevitably an individual released
as a result of one of these algorithms will commit a criminal act and the
argument will arise that had an individual been responsible for making the
evaluation and that had cash bail existed then such crime would not have
occurred. I think people will have significant difficulty in accepting even
the slightest of errors with algorithms (particularly with respect to violent
crime) even if the overall result is lower than if humans had been the sole
deciders. In many ways it's analogous with the ongoing debates about so-called
driverless cars.

------
kbos87
So we are moving toward a world where one person gets out, while the other has
to languish in prison for months or years awaiting trial because an algorithm
said so. And the accused aren’t allowed to know what’s happening behind the
scenes of the algorithm because we need to protect the competitive advantage
of the vendor who sold the software to the state. This is total and complete
insanity and elected officials should pay with their jobs and their
reputations for even suggesting something so unjust.

~~~
rectang
Where is your outrage about how poor people are wronged by the current system?

~~~
kbos87
I am absolutely outraged about the way the current system treats and traps the
poor. But we need to be clear that this isn’t a fix; it’s a further
obfuscation of the problem. I have little doubt that said algorithms will
continue to wrong the poor at similar or greater rates - the only difference
is that it will be significantly harder to detect.

------
DevX101
As governments start to implement more algorithms in public policy decisions,
it becomes increasingly important that ALL algorithms be open sourced for
public review unless there is a national security risk.

Having a summary of how the algorithm works is necessary but not sufficient.

------
CoryG89
Shouldn't the "algorithm" need to be published? Surely someone being held
prisoner by a computer program should at least have the right to examine and
potentially challenge the source code and data being used.

~~~
JoeAltmaier
Actually, they're being held because they were arrested, right?

------
anfilt
One step forward, and well who knows how many back.

------
mrcactu5
Machine Learning Algorithms are severely racist towards defendants. We know,
for example, that AI algorithms mistake Black people for primates... but they
are no more racist than the criminal justice system at large. And they have a
better chance of getting fixed.

[https://www.nytimes.com/2016/06/26/opinion/sunday/artificial...](https://www.nytimes.com/2016/06/26/opinion/sunday/artificial-
intelligences-white-guy-problem.html)

~~~
Anderkent
Bond is severely racist too, if you take those criteria. Black people get
assigned higher bond, and often have a harder time getting it together. As
such there are many people in jail that are in no way a risk but are held
because they can't afford bond.

~~~
rectang
Our awareness that machine learning algorithms have the potential to make
racist decisions gives us something to watch out for as these systems roll
out. Hopefully addressing any flaws will be a matter of ordinary debugging.

Regardless, the cash bail system is terrible and I'm anticipating big
improvements as we move away from it.

