
Algorithmic Justice Could Clear Convictions in California - LegalProduction
https://www.artificiallawyer.com/2019/02/26/algorithmic-justice-could-clear-250000-convictions-in-california/
======
jrochkind1
> Why is this a big deal? It matters because algorithmic justice, of a complex
> or simple variety, has come in for a lot of flack in the US, in part because
> of one truly major mess that was caused by the COMPAS system. So, it’s great
> to see algorithmic justice doing good.

That's because these don't seem like the same things at all to me. Lumping
them both under "algorithmic justice" with an agenda to make the one look
better doesn't change that. This thing in this story involves zero machine-
learning, zero statistics at all, zero attempts at statistical prediction of
likelyhoods of anything, or even any heuristics at all. Those are all the
things that have come for a lot of flack. Looking things up in databases and
using them to automatically fill out forms, indeed, has not come under a "lot
of flack."

~~~
lallysingh
The public doesn't care about those distinctions. So it's useful to cancel out
a negative gut-reaction to encourage a more nuanced view.

~~~
jrochkind1
You think the public is going to be opposed to _this_ sort of thing, because
of what they've heard about the _actual_ statistical algorithmic things?

I seriously doubt it. I think the author was trying to do exactly the reverse,
get people to _support_ the statistical algorithmic things because of this
thing, which seems obviously a good idea.

It seems like (exagerrated of course) saying "Yeah, I know you don't like
autonomous military robots killing people, but you like YOUR TOASTER, right?
It also works on electric power, and turns itself off unattended when your
toast is done. Maybe there's a role for Autonomous Killing Robots after all,
they might even be able to toast your bread with their lasers!"

Most of the public has never even heard the term "algorithmic justice", of
course. You're saying "the public doesn't care about those distinctions," but
it's the OP author _suggesting_ we include software providing _decision-
making_ based on statistical algorithms, along with simple rule-based
automation, both under "algorithmic justice", _rather_ than making a
distinction. The OP author encouraging us not to make distinctions. I think
the public is smart enough to distinguish between software that is _making
decisions_ and software that isn't, and it's the OP that is asking this
distinction not be made. As the public is indeed just starting to learn how to
think about and categorize these things, I think it's irresponsible to try to
educate them _not_ to make distinctions, and I think it's being done for an
agenda.

~~~
dak1
Yes, they really will. If you oppose this initiative, you start running ads:

"Your legislator wants to let a computer overrule judges and juries and put
criminals back on the streets. Call your legislator now to tell them you
oppose HB-555."

There's enough truth there, that explaining it away takes far too much effort,
and in politics, if you're explaining, you're losing. If you don't think this
is how this kind of thing works, you haven't been paying attention.

------
anton_tarasenko
"Algorithmic justice" reminded me of a study where researchers predicted the
risk of a crime better than judges:[1]

> Millions of times each year, judges must decide where defendants will await
> trial—at home or in jail. By law, this decision hinges on the judge’s
> prediction of what the defendant would do if released. This is a promising
> machine learning application because it is a concrete prediction task for
> which there is a large volume of data available. Yet comparing the algorithm
> to the judge proves complicated. First, the data are themselves generated by
> prior judge decisions. We only observe crime outcomes for released
> defendants, not for those judges detained. This makes it hard to evaluate
> counterfactual decision rules based on algorithmic predictions. Second,
> judges may have a broader set of preferences than the single variable that
> the algorithm focuses on; for instance, judges may care about racial
> inequities or about specific crimes (such as violent crimes) rather than
> just overall crime risk. We deal with these problems using different
> econometric strategies, such as quasi-random assignment of cases to judges.
> Even accounting for these concerns, our results suggest potentially large
> welfare gains: a policy simulation shows crime can be reduced by up to 24.8%
> with no change in jailing rates, or jail populations can be reduced by 42.0%
> with no increase in crime rates.

[1]
[https://www.cs.cornell.edu/home/kleinber/w23180.pdf](https://www.cs.cornell.edu/home/kleinber/w23180.pdf)

~~~
gbrown
The authors note that judges may care explicitly about racial bias, but based
on a quick read they're making a really, really big mistake in the language
they're using: they confuse arrests with crime. Arrests and convictions are
simply a measurement mechanism for crime, which is known to have severe
biases.

~~~
reader5000
There is no convincing evidence that arrest rates "severely" overestimate
offense rates; if anything it is just as likely arrest rates underestimate
offense rates.

~~~
moosey
While what you say specifically is true, using arrest rates to determine
criminal activity by race (and the subsequent conviction rate, etc.) has been
shown to have strong relationship to race; at least in the United States.
There are entire books on the subject. You can't tie racial arrest rates to
the underlying crime rate, as POC get arrested far more often for the same
crimes.

~~~
47gfnm8sgf7m
I'm not sure I understand that correctly. Are you saying that POC are
acquitted or have charges dropped far more often for the same crimes (i.e.
have a far lower conviction rate)?

~~~
ddulaney
In general, POC are more likely to be arrested for committing a crime.

The parent's point isn't about whether they are acquitted, it's that if you
were to commit a crime as a POC, you are more likely to be arrested than if
you had committed that same crime as a non-POC. In both scenarios you
committed a crime, but in one of them the system never has a record of it.
This is why arrest rates and crime rates are different: if a POC is more
likely to get arrested for committing a crime, the arrest rates by race (POCs
get arrested more) will not reflect the crime rates by race (differences are
generally smaller).

~~~
reader5000
Most empirical data indicates that white people are arrested and convicted at
a higher rate relative to basal offense rate than black people.

~~~
iterati
Post your sources. I've seen otherwise (especially for low level drug
offenses) but you're the one making the claim.

------
lordnacho
What's remarkable is the article uses the word "algorithm" while stripping it
of its current near-magical connotation. Bravo.

------
mschuster91
A big part to help would be forbidding employers, landlords and banks to
access or use conviction databases, with limited exceptions for security-
sensitive employment such as banks, security guards, childcare/education and
the likes.

Or, simply, do not make them "public" in the sense of "putting it on the
Internet". By all means, make convictions and court documents public in the
sense that one can go to the local library and do research there in person (to
provide a natural scaling limit), but it should not be acceptable that there
are "data mining" companies that get the name searches on Google polluted by
conviction records or court documents as sensitive as one's financial worth
during / after a divorce judgement process and then charge people extortion
fees to "remove" the records from their site (only to reappear on another
site, rinse and repeat).

In Germany, this is the norm. Drug testing by employers or checking their
credit score is also not allowed, with highly limited exceptions. As a result,
while we _do_ have a problem with convicts being discriminated against after
release, it doesn't even come close to the level of problems the US has.

~~~
Nasrudith
There is a fundamental flaw with the approach - the right to remember and the
right to print/speak. You can keep a newspaper in your house. Why then is it
permissible to stop someone from archiving the facts? Even if later proven
false it is still useful data that say the New York Times featured an angry
Trump full page editorial calling for the death penalty for them?

I don't disagree that undue judgement is an issue for rehabilitation but I
have heavy doubts that forced amnesia is a good idea - especially given the
repeat scammers who would exploit it.

~~~
astura
There's an absolutely huge distinction between "try to rewrite history so
[thing] doesn't exist anymore" and "law preventing you from taking [thing]
into account when making hiring/renting/whatever decisions."

It's not impossible, insurance companies have no problem with this, the
insurance industry is highly regulated about specific things they can/can't
consider while underwriting a policy, which varies by jurisdiction. They just
don't ask about/look up factors that they aren't allowed to take into
consideration while underwriting a policy, but aggressively ask about/look up
factors they are allowed to take into consideration.

For example, in Massachusetts insurance companies are forbidden from
considering credits score when setting premiums and making underwriting
decisions, a practice which is extremely common in the industry where legal
(people with high credit scores have less insurance claims). That doesn't mean
Massachusetts is trying to rewrite history so that your poor credit doesn't
exist, it still exists, and others are allowed to use it for other reasons
(such as underwriting loans).

Saying "we forbid you to take into consideration conviction history when
hiring" isn't anywhere near "forced amnesia."

~~~
sib
Unfortunately, if "people with high credit scores have less insurance claims,"
then preventing insurance companies from taking this into account is a
deadweight loss to society.

People who are more likely to have more insurance claims _should_ pay more for
insurance (that's the whole point of underwriting), otherwise people who are
less likely to have claims are being unfairly overcharged.

~~~
mschuster91
Now if you take a look at how many people got hits to their credit score
thanks to the recent government shutdown or due to massive medical bills or
due to identity theft/other fraud, things look differently.

Credit scores deserve to rot in hell forever. Humanity has managed to exist
for thousands of years without this degradation of human worth to arbitrary
totally intransparent numbers.

------
shireboy
Maybe it’s naive, but I’ve often thought law itself should be spelled out in
code. Obviously some things are human judgement. But take those as inputs into
an unbiased machine that outputs fair sentencing, procurement etc. Still let
meatbags review it at least for the time being. It could be a guidance system
for judges and juries at least at first. Seems like it could help clear cases
quicker and provide fairness.

~~~
beefield
Yes, it is naive. All code has bugs. Some time ago some people though it would
be good idea to encode contractual agreements with code. I guess it should be
no sutprise that someone found a bug in the contract and got him/herself most
of the money that was tied into the contract framework. If I recall correctly,
we are talking about tens if not hundreds of millions of dollars of
involuntary bug bounty in this particular case (google DAO hack of you have
not heard)

Further, in my opinion there does not seem to be any solid logic what me and
other humans think that is right or wrong, which makes it quite difficult to
spell out preferred laws into code. (If someone disagrees, I am happy to hear
one counterexample of moral axiom that holds _always_ , without any exceptions
whatsoever, realistic or unrealistic. Even for one single person)

~~~
0xBA5ED
Loopholes are essentially bugs, and they're common. Is there a reason to
believe another way of encoding laws would necessarily be more bug-prone than
it currently is?

~~~
pnw_hazor
One person's loophole is another persons civil right.

------
snidane
Algorithmic justice is one of those problems where techies think that by
applying algorithms and techniques du jour will somehow solve the problem.
Unfortunately law is so complex nowadays that it is ridden with internal
contradictions. Formalizing law would only reveal these. To change the system,
a massive reduction of complexity of law would be required before any math
could be applied on it with some success. But then if we simplified the law to
make it comprehensible to mortals again, who would need any algorithms at all?

Relevant quote from Tony Hoare. s/software design/law/

 _There are two ways of constructing a software design: One way is to make it
so simple that there are obviously no deficiencies, and the other way is to
make it so complicated that there are no obvious deficiencies._

~~~
hueving
>But then if we simplified the law to make it comprehensible to mortals again,
who would need any algorithms at all?

Because automating simple systems is what software is great at. Why wouldn't
you?

~~~
dsfyu404ed
Because it is likely to enable us to go further us down the path to 1984.

Fucking with people's lives is something that should be done by a human that
can see the bigger picture, not a bunch of code daisy chained together.

~~~
lamy2000
The argument for is that judges fuck with people's lives and judges are
already unreliable. A simple ML system may have errors but could have less
error than a judge. And there is evidence to that effect.

[https://www.nber.org/papers/w23180](https://www.nber.org/papers/w23180)

~~~
maxxxxx
The machine will probably be very predictable so expect people working around
it soon.

------
pnw_hazor
OCR + Data mining with automatic form generation.

It works here because the state is willing to accept the data/results at face
value.

Kind of a stretch to call it algorithmic justice.

~~~
bertil
I would see it as a stretch to call it “machine learning” justice (ignoring
the OCR) but it is automated with an (albeit simple) rule -- and that process
is an algorithm.

Like for all machine-learning project, I would recommend ‘Crawl before you
run’ and have that project encourage the Justice system to modernise and have
basic data-handling processes in place. We’ll get to Orwellian pre-cog soon
enough. Right now, let’s get basic things like not arresting homonyms or
people who have committed acts that are not crimes anymore.

~~~
pnw_hazor
We could call it:

Filtration Justice

or

Greater than Justice

if time_since_last_conviction > n years AND past_crimes_committed NOT IN [some
list of crimes] { generate form and email to prosecutor. }

The OCR is the most important part.

edit-to-add:

some misdemeanor convictions are barred from sealing/expungement. It is
jurisdiction dependent but they usually are crimes, such as, DUIs, domestic
violence assaults, sex crimes, and so on.

------
qwerty456127
Algorithmic justice can only be considered justice if the laws are
scientifically proven to be just.

~~~
jobigoud
Is that different from Human-Judge justice?

~~~
qwerty456127
Human-Judge justice is judgement and human judges are supposed to have moral
authority to judge. Algorithms hardly are.

------
NoblePublius
This is lovely until the same system and precedents are used in reverse (algo
district attorney) or by banks to automate the foreclosure and seizure of
thousands of homes. The banks did it before!

------
staticautomatic
I really hope the term "Algorithmic Justice" doesn't catch on. It reminds me
of the old saying that military justice is to justice as military music is to
music.

------
anilakar
Let me get this straight: Are there people who have been wrongly convicted and
are currently locked up only because someone hasn't done their paperwork?

~~~
twic
_So far they have reviewed 43 years of eligible convictions, proactively
dismissing and sealing 3,038 marijuana misdemeanours and reviewing, and
recalling and re-sentencing up to 4,940 other felony marijuana convictions
which were sentenced prior to Proposition 64’s passage in November 2016._

This sounds to me like it's quashing convictions for marijuana-related
activity which is no longer a crime, and where sentences have already been
served. So the convicts are no longer in prison, but they have a criminal
record, which this project will wipe clean.

In a post from last year [1], a lawyer writes:

 _But here in California, there’s hardly anybody in prison for marijuana
anymore because of the reforms in our system that have happened over the last
decade._

[1] [https://melmagazine.com/en-us/story/if-cannabis-becomes-
lega...](https://melmagazine.com/en-us/story/if-cannabis-becomes-legal-what-
happens-to-everyone-in-jail-for-weed-2)

------
tareqak
Here is the APNews article about dropping the pot convictions:
[https://apnews.com/1aeb5fed9e8746d8b120049e908af06c](https://apnews.com/1aeb5fed9e8746d8b120049e908af06c)
. They don't use the phrase "algorithmic justice", just technology.

------
yboris
Even simple linear models can outperform "experts" in numerous fields,
especially in those where the feedback loop (for learning) is very long (as
_is_ the case in the criminal justice system).

An amazing paper that touches on this topic: _In Praise of Epistemic
Irresponsibility: How lazy and ignorant can you be?_ by Michael A. Bishop

You can find the PDF easily. Here's a ref:
[https://link.springer.com/article/10.1023/A:1005228312224](https://link.springer.com/article/10.1023/A:1005228312224)

------
worldsayshi
> [https://www.codeforamerica.org/](https://www.codeforamerica.org/)

Does anyone know of good examples of such initiatives in the EU?

~~~
poloniculmov
See [https://codeforall.org/](https://codeforall.org/)

------
Shivetya
what can be used to crawl through defendant's records to exonerate them can
also be used for other purposes so care will always need to be taken with
expanding processes such as this.

that out of the way, long term it would be best to standardize how this
information is stored, categorized, and shared, so that exoneration in one
locality can be more easily implemented in another.

~~~
dsfyu404ed
This technology has long been used to dig through the past of anyone and
everyone that comes in contact with the legal system. When you get pulled over
for a taillight out and the cop runs your driver's license your pot charge
from the 80s comes up.

------
maitland
The Digital Regime of Truth: From the Algorithmic Governmentality to a New
Rule of Law - [https://www.iainmaitland.com/pdf/Rouvroy-
Stiegler.pdf](https://www.iainmaitland.com/pdf/Rouvroy-Stiegler.pdf)

------
Razengan
Lawyers and doctors are probably the best candidates for replacement by AI,
and those professions will lobby the hardest against their replacement, but
ultimately, I would rather be governed by transparent, open-source, AI than
humans.

~~~
NateEag
Speaking as one who was raised by a physician, I cower in horror at the
thought of having a treatment plan designed and implemented by a neural
network without human intervention.

We've seen that for a number of difficult problems, deep neural networks,
given sufficient data and expert trainers, do a good job most of the time and
occasionally come to mortifyingly wrong conclusions.

Run a 'sed s/mortifying/mortal/' on that previous sentence and think about it
for a minute.

~~~
Razengan
Humans also make mistakes that are fatal for others, and we have been thinking
about _that_ for millennia.

Humans also subvert things in order to make more money.

Maybe not complete replacement, but where AI does most of the thinking, shows
"This is what I found, and this is what I used, to decide this." and humans
have the final approval on whether to go ahead with whatever it decided.

If someone has a disagreement, it would be submitted to the AI and make it re-
evaluate everything while weighing it against the disagreement.

~~~
NateEag
As an assistant to a heavily-trained, expert human, I do think it would be a
helpful aid in catching human error.

I would make my tiebreaker another human, though.

If the AI system showed the reasoning behind its choice, that would help a lot
with assuaging my fears. I was thinking of the current state of neural
networks, where there's no way to see how it drew its conclusions.

------
KorematsuFred
I think I would prefer a system where Algorithms decide not to charge a
person. Only those to be charged go to a real court.

Also, all plea bargains must be algorithmic so that narcissist DAs can't
browbeat ordinary people into accepting guilt.

------
KorematsuFred
Remember the time Google Image search started mistaking black people for
Monkeys ? I am worried about Algorithmic judges for the same reason that
minorities might get into trouble here.

------
atemerev
Judging... 97% complete.

------
crimsonalucard
So does this algorithm generate a report that predicts the future? Like a
minority report?

------
sametmax
Algorithmic justice, the one thing that is worst that electronic voting. I
didn't think it would be possible.

