
Proprietary algorithm that predicts future criminals okayed by Wisconsin court - nonprofiteer
https://fusion.net/story/330672/algorithms-recidivism-loomis-wisconsin-court/
======
Freak_NL
I can't think of any reason why anyone with a decent grasp of technology or
even logic and rational decision making in general would consider the use of
secret, proprietary algorithms permissable in the due course of justice. So
why is this deemed acceptable here? Is it a lack of understanding how a
decision making tool differs from, say, the brand of microphones used in the
court room?

Mind, I don't see how anyone can condone privatized penitentiary institutions
either. Perverse incentives and all.

~~~
onetwotree
This stuff has been in use in the probation and parole system for decades now.

Whether it's a "sophisticated" algorithm or a sheet where a parole officer
writes down numbers and adds them up, there's this constant idea that it can
be used to determine how likely an individual is to re-offend.

Rejecting the use of such a system would call into question the methodology
used in deciding whether or not to revoke someone's parole/probation - this is
incredibly significant, because, to take the example of Wisconsin, over half
of the state prison inmates are in on revocation.

I'm just saying this by way of presenting an alternative idea of why the court
might have rejected the appeal. It's also important to note that in Wisconsin
we have a _very_ conservative supreme court.

I find that the more I learn about the criminal justice system, the more
shocked I am that we allow it to continue. It's not a law and order vs. wishy-
washy liberalism thing - our criminal justice system demonstrably increases
recidivism.

~~~
x0x0
The idea of an excel spreadsheet and points is at least understandable.

A cynic might think that the value of a complicated algorithm is you can, as
Pinboard said in a similar context, money launder your bias.

Hell, I do ml on a daily basis, and for anything but a regression or a
decision tree, it's complex and/or impossible to even explain why an algorithm
picked what it did for a specific example. Let alone to evaluate what an
algorithm is picking up on in general.

And that doesn't even get into the mess that is highly correlated variables
(eg ethnicity, SES, income, peer income, parental income, education, arrest
rate, housing location) most of which are largely synonyms for each other. And
the bias in the data themselves -- being poor or minority increases the
detection rate of criminality.

~~~
onetwotree
> money launder your bias

I like this phrase. I'm stealing it :-)

~~~
x0x0
I stole it first!

source:
[https://twitter.com/Pinboard/status/744595961217835008](https://twitter.com/Pinboard/status/744595961217835008)

------
ccvannorman
Thank god. It's about time we finally use technology for good, and governments
can finally do their jobs effectively without the guesswork.

------
mixedCase
Of all the weird things of Cyberpunk culture, the one I never thought I'd see
in my lifetime is Psycho-Pass. And here we are.

------
cakebrewery
It would be insanely unfair if this algorithm was based on statistics around
economic (or racial) status.

~~~
beached_whale
From what I have read elsewhere, the data sets used for this type of decision
making is biased towards race. That is an area of research currently.

[https://www.fordfoundation.org/ideas/equals-change-
blog/post...](https://www.fordfoundation.org/ideas/equals-change-
blog/posts/can-computers-be-racist-big-data-inequality-and-discrimination/)

------
oyebenny
Is this how AI will begin to segregate our futures and our lives?

