
When Government Rules by Software, Citizens Are Left in the Dark - AndrewDucker
https://www.wired.com/story/when-government-rules-by-software-citizens-are-left-in-the-dark
======
clarkevans
A few months back, the Wisconsin Supreme Court ruled against someone who
wanted to examine the algorithms by which he was sentenced.

[https://www.nytimes.com/2017/05/01/us/politics/sent-to-
priso...](https://www.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-
software-programs-secret-algorithms.html)

~~~
rm_-rf_slash
To be fair the very article details that conditions of his arrest (evading
police in a car he drove during a shooting) would have resulted in the same
sentencing under Wisconsin law. He also had prior sex offense convictions.
This wasn't a guy with a clean record being persecuted for the sake of
corporate profit.

I don't like the idea of being tried and sentenced by an unaccountable
algorithm any more than you do, but when using examples like these, we also
have to accept that if someone's gonna do the crime, they gotta do the time.

~~~
lghh
Regardless of if the person performed the crime or how clear it was that they
did it, being able to review the algorithm should be their right just as much
as it would seem to be if it was a questionable arrest.

~~~
rm_-rf_slash
Agreed. The "what if it's me" response ought to be felt by everybody. And
algorithms used in criminal justice cases should be open and reviewable.

I made my comment above because it's easy to read "6 years...algorithmic
sentencing...unaccountable" and miss the fact that this sentencing was given
to a dangerous man with multiple prior convictions.

~~~
cortesoft
Right, but the fact that he was deserving of the punishment doesn't change
anything; he still has a right to know the rules for what sentenced him.

~~~
rm_-rf_slash
Technically, the rules were already there. Absent the algorithm, regular
sentencing guidelines would have given him 6 years either way.

------
Chardok
Ultimately it seems like the software issue is just a symptom of the greater
problem that is our criminal justice system.

I think software could play a helpful role in all aspects of our government,
but having closed source, proprietary algorithms play a hand in someone's life
hardly seems like justice to me.

This needs to be stopped immediately, and some serious debate needs to go into
how software can assist _everyone_ in assessing the facts and making a
judgement, instead of just increasing the speed they can process people.

------
hprotagonist
_Captain Jean-Luc Picard: I don 't know how to communicate this, or even if it
is possible. But the question of justice has concerned me greatly of late. And
I say to any creature who may be listening, there can be no justice so long as
laws are absolute. Even life itself is an exercise in exceptions. _

------
matthjensen
I am equally concerned by the application of secret simulation models by
organizations like the Congressional Budget Office for fiscal policy decision
making. Our legislators rely on models that even they do not have access to
when writing spending and tax laws.

~~~
creaghpatr
The CBO estimations compared to the actuals speak for themselves.

I find it hilarious when venture capitalists cite CBO forecasts on twitter
when, in the real world, if you brought that kind of data into a startup pitch
it would get laughed out of the room.

[https://www.forbes.com/forbes/welcome/?toURL=https://www.for...](https://www.forbes.com/forbes/welcome/?toURL=https://www.forbes.com/sites/theapothecary/2017/01/02/learning-
from-cbos-history-of-incorrect-obamacare-projections/)

^I don't mean to pick on the ACA either this applies to virtually all CBO
estimations.

~~~
xxpor
Because it turns out that the modeling the CBO has to do is WAY harder than
what your typical startup has to worry about.

------
wtmt
The first part of this comment may be off topic for this specific article. The
Aadhaar (unique ID) program in India is a bungling mess of hardware and
software that has excluded millions of people from their legally entitled
welfare benefits, helped leak more personal and sensitive information than
ever before and is being touted by the government (and its supporters) as
India moving into the "developed world club." The central biometrics database
and the entire system have never been audited in all these years of its
existence. There have been no security related tests done. There are no strict
policies for third party agents to follow (just a bunch of loose guidelines,
including nonsensical ones like "Windows XP is a supported OS").

Governments are eager to use technology as if it were a magic wand that could
cure all the ills around, without understanding what the technology is about,
what its limitations are, how it could be misused, the issues that arise on
misuse, etc. Technology is just a tool. It can be lousy or good, and in either
case it can be put to use in different ways that have detrimental effects on
an entire country (or state). Governments need to be competent and capable
agents first and foremost, and that's missing in many places. The gullibility
of some governments and government agencies are taken advantage of by others.

------
mschuster91
Seriously, letting software blackboxes decide upon the freedom of humans? Has
the US gone... totally bonkers now?

What happened to the right to fair process? How can that be ensured AT ALL?

~~~
goialoq
We've had human blackboxes decide upon the freedom of humans for centuries.

~~~
dTal
Humans aren't black boxes. You can press them on their reasoning, and they
often experience public criticism if they refuse to provide it.

------
goialoq
Of course, when government rules by human judgment, citizens are left in the
dark. Software doesn't change anything here.

~~~
cheez
Judges are elected or at the worst appointed by someone who is elected.

~~~
RugnirViking
This is far from a panacea; see the inability of successive governments in
many democracies to change voting systems or large scale education/healthcare
reform.

Elected officials, while a good way of ensuring no-one truly vile gets into
power, also ensure no-one truly inspired gets into power

Edit: Just to note, I support democracy (it should go without saying) but we
should all see that it isn't infallible

------
mnm1
This is absolutely insane. Why can't government develop their own in-house
solutions? Create jobs and have full access. The law should demand that these
algorithms and all information about these systems be revealed publicly or
these systems should not be used. I don't want some random code from some
random for-profit company deciding my fate or anyone's fate. I don't even
understand how this is legal or constitutional. I'd bet serious money all
these systems are in violation of multiple laws, especially discrimination
laws. What a fucked up technology. If this is what people talk about when
saying they are afraid of AI, then yes, I'm absolutely fucking terrified. Fuck
this "AI". It has no place in our society. Fuck these companies that develop
it too, making money off of people's suffering.

------
rrdharan
I'm surprised the article makes no mention of this book which covers the exact
topic extensively, and which has (AFAICT) received a lot of exposure and
publicity.

[https://weaponsofmathdestructionbook.com/](https://weaponsofmathdestructionbook.com/)

------
tomjen3
Misleading headline: the algorithm has been made public and it is included in
the article as a PDF link.

~~~
goialoq
The algorithm, yes, but not the process of building the algorithm:

> Goodman argues the foundation should disclose more information about its
> dataset and how it was analyzed to design PSA, as well as the results of any
> validation tests performed to tune the risk scores it assigns people. That
> information would help governments and citizens understand PSA’s strengths
> and weaknesses, and compare it with competing pretrial risk-assessment
> software.

------
supernumerary
Bernard Steigler has been writing about this:
[https://iainmait.land/pdf/Rouvroy-
Stiegler.pdf](https://iainmait.land/pdf/Rouvroy-Stiegler.pdf)

------
interstitial
They used to scream "child porn" and everyone would bow down. Then it was
TERRORISM. And everyone complained about profiling. This at least had a
smaller footprint. Now they just scream "White Privilege" and no one cares
about profiling. Because evil white men deserve it.

~~~
ionised
Stop with the victim complex.

