
California Bill Would Mandate 'Crime Prediction' Algorithms Instead of Cash Bail - elsewhen
https://www.vice.com/en_us/article/n7wymd/california-bill-would-mandate-crime-prediction-algorithms-instead-of-cash-bail
======
astrophysician
As a data scientist, this is terrifying. A questionnaire is asking things like
"have you only held low-wage jobs" to determine if someone should be locked
up? The model isn't completely open to the public? (how that is justified is
beyond me, this model literally decides whether or not you, as a citizen, are
locked in a box).

If it's not obvious: these methods easily reinforce all sorts of terrible
biases in the data. Even with a perfectly transparent model, it's unclear what
sort of data biases you are reinforcing, because getting unbiased data is
extremely difficult if not impossible.

------
lsb
This is a serious problem in many ways.

Ethics can be a distraction from justice.

Ethics sites the problem in the algorithm, whereas justice sites the problem
in the power differential.

People generally trust computers more than random human decisions, and
computers can automate the repercussions of their decisions quickly. If you
work hourly at a government job, and you get your hours yanked when you're
judged a risk, without time for an appeal, things spiral out of your control,
exacerbating oppression.

