Hacker News new | comments | ask | show | jobs | submit login

I think things will become even worse now that criminal "scouting" and even vetting is being done via learning models. So you may not even find hard filters or conditionals...instead the errors (or stereotypes?) would be embedded deep inside some neural net. I'm not even sure how one would explain that one to a jury.

Via learning models? I wonder how long it takes before it just automatically selects any black poor person.

Machine learning doesn't make a difference between correlation and causation. No doubt such systems already help reproduce the inequalities present in the data they've been trained on.

Yeah, that's what I'm thinking. GIGO basically: the US records it would be using would suggest poor, black, mentally-ill etc. people are most likely to be criminals. The software will pick up on that.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact