

Judge, jury and executioner: the unaccountable algorithm - tbrownaw
http://aeon.co/magazine/technology/judge-jury-and-executioner-the-unaccountable-algorithm/

======
tbrownaw
I'm not so concerned about the individual unfairness the article hilights, as
"what if everybody did". These algorithms are designed to automatically pull
correlations out of large data sets, so if the designers aren't incompetent
then given about the same data -- which seems a reasonable assumption --
they'll all find about the same correlations.

Which is a problem whether or not it also correlates to a protected class. You
still end up with a group being excluded from society, even if it's a
relatively new group that hasn't needed explicit legal protection before.

This applies even if the criteria _do_ only include actually causally relevant
things, as long as all the algorithms are more-or-less in step; you still get
people being systematically excluded.

Some amount of fuzziness in decision making is useful to society, even if it
doesn't benefit any of the individual decision makers.

Plus there's also the monoculture / local maximum issue.

.

(Also, I find it a bit curious that the subject line in my RSS feed, the page
title, and the article headline are all different.)

