
Let's not have Black Mirror in real life - thailor3
https://medium.com/swlh/the-not-so-simple-ethics-involved-in-software-engineering-80b24fcc1f0d
======
planetzero
"In the above situation, the data probably had more photos of whites than of
other skin tones, meaning the model learned to pick out details for whites but
was unable to do that for others. Not enough data can be dangerous."

China seems to be doing a pretty good job..and 99% of the population has non-
white skin tones.

"This has many implications. One example is in the case of a facial
recognition software looking for a wanted criminal. If the wanted criminal
happens to be African American or Asian, and the person whose face is being
scanned has the same skin tone, the software is more likely to falsely signal
a match when the match is not correct."

Facial recognition is not perfect and never claimed to be. As long as we
aren't prosecuting people based on facial recognition alone (which isn't
happening anywhere in the US), false positives won't really really be an
issue.

