
Is Machine Learning/AI Sexist? - spikewestern
https://shecancode.io/blog/is-machine-learningai-sexist
======
falkenb0t
The author cites word2vec as being biased and gives the example of it
associating the 'father' with 'doctor' but associating 'mother' with 'nurse'

While this is certainly a bias, I think the insinuation is that this makes a
program inherently sexist or racist.

I don't think anyone would really call the statement "in today's society,
certain professions are male dominated" particularly sexist.

What I'm getting at is that I think there is a distinction to be made here
where bias introduced via training data doesn't necessarily translate to
sexism or racism.

Additionally, the problem doesn't as much exist in the programs as they do in
the training data. No engineer at Google sat down at their computer and went
"yep, men are doctors and women are nurses. Better program that into
word2vec". That behaviour was learned from the information available.

