Hacker News new | past | comments | ask | show | jobs | submit login
Are you scared yet? Meet Norman, the psychopathic AI (bbc.com)
13 points by hdivider on June 3, 2018 | hide | past | favorite | 6 comments



> One study showed that software trained on Google News became sexist as a result of the data it was learning from. When asked to complete the statement, "Man is to computer programmer as woman is to X", the software replied 'homemaker".

> Dr Joanna Bryson, from the University of Bath's department of computer science said that the issue of sexist AI could be down to the fact that a lot of machines are programmed by "white, single guys from California" and can be addressed, at least partially, by diversifying the workforce.

Just curious but wouldn't this more likely be a result of a bias in the training data rather than the diversity of the programmers?


> When asked to complete the statement, "Man is to computer programmer as woman is to X", the software replied 'homemaker".

Seems they're putting a lot of weight on an unqualified question...the AI just fed back the easiest, most popular, stereotypical answer it came up with. A tee-ball response.

It didn't stop and think about who'd be offended, first.


Yes, but who picks the training data?


If it was trained on Google News it would just make more sense to me that it is a reflection of our journalists and possibly whatever algorithm Google uses to identify popularity, presumably a reflection of our society.


Yes, but the intention is to demonstrate one of the issues with using ML. For example, translating into a language which uses genders in grammar. A human translator will read a passage to determine a doctor's gender. A machine translation will know that doctor is usually translated as a male doctor and apply that to the translation. If the bias in the training is strong enough it can override actual signs in the source content that the doctor is a woman.

If a person reads enough translated content, this just passes on the bias from the training data.

https://mashable.com/2017/11/30/google-translate-sexism/?eur...


Right. This is simply a matter of what was used as training data.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: