Hacker News new | past | comments | ask | show | jobs | submit login

More than that, though. Graduates of all-women colleges were also caught. If you're using school as a data point, that's extremely hard to sanitize.



Then what is the purpose of this? At some point you want this thing to "discriminate" (or "select", if this is a better word) people based on what they have done in life. Which is not negative per se.


But you don't want it to select based on gender.


Would it though? A school name is essentially just that, no gender information there, even with the "women" prefix. If you discriminate other schools, you can do it too with those. FWIW there could be a difference in performance which the ML finds.


It would. Just because it's not explicitly looking for a "W" in the gender field doesn't mean it's not able to determine gender and discriminate based on that. The article and the discussion is all about how these things, despite not explicitly being told to discriminate based on gender, or race, or any number of factors, can still end up gathering a set of factors that are more prevalent among those groups, and discriminate against those people all the same.


>despite not explicitly being told to discriminate based on gender, or race, or any number of factors

Then this is completeley useless. You want this "AI" to discriminate based on a number of things. That's the whole point. You want to find people that can work for you. If a specific school or title is a bad indicator (based on what you hired now), then it just is that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: