Hacker News new | comments | show | ask | jobs | submit login

How about this recent opinion piece: "Robots are racist and sexist. Just like the people who created them" [0]. Basically if you're one of those "white, straight men" building software you're racist and sexist. I wouldn't call that moderate feminism.

[0] https://www.theguardian.com/commentisfree/2017/apr/20/robots...




Why is that extreme? The author is asking that developers take care in their work and account for unintended consequences of a non-moral entity learning from a world with racism & sexism in it. That seems prudent to me.


> machines can work only from the information given to them, usually by the white, straight men who dominate the fields of technology and robotics.

> one Google image search using technology “trained” to recognise faces based on images of Caucasians included African-American people among its search results for gorillas

> Microsoft created a chatbot, Tay, which could “learn” and develop as it engaged with users on social media. Within hours it had pledged allegiance to Hitler

> Robots are racist and sexist. Just like the people who created them

The author is determined to see sexist intent in everything, and bends the truth to match. (E.g. claiming Microsoft software pledge allegiance to Hitler.)

That said, steer clear of the opinion pieces and you can avoid the worst of this junk.


Is it not extreme to claim that all software developers are racist and sexist, and are all "white straight male"?

The points she makes otherwise might indeed be worthy of discussion. However I think some of what she describes is simply misclassified data - black people get misclassified, but so do white people (but of course it doesn't make the news). Unless she can prove that white people are misclassified less frequently than black ones, she has no point.

The Tay incident was not due to bias but to trolls purposely feeding the IA racist information. Despite what she vaguely claim later in the article it wasn't encoded, not even subconsciously, in the bot by the developers.


> Is it not extreme to claim that all software developers are racist and sexist, and are all "white straight male"?

I never saw that claim in the article. She did claim that white, straight men dominate fields of technology and robots, but that doesn't seem controversial to me (though I would happy to see evidence against it).

I'm not sure that either of your points diminish the argument of the article. I didn't get the impression that the author thought that developers are purposefully creating racist robots. To me, she was saying that those who suffer bigotry the least will also be the least likely to account for it in the systems they design because they see the world as less bigoted than it is and has been. Sure, in hindsight, the two examples you mentioned can be explained as poor sources of information. But if we're going to avoid bigoted tech & robots, we'll need to catch those issues beforehand, and I think her point is that more diversity would lead to better foresight on such things.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: