
Service that uses AI to identify gender based on names looks incredibly biased - mschuster91
https://www.theverge.com/2020/7/29/21346310/ai-service-gender-verification-identification-genderify
======
TheAdamAndChe
Someone made a funny, novel toy, and they are seriously talking about "harm"
it brings?

We seriously need to reevaluate the nature of communication on the internet.
Something about it just facilitates the social contagion of oversensitivity
and hurt feelings.

~~~
Kye
For example: oversensitive people like to dismiss concerns before they've
taken a single second to hear or understand them. Actually _listening_ to
people and engaging with their words instead of the reader leaning into their
biases and hurt feelings about the subject would improve communication on the
internet.

It wasn't a:

>> _" funny, novel toy"_

This was someone trying to build a business on guessing gender based on
meaningless data. It did a very poor job even before considering the ethical
issues. It couldn't even get it right with the name of the person behind it.

~~~
TheAdamAndChe
> This was someone trying to build a business on guessing gender based on
> meaningless data

This claim is baseless. We don't know their dataset or how it was used to
train the model.

> It did a very poor job

By what metric? I didn't realize people have done stastical models to show
that it was inaccurate.

> It couldn't even get it right with the name of the person behind it.

Yeah because it's machine learning. It's imperfect.

You ran circles around what you want to say. You implied that I was being
overly sensitive about how upset people were, but then you went on a tangent.

To address the main point of conflict head-on: making assumptions of people
based on a limited dataset. Some people think it's okay, some people think it
isn't. I think in this situation, it is fine! They aren't determining who is
being sent to prisons or something, they just made a mathematical model to
make fuzzy predictions to help with things like advertising.

