
The Toronto Declaration: Protecting Rights to Non-Discrimination in AI - apeace
https://www.accessnow.org/the-toronto-declaration-protecting-the-rights-to-equality-and-non-discrimination-in-machine-learning-systems/
======
apeace
The other day out of curiosity I was searching for basic words on ImageNet:
cat, dog, man, woman.

Here is the page for "woman" [http://image-
net.org/search?q=woman](http://image-net.org/search?q=woman)

I was surprised to notice a couple of things:

\- There are categorizations for "mistress", "slut", and "adultress". The
closest analogs for "man" appear to be "playboy", "stud", and "pimp". There is
also a "foolish woman" and doesn't appear to be an analog for man.

\- Look at the pictures for "mistress". Many if not most of them are just
women in their bathing suits or doing other normal things, like laying in a
bed with a man. The same is true for "slut". Does this mean AIs using ImageNet
could look at a woman on a beach and categorize her as a "slut"?

\- There is a category for "honest woman", which appears to be three pictures
of a woman standing next to a man.

I'm not saying anybody is at fault or that the people who put ImageNet
together did this intentionally. After all, it does reflect the (unfortunate)
reality of how our society often perceives images of women.

But before I poked through that dataset, it wouldn't have occurred to me how
important it is that we focus on equality and non-discrimination when it comes
to AI. This declaration seems like an interesting step.

~~~
belorn
It has a similar horrific page for "Bad person": [http://image-
net.org/synset?wnid=n09831962](http://image-net.org/synset?wnid=n09831962)

