
Neural nets more accurate than humans at detecting sexual orientation in images - helloworld
http://psycnet.apa.org/record/2018-03783-002
======
eboyjr
This has been posted multiple times on HN in the past 6 months:

\-
[https://news.ycombinator.com/item?id=15198997](https://news.ycombinator.com/item?id=15198997)

\-
[https://news.ycombinator.com/item?id=15197287](https://news.ycombinator.com/item?id=15197287)

The author of this paper has created a document that addresses:

(1) Summary of the findings

(2) You must be wrong – this is pseudoscience! (common criticism of this
paper)

(3) Our response to an irresponsible press release by GLAAD and HRC; or, a
much better response by LGBTQ Nation

You can find that here:
[https://docs.google.com/document/d/11oGZ1Ke3wK9E3BtOFfGfUQuu...](https://docs.google.com/document/d/11oGZ1Ke3wK9E3BtOFfGfUQuuaSMR8AO2WfWH3aVke6U/edit#)

------
erispoe
The fact the the authors of the paper rush for innate explanations (hormone
levels in the womb...) that they have no expertise to comment on instead of
much more obvious behaviors (maybe self-labelled gay and straight people
upload different types of pics or take care of themselves differently, on
average), is a clear violation of Occam's razor.

The "average gay face" is similar to the "average straight face" with glasses
on, a better angle, and a couple of pounds less.

~~~
lxw
Exactly. The authors don't seem to have any basis for their explanations, and
the title should really be "neural nets more accurate... on dating website
images". Still this is significant, because these are the same images a
government or business could potentially access for their own ends.

------
skjerns
This is probaly the best article that you need to read after reading another
one of those 'detecting sexual orientation with dating profile pictures/prison
inmates/etc' papers:

[https://medium.com/@blaisea/do-algorithms-reveal-sexual-
orie...](https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-
or-just-expose-our-stereotypes-d998fafdf477)

------
randyrand
"Given a single facial image, a classifier could correctly distinguish between
gay and heterosexual men in 81% of cases, and in 71% of cases for women"

I can guess if someone of gay or straight with 98% accuracy - just always
guess they are straight.

Tha ratio of gay to straight in the test data set is perhaps the most
important part in determining how well the algorithm actually performs.

Are we supposed to assume the computer was shown a 50/50 split of gay and
hetero people? It appears that way. But please, tell us.

~~~
r_singh
If the test data does not have a 50/50 split (or something around that ratio),
the headline is straight up lies.

Skewed test data is the most common problem with research, reminds me of this
great video by Veritasium (Is most published research wrong?):
[https://www.youtube.com/watch?v=42QuXLucH3Q](https://www.youtube.com/watch?v=42QuXLucH3Q)

------
Animats
The data used is from dating web sites. That's not a good sample source.

~~~
im3w1l
Maybe not but still impressive that they beat humans, no?

~~~
krick
Well, assuming the data isn't good — no, obviouly. What "beating humans" would
even mean in that case? Imagine you have a dataset of dog and squirrel
pictures, with "dog" and "squirrel" (or, actually, "elephant" and "shark")
labels assigned quasi-randomly. You make humans and NNs to compete in what
label you assigned to them. NN beats humans. Now, what the fuck that result
even means?

~~~
bryanrasmussen
yeah but I guess the data isn't bad in that way in that the sexual orientation
of people on the site is probably labeled accurately (in fact I would expect
there it's more accurate than, you don't want to lie about what you want on
the dating service), the data is bad in that it is perhaps not a
representative sample of the population as a whole?

~~~
erispoe
What do you mean by labelled accurately? The AI only predicts how an
individual self-labelled on a fairly public dating service, not how this
person feels or behaves in any other way. There are so many headless torsos on
Grindr (or any other gay hookup app) because many men don't want to self label
as gay when they look for sex with other men. These men could very well be
looking for women on Tinder.

~~~
bryanrasmussen
ok, I hadn't considered that.

------
helloworld
Some additional context and opinion on this:
[https://www.washingtonpost.com/opinions/ai-gaydar-could-
comp...](https://www.washingtonpost.com/opinions/ai-gaydar-could-compromise-
lgbtq-peoples-privacy--and-
safety/2018/02/19/172156bc-126d-11e8-9065-e55346f6de81_story.html)

------
techrich
They will be using it to detect criminals next.

------
abusoufiyan
But why does it matter? What could it possibly be used for? I can't think of a
use for this which wouldn't be crossing a line / inappropriate / dangerous.

~~~
JorgeGT
Homosexual relations are illegal in ~70 countries, and punishable by death in
~10 of those, such as Iran, Saudi Arabia, Brunei, Yemen, etc. Governments in
these countries could run this software in their databases, rooting out and
prosecuting or even executing those identified. So I would say this kind of
software merits discussion.

~~~
barry-cotter
The governments in these countries are evil, not stupid. The false positive
rate is too high for this to be useful at anything but a really coarse, maybe
check this guy out level.

All of the countries you name bar Yemen are easily capable of rooting out and
destroying their gay communities if they really wanted to. They don’t. What
they want is to make sure everyone involved is very discreet. Hypocrisy is the
homage vice pats to virtue etc.

