
Facial recognition service becomes a weapon against Russian porn actresses - georgecmu
http://arstechnica.co.uk/tech-policy/2016/04/facial-recognition-findface-used-against-russian-porn-actresses/
======
devereaux
_”Dvach” (2chan) launched a campaign to deanonymize actresses who appear in
pornography. After identifying these women with FindFace, Dvach users shared
archived copies of their Vkontakte pages and spammed the women’s families and
friends with messages informing them about the discovery. (...) The Internet
users behind the doxing campaign say their motivation is moral outrage,
claiming that women in the sex industry are “corrupt and deceptive.”_

Personally, I find these persons behind the doxing campaign much more corrupt
and deceptive.

~~~
PascalsMugger
Absolutely, but this incident is the perfect example of how new technology
will always be used by assholes to do asshole things, along with whatever good
uses it was designed for. We should always ask "what would 4chan do with this
capability?" when inventing new things.

------
jbob2000
Facial recognition has all sorts of problems, but I think it's funny that the
first problem we have with it is related to sex. Like, we have all this great
technology, but at the end of the day, we're still a bunch of hairless monkeys
(or ugly-bags-of-water, if you're into obscure references).

~~~
disremembered
It is funny how many technologies are introduced the mass market by relating
to sex. As smart as we might be, survival and reproduction are still core
genetic imperatives.

~~~
hartpuff
> It is funny how many technologies are introduced the mass market by relating
> to sex.

This kind of claim is often made (often by smut mongers), but I don't remember
(though may have forgotten) ever seeing any examples that really back up the
claim. Can you list some?

~~~
tmptmp
>>This kind of claim is often made (often by smut mongers), but I don't
remember (though may have forgotten) ever seeing any examples that really back
up the claim. Can you list some?

Let me try: Not just technologies, even religions have used sex to sell
themselves. Take the kind of things that are sold (in terms of unsubstantiated
guarantees of carnal pleasures in heaven after death if you follow the code of
conduct proscribed by God(s) via his appointed messengers) you will see that a
very large part of this promise is some kind of abundant sex (akin to
unlimited download/upload). See for example this sales pitch. [1]

[1]
[http://www.youtube.com/watch?v=HdzusekB8cg](http://www.youtube.com/watch?v=HdzusekB8cg)

~~~
hartpuff
Thanks for your reply. That's an argument about 'sex sells' though. Like car
exhibitions or Microsoft Xbox conventions using scantily clad women.

What I meant (and I _think_ the parent of my post was talking about, though I
could be wrong) is the old chestnut that the sex/adult industry innovated or
popularized all kinds of technologies.

~~~
disremembered
Sex sells far more than its own depiction.

------
egjerlow
This is the second article in a short while I've seen about this FindFace app:
[https://news.ycombinator.com/item?id=11491264](https://news.ycombinator.com/item?id=11491264)

I am still amazed at how it is possible to match people in the way this app
purports to do. Does anyone know of more accessible (to english speakers) code
that can do roughly the same?

~~~
tyingq
\- This is open source, but I have no idea if it's anywhere near the FindFace
capability: [http://openbiometrics.org/](http://openbiometrics.org/)

\- Not open source, but it has a demo of face searching
([http://www.faceplusplus.com/demo-search/](http://www.faceplusplus.com/demo-
search/)), and an api.

~~~
asteadman
I'm not sure how leading edge the openbr stuff is, AFAICT (i'm not a domain
expert or even novice), it's pretty primitive. If you were starting from
scratch today, i'd say the openface project
([https://cmusatyalab.github.io/openface/](https://cmusatyalab.github.io/openface/))
is a better starting point.

Having a DNN that maps images of faces into a point on a hypersphere seems
eminently practical. I'm a bit surprised that it works, but having something
like that is awesome.

~~~
tyingq
This post seems to indicate they aren't that far apart:
[http://bamos.github.io/2016/01/19/openface-0.2.0/](http://bamos.github.io/2016/01/19/openface-0.2.0/)

I am though, in the same boat as you in this domain...not even a novice.

~~~
asteadman
That post is great! thanks for the link, i hadn't stumbled across it yet.

From what I can tell from that link though, it is exactly as I suspected, and
openface 0.2 significantly outperforms openbr for the LFW test set. (whereas
openfaces 0.1 was largely equivalent). Moreover, I think that there is clearly
room for drastic improvement with the openface architecture, either by
tweaking the layout of the NN's, or simply throwing more data at it. It is not
clear to me how openbr could be improved.

------
amelius
I have this theory that the more beautiful people are, the more likely they
are to have a look-alike. So given that these actresses are beautiful, this
algorithm will probably generate a lot of false-positives.

~~~
amelius
The theory is actually quite intuitive, and you can apply it, for example, to
flowers. The chances of finding two flowers that look the same are much higher
if the flowers are beautiful than if they are not.

------
disremembered
Facial recognition is here, alongside pervasive surveillance. Voice and gait
recognition probably aren't too far behind.

> "“In theory,” Tsvetkov told RuNet Echo, "this service could be used by a
> serial killer or a collector trying to hunt down a debtor.”

If anything, this technology will make serial killers easier to catch, right?

~~~
jessaustin
As seen on HN again today [0], everything that law enforcement touches, turns
to shit. They will take a _perfect_ "gait recognition" system and within a
month they'll decide it's too conservative and start dialing back the match
parameters until they get a list of twenty suspects every time they use the
system. Then whomever among that unfortunate list is least able to prove an
alibi (i.e. the poorest person) will go directly to prison.

[0]
[https://news.ycombinator.com/item?id=11578240](https://news.ycombinator.com/item?id=11578240)

