
The guy who made a tool to track women in porn videos is sorry - theBashShell
https://www.technologyreview.com/s/613607/facial-recognition-porn-database-privacy-gdpr-data-collection-policy
======
orthoxerox
Russian imageboards found VK profiles of quite a few porn actresses and
prostitutes using findface.ru. Of course, they immediately started harrassing
them and sending messages to everyone in their friends list.

However, ethics aside, the cat is out of the bag. People can shame the guy
into deleting the data, but computing power will only get cheaper and data
matchers faster. Sooner or later some will do this again and launch a Tor site
with an up-to-date database. We have to adjust to the idea that if you have a
social media profile _any_ public images of you will lead people to it sooner
or later. And it will be sooner, not later, with every passing year.

~~~
sametmax
Yes, but that shoudln't prevent society to also make those particular use of
tech very costly. It should be consider like holding a concealed weapon: you
need a permit, or you get jail time.

Permit should not be hard to get, as long as you can justify professional or
research need.

But letting this go is creating a society I don't want to live it. I already
hate, living my anonymous life, when some stranger take my photo without
asking. It happens more and more.

On the other end, people filming the police get in more and more trouble.

We got things reversed.

~~~
CydeWeys
How is this possibly enforceable though? The software is already out there and
available. In countries that restrict guns, it's often quite hard to get one
outside the legal channels (if there are any), so enforcement is pretty
effective. But for something you can just download??

~~~
ashelmire
Not just download, but write yourself in a short time. And the code could be
shared, and I’d guess you can build a decent system to do this in a few
hundred lines.

It’s a ridiculous proposition to restrict software access, akin to making it
illegal to write on paper.

~~~
jeanmichelx
It's ridiculously easy to write a DoS tool. Let's make DoSing legal then.

~~~
CydeWeys
The two aren't the same thing at all. It's legal to own guns, but illegal to
shoot someone with one. It's legal to write software that opens network
connections, but illegal to use said software against a target on the Internet
that doesn't want said traffic.

Similarly, it's legal to write or download a neural network that is programmed
for facial recognition. Same for a basic crawler bot.

------
samat
The same story about Findface project was big news in Russia several years
ago. They scraped the whole vk.com (Russian Facebook copycat with almost
entire Russian population in it) and applied some very fine ML to find a
person by his/her face.

2ch (4chan ancestor) users identified girls from porn and harassed them,
outing them to their families and friends using this service.

Later company discontinued this service and now sells technology to the
Russian government to monitor CCTV footage from Moscow subway.

VK kind of fixed data scraping issue (not sure if they actually changed much).

Usually Russia copies everything from the west, interesting to see western
news follow Russian lead after a few years. Not saying I am proud of mother
Russia in this case, just an observation.

------
bem94
It's things like this that make me think all engineers need more training on
the social consequences of the tools they build.

This is an extreme case and I hope most engineers would look at this and
understand how awful & damaging it could be; but given our influence on the
world nowadays (as a class, if not always as individuals) we really need to be
much more aware of what we're doing and how it affects people.

~~~
xwdv
The problem is that engineers are not always the ones in charge of the entire
end result. If you want to do something unethical you can just split it up
into harmless individual tasks and give it to a dozen engineers and then put
it all together. It’s obfuscation.

Some of the greatest atrocities we will see in tech (and we _will_ see them)
in the coming century will be the result of combining things created with good
intentions.

Not much we can do I’m afraid except endure.

~~~
whenchamenia
An engineer can quit if the project is unethical. Just 'enduring' is harmful
to yourself, and others. Don't be a coward, at least stand up for your values.

~~~
phkahler
>> An engineer can quit if the project is unethical.

Is creating face recognition software unethical? Your answer is not really
important, just the fact that different people will classify this differently.
I thought it was creepy as f __* when facebook started wanting to
automatically tag people in photos. But if that 's all the tech was for it may
well be ethical, if creepy to some. And yet, face recognition is really all
that was used in this case - matching up porn images with social media ones.

~~~
close04
A while ago I suggested that engineers could have an ethics code just like
lawyers or doctors and the opinion wasn’t very popular (although with no
explicitly stated reason). I still don’t see a reason not to have one. If your
company is selling your code and software to abusive states and it’s used to
persecute journalists (for example) continuing to do that work there is no
different than any unethical work from a doctor or lawyer. And software
developers have a huge impact on our life these days With their work. It’s
reasonable to hold them to higher standards than in the past.

~~~
UncleEntity
Problem with that is anyone with a little python-fu can hack together
something like this without being an "engineer". I could probably do it using
"ethical" FLOSS code and I'm pretty far from being employed in the industry.
Hell, if I did do something like this and someone complained to my company
they'd probably promote me since our IT dept is iffy on the best of days.

~~~
close04
Sure, the “home-made” option is hard to regulate in any field. You also can’t
stop people from using hone remedies no matter how unethical they are. So you
can take it and use it with little recourse.

But I was thinking more at FAANG for example. They could be required to hire
“certified ethical” engineers at least on sensitive projects at first. This
means that even if the code is out there no coder would implement it in the
company’s solution without risking losing the “certification” and the job.
Anyone working for the Hacking Team (those guys selling exploits to oppressive
governments) should be more or less unhireable especially if all code in a
company’s portfolio should be traceable to a specific engineer.

The principle is already in place for other fields. Journalism being the
closest probably. It could be made more effective if needed. It’s obvious that
_something_ is needed in this direction.

------
gmuslera
The bigger problem are not the right matches, but the wrong ones, specially if
they are very similar. This should affect far more people than just ex-porn
actresses.

~~~
nkurz
While false-positives are definitely a problem, why do you think they are a
bigger problem? Do you feel the participants in pornography are "fair game"
but the others are innocent? In some ways, the current level of inaccuracy
could be considered an advantage, because it makes it more likely that there
will be resistance. A 100% accurate system strikes me as more dangerous, as
it's less likely for politicians or lawmakers to stand up for the victims.

~~~
keiru
Despite agreeing with your latter points, there's a reason why shrapnel damage
is comparably serious.

Although most people who did some porn are most likely not proffesionals, most
porn (by volume) is done by people who might even consider your finding part
of their CV. So going by sheer numbers alone, there might be so much more
cases of false positives. Now, _even if_ it becomes terribly obvious how
inaccurate it is, I don't think that matters much when it comes to weaponising
it, because people are easily fooled or don't care. As long as you kind find a
vid of a girl that looks just like Clara from accounting, you can spread it as
a false rumor. Or not even claim it's her; it's just as embarrassing. And
that's even assuming you could tell them appart, because the seed of doubt
extends the problem to people who didn't even do porn.

For comparison think of deepfakes. Everybody knows it's not actually Emma
Watson, but I bet she isn't particularly happy about it.

------
hurryskurry
If the videos are published, and the social media profiles public, how can it
be illegal to match the two? Supposedly if they show their face they already
accept the danger of being recognized, unless the video was published
illegally.

That said, why would someone, once someone was identified, just go about
harassing them and telling all their friends they do porn? Like who does that?

~~~
jddj
The internet has enabled some people to deploy complex systems into New York
datacenters from a beach in Thailand in a matter of seconds.

Unfortunately, it's also enabled a selection of less socially aware and/or
socially responsible people to bypass the natural limits and filters that
society had in place, to effectively cause suffering on a fairly broad scale.

Harrassing somebody on the other side of the world from the safe anonymity of
your bedroom was once logistically very difficult, as was reaching significant
numbers of people with a wildly malicious idea if you couldn't first find
people in your immediate (physical) social circle and community to vouch for
you/it.

Now, we have entire online communities which have embraced this new normal and
provided these folk with the tools and an audience.

------
bitL
Bypassing the stupid issue of harassing actresses by likely teenagers, isn't a
husband-to-be right to know about the past of his fiancee? Is the moral right
of not being outed outweighing the moral right of not being deceived in a very
important matter? Where is the line?

~~~
jonnycomputer
What right of the husband-to-be are you talking about?

Is failing to tell someone something the same thing as deception?

Is this supposed right symmetric? Does bride-to-be have the right to know if
the husband participated in porn video?

~~~
bitL
> Is failing to tell someone something the same thing as deception?

Yes, absolutely. Imagine you are the only person that doesn't know your fiance
was in porn and you committed to her without knowing that.

> Is this supposed right symmetric? Does bride-to-be have the right to know if
> the husband participated in porn video?

Of course. Even though social stigma for males is lower, it has a devastating
impact on their business relationships.

~~~
jonnycomputer
You still haven't actually answered the question of what "right" there is
supposed to be.

If a person is afraid that their fiance might have done porn in the past, or
if they care so damn much, they should just ask the person.

~~~
bitL
If people were honest to each other all the time, we wouldn't need laws. Let's
be generous and assume your question is just naive - could you ever envision a
situation where an innocent husband-to-be (e.g. a nice, do-no-harm person and
with reasonable means) is taken advantage of by a clever woman (with what is
perceivably considered shady past in the given era) for her own material
benefit? Do you think this never happens (hint: belletry is full of such
situations and I'd recommend you to investigate common causes of suicide)?
Doesn't husband-to-be have a justified interest in protecting his life,
reputation, possessions from the baggage the unvetted fiancee brings? Do you
think the fiancee wouldn't lie as much as possible in order to protect her
planned future?

If you truly are just naive, please think about the points I mentioned; if
not, there is no point to discuss, as your mind is set in stone anyway.

~~~
jonnycomputer
You literally said "everyone". If you're best friend knows, and your Mom
knows, and your neighbor knows, and your dog knows, and your boss knows, but
you don't? then you're either a fool or no one likes you.

but note: the "you" in that sentence isn't you. I'm not saying you are a fool.
Then why did you say I was? "let's be generous and assume the question is just
naive"

Again, you've never established a right. And there is also this horrible
assumption or belief that once doing porn forever marks or stains you. Well,
certainly some people believe that. But I do not, and I challenge it.

------
jonnycomputer
I'm not sure I understand the GDPR argument here. Is collecting the facial
information from actors and actresses in porn movies illegal? Or was it the
access of their social media accounts? Why is the tallying of what is publicly
consumable a violation of GDPR?

~~~
GuB-42
In France, we have something called "droit à l'image". It means that if you
can't appear on a picture without your authorization, with a few exceptions.

If you give an authorization, it has to be for a specific purpose and not
generalized. So just because you gave an authorization to appear in a movie
doesn’t mean you gave an authorization for other purposes, such as face
matching. Same thing for social media.

There are also copyright issues. If France, you always own copyright (droit
d'auteur), you may transfer the exploitation rights but the work is still
yours. It means you still retain some level of control, in particular
regarding decency.

France is just one European country but that's to show that there are limits
to what you can do, even with publicly available data.

~~~
internet_user
What if your image has been cached by a CDN outside of France, and then
downloaded and matched by someone who never interfaced with a French IP? Who
is at fault here?

~~~
GuB-42
IANAL but I don't think CDNs an IPs matter here. The question would be more
among the line of "in the picture taken in France" or "has the matching been
done in France or by a company that has a presence in France".

And unless everything happens in France, it would be a huge mess and I don't
think anything can be done. GDPR is an attempt to harmonize the rules of
different countries and make them enforceable internationally.

I don't know how it is done in other European countries but rules regarding
personal data and authorship tend to be stronger than in the US, and freedom
of speech is more controlled (ex: hate speech, libel, ...).

------
zamadatix
What are the GDPR repercussions for an open source project that does not make
money, has no ties to an organization, holds only code, and handles no data?

~~~
kyberias
If there is no data, obviously GDPR doesn't apply.

~~~
zamadatix
I'm more focusing on this section of the article where the legal source says 2
interesting things:

"just collecting the data is illegal if the women didn’t consent, according to
Börge Seeger, a data protection expert and partner at German law firm Neuwerk.
These laws apply to any information from EU residents, so they would have held
even if the programmer weren’t living in the EU."

#1 the programmer is implicated (it's possible this is in the context of "the
programmer" also being "user in Europe") #2 even if they weren't living in the
EU

#2 is the particularly interesting one and also opens up the general question
of "what happens if <external group> collects data but does not handle
monetary transactions in Europe or deal with money abroad". GPDR is very much
written around how to wrangle companies in financially, what if it's not a
company?

------
majkinetor
Not sure how this relates to GDPR - the arguments are nonsensical. People tend
to bump everything under GDPR now like its some form of gods commandment. If I
simply sit in the cafe watching someone continuously near me, I am collecting
data, and nobody can do anything about it.

When you leave your home, your privacy is gone.

When you make anything public your privacy is gone. Not sure anything is
ethically wrong with the software existence, the problem seems to be data
usage, just as if somebody found out his neighbors wife is in porn and went to
twitter and said it publicly with few pictures as a proof, and people do that.

It's naive to think that this thing can/will go away, if anything, because of
this news alone who knows how many more random programmers are retrying this
(or even make business plans) at this very moment.

~~~
Aengeuad
This has everything to do with the GDPR because the GDPR outlines the
conditions you need to meet to process data, simply put actually gathering the
data and using this tool is not legal within the EU.

>If I simply sit in the cafe watching someone continuously near me, I am
collecting data, and nobody can do anything about it.

But the tool is doing more than that as it's also using data gathered from
crawling social media profiles and then matching people, you cannot do this
without specific consent under the GDPR.

>When you leave your home, your privacy is gone.

>When you make anything public your privacy is gone.

GDPR provides controls on privacy, it does not say you are guaranteed privacy
in every situation but that if somebody misuses your data in a way that
violates your privacy you have rights to correct this misuse, e.g., the right
to opt out and the right to erasure of data among others.

~~~
majkinetor
> the right to opt out and the right to erasure of data among others.

Algorythm doesn't have to collect any data on you besides your image which is
not considered private data given that you need to carry one. It can have
single image and compare it to the porn collection without storing any results
anywhere, but just returning 'XXX was a porn star in YYY'.

Tool can also create opt out mechanism which would have to be pretty complex
as providing ZZZ in the name blacklist isn't appropriate solution given that
many people share this name.

> But the tool is doing more than that as it's also using data gathered from
> crawling social media profiles and then matching people, you cannot do this
> without specific consent under

Implementation detail. This can be totally avoided. Tool can accept picture
and return the result.

~~~
Aengeuad
>Algorythm doesn't have to collect any data on you besides your image which is
not considered private data given that you need to carry one.

The GDPR doesn't care about 'private' data (for the most part) as it only
cares about personal data, and faces absolutely are considered personal data
under article 4 point 14. Faces are even in a special category of data which
you are prohibited from processing at all unless some conditions are met,
although article 9 point 2e allows processing such data if they have been
'manifestly made public'. The GDPR doesn't define what 'manifestly made
public' means but the Scottish Parliament[0] suggests that it could mean
images purposely uploaded to social media and made public, but note that just
because you are not prohibited from processing such images does not mean you
have a lawful basis for processing that data in the first place.

>It can have single image and compare it to the porn collection without
storing any results anywhere, but just returning 'XXX was a porn star in YYY'.

>Implementation detail. This can be totally avoided. Tool can accept picture
and return the result.

This is an implementation detail that I had not considered. The original tool
and article talks about crawling both porn sites as well as social media and
both are fraught with legal issues, not every video or image on porn sites is
going to be legal in the first place (revenge porn, stolen images) never mind
the copyright issues involved, and crawling social media for this information
is simply not acceptable under the GDPR.

>Tool can also create opt out mechanism

Such a tool would almost certainly need to be opt in from the data subject,
article 6 makes this clear, you need specific consent from the data subject or
some other conditionals which aren't applicable here. Best case scenario if
you want to operate such a tool in Europe is that you argue (in court, mind)
that you have a legitimate interest to run such a tool which does not override
the fundamental rights and freedoms of the data subject, which does not seem
like it would go in your favour at all.

[0] Page 6:
[https://www.parliament.scot/S5ChamberOffice/2018_06_01_Motio...](https://www.parliament.scot/S5ChamberOffice/2018_06_01_Motions_Guidance_FINAL.pdf)

~~~
majkinetor
Great comments

Consider however the variant I proposed (picture 2 result). Would that be
penalty according to GDPR and if photo is already public and not used for
personal identification ?

Perhaps it could be said that the any algorithm that makes person identifiable
is problematic. However, random photo of you doesn't identify you AFAIK and
porn star names are also made up, so you are connecting 2 non-identifying
things and the end result most also be non-id ?

~~~
Aengeuad
>Consider however the variant I proposed (picture 2 result). Would that be
penalty according to GDPR and if photo is already public and not used for
personal identification ?

>However, random photo of you doesn't identify you AFAIK and porn star names
are also made up, so you are connecting 2 non-identifying things and the end
result most also be non-id ?

I'd say this would still be against the GDPR. The porn image (assuming a
professional production and not revenge porn/stolen images) and an image from
social media would both be public images of course, but under the GDPR you
still need a lawful basis for processing data as defined by article 6 [0], one
such basis for this is that the data subject concerned (i.e., the person in
the photo being uploaded) gives specific consent for this purpose. The only
way I can see such a site being legal is if you're providing a service to
allow people to upload their own images to see if porn of them is being
uploaded without consent, however you would need to ensure that the person in
the image is the one consenting to that use as you would be liable if other
people were uploading those images. It may also be legal to run such a tool
for purely personal reasons as the GDPR does not apply to personal activities
[1], but it would be illegal to make this available to other people and you
would still have other legal issues with such a tool (like copyright).

[0] [https://gdpr-info.eu/art-6-gdpr/](https://gdpr-info.eu/art-6-gdpr/)

[1] [https://gdpr-info.eu/art-2-gdpr/](https://gdpr-info.eu/art-2-gdpr/)

------
kabwj
Why should pornography actors not have responsibilities for what they do? I
don’t agree with this part where people who 1) put their faces in pornography
and 2) put their faces in social media should not expect the two things to be
correlated. Especially since the technology has already existed for some time
and it cannot be stopped. Are we looking for regulation here? Is that exactly
what we want, going back to cryptography export laws?

~~~
Youcandothis
> Why should pornography actors not have responsibilities for what they do?

They should have a right for privacy.

~~~
hurryskurry
How is the right to privacy being violated though?

Imagine you were a stripper or porn actress, your right to privacy wouldn't
preclude someone recognizing you at the strip club or video.

Or suppose you went to the strip club or watched a video and then also
happened to be on Facebook and saw the profile of someone you saw stripping at
a strip club or in the video, matching the face and name in your head.

How is the slow case different fundamentally from the case where programming
is used and throughput and speed is much higher and faster?

I mean would it be illegal for someone to just start going through Facebook
profiles by hand trying to find an actress they just saw?

~~~
jadell
What if we're not talking about strippers or porn actresses. What if we're
talking about some random person who let a partner take photos/videos of them,
in confidence and in private?

~~~
majkinetor
In my opinion, if you let your partner do that, you deserve it. If you do
that, and later come surprised if such situation happens, you are literally an
idiot, not knowing in what age you live in.

So, understand what your fetish means and how it can backfire. Any age will
have its own problems like that.

