
Porn Sites Still Won’t Take Down Nonconsensual Deepfakes - Malifalitiko
https://www.wired.com/story/porn-sites-still-wont-take-down-non-consensual-deepfakes/#intcid=_wired-homepage-right-rail_15cd1a7d-8dd5-45bc-819e-068f5707d7b1_popular4-1
======
dak1
Is there a clear bright-line between a cartoon or South Park-style animation,
a hand-drawn photorealistic graphic, and a ML-generated one that should be
legislated differently than existing laws against commercial usage of
somebody's likeness vs fair use?

If so, what is that bright-line, and what unforeseen consequences might
legislating it have?

I recognize the limits on regulating free speech are weaker in the UK where
this story originated as opposed to in the US.

~~~
misja111
There doesn't have to be a clear bright line, even when the law seems clear
the reality is many times fuzzy. This is why we have judges, to make a ruling
in the spirit of the law. In this case, if the likeliness is close enough then
it's forbidden.

~~~
nvr219
"I know it when I see it!"
[https://en.wikipedia.org/wiki/Jacobellis_v._Ohio](https://en.wikipedia.org/wiki/Jacobellis_v._Ohio)

------
fxtentacle
What a weird and misleading article. I assume it was written by that "deepfake
detection company Sensity" to drum up business, but it certainly doesn't seem
to be well-researched.

As is, it is already illegal to use Emma Watson's "likeness" without her
approval in pretty much every western country, including the EU and the Czech
republic. "likeness" here means that the fake is good enough for regular
people to recognize the actress.

For example, see Midler vs. Ford:
[http://rightofpublicity.com/pdf/cases/midler.pdf](http://rightofpublicity.com/pdf/cases/midler.pdf)

The article continues: “Until there is a strong reason for [porn websites] to
try to take them down and to filter them, I strongly believe nothing is going
to happen,” Patrini says.

Obviously. That "strong reason" is called a lawsuit and is pretty easy to
organize. You might even sue for punitive damages and recoup your legal costs,
meaning a celebrity successfully suing might be as expensive as $350,000 per
video. I'd say that is a very strong reason to take things down, if - and
apparently only if - the person requesting the takedown is following legal
procedures.

~~~
yasp
Would this also apply to real porn actresses who happen to look like Emma
Watson?

~~~
fxtentacle
[https://www.casebriefs.com/blog/law/intellectual-property-
la...](https://www.casebriefs.com/blog/law/intellectual-property-
law/intellectual-property-keyed-to-merges/state-intellectual-property-law-and-
federal-preemption/midler-v-ford-motor-co/)

"Under California law, intentional imitation of a celebrity’s distinctive and
widely known voice for commercial purposes constitutes tortious
misappropriation."

Intentional is the key word here. So I presume that if you look like Emma
Watson and go with your real name, you're fine. If you pretend to be Emma
Watson, then it becomes infringing.

~~~
yasp
So then ostensibly a deepfake that is clear that it's not actually Emma Watson
(not pretending) and therefore only sharing a likeness would be legal?

~~~
fxtentacle
Depends on your intention. If you produce the video with the intent to profit
off her fame, and that will certainty be assumed if you use her name in the
tags or keywords, then it's clearly illegal.

But if the video says "this is a deepfake" and is not showing up in search
results for her name, then it should be fine.

Its just that in the latter case, probably nobody is going to watch it, so the
discussion is then moot anyway.

~~~
chr1
In the latter case users could write the name in comments.

------
barry-cotter
The kinds of people who work for Pornhub aren’t going to have moral qualms
about this[1] and there’s no way you can ban deep fakes as such under the
First Amendment. Either there will be regulation of porn on grounds of
obscenity or the current free for all will continue but there’s no way deep
fakes will be banned and the rest of porn left alone.

[1] [https://fightthenewdrug.org/pornhub-refused-to-remove-
videos...](https://fightthenewdrug.org/pornhub-refused-to-remove-videos-of-
this-minors-sexual-assault/)

[https://www.bbc.com/news/amp/stories-51391981](https://www.bbc.com/news/amp/stories-51391981)

~~~
fxtentacle
Your source [1] doesn't exactly appear neutral.

I got a popup that said "PORN KILLS LOVE" TM copyright by "Fight the New Drug,
Inc" and then I thought "ooh this is propaganda" and didn't actually read the
article.

~~~
agentdrtran
That source does suck, but this has happened to multiple women.

------
adamnemecek
The article mentions a Czech holding company owning some of these sites. The
amount of porn, both straight and gay, produced in Czech Republic is
astounding. You can't live in Prague without being harassed by the industry. I
received a cease and desist letter for physically pushing some porn producers
harassing people on the busiest street in Prague.

[https://www.reddit.com/r/europe/comments/arhzub/pornstars_pe...](https://www.reddit.com/r/europe/comments/arhzub/pornstars_per_capita_in_europe/)

~~~
asimpletune
Why is there so much porn in Czech Republic?

~~~
severine
[https://duckduckgo.com/?t=ffab&q=+Why+is+there+so+much+porn+...](https://duckduckgo.com/?t=ffab&q=+Why+is+there+so+much+porn+in+Czech+Republic%3F+&ia=web)

 _edit: not trying to be snarky, there are some good links above the fold._

~~~
asimpletune
Wow, that is amazing, thanks

------
CyberRabbi
As a regular porn watcher myself and in many cases a defender of free speech
(except in the case of hate speech coming from fascists) I still have to admit
that the destructive capability of deep fakes of public figures is deeply
concerning. Female politicians like AOC should not have to deal with malicious
actors spreading pornographic deep fakes. Some may remember the countless lewd
doctored photos of Hillary Clinton that spread in conservative circles. It
only stands to reason that this helped the current president win, and in
general makes it harder for all female politicians to win. This doesn’t seem
to affect male politicians nearly to the same degree.

Simultaneously these porn companies are making exorbitant amounts of money
from obviously illegal content. Something needs to be done but at the same
time I worry this could be a political opportunity for a conservative to push
forward an internet censorship bill.

~~~
throwaways885
While this is abhorrent, the likely end result is people will stop trusting
video they see online. That combined with sites like
thispersondoesnotexist.com might actually increase the personal privacy of
people online. We might end up in a state where employers stop researching
candidates from their apparent Facebook profiles.

~~~
ouid
This view seems dangerously, irresponsibly, optimistic.

~~~
asdfasgasdgasdg
Really? To me, it seems not optimistic but inevitable. I don't think anyone
believes Emma Watson has actually done a porn, despite her appearing in the
deepfake mentioned in the article. Being deepfaked is unlikely to affect
anyone's real-world prospects in any way. That being said I can understand
that it would still be hurtful and I'm open to having the content more tightly
regulated.

~~~
ouid
I just don't understand how anyone can live in the same world I do and have
anything nice to say about the average person's credulity.

~~~
asdfasgasdgasdg
Are you saying that you think folks believe Emma Watson has done a porn, or
something else? I mean, the claim I'm making is empirical. If you went out and
asked people, you could know the answer.

~~~
ouid
You have made two claims directly, and asserted their equivalence for a third
claim.

People have not seen the Emma Watson porn, and those that have can probably
read the word deepfake in the title. My guess is between 5% and 20% of the
people who have seen it, and found it through searching for Emma Watson,
rather than Emma Watson Deepfake believe that they have seen actual Emma
Watson in porn.

But ultimately, you said that you didn't believe that it was irresponsible to
assert that deepfakes will society more skeptical of video. That is your
claim. I pose that deepfakes can easily be used to produce videos which are
closer to what their audiences expect than reality, and as a result they will
likely be _less_ suspicious of what they see.

------
ur-whale
I wonder what the law says when the "fake" is produced by a human artist,
typically a hand drawing or a "hand-made" 3D model of an unwilling subject.

~~~
nvr219
It's because the deep fake is literally trying to be a forgery of a real
pornographic act where a hand drawing is obviously not a forgery or a real
pornographic act

~~~
convery
Have fun trying to get a legal definition for that. The evil machine just
copies what it has seen before to create a composite, while the human
3D-artist just draws from what they know. Not to mention that the rules around
what's considered 'art' is extremely murky; e.g. the band Scorpions feature a
nude minor on one of their album-covers which is fine "because art" while
people have gone to jail for having hentai featuring underage fictional
characters.

------
Invictus0
The ability to create a realistic pornographic photo using someone else's
likeness has been around for a long time; why does the medium of video make
things different? We've got to let go of our notion that video is reality and
can't be faked. The time when that was true is definitely gone.

~~~
tambourine_man
There's a difference between being possible and trivial. What once required
professional artists can now be done by kids with lots of free time. That
changes the dynamic and new problems arise.

~~~
echelon
This is just the progression of technology.

Photoshop made a lot of stuff accessible that previously wasn't.

In a few decades, we'll merely tell the computers what we want to see and
they'll make it for us.

In fifty, they'll infer what we want and dynamically adjust.

In a hundred years, all reality will be constructed.

The trend is bigger and more important than the fear.

~~~
tambourine_man
Sure, and Photoshop doesn’t allow you to scan money, for instance. It’s still
possible to counterfeit, just a bit harder.

We can choose to embrace this change or take measures to make it less trivial.

------
commandlinefan
So it seems to me that there are two types of people: those who don't care if
they're seen naked and those who do. Among those who don't care, this wouldn't
be a problem, and for those that do, this actually works in their favor if it
becomes widespread: if their personal pictures are leaked, they have perfect
plausible deniability.

~~~
not2b
Many who are willing to appear naked expect to be paid for it, just as an
athlete who lets his image be used in an ad isn't giving everyone else
permission to photoshop his image into their ads for free.

~~~
falcolas
> Many who are willing to appear naked expect to be paid for it

Reddit would demonstrate otherwise. As would Burning Man, nude beaches, and
any number of other locations where nudity is allowed/expected.

