
Is It Legal to Swap Someone’s Face into Porn Without Consent? - IntronExon
https://www.theverge.com/2018/1/30/16945494/deepfakes-porn-face-swap-legal
======
gamblor956
Professor Goldman may be an expert but on IP law but his understanding of
invasion of privacy law leaves a bit to be desired...Invasion of privacy
doesn't require an actual invasion into a person's private life. It includes
publicizing a person in a false light and appropriating their name, both of
which would apply to deep fakes.

How then do paparazzi get away with photos of celebrities?

Almost all pictures of celebrities are in _or from_ public areas where there
is no right of privacy. (If a celebrity really didn't want to be photographed
naked, they wouldn't undress in front of a public-facing window. There are
decades of court cases on this point.) The pictures that aren't, i.e., Erin
Andrews, resulted in massive court victories for the victims.

The appropriation cases are much rarer, as it generally applies to
impersonation cases. It's usually in context of name appropriation, as most
image appropriation cases thus far are pursued under different tort theories
usually involving commercial harm. (Though since _Catfish_ , image
appropriation invasion of privacy cases are becoming more common.)

~~~
norikki
>Invasion of privacy doesn't require an actual invasion into a person's
private life. It includes publicizing a person in a false light [...]

Could you explain that a bit more? I would be very shocked if something as
subjective and vague as "publicizing a person in a false light" would be an
invasion of privacy worthy of first amendment exemption.

~~~
gamblor956
It's basically defamation lite, for when the statement isn't actually false
(as in a deep fake), or the falsity is clearly expressed, but publication of
the statement still portrays the person in a false context that causes
quantifiable harm to their reputation.

Generally, any person who pursues a defamation claim over a deepfake would
likely include this claim if they can't meet the burden for defamation.

(Please don't try to pick that apart as I'm only trying to provide a very-
high-level-in-a-nutshell general overview.)

------
harshreality
Like the issue of morality of copyright infringement, this deepfake may well
be immoral/amoral in western (and some other) societies, but that's missing
the point.

This deepfake situation, and a similar situation that will occur by
synthesizing fake speech in people's own voices, is _unstoppable_. All the
complaining in the world won't stop it, because the barrier to entry will soon
be trivial. What is the proposed response? Criminalize running certain kinds
of neural nets? Or social media shaming anyone who publishes any fake (that's
detectable)?

We all just need to get used to the fact that pretty soon audio and video
recordings of people won't be verifiable without careful checking and
comparing, and eventually maybe not even then.

Look at the bright side: someone can have a recording of you saying something
crude and disrespectful and describing sexual assault, and if you're running
for president you won't even have to admit it was you anymore, because soon
anyone will be able to fake that kind of thing. Video's more difficult, but
that too might happen.

~~~
GVIrish
> Look at the bright side: someone can have a recording of you saying
> something crude and disrespectful and describing sexual assault, and if
> you're running for president you won't even have to admit it was you
> anymore, because soon anyone will be able to fake that kind of thing.

Honestly this scenario is so terrifying to me that I almost hate to even
discuss it. What happens when people can make convincing fakes like that of
religious figures and weaponize it to incite violence or install authoritarian
governments?

How much more powerful and devastating would calls for genocide be, if driven
by deepfake videos? Especially in parts of the world where people wouldn't
even be a little bit skeptical of something outrageous that they saw with
their own eyes.

Just look at the current political climate in the US. How much more polarized
and volatile would things be if someone could fake videos showing politicians
abusing children or planning a violent 'government takeover'.

If/when this tech gets good enough, it will absolutely be disruptive to
civilization.

~~~
bsenftner
It will cause everyone to suspect/question anything they have not witnessed
personally. It is a significant industry threatening issue for journalism,
that's for sure.

~~~
krapp
>It will cause everyone to suspect/question anything they have not witnessed
personally.

No, it really won't, unless it involves them personally. If a celebrity or
politician denies it was them in "that video," most people will believe
whatever conforms to their biases about that person.

------
strait
Soon, enough people on Facebook will have climbed Everest, yachted with
billionaires, and partied with the most fashionable pop stars, that nothing on
the internet will be believeable. Elaborate and unique photo/video narratives
will be purchased for having one's face and body/body type convincingly
inserted. Most of the internet media will then quickly collapse, but some of
the fake photos/videos will be pretty funny, and help us laugh away our
bitterness for awhile.

~~~
jedberg
Before newspapers, stories traveled from one person to the next. Then
newspapers came and everyone said, "they can print anything they want, why
will people believe anything in them!".

And yet here were are, for the most part people believe what they see in
reputable newspapers.

With the expansion of fake photos and videos, it will just come back to
trustworthiness.

Do you trust the source that has provided the image?

------
yodon
We don’t need a complex new body of law for this issue. It will be litigated
as a likeness rights case because those laws are simple and clear and there is
plenty of legal precedent to fall back on, including doctored images.

If it’s your face in the image you have likeness rights in the image. If a
photographer takes your photo and uses your image in a way that makes you and
your face significant in the image, they need to have a model assignment
agreement signed by you on file or they are setting themselves up for a
lawsuit and a world of pain in the event the image goes big. Yes, there are
exceptions to the likeness rights laws, but deepfake swaps are unlikely to fit
into those exceptions (and yes, California law in particular makes clear those
rights extend to dead celebrities through their families or the people their
families have transferred the rights to, and it’s very hard to have your image
go big and not have it go to California).

This is one of the things that makes the Flickr Creative Commons flag so
dangerous. The photographer is saying “you can use this commercially” but
99.9% of images on Flickr are snapshots taken by amateurs with no rights
assignments from the subjects. There have been and continue to be lawsuits
where the photographer is found liable for the commercial use of their
snapshots, without their knowledge, because they flagged it as CC and someone
did and the photo subject got upset and sued.

~~~
dragonwriter
> That’s surprising to me as likeness rights will be how these issues are
> litigated because it’s simple and clear and there is tons of legal
> precedent.

Likeness rights (aka personality rights aka right of publicity) aren't all
that simple or clear, are rapidly evolving with different forces pulling them
in different directions, vary considerably between jurisdictions, etc., and,
because they are rapidly evolving and inconsistent at the state level, many
aspects of them may not be well-tested against overriding Constitutional
provisions (First Amendment concerns.)

~~~
yodon
All that you say is true, but it’s still a much simpler, more battle tested
legal arena than most of the strategies discussed in that article (and I
suspect that the core set of retained rights is likely to remain enough to
punish deepfakers, regardless of the evolution of the case law over the next
few years)

~~~
dragonwriter
> I suspect that the core set of retained rights is likely to remain enough to
> punish deepfakers

Most personality rights / right of publicity regimes apply to _promotional_
use and a subset of _commercial_ use (California, noted frequently for having
a generous regime for such rights, still explicitly notes that content merely
being monetized with advertising is not necessarily enough to qualify for
protection.)

I don't think that covers most deepfakes.

------
mc32
This is a relevant question from the verge. If current laws are insufficient,
I'm sure we'll see some proposed and passed like those addressing revenge
porn.

My question would if there would be a diff btwn public person (aka famous,
known) and run of the mill hoi polloi. Famous personalities in some
circumstances have less recourse in terms of privacy.

~~~
FussyZeus
IMHO We really need to get a handle on something akin to copyright to cover
our own faces and identity in general. At the moment companies (especially
social networks) can do so much stuff with the likenesses of total strangers
with zero oversight.

~~~
PeterisP
How about actors playing the roles of (still living) real people in theatre
and movies? Would they suddenly need permission?

Should the (many!) Elvis impersonators need a licence from his estate?

Impersonating others isn't a new issue, we already have a quite settled body
of law about that and centuries of weird edge cases. I don't see a radical
difference brought in by technology here. IMHO the treatment of making face-
swapped porn videos (or distributing them, which is a quite different case)
should be the same as the currently established treatment of painting (or
distributing) a realistic painting of someone you know in a pornographic pose
- whatever that treatment is.

~~~
gamblor956
_How about actors playing the roles of (still living) real people in theatre
and movies? Would they suddenly need permission?_

Actually, the way the law currently works...they might (to avoid one or more
tort cases for invasion of privacy, defamation, and/or publicity rights).
Studios will rename characters in biopics if they can't secure rights to the
person the character was based on. Like all things in the law, it is a matter
of context. A biopic or other media "based on a true story" may require the
permission of the people portrayed in it. A satire generally would not...The
law is complicated like that.

 _Should the (many!) Elvis impersonators need a license from his estate?_

Elvis is dead...The publicity rights of the dead are generally minimal and the
invasion of privacy rights nonexistent...Also, the Elvis estate is generally
okay with impersonators because for the most part they don't harm (i.e.,
defame) his image. They have gone after impersonators who they believed
misappropriated Elvis' image for defamatory purposes.

 _IMHO the treatment of making face-swapped porn videos (or distributing them,
which is a quite different case) should be the same as the currently
established treatment of painting (or distributing) a realistic painting of
someone you know in a pornographic pose - whatever that treatment is._

Under the current law, that is several torts: Invasion of privacy, publicity
rights, and defamation...

~~~
jasode
_> Studios will rename characters in biopics if they can't secure rights to
the person the character was based on._

What circumstances would require a film production to ask permission of a real
person before an actor could portray him? E.g. The film "The Social Network"
didn't ask Mark Z for permission to portray him in an unflattering way:

[https://movies.stackexchange.com/questions/2108/how-can-a-
st...](https://movies.stackexchange.com/questions/2108/how-can-a-studio-make-
a-biographic-film-like-the-social-network)

~~~
PeterisP
IMHO the distinction is that situation the events must be _true_. The person
can't prevent you from portraying the facts, but if you wan't to take some
"artistic liberty" about what happened, then you'd better either get their
permission or change their name.

------
jancsika
First off: is there any ethical or moral set of questions which apply to an
image sequence but _not_ to a single image?

~~~
yodon
You asked about ethical/moral issues that arise from image sequences but not
images. I can’t answer that but I can say there are legal diffences, for
example synchronization rights for synchronizing audio to video is a huge body
of law dating back about 100 years. It’s related to, but different from the
laws covering performance of recorded music in public spaces (which dates back
to about the 1930’s)

------
bsenftner
I am the author of the global "Automated Actor Replacement In Filmed Media"
patent. I wrote the patent in 2002, which was globally awarded around the
2006-2008 time frame. At that time, I was heavily trying to raise financing
for a Personalized Advertising startup - replace TV commercial actors with
your image and that of family and friends. The whole technology and production
pipeline worked, but I needed financing to make a real company. However, no
matter how I structured my company, there were no financiers willing to invest
without the option of, and active desire, to produce porn first. Every single
possible investment group, from individual angels to established top-tier VC
all insisted on a Porn First strategy, even though I insisted this technology
should never be applied to such a purpose - it is a Pandora's Box that would
trigger society issues I did not want to be the cause. Eventually, I pivoted
away from the advertising concept, into custom game characters, but that's a
different story. Every single serious investor insisted and ultimately walked
when I refused to use the technology to produce porn. This includes almost
every major film studio and recording label too.

~~~
JBlue42
That seems at odds with the concerns an HNer brought up the other day about
their adult business and moving to SF. Most people warned them that VC is
still reticent about porn. I guess it's whatever they would perceive would
make them money.

As to your advertising idea, that's interesting but also brings up issues
don't you think? Especially now with better facial recognition technology. For
example, walking down a street, a camera scans your face, pull up a bunch of
data on you (say you've been looking for a kayak lately), then in a nearby
billboard or hologram display area, they show you images of yourself and your
family kayaking down the Colorado or in an ocean.

What's interesting in movies/tv is that as the technology matures, we may
never see some of our stars actually age or even what they truly look like. A
lot of time and money is already spent on beauty work so it's not too big a
leap to see this being mixed in with that.

~~~
bsenftner
VC are not reticent about porn, only about not making a killing at it if they
go into porn. If they can capture any serious revenue flows, that is all they
care about. Any discussion otherwise is purely safety speech.

Interesting you should mention facial recognition, because that is what I
turned the technology into - or reverted it back into. My 'digital double'
creation for automated actor replacement involves neural net trained 3D
reconstruction of faces. It originally came from pose correction for facial
recognition. Being an animation/VFX, guy I licensed the 3D reconstruction tech
from an FR company.

After my startup efforts failed, the FR company hired me because I'd done the
largest scale use of their technology at the time. Now I create mesh network
appliances of their tech, combined with my production tech from my feature
animation days, to create apps for dynamic FR security perimeters. The
interesting thing is, the frame processing I do could easily be fed into a
'deep fake' system to produce crazy fake media pretty much automatically. We
track faces frame to frame and project a 3D polygon mesh matching their facial
expression. Need more?

~~~
JBlue42
Re: VCs - That makes sense. The thread I was referring to is:
[https://news.ycombinator.com/item?id=16252568](https://news.ycombinator.com/item?id=16252568)

As much as I read about new tech stuff, I had no idea how far things were
along until all the deepfake porn headlines hit the news the past few weeks. I
guess amidst all the hand-wringing over Carrie Fisher and Grand Moff Tarkin's
"looks" in Rogue One it never really struck me that that stuff had 'arrived'.

------
voidr
It was possible to do face swap images in photoshop for a long time and people
learned not to trust random nude celebrity images, I believe it's the same
thing now with video.

There are some fun grey areas where you have someone doing porn who actually
looks like a celebrity.

The good news is that you should be able to easily detect a face swap video,
by comparing the face against public images that were taken of the celebrity
and comparing the rest of the video with porn videos (like Youtube's
ContentID).

------
d1zzy
Am I the only one that may see a positive side to all this? For example, what
if I use this technology to swap my face and/or my partner's face into other
people's videos and then we can watch these? I think it might be pretty hot,
much more so than watching some unknown person's faces.

------
tw1010
"Consent" is starting to become a really loaded word. It doesn't seem
journalistically honest to start a conversation and use non-neutral language
in the headline to frame the tone of the discussion.

~~~
chc
It seems like the right word in this context. It sounds like you have an issue
with the word for some reason, but I can't work out what the issue is, much
less how it shows a lack of journalistic integrity.

~~~
tw1010
It is definitely not a grammatically incorrect use of the word. I have nothing
against it per se. But the word is almost exclusively associated to negative
emotional content, and that fact primes the reader to feel a certain way about
the issue before they have even read the piece. See Russell conjugation:
[https://www.edge.org/response-detail/27181](https://www.edge.org/response-
detail/27181)

~~~
chc
How would you phrase it that communicates "without consent" without that
sounding like a potentially bad thing?

~~~
tw1010
A more neutral candidate could be just: "Is It Legal to Swap Someone’s Face
into Porn?" (It delivers the same information since it wouldn't be a legal
question if all participants agreed to the act.)

------
solnyshok
how would we go about making a proof-of-unaltered videos? shall someone run an
ICO for a start-up thickly dressed in a cloud of tags related to blockchain,
web of trust, verifyable frames... Seriously, can any degree of trust be
achieved short of hashing and signing every single frame by a trusted video
sensor?

------
bsenftner
Interesting:
[https://github.com/deepfakes/faceswap](https://github.com/deepfakes/faceswap)

------
golergka
What they're doing is processing data that is publicly available online.
Revenge porn law is something else completely - it's about releasing data to
the world.

So, if such a law would be implemented, it would be much closer to DRM laws,
which allows data authors and content right owners to dictate how you can
process their data, even if they give it to you. In one case, it's copying the
data to another format, or may be running a decompiler on it; in another, it's
using it in deep learning. Either way, it's processing.

~~~
gamblor956
If you want to be super technical about it, processing data and releasing
revenge porn are both still a matter of copying bits.

Releasing new bits would be invasion of privacy and defamation.

"Processing" existing bits would be defamation, appropriation, and possibly
invasion of privacy based on the resulting images, though the IOP claims would
likely be add-on claims to a defamation or appropriation action.

Either way, it's a tort.

------
randyrand
i've seen much worse done with trumps face - which is still clearly protected
by the first amendment.

------
FLUX-YOU
Why wouldn't it be libel?

~~~
golergka
Libel would be stating that this is a real recording. If someone publishes a
deep fake video saying it's a fake video, it's not much different from a
cartoon based on someone, is it?

~~~
gamblor956
Saying the video is a fake is not a defense to defamation. The false claim is
putting the person's image in the deep fake in the first place. It is a well-
settled point of law that a statement _after_ the tortious act doesn't undo
the tort; at best it might mitigate damages if the statement provably reduces
the harm to the victim.

~~~
dragonwriter
Putting the image together is not a public false claim that would trigger tort
liability. Publishing it, obviously, might be, but publishing it as a unit
with an explicit label of falsity of the image does not make the label of
falsity an act _after_ the tort, it makes it an act simultaneously with one
required element of the tort, and one which, at least arguably, eliminates
another required element, that of falsity of the claim, so thst the required
combination of elements never exists.

~~~
gamblor956
In the US, it is a well-settled point of law that the defamatory act is the
publication of the false image. The statement that the image is fake is not a
defense unless it accompanies _every publication_ of the image and identifies
the fake aspects of it at every occurrence.

You can certainly argue that the defendant didn't intend to defame someone and
included a statement that the image was fake. But as I said, that statement
would not eliminate your liability for the tort; it would merely mitigate the
damages.

Tort law is complicated and isn't something you can just pick up by browsing
the internet for a few minuets. I've handled these types of cases before. Have
you?

~~~
dragonwriter
> The statement that the image is fake is not a defense unless it accompanies
> every publication

Of course. OTOH, if you build it into the image/video, that's trivial to
assure for each publication you are involved in, and subsequent publication by
some other third party of some subsequent modified image that removed the
statement of falsity is no different than if someone republishes a modified
version for your statement that someone is _not_ a child molester which omits
the “not”.

> Tort law is complicated

(9_9)

> and isn't something you can just pick up by browsing the internet for a few
> minuets.

Yeah, I've never really questioned the decision I made to actually buy and
study the required texts, etc., when I took Torts, rather than just spending a
few minutes studying on the internet, so I really don't need anyone to
validate that for me.

(Now, if we were talking about _false light_ , in the jurisdictions which
recognize it, rather than _defamation_ , this would be a trickier issue, I
think.)

------
tinus_hn
The article states that removing the content would be a violation of the right
to free speech. Actually though that right does not apply to obscene imagery.
So there is no problem with blocking or removing the images.

------
nailer
I'm not a huge fan of The Verge and Vox, but this is clearly amoral - I'm not
sure any reasonable argument could be made for involving someone in a sexual
situation without their consent.

I'd expect the law to catch up soon, and that to happen after the issue is
made more common once Reddit's non-consensual deepfakes goes more viral than
it already has.

Edit: replying to the flagged response below, which I disagree with but which
I think deserves a reply:

By immoral, I mean hurting other people without consent. I do not believe any
activity that doesn't hurt other people is immoral, though I understand some
people (orthodox religious people for example) do.

Someone masturbating to an image - a private mental image, or even a private
image produced for their personal use with software - seems fine. But
_publishing_ a sexual image of someone else clearly does involve the person.

~~~
Veen
> involving someone in a sexual situation without their consent

That's rather a broad definition of "involving someone in a sexual situation".
I think it's amoral too, but I don't think it's a sexual consent issue. My
intuition is that it has more to do with inauthenticity and coopting a
person's likeness, their public reputation. You could do the same sort of
thing to have people endorse political candidates or products against their
will.

~~~
nailer
Re-reading your post, I just realised by 'sexual consent' you meant rape. I'm
not comparing this to rape at all, I mean consent as in consent. Same as the
political didn't consent to being videoed or the model didn't consent to be
used in the documentary.

~~~
Veen
I still have the same disagreement with your point. The person is not being
involved in a sexual situation. A person is not the same thing as a
representation of a person. An image or representation is being involved in a
sexual situation, the person is not.

It's an important distinction to make because the language we use affects how
we think and argue about the wrong being done and what should happen to punish
or prevent it.

I think we agree that it's a shitty thing to do with someone's image, and it
will become more shitty as the technology improves to the point at which it's
difficult to tell the fake from the real.

