
Deepfake revenge porn distribution now a crime in Virginia - smacktoward
https://arstechnica.com/tech-policy/2019/07/deepfake-revenge-porn-distribution-now-a-crime-in-virginia/
======
apo
> The new law amends existing law in the Commonwealth that defines
> distribution of nudes or sexual imagery without the subject's consent —often
> called revenge porn —as a Class 1 misdemeanor. The new bill updated the law
> by adding a category of "falsely created videographic or still image" to the
> text.

This is a strange law:

> Any person who, with the intent to coerce, harass, or intimidate,
> maliciously disseminates or sells any videographic or still image created by
> any means whatsoever that depicts another person who is totally nude, or in
> a state of undress so as to expose the genitals, pubic area, buttocks, or
> female breast, where such person knows or has reason to know that he is not
> licensed or authorized to disseminate or sell such videographic or still
> image is guilty of a Class 1 misdemeanor. For purposes of this subsection,
> "another person" includes a person whose image was used in creating,
> adapting, or modifying a videographic or still image with the intent to
> depict an actual person and who is recognizable as an actual person by the
> person's face, likeness, or other distinguishing characteristic.

[https://law.lis.virginia.gov/vacode/18.2-386.2](https://law.lis.virginia.gov/vacode/18.2-386.2)

A kid makes a pencil drawing of the school principal in a compromising
situation and posts it to Instagram.

Legal?

A very talented artist creates a realistic watercolor of the President in a
compromising situation with Kim Jong-un and posts to Instagram.

Legal?

A kid uses software he downloaded to generate a realistic depiction of the
school principal in a compromising situation and posts it to Instagram.

Legal?

~~~
pfisch
> A very talented artist creates a realistic watercolor of the President in a
> compromising situation with Kim Jong-un and posts to Instagram.

Probably legal. It is satire meant to be political commentary. Not distributed
with the intent to "coerce, harass, or intimidate"

As for the other ones idk, probably illegal. The last one almost certainly is,
and honestly it probably should be.

~~~
solotronics
Why? I thought we had freedom of speech in the US. If I want to draw a picture
of you doing something nasty and post it to my own website why should the
state stop me?

~~~
curryst
IANAL, but at least for images that are designed to appear to be, or likely to
be perceived as, a real photograph, it strikes me as comparable to libel or
slander. To me, the distinction is whether a viewer is likely to interpret the
image as satire or general humor, or if they are likely to believe that it is
genuine photographic evidence of a compromising event. Satire and humor are
protected

You drawing a picture is unlikely to pass that muster. Deepfakes, on the other
hand, are generally designed to be falsified images.

I think the interpretation of this will lean heavily on being able to prove
intent.

~~~
zzo38computer
As long as it includes a disclaimer that it is a fake (even though it appears
to be a real photograph), I do not complain about it, because I like freedom
of speech.

~~~
jakobegger
Disclaimers are useless because people strip them when they repost the image

~~~
AstralStorm
Make stripping disclaimers illegal? Like stripping copyright claims. Removing
the disclaimers is in most cases breaking copyright anyway. It does not count
as a remix. It does not count as a quotation and definitely not as archival.

That bit is easy to detect most of the time...

~~~
pfisch
Yeah, and all we have to do to stop piracy is make it illegal. Right?

------
sandworm101
The real joke is that "deepfake porn" has been common in Hollywood for
generations. It was just consensual. Google around to see how many shots in
sex and/or nude scenes were actually body doubles. Only lately has that
evolved into digitally putting the star's face on the body double's body.

There are some serious first amendment issues here. Hustler Magazine v.
Falwell, an iconic case that set the tone for modern supreme court arguments,
turned on a verbal 'deepfake', the depiction of a celebrity in a supposed real
sexual situation _without their consent_. That involved a fake interview with
Falwell published in Hustler. I don't see how that is legally distinct from
putting his face on a body double. Both are claiming an untruth, the creation
of a fake sexual history, as a form of legitimate parody and comment on their
public persona.

Also, half of all porn films. Does using a Charlie Sheen lookalike in the XXX
version of _Two and a Half Men_ constitute deepfake? ("Two and a Half Men",
"The Big Bang Theory" ... Chuck Lorre was just begging for porn parodies.)

Download/watch "The People v. Larry Flint". I used to use clips as intros to
first amendment and copyright law. It is a great movie.

~~~
adamlett
_That involved a fake interview with Falwell published in Hustler. I don 't
see how that is legally distinct from putting his face on a body double_

What I understood from recently rewatching The People vs Larry Flynt was that
LF and Hustler magazine were acquitted because nobody would reasonably believe
that Jerry Falwell had actually had sex with his mother. It seems evident that
the material difference between that case and today’s deepfake is how likely
anyone would be to mistake it for the truth.

~~~
tptacek
Yes; this is basic enough to be in the syllabus for the case:

 _Held: In order to protect the free flow of ideas and opinions on matters of
public interest and concern, the First and Fourteenth Amendments prohibit
public figures and public officials from recovering damages for the tort of
intentional infliction of emotional distress by reason of the publication of a
caricature such as the ad parody at issue without showing in addition that the
publication contains a false statement of fact which was made with "actual
malice," i.e., with knowledge that the statement was false or with reckless
disregard as to whether or not it was true. The State's interest in protecting
public figures from emotional distress is not sufficient to deny First
Amendment protection to speech that is patently offensive and is intended to
inflict emotional injury when that speech could not reasonably have been
interpreted as stating actual facts about the public figure involved. Here,
respondent is clearly a "public figure" for First Amendment purposes,_ [[[ and
the lower courts' finding that the ad parody was not reasonably believable
must be accepted. ]]] _" Outrageousness" in the area of political and social
discourse has an inherent subjectiveness about it which would allow a jury to
impose liability on the basis of the jurors' tastes or views, or perhaps on
the basis of their dislike of a particular expression, and cannot,
consistently with the First Amendment, form a basis for the award of damages
for conduct such as that involved here._

Em. mine.

~~~
zaroth
Thank you for posting that. I’m not sure I ever fully grok’d the
“outrageousness” standard.

So if you say something horrible about a public figure, knowing that it is
false, and presenting it as true, if it’s outrageous enough to be obvious
parody then it’s permitted. But if it’s stated as truth and emphatically
presented as truthful and not as parody, then the public figure exception is
lost?

Very hard not to wade into the quagmire of horrible statements people make
about a certain politician which a large portion of the country accept as
truly factual in this context...

------
fouc
Deepfake revenge porn could be counter-intuitively good for privacy. If
people's real porn images get leaked, they can just claim it was deepfaked.

Trying to make a law against this could actually be a massive overreaction.

------
mjevans
Shouldn't this already be covered by defamation and/or laws related to truth
in marketing?

~~~
markdown
> laws related to truth in marketing

I'm a regular joe consumer, and as far as I can tell, there are either no such
laws, or they are never enforced.

------
tick_tock_tick
I don't see how this can possible be legal vs the First Amendment.

~~~
moate
The same reason that harassment and slander are crimes. There's always limits
to what you can "say" when what you're saying isn't just unpopular but
actually damaging.

~~~
pretendscholar
slander is not a crime

~~~
chrshawkes
Libel is illegal and in the same category.

~~~
ThrustVectoring
It's a tort, not a crime. The individual you defamed can sue you in civil
court over the damage you do to their reputation, but the US government can't
prosecute you for it.

------
jsnider3
There's a good chance that this law is unconstitutional, but I doubt anyone
wants to be the first to test their luck in court.

------
heyitsguay
Very disappointed but very unsurprised to see that the initial HN reaction is
"this shouldn't be illegal". Where is your respect for others?

~~~
TallGuyShort
Any new law has the potential to result in no-knock SWAT team raids and online
censorship with minimal judicial oversight when someone violates a
technicality in the law and pisses off the wrong person. And despite that many
times said laws don't even result in a decrease in the undesirable behavior
after all that. It's always worth looking at the potential abuses of a law
(especially one that that effectively makes an exception to the 1st Amendment)
even though the thing it claims to address is reprehensible.

Yes, I think deepfake revenge porn is terrible, and people who make it are
douchebags. But - we've all seen our politicians support some truly stupid
stuff and make some truly ignorant statements. Do we suddenly just assume that
these same people managed to write this law with everything well-defined, and
with appropriate checks and balances in place, without external critical
thinking applied?

~~~
heyitsguay
Nothing you say here is special to this law though, as opposed to any other
law that gets passed. If you're of the opinion that no law is a good law, i
think we'll have to just disagree. All I'm seeing is justification for
inaction on an issue with pretty clear precedents in established law on
revenge porn and libel/slander.

~~~
TallGuyShort
No I'm saying no law is above critical thinking. There should always be a
devil's advocate because you should never be so convinced you're not the
devil. But especially because I do think the legislative and judicial branches
in the United States have effectively abdicated their responsibilities, and
now we have executive branches that are given massive amounts of power,
sheltered from the consequences of misconduct and given freedom from the
responsibility to actually protect. So in practice yes I am suspicious that
any law, badly implemented, is nothing but a tool for the powerful to wield
more power with more supposed moral authority.

edit: As a specific example, the only other non-dead comments I saw minutes
after you posted this dealt with how this is different from existing parodies
and the use of body-doubles in film. Is there a process formalizing consent
from actors and actresses in such cases? There probably should be in the case
of body double. But it would be an unreasonable expectation in the case of
parody. But these are all questions that need to get answered before. Asking
those questions and observing that they appear to not have been dealt with
doesn't mean you think deepfake revenge porn is A-OK.

------
LAMike
For everyone saying this is a violation of the 1st amendment -

The 1st only extends to what you can say on your _personal property_.

There is a reason you can't yell "FIRE" in a crowded theater, it's because you
are infringing on the property rights of the people who bought a ticket to the
movie, and the landloard who is expected to keep everyone safe.

You don't have any property rights to any other human other than your own body
by default.

That being said, the future will be filled with AR glasses with the nude
version on default and that's going to be very weird.

~~~
tick_tock_tick
You are legally allowed to yell "FIRE" in a movie theater. The idea you're not
is a myth from a early 1900s supreme court ruling.

[https://scholarship.law.wm.edu/cgi/viewcontent.cgi?article=1...](https://scholarship.law.wm.edu/cgi/viewcontent.cgi?article=1748&context=wmborj)

~~~
chrshawkes
Especially if there is a fire, what're you supposed to yell?

~~~
thrwayxyz
Endless war is a mistake. Which is what people were arrested for in that case.

