
Reddit bans 'deepfakes' AI porn communities - internetxplorer
https://www.theverge.com/2018/2/7/16982046/reddit-deepfakes-ai-celebrity-face-swap-porn-community-ban
======
excalibur
Fake porn of celebrities has been prevalent on the Internet for a long time. I
don't have precise statistics, but I think it might predate the web. For the
most part it hasn't been particularly controversial. Now that advances in
technology have significantly improved the quality, everybody suddenly has a
problem with it.

Let's back this up a few steps. You're an actor, you appear on television. By
being filmed and accepting the pay offered to you, you're agreeing to allow
these images of you be disseminated to the general public, for their
enjoyment. But what if somebody finds you attractive, and looks at your
picture whilst... you know. Can you sue? No, they're well within their rights
to do so. What if they cut your face out and place it over a Playboy
centerfold? Same deal. Several technogical innovations later, here we are.
Fundamentally, nothing has changed. Fundamentally, people are still 100%
within their rights to combine images legally obtained in this way. And post
them online. This may not be what the Internet was created for, but this was
always what it was used for.

~~~
codingdave
> people are still 100% within their rights to combine images legally obtained
> in this way. And post them online.

Are they? If you distribute movies and media to the public, you normally are
breaking laws. (Hence the infamous FBI warning on movies for a couple decades
now.) Likewise, you cannot just use people's likeness in marketing and other
public uses without their permission. So I'm not at all sure that just because
you have an image, you have the rights to create and distribute derivative
works from them.

~~~
falcolas
> Likewise, you cannot just use people's likeness in marketing and other
> public uses without their permission.

Tell that to Prince and Peter Cushing. ;)

------
barrkel
There's no way of putting this tech back in the box. And it's only going to
get more powerful.

What if someone combines it with some kind of "deep ageing", to create
artificial child porn? Sexual representations of children, even if completely
artificial and involving no victims, are illegal in many jurisdictions, but
not all.

We're only scratching the surface of what semantic editing of video is going
to be capable of. It's a very big barrel of worms.

~~~
ordinaryradical
Absolutely true, and not that you’re arguing for this but:

Just because the technology is out there doesnt mean we throw our hands in the
air and give up. Yes it’s there, it’s going to be used for most of the
nefarious cases we can imagine, but that doesn’t mean we have to tolerate
someone using our image against our will.

In fact, I would imagine that the data privacy advocates I often see on hn
should see this as a logical extension of the privacy protections they want to
see across the web.

No, Lyft employees should not be able to view our trip history willynilly. No,
the NSA should not be able to gobble up all of our google searches for
profiling us. And no, we should not have to suffer being put in porn against
our will because we are a person in the public eye.

This stuff should be treated like revenge porn, IMO. Functionally it’s the
same even if the technical implementation is different

~~~
flukus
> In fact, I would imagine that the data privacy advocates I often see on hn
> should see this as a logical extension of the privacy protections they want
> to see across the web.

The question is whether or not my likeness is my data, which I don't think has
ever been settled. Anyone can take a photo of me in public and the
photographer owns the copyright. Are security cameras recording me violating
my privacy? On the other hand, football players have to be paid to have their
likeness appear in games.

I think this might be the catalyst to resolve these issues once and for all.

~~~
gamblor956
The photographer owns the copyright in the photo, but not the right to use
your likeness contained in the photo without your permission (likeness rights
are also referred to as model rights).

This means that you can demand money if the photographer sells that photo to
anyone, especially if the sale is for commercial use.

~~~
Digory
TMZ and other paparazzi would not exist if this were the rule. I think some
states recognize a limited right to control aspects of your likeness, but not
nearly as broad as this.

------
bradbeattie
I'm on the fence on this one. If I draw a stick figure and put your name above
it, is that unethical? What if I add a speech bubble saying something risqué?
What if I'm a talented artist and I draw something pretty life-like? I don't
see the lines here being too clear without claiming down on all expression.

~~~
FLUX-YOU
None of these examples are really anywhere close to deepfake porn videos.

\- Porn is done under consent, where participants should be reasonably aware
that it will be published

\- Porn tarnishes reputations

If you pasted Daisy Ridley's face in a crowd in, say, China, doing every day
stuff, no one would rightly care because there's no real potential unless you
are doing it for some ulterior motive.

~~~
bradbeattie
I think how convincing a piece of media may be is at the crux of this issue.
Photoshopped heads of celebrities on porn actors' bodies have been around for
some time now and isn't, to my knowledge, illegal. The key difference here is
that people hold video to be more trustworthy. Given the advancements in CG
and now more recently with deep fakes, perhaps that's what needs to change;
People should stop trusting video footage.

~~~
gamblor956
_Photoshopped heads of celebrities on porn actors ' bodies have been around
for some time now and isn't, to my knowledge, illegal_

They have been, since inception, illegal, in the sense that they represent a
tortious act. They might be illegal in the criminal sense, depending on the
jurisdiction, even in the US if absent meaningful context bringing the fake
under the protection of the First Amendment as a form of speech.

------
feelandcoffee
What's scare me the most it's not the porn. But the fact that make this kind
of tech it's kinda-avaliable to anyone with a GPU and a few hours of learning
and training.

The only thing that make this not a treat to your regular folk it's the fact
that needs a lot of images references to make the model, but imagine a
politician or activist, they have a lot of images on the net; So this can take
the fake news to another level. Yeah if this happens the news media and legal
system, will probably not take shady videos seriously without verification
(specially now, that the algorithm it's still is in the middle of the uncanny
valley, so for the moment it's easy to recognize without experts need). But
think about your friend, uncle or cousin that shares his "echo chamber" posts
on facebook?

Or who knows, maybe I should't binge black mirror, and this only gets limited
to porn and good uses. Like a new era for stunts in movies.

~~~
dkersten
> if this happens the news media ... will probably not take shady videos
> seriously without verification

Because mainstream media is great at fact checking all the bullshit they
report on? Some do, sure, but there is a lot of shit on mainstream news.

------
irrational
So, subreddits that do "safe for work" deepfakes are still around and allowed.
This tells me that the technology will just get better and better and the use
for it to create NSFW deepfakes will likewise get better and better (while
existing underground?) until it really is impossible even for an expert to
tell that they are fakes. That is what I assume will happen anyway.

~~~
earenndil
> until it really is impossible even for an expert to tell that they are fakes

I hope that happens, and as soon as possible. Otherwise, there will be an
'uncanny valley' period of sorts, where people are able to mass-produce this
video, but not everyone is aware that it exists. When it really is flawless,
or close enough to, societal change will begin. This will also mean that video
evidence will likely no longer hold in court (which is good because it was
already possible to fudge manually with videoediting tools).

------
randyrand
Looks like voat.com/v/deepfakes has already existed for while in anticipation.

~~~
dna_polymerase
To me it looks like voat is just the ulta-toxic communities from reddit joined
together. I am a fan of real free speech without banning stuff that could
offend some people, yet voat does not feel like a solution for that (same goes
for Gab, seems like people there are more interested to see Twitter fail than
coming together as a community and create a viable environment).

That said, I think this will again drive some more people to voat. What it
definitely won't do is stopping people from doing deepfakes.

~~~
prepend
I feel similarly. I want free speech, but tempered enough to not bum me out.
Reddit is too constrictive, voat is almost as bad as 4chan.

I long for the olden days when bbs allowed for lots of diverse stuff without
people being banned for mentioning a leaked game of throne episode, or cgi
boob.

Right now, I feel similarly toward Reddit as Google- moving in an anti-user
direction, but no viable alternative to quit to.

------
petepete
These bannings and the process making the news will definitely have the
Streisand effect. The community might move on from Reddit but so many people
have been made aware of the possibilities that it's impossible to keep a lid
on it.

------
fellellor
So glad that they banned this crap. This is actually a fun project to get
started with deep learning as you paste Nicholas Cage in every movie you can
imagine. Unfortunately it's been dominated by all this porn creepiness.
Hopefully the discussion sorrounding this will get friendlier from now on.

------
xkcd-sucks
The decision that makes voat.co become successful?

------
downer72
Ban all you want, it’s not illegal, and silly to suggest as much.

Here’s a picture I drew of your mom. Look at those stink lines.

Arrest me.

~~~
Cynddl
It can actually be considered as a form of harassement or defamation. While it
might be a gray area in the US, it is likely not going to be the same in
Europe.

------
aznvr
It seems to have gone unnoticed, but since a week or so Github requires sign-
in for the deepfakes repos
([https://github.com/deepfakes](https://github.com/deepfakes)). Note that
these are public repositories.

