
Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence - LearnerHerzog
https://www.theverge.com/2018/1/24/16929148/fake-celebrity-porn-ai-deepfake-face-swapping-artificial-intelligence-reddit
======
procedural_love
> We assume, too, that face swapping is the end game, but it’s clearly just
> the beginning.

Isn't the end game an endless stream of personalized content for everyone?
Wherein the entire corpus of human-created media becomes a training set for
our fantasies.

It is interesting how entertainment is again pushing the boundary of
technology. Soon enough this push to make face editing tools for porn more
accessible to everyone will allow anyone to:

1) Replace their ex-husband's face in their old family videos with their new
husband's face.

2) Create a viral video of Donald Trump murdering someone.

3) Be the star of their favourite movie, porn or otherwise. (What's the effect
this would have on people's memories, when they actively see themselves doing
everything James Bond does, for instance? Shooting people, being generally
powerful, and "getting the girl"?)

~~~
IntronExon
Things are going to get very weird in porn, when you don’t have convince a
human to actually do it. I have to assume that early adopters will also be
people with predilections which are unserved, or illegal. If people worry
about their kids seeing disturbing porn now, imagine when it’s AI generated,
photorealistic rape, snuff, child porn. Illegal or not, if it’s purely virtual
law enforcement is going to focus on the subset of crimes which involve actual
human victims.

~~~
dragonwriter
> If people worry about their kids seeing disturbing porn now, imagine when
> it’s AI generated, photorealistic rape, snuff, child porn.

There was a time when it was quite easy to find (without even trying for that
specific content) photorealistic rape, snuff, bestiality, and child porn on
the public web, without any AI involved.

> Illegal or not, if it’s purely virtual law enforcement is going to focus on
> the subset of crimes which involve actual human victims.

Actual prosecutions for virtual (generally not photorealistic) child porn in
various jurisdictions demonstrate that this is not a hard and fast rule.

~~~
djsumdog
Animations or fiction of obscene content are not illegal in the US and Japan.
They are illegal in the UK (a man was sentenced for Simpsons's porn) and many
other countries.

Now with added realism, these lines could become blurry and we could see some
of these issues brought up again.

~~~
jstarfish
> Animations or fiction of obscene content are not illegal in the US

Citation please. There is nowhere near enough precedent to draw such a
conclusion in the US. The defendants in these cases often end up pleading
guilty.

US v Hanley, US v Red Rose Stories, etc.

~~~
djsumdog
Hmm .. seems things have changed quite a bit since I last read up on this. It
seems to vary by state:

[https://en.wikipedia.org/wiki/Legal_status_of_drawn_pornogra...](https://en.wikipedia.org/wiki/Legal_status_of_drawn_pornography_depicting_minors#United_States)

------
hirundo
Technology is degrading the value of photo and video evidence (and probably
audio too) asymptotically toward that of famously unreliable testimony from
memory. Criminality becomes less risky and/or innocence becomes less
protective. Law becomes less effective. A bad result, to the extent that the
law isn't an ass.

On the plus side artistic tools that help materialize internal life become
more effective. We can interact with our dreams and fantasies more readily, to
potentially therapeutic benefit.

It's hard to say whether this trend holds more danger or promise.

~~~
Chaebixi
I wouldn't be that pessimistic just yet. It's yet possible that new advances
in authentication technology might counteract some of these trends.

~~~
unethical_ban
What do you mean "authentication technology"? Tamper detection? The ability to
see that a tool was used? It may slow things down, but this is an arms race.

~~~
BoiledCabbage
> What do you mean "authentication technology"?

Cryptographic signatures. Ie every frame in a video is signed with a 512-bit
key that states authoritatively what camera was the source of the video and
when it was taken. In order to change any pixels in the video you'd break this
key and need to resign it. An attacker would be unable to do this unless they
had physical access to the original camera.

But it'll be at least 3 decades before this technology is commercialized,
people see the demand for it, and the majority of all cameras in the world are
replaced by it. Even if this new tech is on the market in a decade (simple
tech, no demand / ecosystem yet), but 90% of existing / installed cameras
don't have the feature then fake videos still get created with them. Only once
~80% of videos are authenticated, and a significant portion of the remaining
20% are fakes will people be able to dismiss non authenticated video. Up until
then it's fakes non-stop.

We've got some ugly decades coming up.

~~~
bduerst
What's to keep you from adding the key to a camera after falsely generating
the media? Or using the key of a known camera to generate false footage?

~~~
BoiledCabbage
So spitballing, my assumption is that it'd end up looking something like SSL
certificate chains today. In the situation you mentioned:

1\. The attacker wouldn't have access to the original cameras private key. 2\.
The attacker creates a fake video, creates a private key and signs the video
with the key 3\. The attacker tries to install the key into a camera

Step 3 has to be made impossible. Meaning that a camera becomes a trusted
entity and only allowed parties (ex the camera manufacturer) has the authority
to insert a private key into a camera. This would be that after installing a
camera with a new private key, the key would then need to be signed with the
manufacturer's private key and also stored in the camera. This shows the
manufacturer is responsible for the contents of the video. If someone tries to
change the camera's private key it would no longer match what the manufacturer
signed.

Which yes then means we need to have authorized camera manufactures and a
process for certificate revocation and all of that.

The exact opposite of "free and open" for recording devices - and the only way
we aren't flooded with fake videos flooding and seriously impacting society.
We have to want this if we want to still have a concept of video evidence in
either the justice or social spheres.

~~~
Chaebixi
This isn't a completely technical problem, there's not going to be an air-
tight technical solution.

I think, at this point, anything more than the camera having a secure device-
generated key that it uses to sign its output and produce watermarks is
overthinking it. That will add a nearly insurmountable barrier many classes of
forgery, namely one were a forger claims to have a photo taken with your
camera.

You also have to remember a significant class of photo-forgery would involve
images that are claimed to originate from a camera the attacker controls. For
instance, forged video of a robbery from a security camera. That could, in
some cases, be defeated by observing that the images have authentication
information _they shouldn 't have_ (such as traces of watermarks from other
cameras).

At the end of the day, forgeries will be spotted by observing that they have
subtle errors that don't add up. That's how it's always been done.

------
r3bl
The was a pretty good discussion yesterday in /r/cyberpunk about the possible
consequences of this:
[https://www.reddit.com/r/Cyberpunk/comments/7sexm6/deepfakes...](https://www.reddit.com/r/Cyberpunk/comments/7sexm6/deepfakes_porn_fakery_realistically_pasting/)

> The subreddit /r/Deepfakes became very active very fast and new deepfakes
> are submitted every day with varying degrees of realism. The most scary part
> is that ANYONE can be deepfaked, not just celebrities. Provided you have the
> right hardware (because neural networks demand beefy video cards for
> training) you could train a model of your friend and paste her face onto a
> porn video and boom. All you have to do is download a browser extension that
> downloads all photos from someone's instagram and work from there. Nobody is
> safe from this.

> I think this here is as cyberpunk as it gets. The technology is 4 months old
> and has already yielded extremely realistic results. Think of what we will
> have one year from now. Something like this matches both the high tech and
> the low life aspect of the cyberpunk genre.

EDIT: Pasted the wrong link.

~~~
hackinthebochs
Revenge porn doesn't even need to be real anymore to terrorize someone. A
celeb has plausible deniability when it comes to stuff like this. But an
average person whose career is on the line?

~~~
hndamien
Now they can just say, "oh, I've been deepfaked" and move on. In many ways
this will be psychologically liberating for many victims of revenge porn.

~~~
kevingadd
That assumes a really high level of technology literacy from the people the
fake is being used to deceive. That simply isn't going to be true, probably
ever. It's already easy to convince people of transparent falsehoods,
plausible faked video evidence will make it worse.

It's the "Give me six lines written by the most honest man in the world, and I
will find enough in them to hang him" problem except now it's "give me six
photos of the most honest man and I will convince everyone he loves to abandon
him".

There are notable examples of harassment mobs forming and never dissipating
off of really scant "evidence", things like accusing school shooting parents
of being "crisis actors" or long harassment campaigns based on flimsy claims
that a woman slept around. It's disgusting.

School bullying leading to suicides is already bad enough, what about when
teens are sending forged porn of each other around? Or to classmates' parents
to get them in trouble?

~~~
Godel_unicode
> That assumes a really high level of technology literacy from the people the
> fake is being used to deceive.

Counterpoint: "photoshopped" is a dictionary word.

~~~
hndamien
It only took 8 years for people to become Bitcoin experts.

~~~
Godel_unicode
What year is that going to happen?

~~~
hndamien
It was a somewhat tongue in cheek comment.

------
no1youknowz
When I look at the future, I think back at these videos.

Hells Club:

Part 1:
[https://www.youtube.com/watch?v=QajyNRnyPMs](https://www.youtube.com/watch?v=QajyNRnyPMs)

Part 2:
[https://www.youtube.com/watch?v=wfYlTtA7-ks](https://www.youtube.com/watch?v=wfYlTtA7-ks)

Where I would like this to go. Is either being able to take scenes from
different films and create mashups like this.

Or perhaps, getting a whole bunch of extras. Narrating lines and acting in-
front of basic sets with green screens. Then putting the faces of recognizable
actors and using something like Lyrebird for the voices. Where actors have
sold the rights of their faces, voices and personality for cheap.

Now you have a $100m movie for the cost of $100k.

A similar premise of the film: The Congress.

\-----

I really think in about 5 years, when the software is there and the dedicated
IaaS to train the sets is commonly available. We'll start to see some really
cool stuff.

~~~
davidw
10 years out, who are the 'recognizable actors'? Do they just keep recycling
the same ones?

~~~
gknoy
Let's just say that the decision for who to play James Bond will be even more
interesting.

~~~
danielbln
Indeed:
[https://gfycat.com/ConfusedScratchyCowbird](https://gfycat.com/ConfusedScratchyCowbird)

------
ryanmarsh
So I clicked the link in the article (for science, so you don’t have to) and
I’m blown away. People are doing this on home computer rigs? I thought I was
going to find some really crappy paste jobs but instead I found myself having
to completely second guess what I was seeing. Some of the videos of course
suffer from odd minor defects that give up their authenticity but others were
flat out as real as anything else I’ve ever seen.

Now I’m concerned about the implications of this. We already know any image
can be faked and almost any video but we also laugh at people who say the moon
landings were faked. Given this though how could anyone believe video evidence
of say, the president with Stormy Daniels, which is a matter of unfortunate
import with real consequences?

How hard would it be to fake an international incident from multiple vantage
points?

~~~
Florin_Andrei
I've taken some machine learning classes and I've played with TensorFlow a
bit. None of this was a big surprise to me.

But the overall implications are deeply troubling. I am tempted to say we're
not entering a "post-truth" era, but more like a "post-reason" era. It's
almost like rational thought has painted itself in a corner. These videos are
almost like a mathematical "proof" too complex to be independently verified,
or too complex for a human (and those are happening too).

If reason is hitting a ceiling, I'm not sure what else we could use to steer a
coherent society through whatever murky waters might lay ahead.

~~~
carapace
You go from rule-based thinking to feedback-based interaction. Part of the
picture is described by the old saw about international politics: "No
permanent allies, only permanent interests."

I do believe we are entering what I call a "trans-rational" era. There are
stable strange attractors beyond the Age of Reason. Our technology is forcing
us to confront the questions of who we are and what we want to do with
ourselves.

------
Raphmedia
I've been thinking about this for a long while. I think this is good.

With face recognition, old pictures you might have posted online are very easy
to find. Some ex-boyfriend shared a naked picture of you? You are screwed.

Now, you can simply say that it is a deepfake. Everyone will have naked
pictures of "themselves" online, even if they are fake.

~~~
albertgoeswoof
I agree - taking this even further, why would you consume these pictures in
the first place, if you can just deepfake them.

------
tomaskafka
Non-NSFW sample:
[https://www.reddit.com/r/deepfakes/comments/7sjkw5/ilm_fixed...](https://www.reddit.com/r/deepfakes/comments/7sjkw5/ilm_fixed_that_for_you_edition/)

> Top is original footage from Rogue One with a strange CGI Carrie Fisher.
> Movie budget: $200m

> Bottom is a 20 minute fake that could have been done in essentially the same
> way with a visually similar actress. My budget: $0 and some Fleetwood Mac
> tunes

~~~
hsod
Can you explain what's going on here?

Specifically, I don't understand this sentence:

> Bottom is a 20 minute fake that could have been done in essentially the same
> way with a visually similar actress.

~~~
bryanbuckley
Accuracy depends on some sort of match between host face and desired face. In
this example, the host face was the cgi face which is obviously already very
close to the desired face. If filming from scratch and skipping cgi, you need
to pick a good host face.

------
dictum
First step towards this:
[https://news.ycombinator.com/item?id=6272626#6272744](https://news.ycombinator.com/item?id=6272626#6272744)

------
gitgud
A quote that struck me from previous discussion on the topic:

@ekimekim 44 days ago

"We've already seen this with images and Photoshop. Society and their
heuristics of belief will adjust as these new capabilities become widespread.

What's more troubling is that as media becomes falsifiable, solid evidence
of...well, anything, becomes hard to have.

The ultimate loser there is the truth, sadly."

------
GuiA
The inevitable outcome is that no recorded media will be taken at face value
unless there is immense proof in some way of its veracity.

In the short term, this will probably lead to all kinds of terrible things
(kids getting bullied through computer generated imagery of them, people being
fired for videos of them saying things they never said, jealous spurned lovers
attempting to break apart marriages with fake videos, etc.)

In the long term, it might actually be a good thing - instilling a strong
sense of caution for anything that claims to be recorded from the real world.

~~~
gknoy
How do you hold people accountable for bad behavior (e.g. police behaving
poorly when making an arrest or traffic stop) when the video evidence is
viewed as "easily faked"? I mean, eyewitness testimony has been proven many
times to be faulty, but presumably someone with an axe to grind (and some AWS
credits?) could make fake videos of all sorts of bad things _from multiple
perspectives_.

Scary indeed.

~~~
Angostura
Cameras that digitally sign each frame?

~~~
PeterisP
All that does is link that frame to [a key in] the particular camera. It
doesn't verify that the frame was "real" in some way. It defends whoever
controls [the key in] that camera from _others_ secretly tampering with that
frame afterwards, but it doesn't defend others from tampering by whoever
controls [the key in] that camera.

Central signature verification or blockchain could verify a timestamp - i.e.
prove that the frame was taken (or maliciously created) before a particular
time. That's about it.

------
wasx
This is... scary. The potential applications of this technology extend far
beyond porn. How long until intelligence agencies are using this sort of
technology to sabotage political opponents?

~~~
XorNot
Low resolution security footage is about 1 year from extinction.

~~~
kevin_thibedeau
Who's putting up the capital to replace millions of low res cameras?

~~~
arca_vorago
Or to build the infrastructure needed to support anything else?

------
braindongle
An interesting angle here is the arms race between manipulation and forensics.
In image forensics, clever people are using clever techniques to keep us
tethered to some notion of authenticity in digital media. Like this guy:
[http://www.cs.dartmouth.edu/farid/downloads/publications/wif...](http://www.cs.dartmouth.edu/farid/downloads/publications/wifs17.pdf)

These emerging video manipulation tools open new frontiers for related
forensics research. In 10 years, when we see a video of someone doing
something horrible, these people are perhaps our only hope of knowing whether
or not what we're seeing ever happened.

------
bob_theslob646
This seems like the beginning of a giant problem. First we had Adobe able to
replicate a user's voice after listening to it for 20 to 40 minutes, now this.

I guess vein scanning is going to happen sooner or later in order for personal
verification.

------
TaylorAlexander
Interesting.

It seems /r/deepfakes (NSFW) is where the content is at (I assume the article
doesn’t link to that, but haven’t checked).

------
herogreen
Maybe one day everyone will be using cameras with a combination of digital
signatures and watermarking technologies.

That said not being able to definitely classify videos between "faked" ones
and "original" ones could help people suffering from revenge porn (or
political manipulations)

~~~
BoiledCabbage
This here seems to be the only possible way forward, other than throwing out
all concept of video evidence.

All video streams (and frames) are signed / watermarked with the source that
recorded them. Video editing now becomes impossible without showing a change
in ownership.

Unfortunately it will be a long number of decades until we can successfully
replace the majority of cameras with these new ones that society doesn't yet
have reason to understand the need for.

~~~
mirimir
Strong authentication might protect video in news and court contexts. But I
don't see how it'd help for celebrity porn, revenge porn, and other contexts
where there is no authenticated original. Unless all the originals were
authenticated. But that'd require mandatory authentication on all recording
devices, which is rather frightening from a privacy perspective.

~~~
BoiledCabbage
> But that'd require mandatory authentication on all recording devices, which
> is rather frightening from a privacy perspective.

Yes inherently scary, but is there a solution to still allow a user to
maintain control as to when they give up their privacy?

Ex. a 3rd party escrow service that will authenticate a video belongs to a
user without revealing who the user is? Or some other key sharing scheme that
allows users to decide when they want to take ownership of a whistle-blower
video or some other govt corruption video in a repressive country?

I'm not a public key crypto expert, but I can't imaging this is the first time
user controlled authentication has been investigated. The requirement would be
that everything is always signed, but (through key chaining, sharing...) a
user gets to decide if they reveal their association to it.

------
dontreact
Between VR and the possibility to train up person-specific porn generators,
porn is only going to become more and more of a super stimulus that is too
hard to resist. Best to try and quit fully now.

------
Koshkin
John C. Wright, in his book _The Golden Age_ , has worked out a pretty
complete picture of the ability to mold one's perception of reality to one's
liking.

------
ProxCoques
It's interesting that the issue of ensuring authenticity might also swing into
censorship too: if politicians only issued certificates for images of
themselves doing good things, they can then dismiss those of them doing bad
things as fake because the creator couldn't produce the cert.

~~~
matte_black
You cannot allow the politician to be the same person that issues certs
because then it creates a market for a politician to buy doctored images and
certify them. A neutral unbiased third party must do the certs.

------
Faaak
This technology makes me afraid because it deconstructs many core value I
have.

You can do plenty of things with it. Create fake porn among others. But is it
ethical ? Is it ethical to create fake porn with a celebrity face ? With your
ex-girlfriend's ? What about if you keep it for your own personal use ?

But then were are the limits ? What about a "teen" ? What about fake child
pornography from pictures of childs you found on the internet that permit to
alleviate yourself without causing any harm to others ?

I would at first think its okay if you keep it for private and own use, but
then that becomes scary because we've been "trained" to think that child porn
is not okay.

Don't know why this is been downvoted ?

~~~
toomanybeersies
If it makes you feel any better (it won't), this kind of thing has been
possible for years.

The only difference is that now instead of using photoshop to make fake
images, we can automate the process, and use it to make video. This isn't a
paradigm shift, it's a change in pace.

~~~
noobermin
A change in pace can create a revolution. That exactly what the industrial
revolution was.

------
jondubois
This is great for celebrities... From now on, if they get their phones hacked
and get their private videos stolen and shared publicly, they can (plausibly)
claim that it's not real.

------
paul7986
And this is why we need a verified identity system on the Internet.

With such in place you will know who created the video as their verified
identity is attached to it. If there's no verified identity attached to it the
video wont hold any weight. Same goes for everything done on the iNet .. you
can use it anonymously where what you say or do doesn't hold much or any
weight vs. commenting,posting, etc using your verified ID does.

This just one solution that could help with all the fakery on the iNet and the
mayhem it brings and will continue to bring but worse.

~~~
Retra
Do you mean digital signatures?

~~~
paul7986
More like a drivers license or govt issued ID card for Internet usage.

YOu can use the Internet anonymously as you always have but if you want to get
your point across and ensure the veracity of whatever your posting you'll use
your Internet verified ID.

Im thinking in the case of someone impersonating a public figure where the
originator of the video is not the public figure its immediately labeled fake
news. Also if it's revenge porn and you upload it without your identity
attached to it .. it's immediately labeled fake.

~~~
Retra
You could always just digitally sign everything you put out. You can do that
today. I don't see how you're proposing anything new or effective.

Besides, what you're proposing is stupid for other reasons. If I have a video
of a police officer murdering someone, you're just going to say it's fake news
unless that officer himself posts it? That's straight bullshit. "Government
Sanctioned Truth" bullshit. The truth about a person or agency is not
determined solely by what they approve of.

~~~
paul7986
Govt mandated Internet identity system that you invoke when you want to ensure
the veracity of what you filmed and posted. If not invoked then it's labeled
fake and just created for fun or shits n giggles. It holds zero weight and all
on the Internet know/understand this system like the back of their hand.

Again I am thinking of two use cases re: this solution... revenge porn and
celebrity & political figure fake videos.

~~~
Retra
This doesn't solve either of those problems. Nobody is going to ensure revenge
porn veracity. They will be posted online, labeled "fake" and absolutely
nothing will change.

And a political figure can already just deny that the video is real. You're
not preventing it from being posted, so it will be posted, and people aren't
going to believe it? Simply because it wasn't approved by that politician?
Again, it makes no difference. People will believe it's real because _of
course that politician won 't approve of an unfavorable message._

Again, things don't become true just because someone approves of them. And
things don't become fake just because nobody involved approves of them. Labels
do not mean anything unless you can guarantee their accuracy. You're really
not thinking this through effectively. And again, digital signatures already
allow all of this to happen.

You're basically asking for an official worldwide propaganda platform, and it
will no more trustworthy than the existing propaganda platforms.

~~~
paul7986
\--->This doesn't solve either of those problems. Nobody is going to ensure
revenge porn veracity. They will be posted online, labeled "fake" and
absolutely nothing will change.<\--

That's what I am getting at ... things change 180 degrees on the Internet
where no believes anything that is not posted by someone with their verified
identity. Thus making revenge porn pointless. You even say they will label it
fake thus no one giving a crap about some fake b.s. and it's totally and
immediately disregarded.

~~~
Retra
>things change 180 degrees on the Internet where no believes anything that is
not posted by someone with their verified identity

No. People will not care what identity you used to post things. Verified or
unverified makes no difference unless the verification is thorough and
performed by a third party. If it is self-done, then it is no different from
how things are currently.

>You even say they will label it fake thus no one giving a crap about some
fake b.s. and it's totally and immediately disregarded.

No. Non sequitur. You label it fake, and people will just disbelieve your
label. They do not necessarily disbelieve the content.

At this point, I don't know what to tell you. Create a startup and put your
money into it. Place a big bet. You can lead a horse to water, but he drowns
himself.

~~~
paul7986
Im talking about an internet that does not exist yet..one where theres a
system in place where every iNet user can spot a fake video vs. a real one in
a heartbeat or fake news vs. real.

Do you not think based off of this Face2Face technology and all the fake news
this evolution needs to happen?

~~~
Retra
You're talking about something so obviously flawed I will not address it
anymore.

------
JetSpiegel
After over 100 years of having a massive professional industry dedicated to
creating fake sequences of images for fun and profit, another decrease in the
capital needed to create films doesn't seem such a revolutionary thing.

Just like people watched that first film of the train moving towards the
camera and freaked out, while today that seems quaint, humans as pattern-
matchers extraordinaiers will find a way to discern the fakes.

------
peterjlee
Finding the origin of contents are already a challenge these days. A random
idea I have is creating a blockchain where content creators can register their
creations to prove their origin. Also, camera manufacturers can get involved
and build in a hardware that signs every picture and video captured by that
camera which can then be registered to the blockchain.

~~~
rland12
Kodakcoin?

------
sjg007
Well definitely black mirror but you can now recreate a passed loved one
virtually.

------
grinsekatze
...and the “nothing” is claiming all of Phantastica.

------
cobbzilla
"DNA or it didn't happen"

~~~
carapace
You can mail-order custom DNA.

