
AI-Assisted Fake Porn Is Here - gridscomputing
https://motherboard.vice.com/en_us/article/gydydm/gal-gadot-fake-ai-porn
======
iamben
Despite some comments here, I don't thing society will adjust to this for a
long time, if ever. We already live in a world where you can create bias with
words on a page or a website, moreso if they come via someone with influence.

Imagine an AI generated 5 second clip of Weinstein saying something scandalous
about raping women, or imagine Trump retweeting _anything_. People choose to
believe something that fits with or reinforces how they think - emotion and
reaction is easier than checking if something is true. Even if it's reported
to be a fake after, people would rather question the reporting ("well that's
what he'd want you to believe" / "of course _they 'll_ report it as a fake")
than acknowledge the fake.

I hate to be a harbinger of trouble, but I can see a lot of people having
their lives destroyed with this.

~~~
unholiness
You could have said the same thing about Photoshop. Journalism is as good as
the integrity of the source. In places, you could argue that's systematically
eroding. I don't think you can argue that fake videos are what will bring it
down.

~~~
labster
Journalism is as good as ever. It's easier than ever before to gather facts
and verify sources. The problems faced are external to journalism: a losing
business model, and government saying that journalists aren't trustworthy. But
the integrity of the press is fine. There are less yellow outfits than there
were a century ago, even.

~~~
mc32
You can say that but the press (whether because they have no money or they
can't afford to be second with the hot news) is complicit in the perceived
decline and real decline.

It used to be you had to have two independent sources before you took an
individual source's word. Now rumor becomes newsworthy --it's like the
Enquirer won.

They are also owned by big cos (Amazon, Conde Nast, Carlos Slim, Murdock,
etc.) and have very clear biases. NOt that biases didn't exist before (Hearst
papers) but on day to day stuff they played it pretty level. Now, they go for
the incendiary whereas before they were focused on big issues --the issues
that mattered to society at large rather than smaller divisive issues. But
divisiveness gets you clicks.

------
rndmize
Contrary to the article's stance that this would be used to create fake
revenge porn or celebrity sex tapes that are taken as real, I'd expect this to
devalue the real thing and give space for people to claim anything with their
face in it is fake.

~~~
ekimekim
We've already seen this with images and Photoshop. Society and their
heuristics of belief will adjust as these new capabilities become widespread.

What's more troubling is that as media becomes falsifiable, solid evidence
of...well, anything, becomes hard to have. The ultimate loser there is the
truth, sadly.

~~~
pjc50
Indeed. The worst use of this will be forged political speeches. What happens
when you can fake any video of anyone saying anything and then feed it to the
social media outrage machine?

~~~
telesilla
As the parent noted, society will adjust expectations about how to determine
facts. When radio first came along, some people were fooled that Orson
Welles's War of the Worlds [1] was actually happening, because it was in the
style of a news broadcast. We learn and adjust how to process data as systems
change around us.

[https://www.smithsonianmag.com/history/infamous-war-
worlds-r...](https://www.smithsonianmag.com/history/infamous-war-worlds-radio-
broadcast-was-magnificent-fluke-180955180/)

~~~
ACow_Adonis
Yes, but that in no way implies that the method/means society adjusts it's
behaviours/expectations will be desirable.

One could even hypothesise that social media bubbles are the first step
precisely in that direction.

------
nickparker
This reminds me of a crypto application I've been waiting for someone to build
/ explain why it would suck:

Secure enclave based verifiable recording devices.

So, a camera that uploads a signed hash of every image it takes using mobile
connectivity as soon as it takes it. Or the same for video and audio
recordings.

It's loosely inspired by Town Crier[0]. The notion is that instead of
regarding arbitrary blocks of data as "recordings" we demote those to
arbitrary blocks of data, and anything you want to claim as verified truth
needs a time-stamped hash proving what device took it, where it was taken,
when, and that it hasn't since been modified.

Disclaimer: I'm the furthest thing from a crypto person, this just seems like
a nice use case to me.

[0]: [http://www.town-crier.org/](http://www.town-crier.org/)

~~~
acdha
At least Nikon has offered a similar idea for awhile:

[http://imaging.nikon.com/lineup/software/img_auth/index.htm](http://imaging.nikon.com/lineup/software/img_auth/index.htm)

The catch is that it has been hacked before:

[https://blog.elcomsoft.com/2011/04/nikon-image-
authenticatio...](https://blog.elcomsoft.com/2011/04/nikon-image-
authentication-system-compromised/)

The Secure Enclave helps a lot but the Nikon hsck illustrates a hazard of
needing devices to be updated, especially since there’s an obvious opening for
someone to use a plausible old vulnerable device – and it seems like, say, a
compromising photo of a politician would spread a lot further and faster than
the observation that the older phone has been compromised.

One other challenge is that you might be able to prove that a device took the
photo but not that, say, it was a real scene rather than a high quality print
or that the time stamp wasn’t subject to a manually set clock. Again, I’d
expect countermeasures but worry that nobody would see them in time for the
most damaging cases.

~~~
nickparker
The time stamp is handled on receipt of the hash by third parties in my mind,
not by the originating device. Ideally the position component of the data
would also be verified by 3rd parties, eg cell towers receiving the data add
their own signatures so you know it was at least in the region it claims.

The high quality prints problem is more interesting. I’d hope that sort of
trick would be detectable by inspection, but then we’re back to cat/mouse
games with AI.

If we all switched to crypto verified light field photography, we could at
least force hoaxsters to build dioramas :)

~~~
acdha
One risk: all of that could make a forgery more persuasive. Imagine some James
O’Keefe style operative preparing a fake but not uploading it until they drive
by the victim’s house, rent a room in the same hotel or near their office,
etc. How many people would see the extra metadata and decide it proved
accuracy?

------
rflrob
There's also a fascinating RadioLab piece from a few months ago about this
kind of thing:
[http://futureoffakenews.com/videos.html](http://futureoffakenews.com/videos.html)

It's remarkable to me how little the ML researcher they interview in that
episode seems willing to grapple with the consequences of the technology.

~~~
sushisource
I heard that piece, but I read is less as "unwilling to grapple" and more
"resigned to the fact that it is inevitable the technology will be built, so
might as well enjoy myself building it".

It would feel a little grating to me as the interviewee to have the
interviewer keep pressing me on these questions that, while totally relevant,
are obviously designed to make me have to "answer for what I'm doing" in some
Frankenstein-esque sense. Yeah, I get it, it's scary, but this train left the
station a long, long time ago and probably I'm not (even as someone working on
the tech) gonna have a lot to say about the downsides that haven't already
been said.

~~~
anigbrowl
With great power comes great responsibility. If you want to make money or
social capital out of providing some innovation, you have some obligation to
think about the negative externalities that will result and come up with a way
to minimize them.

~~~
tree_of_item
Should the people who discovered a way to start fires have come up with a way
to prevent arson? Could they?

~~~
dogecoinbase
If structures had existed at the time, sure.

~~~
tree_of_item
And how would they do this? Should they have tried to keep fire secret until
this problem was solved?

------
dogma1138
Not exactly a new concept but it’s likely extremely cheap now and would even
get cheaper and better.

With VR porn, the new sex toys for men and women and now this I’m somewhat
happy that this wasn’t around when I was a teenager as I’m really not
confident I would have left the room if that was available.

~~~
ShabbosGoy
Reminds me of “I Dated a Robot” from Futurama.

~~~
dogma1138
More like the rick and morty episode with the sexbot...

"I Dated a Robot" actually has dating in it.... if we had a human or near
human AI datebot that would be quite different since whilst you won't be
participating in the continuation of the human species at least at that time
you would also not be losing any social interaction skills.

This could be nothing and this could be the same thing as the rage against
hustler in the 70's and internet porn in the mid 90's but for someone who
grewup with every nasty available when I tried one of the higher end recent VR
porn productions on my Oculus the feeling that you get from it is quite
different.

I've tried VR porn as a joke like 1-2 years ago when the Google VR stuff
started and back then it was pretty garbage it was mostly just barrel
projection of a standard porn scene that wasn't shot with the intention of VR.
The new stuff is shot with proper cameras, uses 3D audio and it's just creepy
how convincing it has gotten so far despite the major drawbacks of the current
VR headsets.

------
Mistri
1967: "I bet in 50 years we'll have flying cars!" 2017: AI-Assisted Fake Porn
Is Here And We're All Fucked

------
hallman76
In a similar vein, this video[1] by a visual artist explores the impact that
modern CG is having on cinema and art. It's 14 minutes, but if you're into
movies or modern art you'll dig it.

[1] [https://vimeo.com/237568588](https://vimeo.com/237568588)

------
djroomba
Truth is going to be destroyed in the AI hyper reality.

We are going to need technological solutions to validate against ai forgeries
for our legal system to survive.

~~~
GuB-42
We already have such tools. This is an example:
[http://fotoforensics.com/](http://fotoforensics.com/)

An issue that the same algorithms that can be used to differentiate between
real and fake can be used to train AIs to make better fakes. But here, the
fake detector has the advantage in the arms race.

Another approach is to use cryptographic signatures. Some cameras sign can
sign pictures to ensure they are original.

~~~
breakingcups
I can point the camera at a high-resolution screen or printed out photo
though.

~~~
GuB-42
I don't think it would work. A 4k screen has 8Mp, that's less than a cell
phone camera. The artefacts will likely be obvious to those who know where to
looks.

Printing may be a better option with regard to resolution but because the
print color gamut is different, simple analysis can probably give away the
trick really easily.

Furthermore, assuming EXIF informations are signed along with the rest of the
picture, you also make all the metadata match the picture. A sharp picture and
bright daylight is not consistant with a slow shutter speed for instance.

------
emmelaich
I see a danger in false propaganda rather than any privacy issue.

Enough people are deceived by photoshopped pictures. Fake video will deceive a
whole lot more.

------
ravenstine
This is the best headline I've ever read.

------
JoeAltmaier
I'm imagining the value for making actual theatre movies will be a bigger
issue. You can have Humphrey Bogart in a science fiction thriller! We'll have
to come up with new rules.

~~~
amelius
Hey, this could actually be awesome. Everybody could make their own preferred
version of the movie.

Movie studios could get away with legal problems just by releasing videos
without the faces, and letting the audience download faces through gray market
channels.

~~~
sudouser
Star Wars XVII, acted by:

Adam Sandler!

~~~
qbrass
It's just Happy Gilmore with lightsabers.

------
wruza
>”It kind of shows how some men basically only see women as objects that they
can manipulate and be forced to do anything they want... It just shows a
complete lack of respect for the porn performers in the movie, and also the
female actresses”

This statement is confusing me in many ways. Can anyone explain the exact
logic behind it? Specifically, for “forced to do” and “lack of respect”.

How does it, and implications of it, seem for you?

------
dillondoyle
I've been thinking a lot about trust, reality, and sourcing. I wonder if we'll
move towards cryptographically signing content and facts so we have a
verification of legitimacy.

Maybe a blockchain single source of truth, using tiered identity verification
where one can accumulate identifiers and use this ID to sign content
(biometrics, other verified IDs can vouch for your verification etc higher
score = less risk of fake). Seems like these things all already exist the big
problems would be putting it together, scale, and easy clarity. For instance
making it so my mom's browser or email automatically verifies signed content
(and maybe more important denotes unverified or verified false) and clearly
presents this like the green lock key in internet browsers.

I wonder if something like this could be extended to verify beyond like an MD5
checksum saying this email or photo is real - i.e. solving problems like 'fake
news.'

~~~
cuckcuckspruce
Can't wait to tell my (future, 10-15 years from now or so) kids stories about
the Internet from when I was a kid and have them ask, "daddy, is it really
true that you could post on the internet without giving a blood sample,
retinal scan, and fingerprints to your ISP and didn't have to tie your real
name and address to all of your comments?".

~~~
dillondoyle
Lol. I refuse to use the face scanner on my new iPhone for similar ideological
reasons. But we also already do this across platforms: TW verified,
banks/Coinbase verification using IDs etc.

~~~
cuckcuckspruce
There's a difference between having some platforms requiring personal
information and having Internet access predicated on everyone knowing your
identity.

For example, I want my bank to know my email address, phone number, and
physical address because I want to receive fraud notifications, to make
telephone banking transactions, and to receive paper statements.

I do not want forums that I shitpost on to have any of that information, nor
do I want it to be written into law that any service I access over the
Internet is required to ask for and log that information.

------
natecavanaugh
I would imagine in ~5 years this will primarily be used in apps for the
purpose of vanity clips (ala Snapchat, MASQRD, etc) and otherwise trivialized,
as well as probably some very profound and moving applications (think of a
deceased loved one bringing to life heartfelt messages, dissidents using a
dictators face to deliver messages to their troops, or even ones we can't
begin to fathom until it's so easy that anyone with an iPhone can mask
themselves with someone else's picture).

All of this, IMHO, will shape our general cynicism and distrust of recorded
evidence, which will lead the way to us finding other heuristics to sniff out
bs. I think this is an overall net gain, but I do agree, there will be some
people who's lives are destroyed by what this enables, which really sucks, to
put it lightly.

The upside, at least, is that we already have people looking to find ways to
deal with this right now.

------
Scarblac
So, soon men will start editing themselves into videos as the male actor, for
private use.

~~~
PeterisP
Men will start paying for a service that will create custom porn by swapping
in the object of their fantasies. The only difficulty with choosing your hot
neighbor instead of someone popular will be that the service will have to
fetch their face from their facebook profile.

~~~
xster
They can sell a LinkedIn Premium equivalent service.

X people used your face as a swap in for a pornstar. Y$ a month and you can
see which state they're from. Z$ to meet the top 10 reviewed individuals near
your zip code.

~~~
Scarblac
This guy is really into you! Except he adjusted the width of your hips,
decreased your breast size, got rid of the tattoos and went for mixed nigerian
/ inuit race, the latest fad. This is only 65% of the typical customization
level!

Send a message? [Y/n/generate] Adjust profile picture to his preferences?
[Y/n/half way there]

------
AdeptusAquinas
While the objectification is bad, I think the issue of determining truth from
lies should be solvable: public key cryptography seems simple enough.

Have every public figure have their own key pair, and each official production
have its own key pair. Then publish a signed hash or similar for each
production the figure chooses to appear in. Any production without such a
signature or with one that can't be verified is probably fake.

Would be a big shift in process, and would only happen if this sort of thing
becomes a serious problem, but once it did such a scheme would stop it in its
tracks.

~~~
azinman2
Just make sure that key rotation works, no one ever loses their credentials,
etc.

Also — do you really need a signed signature to verify a celeb isn’t actually
in a porn? It’s not like they’d sign that anyway...

~~~
AdeptusAquinas
More about the wider implications: faked interviews or public statements etc.

------
clickok
I am disappointed in the level of fear-mongering pearl clutching; it could[0]
get so much worse!

First, the person creating the fakes is using an off-the-shelf algorithm[1],
adapted accordingly. A system designed purely for the purpose of swapping
faces (or morphing a particular face into others, one-to-many style) would
obviously yield improvements.

Secondly, it's the work of one person without commercial backing[2]. A studio
that invested in lidar or used multiple cameras would be able to incorporate
volumetric information.

Third, it's limited to individuals where you have a performer that's an
approximate body double for the target in question. Why involve real people at
all? That limits the kinds of scenes you'll be able to shoot, and it's clearly
a production bottleneck besides. Since it's feasible to create realistic
animations of locomotion via deep reinforcement learning[3], applying the same
technique would remove the need for human actors.

With that in mind, we can invent scenarios substantially more terrifying, for
which this is but a teaser trailer.

As computation becomes cheaper and AI knowledge more widespread, the delay
between when something becomes _possible_ and when it becomes _ubiquitous_
grows ever shorter. Once custom pornography is just another something-as-a-
service, the question becomes how do we optimize it? IoT sensors and wearables
can provide the data needed to customize an experience tailored to your
particular tastes on a moment-by-moment basis. Facebook is sinister, but it's
sinister in the sense of a malevolent external force with inscrutable goals;
you can defeat it by not using Facebook. The idea of a machine that can
produce something more compelling than most stimuli available in the external
world is scarier, because all it would be doing is giving you exactly what you
want.

Okay, so the riff on wireheading in the style of _Infinite Jest_ might not be
adequately terrifying. How about something more realistic, like character
assassination campaigns? We've seen how intimations of sexual impropriety can
damage a politician's electoral chances or outright cause them to resign.
Sometimes, the thing that allows them to hold on to office is the fact that
there's no convincing proof beyond circumstantial evidence. However, if you've
already primed the public with rumors, a fake sex tape might be the just the
thing to push past the tipping point. Imagine: a future where an individual
could effect a bloodless coup using nothing more than a Twitter botnet to
spread slander and a few GPUs to generate scandalous footage.

No? That's perhaps more in the vein of William Gibson. The last thought that
came to mind was along the lines of "the future is weird and terrible, yet
somehow boring and familiar" _a la_ Neal Stephenson. The redeeming value of
porn is that there's at least some intentionality behind it. Real people had
to be involved, from the actors to the crew to the guy who wrote the script in
traffic on the way to the shoot. The future will be endless porn spam, half-
coherent plots generated by LSTM-char, just glitchy enough to be discernible
as fake, yet seemingly unavoidable due to the sheer quantity made possible by
automating the production pipeline.

\------

0\. And when it comes to AI, "could" should be read as "will" but with an
indeterminate time frame.

1\. Apparently, the face-swapping is accomplished following "Unsupervised
Image-to-Image Translation Networks" (paper:
[https://arxiv.org/abs/1703.00848](https://arxiv.org/abs/1703.00848) video:
[https://www.youtube.com/watch?v=Bwq7BmQ1Vbc](https://www.youtube.com/watch?v=Bwq7BmQ1Vbc)),
with the face detection provided by FasterRCNN.

2\. The reddit account of the artiste in question:
[https://www.reddit.com/user/deepfakes](https://www.reddit.com/user/deepfakes)
(not exactly safe for work, but not particularly unsafe, either).

3\. [http://www.cs.ubc.ca/~van/papers/2017-TOG-
deepLoco/](http://www.cs.ubc.ca/~van/papers/2017-TOG-deepLoco/)

~~~
fancyPantsZero
> How about something more realistic, like character assassination campaigns?

if fake porn does become ubiquitous as you argue, wouldn't the impact of such
assassination be greatly diminished?

~~~
acdha
Eventually but consider how much long-term damage will be done before most of
the people living when that technology becomes commonplace stop voting. Think
about Brexit and the US 2016 elections, both decided by generally old people
who believed factually untrue but widely repeated claims – there’s no easy way
to reverse something like leaving the EU or not dealing with climate change
even if a decade later the average voter is considerably more savvy.

------
notadoc
"Fake news" is about to get a whole lot worse.

------
amelius
I already thought about this possibility a decade ago.

Anyway, the world is going to be a strange place. I'm sure that soon, nobody
will believe anything they didn't see with their own eyes.

~~~
jhayward
It seems likely we are only one or two tech breakthroughs from not even being
able to trust our own eyes. It's going to get strange.

~~~
wruza
Chances are that we overestimate the completeness of information coming from
our own eyes, except for obvious extreme cases. People, actions and events may
and do look not what they really are.

I mean, with the fall of eye-trust we will barely lose something important;
it's only perception bias.

------
Avshalom
For many of you, this will be much less breeding. For me, much, much more.

