
Family fun with deepfakes - robinhouston
http://svencharleer.com/blog/2018/02/02/family-fun-with-deepfakes-or-how-i-got-my-wife-onto-the-tonight-show/
======
digitalsushi
Whatever you think about this, it's here to stay - the code is open source and
cached on a million hard drives already.

If we treat the normal rate of progression, we'll see a year 2023 version for
5 bucks off the App Store, your phone will live-convert the video you're
recording at your drinking party with your friends into an all celebrity
event. Probably with voice swap. Your drunken roommate's scratchy voice will
be ML'd into Frank Sinatra, autotuned, face swapped, and uploaded for a few
laughs to facebook.

It's here to stay whether you like it or not.

~~~
zodPod
This! One of the most ridiculous notions in articles about this when it first
started coming out was "We should ban this technology!" and reddit started
deleting threads that contained links to download the scripts and apps. Do
these people still really not understand that's not how this works? Has the
Streisand effect not been proven enough yet for the general populous?

~~~
slg
By that logic why should we ban any technology? Everyone here is unanimously
against child porn even though we understand that it is already out there and
that there is no way to put a complete stop to it. How are deepfakes that
different?

The truth is this technology is dangerous in the wrong hands. I don't know if
that means there needs to be laws created regarding it, but I know that the
difficulty in stopping it shouldn't be used as an excuse to not even debate
that possibility.

~~~
cortesoft
So we ban child porn to try to curb the production of more of it. Its
production is what is harmful, and we do everything we can to reduce its
production, which involves trying to reduce the demand by making it illegal
and arresting people who obtain it. It has no legitimate use, so there is no
collateral damage when we ban it.

Banning this technology would be more like banning cameras, since they can
also be used to create child porn. However, it would be even LESS effective
than banning cameras, because we are talking about a software algorithm.

Also, you say this technology is 'dangerous in the wrong hands.' Why is that
the case?

I think the only reason is because it has the possibility of decieving people
into thinking it is real. The real way to reduce this risk is to either spread
the idea that all video is suspect, and you can't assume who you see on video
is who it appears to be, or to perfect technology that can instantly detect
when this system has been used.

If you are able to easily distinguish between true and false video, isn't the
entire danger mitigated?

~~~
slg
>So we ban child porn to try to curb the production of more of it. Its
production is what is harmful,

We also ban computer generated child porn even though there is no victim
involved in the production of that. If you use this deepfake technology and
stick an 18 year old's face on a porn video it is legal but if you do the same
thing with a 17 year old it isn't. The production of those two videos is the
same. The only difference is the potential victim if the video became public.

>The real way to reduce this risk is to either spread the idea that all video
is suspect, and you can't assume who you see on video is who it appears to be,
or to perfect technology that can instantly detect when this system has been
used.

That sounds great in theory, but people don't work like that. We are built to
believe what we see. We all know that almost every magazine cover has been
mercilessly Photoshopped and yet the images on those covers have still been
shown to have a strong influence on people's own body image.

~~~
kuschku
> If you use this deepfake technology and stick an 18 year old's face on a
> porn video it is legal but if you do the same thing with a 17 year old it
> isn't. The production of those two videos is the same. The only difference
> is the potential victim

For reference, artificially created child porn (e.g. in mangas, or in other
ways created) is in a legal grey zone in many areas, including Japan and
Germany. It is absolutely illegal in other, e.g. Sweden.

~~~
jstarfish
> For reference, artificially created child porn (e.g. in mangas, or in other
> ways created) is in a legal grey zone in many areas,

No. The grayness only applies to non-photorealistic works.

Once you start making photorealistic deepfakes or diptychs, where the end
result appears to be photographic, the color becomes much more black and white
the world over.

~~~
kuschku
In some jurisdictions the grayness also applies to the "victimless crime"
part, because no person was harmed during creation of the video.

But yes, it's a much more complex legal case.

------
dubin
To add a data point, a friend and I made one of these over the weekend (with
Trump and Michael Scott [https://youtu.be/0Rexuh-
VY6E](https://youtu.be/0Rexuh-VY6E))

It was shockingly easy to do. We're technical, but neither of us know a lick
about machine learning. It took a couple hours to collect training data (we
turned speeches/interviews from each of them into thousands of photos), and 20
hours to train the model.

In the future you could automate the data collection for a person even more,
to the point where they just need to film a selfie of themselves for a couple
of minutes in a couple of different lightnings, and boom, after you train
their decoder, you could put them on any celebrity.

EDIT: This is what we used
[https://github.com/deepfakes/faceswap](https://github.com/deepfakes/faceswap)
Would be happy to walk anyone through how to do this

~~~
michaelbuckbee
A question for the lawyers: If I build a model of a celebrity, do they have
copyright on that? Could I sell it?

~~~
some-guy
IANAL, but Crispin Glover sued the creators of Back to the Future II for using
a mold his face as George McFly without permission [1]. While he starred in
the first film, the second film used a mold of his face since he ended up not
joining the project.

[1] [https://www.hollywoodreporter.com/thr-esq/back-future-ii-
a-l...](https://www.hollywoodreporter.com/thr-esq/back-future-ii-a-
legal-833705)

------
ploggingdev
Discord banned multiple servers that r/deepfakes created and their reasoning
was that it's being used for creating involuntary porn and so is against their
ToS. I'm curious to hear your thoughts on the legal status of deepfakes as
well as the moral and ethical questions behind the usage of the tech.

A thought experiment : I run a chat room site and will be adding support for
user created chat rooms (something like discord for cryptocurrency communities
with real time price widgets etc). Let's say I went ahead and got them to use
my chat room, what would the consequences be? And what do you think about the
moral and ethical questions behind hosting such a community? Am I now enabling
the creation of deepfakes? By choosing to avoid hosting such a controversial
community, am I imposing my moral and ethical reasoning on others (the free
speech argument)? How do think about the deepfakes project, which is mostly
being used for creating celebrity porn, but the tech itself is interesting and
has applications(targeted ads or in stunt doubles etc)?

To be clear, I'm not trying to host a chat room for them, this is just a
thought experiment. I don't like what it's being used for and don't want to
support it in any way.

~~~
maxerickson
Using other people's visage without their permission is gross. It's immoral
and unethical.

The idea that you can have morals and ethics without sometimes acting on them
is also sort of incoherent. If you claim to believe it is immoral to do
something and then do business in spite of that, it directly puts the lie to
your claim. This is separate from whether you believe that there should be
legal consequences or other government action attached to certain acts.

~~~
runeks
> Using other people's visage without their permission is gross. It's immoral
> and unethical.

This seems a bit extreme to me. In my view, the line is crossed when it’s used
for fraud, i.e. misrepresenting the truth (for whatever purpose).

Face-swapping Daniel Craig’s face onto my body, and showing it to my friends,
is neither immoral nor unethical — it’s actually kind of funny.

~~~
maxerickson
I think publicizing a non celebrity is problematic even if there isn't any
misrepresentation.

It can be fine in a news context, but there's not a lot of faked videos that
would be appropriate in that context.

I even thought the faked Fred Astaire ad was in poor taste.

[http://www.criticalcommons.org/Members/kellimarshall/clips/A...](http://www.criticalcommons.org/Members/kellimarshall/clips/Astaire_DirtDevil2.mp4/view)

------
fredley
By the time the next US election comes around it's very likely this
technology, combined with Lyrebird etc. will allow anyone to make [politician]
convincingly say or do anything they like.

I can't make my mind up if this will make the current fake news / filter
bubble situation worse, or if it will make it so bad that people learn to
distrust any kind of digital media entirely.

I'm still not sure, given that the use of these tools for political gain seems
inevitable at this point how we will actually verify a video is real in the
near future.

~~~
tboyd47
We need to start thinking about what criminal justice will look like when
digital recordings are all considered hearsay. That's bound to happen.

And perhaps it should... if average people have the technology now, how long
has it been in the hands of police departments, newspapers, militaries, and
intelligence agencies?

We tend to focus on fakes with high-profile people like politicians and
celebrities, but it can clearly be done with everyday people as well. The
author was able to produce a fake video segment of a well-lit, animated
conversation involving his wife. She's directly in front of the camera,
speaking and making hand motions for several seconds. How much easier would it
be to insert a face into grainy security camera footage, or a shaky clip of a
large crowd at a political rally?

~~~
aerovistae
> And perhaps it should... if average people have the technology now, how long
> has it been in the hands of police departments, newspapers, militaries, and
> intelligence agencies?

To be honest with you, when I was younger I bought into this.

Nowadays....it kinda seems like with the open source community, consumers are
getting stuff first. Not always, of course, but often enough.

And police departments are one category that I can be quite certain are
lagging behind, they are _definitely_ not getting stuff before consumers have
it.

~~~
jessaustin
You're not talking about a monolith. Sure, if you've met 20 cops, it's
unlikely that you've met any cops who are technological geniuses. Most LEOs
haven't had a reason compelling enough to figure out how to fake a video. Can
we be sure that _no_ LEOs have? If we can be sure of that, can we also be sure
that no intelligence agencies have? That seems unlikely.

~~~
pixl97
Or to say it another way.

Only a few cops will have the ability to edit videos, the problem is almost
all cops in his department will lie for that officer on the stand.

------
taoistextremist
In the realm of innocent fun with this stuff, just how much time would it take
to insert myself into the entirety of a movie? I'm just imagining how fun it
would be to invite friends around to watch something we're all excited about,
only to convincingly insert my face into the main character's.

~~~
tomaskafka
Or, once advert industry learns about it, how about replacing all the things
on the table in Audrey Hepburn movie with BRAND NEW LEMONMELON FLAVOURED
OREOS! :)

Ideally on the fly, with each box sold to the highest bidder, adsense-style.

~~~
pjc50
I've already encountered a minor version of this with the film "Demolition
Man": when I first saw it on UK TV, it mentioned Pizza Hut at one point, but
more recently it was on again and that line was overdubbed _really badly_ to
say "Taco Bell".

It's a weird thing to watch an film turn out differently to how you
rememebered it when you're not told it's a different cut. Feels like low-grade
gaslighting.

(Another example is the translation of "天下" \- Tianxia - in the film Hero,
which is both critical to the film and translated differently in different
versions)

~~~
jayrobin
The original version was Taco Bell - Pizza Hut was dubbed over in
international releases (the latter has a much greater international presence).

------
algorithmsRcool
I am reminded of Ghost in the Shell : SAC where they mention that video and
photographic evidence became inadmissible in court due to how easy it was to
produce fakes.

I think we are perhaps 1-2 years away from this becoming a practical
consideration for the judicial system.

~~~
knodi123
I think the reality is that there's a burgeoning opportunity for
cryptographically secure chain-of-evidence cameras. Sure, no protection is
perfect, but if a manufacturer can reliably state "It will require [X] level
of resources and effort to falsify data in our camera", then a lawyer can
present footage and ask the jury to draw their own conclusion about whether a
criminal's claims of "fake footage!" are reasonable.

~~~
pixl97
So everybody is going to go out and buy new phones and cameras?

Also, using Meltdown and Specter as examples, can these companies actually
build cryptographically secure devices. It's easy for a manufacture to say
"Our device is secure", and it's easy for the judge to say "You go to jail for
life", but it creates a big mess in the legal system when we realize 3 years
later that someone extracted the private key from a device, framed somebody,
and that person had been put in the electric chair between now and then.

~~~
knodi123
> So everybody is going to go out and buy new phones and cameras?

Probably not. That sounds a little ridiculous. But probably so in the case of
security cameras or bodycams or dashcams.

> but it creates a big mess in the legal system when we realize 3 years later
> that someone extracted the private key from a device, framed somebody,

Yeah, I covered that. Did you read my second sentence, or only the first one?

Evidence can _already_ be faked. Eye-witness testimony has been faked for
millenia. Photographic evidence started being faked shortly after the
invention of the camera. What you do is, make sure the jury knows how hard it
would be to fake, and let them make an informed decision.

Does that mean it's technically possible to frame someone? Yes! Just like it
is today. Sucks to live in an imperfect world.

> and that person had been put in the electric chair between now and then

That's a valid argument against the death penalty. But that's a different
subject entirely.

------
ghostcluster
Is it just me or are these "deep fakes" just not even very convincing?

I felt the same way about all the Nicholas Cage ones that were viral recently.
It's so obviously a very shallow 'effect' and can't compensate for very basic
differences in bone structure, hair, etc.

~~~
ghostbrainalpha
Are you talking about the pornography being unconvincing or the GIF from this
article?

Fake: [http://svencharleer.com/blog/wp-
content/uploads/2018/02/anne...](http://svencharleer.com/blog/wp-
content/uploads/2018/02/anne-1.gif)

Original: [http://svencharleer.com/blog/wp-
content/uploads/2018/02/elke...](http://svencharleer.com/blog/wp-
content/uploads/2018/02/elke-1.gif)

I could see the fake being easy to detect at high resolution, but personally I
have a 0% chance of catching this gif as fake.

~~~
AH4oFVbPT4f8
You have the Fake and Original mixed up.

elke-1.gif is the fake anne-1.gif is the original

------
ChuckMcM
It makes the use of stunt doubles easier as well since the actor's face can be
mapped to the body double. And it isn't like this point in time wasn't
predicted by just about everyone in CS. Given enough computer power and the
right algorithm pretty much any data is fungible, video or otherwise.

But we've just entered the most dangerous period when most people still
believe what they see in a video is what was originally videoed. It is when
people will use this technology for deception and get away with it like they
did with Photoshop before 'shopped!' came into the more general populations
consciousness.

~~~
HorizonXP
A cool, possible solution would be to have every participant in the video to
digitally sign the final file to verify that they're actually in the video,
consensually. That way, if anyone finds a deep fake video, it would be easy to
dispute it since it doesn't have the person's signature.

Of course, this would be exceedingly difficult in practice, since you'd need
the signing to be opt-out instead of opt-in for it to be truly effective.

------
acmecorps
So, what's the difference between this and photoshoping someone's face to
create fakes? Both are the same thing fundamentally, right? I understand the
technological significance of it, but not the philosophical/legal etc
differences of it.

~~~
Myrmornis
It’s a good question. Is the answer that to a lot of people video is their
reality, whereas static images is some crusty technology from the decades
before they were born?

------
bloopernova
For those of us around when the optic nerve implants are available, it's going
to be a wild time.

You'll be able to experience your own subjective reality, completely
differently from anyone else.

Want to replace the grocery store with a Minecraft dungeon? Sure! Want to see
dragons soaring in the sky above? Sure! Want to pretend your self-driving
motorcycle is a speeder bike from Star Wars? Sure!

~~~
blacksite_
Psychedlics _kind of_ help you get there currently. Depends on where you're
looking to go, I suppose.

------
randyrand
In these discussions, its easy to think about the person harmed by a (known to
be) fake video of them having sex.

I think it's odd that we purposely ignore the joy created for the other side
of (thousands and millions people). In a utilitarianism view we'd weight these
sides in a 1000:1 ratio because of this.

I don't think a video that I know and everyone else knows is fake is very
harmful. And it's clear that many people get joy out of it. I think it should
stay (though it's here to stay regardless of the law).

~~~
chug
I'm not quite sure you can just hand wave away the harmfulness. It's very
possible that a lot of people would feel highly violated by it. It basically
takes away your control over your modesty and to an extent even the autonomy
of your body. Yes, the videos are fake, but that doesn't change that the
entire premise is for people to imagine it's real.

I'm much less decided about the legality side of this, but I think the from
the moral side, this is pretty hard to defend. Obviously this is taking things
a bit to the extreme, but where exactly do you draw the line from a
utilitarian perspective? Is a gang rape morally permissible if the
perpetrators enjoy it enough to outweigh the victim's suffering? Why should
people ever be (morally) allowed to derive their pleasure from another's
suffering?

------
Myrmornis
I can’t really see much difference between the original and the faked version.
It would be better if it swapped the hair too. (Yes, I do have problems
recognizing faces in real life.)

~~~
pixl97
Hair will be harder in the sense it requires a lot more CPU power. Making
realistic hair in video games has been a challenge.

------
eagsalazar2
I keep wondering about how this approach can be composed, vs just doing a
single replacement. Example: replace person a head in this video with person b
(current examples), make composed person more tan, swap hair on composed
person with this other person c hair, add muscles, change shirt with this
other shirt, etc, etc. It really seems like the same type of operation
reapplied repeatedly. Could be powerful character editor type functionality.

------
trampelpfad999
As we all (at least in europe) remember, to know the truth is not so easy.
[https://www.theguardian.com/world/2015/mar/19/i-faked-the-
ya...](https://www.theguardian.com/world/2015/mar/19/i-faked-the-yanis-
varoufakis-middle-finger-video-says-german-tv-presenter) This fake video from
Varoufakis made pretty huge waves.

------
chrischen
Is there an inplementation of a generalized face swap that doesn’t require
hundreds of images of both the target and subject?

~~~
bpicolo
Keep in mind that one video counts as hundreds of images

------
jacquesm
Next up, put yourself and/or family members into any movie. That's really only
a matter of time.

~~~
petercooper
When I was a kid in the mid 80s, I got a racing game (Revs) that let you enter
names in for the CPU racers. Looking back it seems ridiculous but I remember
it being such a thrill to pretend I was racing against "people" I knew. Even
_that_ was exciting to me, so what's happening now is absolutely
mindboggling(!)

~~~
0xdeadbeefbabe
The names example calls for more imagination though. Yeah it's mind boggling,
but I wonder if the effect will be about the same or that much more.

~~~
petercooper
So maybe the "uncanny valley" could even ruin the effect? Some of the samples
so far are really good though, so maybe we skip that stage..

------
peterchon
I see a new business selling face data

~~~
jgautsch
I wonder if there are any companies aggregating some sort of book of faces.

~~~
zkomp
eeerm... Yeah, like Facebook (it's the actual name)

(best to just spell out the obvious)

------
arpgy
Within a few years there will be apps that take movies of two different people
and allow you to just tell it the parts to swap. People will be able to easily
make their own videos like this easily. Banning the technology is a temporary
and questionable fix at best.

------
vonnik
When I see this I think of the future of human robotics interfaces. A robot
with a screen for a head, will be able to express anything we do, and tailor
its responses to the environment. That, in turn, will probably make them much
more acceptable in society.

~~~
pbhjpbhj
Behold the helper robot whose face was anonymously hacked to be goatse and
whose owner can't afford to get fixed.

Buying groceries in your local supermarket ...

------
mrfusion
Does this mean we can make terminator movies with Arnold for hundreds of
years?

~~~
pbhjpbhj
I think it means you'll be able to replace all actors in all films with Arnold
if you like.

Films may come with default actors and the viewer will be able to choose
alternate body types or faces.

Would be interesting to see how that affects Hollywood celebrity. Will the
celebrities do the movies still or just use stand-ins and lens their visage to
the final cut?

At what point will the star of live action movies be a computer generated
skin?

------
peterchon
[https://www.youtube.com/watch?v=ohmajJTcpNk](https://www.youtube.com/watch?v=ohmajJTcpNk)

------
adonnjohn
Is it possible to utilize deepfake tech to crack iPhone Face Id?

~~~
hmate9
The two are unrelated. Face ID looks at a 3D face not a 2d video

------
Pica_soO
There will be a hollywood version of do your own story books of course.

