
DeepFaceLab: A tool that utilizes ML to replace faces in videos - wawhal
https://github.com/iperov/DeepFaceLab
======
localhost
I wonder what this implies for the future of media (audio, photos, video) in
general? Do we only consume media that has been signed by the creator and
verified by an authority that we trust, e.g., a "blue checkmark" for media? Do
we know how effective the SSL certificate verification has been in browsers at
influencing consumer behavior?

~~~
RandallBrown
In Neal Stephenson's latest book, that seems to be how he gets around the fake
news problem.

The near future world signs everything with a personal identifier so they can
prove they what people are seeing is genuine.

I imagine these identifiers could be extended to things like security cameras
and things too so that there could be some verification that the video footage
hasn't been doctored.

All of this of course relies on the general public getting the knowledge and
tools to seamlessly do this verification on a daily basis. In the book it's
mostly taken care of using google glass style wearables.

~~~
slg
The primary problem with fake news is not that people are fabricating
evidence. Sure, that does happen on occasion but it is relatively rare and is
usually debunked pretty quickly. The biggest issue is that there are sources
that can't be trusted and consumers don't have the skills or tools to identify
those untrustworthy sources or simply don't care that their sources might be
biased. Fixing the first problem (which is all some type of signing
certificate could hope to do) doesn't accomplish much if we can't also
addresses the second problem.

~~~
air7
Imo the problem with fake news is that by the time it's debunked (if), its
effect has already happened and can't be undone.

~~~
slg
That is really just a symptom of the second problem. Fake news can spread
pretty far before it is debunked, but the fake news has to start somewhere.
Maybe people get it from some untrustworthy news site, social media, or
directly from the mouth of a habitually lying politician, but it almost never
originates from a legitimate news source that has the skills, tools, and
motivation to better assess the authenticity of the news. If people put less
faith in those untrustworthy sources and waited for actual journalism to be
done before reacting, fake news would be less of a problem.

~~~
belorn
I can honestly say I do not know a legitimate news site. For every topic that
I have more than surface knowledge on the articles are always, almost with no
exception, biased and limited to a single perspective which aligns with
cultural expectations.

The exceptions are local news about local events that has no political angle,
and occasionally investigative journalism.

The most common method that legitimate news sources use to bias news is by
omission, and second by using a misleading context. Neither is strictly a lie,
but the result is as much fake news as something fabricated.

------
puranjay
Video evidence is going to be inadmissible very soon.

It's ironic that we're essentially being pushed back to a pre-technology, pre-
media time. "If you didn't see it with your own two eyes, you can't believe
it"

~~~
vectorEQ
videos can be altered in this way for a long long time now. just not by neural
networks. so in reality, it changes nothing.

~~~
simias
It does change a whole lot. Up until a few years ago you'd only have to be
suspicious of very important videos coming from people who have access to
convincing video editing. Some video coming from the USA, Chinese or Russian
government trying to disprove war crimes? Yeah, better be careful. Some guy
using damcash footage to prove the other driver was in the wrong? Not so much.

As the technology improves and becomes accessible soon anybody will be able to
edit videos convincingly without having to invest a massive amount of time or
money. That's going to have a large impact I think.

~~~
koonsolo
Look at the old video's of Bigfoots and UFO's. A lot of them have been
questioned for years if they were fake or not.

So in that sense it could have still be done pretty cheap.

~~~
simias
Nobody knows exactly what a UFO and Bigfoot really look like so that helps.
The debate is not so much about whether the footage has been doctored but
rather about what it depicts.

A better example would probably be the moon landing footage or 9/11\. And then
it's really only questioned by conspiracy theorists and specifically _because_
a state actor might have wanted to hide the evidence of a conspiracy.

Tomorrow with this new technology I might not convince you that Bigfoot exists
but I might be able to show you extremely convincing footage of Margaret
Thatcher and Mao Zedong frolicking in the Swiss Alps while Tupac is watching.

~~~
koonsolo
It's true that things can be more subtle now, like a politician giving a
racist remark or something like that.

------
costcopizza
I think the cons are going to heavily outweigh the pros with this tech.

Fake news on high octane race gas.

~~~
lawlessone
Beyond trivial amusement what are the Pros?

~~~
cactus2093
I don't know that it makes sense to isolate just the narrow concept of deep
fake creation. It seems like fundamentally a lot (most?) of the breakthroughs
that make creation of deep fakes possible are the same ideas that make
possible the current state of the art for classification, decision problems,
advanced NLP, and other things we call ML or AI.

So to do this pro/con analysis you probably need to include the pros of these
related technologies as well, which are certainly more than trivial amusement.

~~~
klyrs
> So to do this pro/con analysis you probably need to include the pros of
> these related technologies as well, which are certainly more than trivial
> amusement.

Like the time my uncle (by marriage) sent a JibJab to the whole family
featuring 4 recently-dead family members... real big pro, I'm sure... /s

------
luiscosio
There is also:

[https://github.com/deepfakes/faceswap](https://github.com/deepfakes/faceswap)

With a big development community and interesting results.

------
s_Hogg
What are the non-malicious uses for something like this? Haven't actually got
around to pondering that yet.

~~~
DMcVeigh
An interesting use could be to anonymise people in public videos. You could
use randomly generated faces (which other emerging tech can produce) and
effectively remove peoples recognizable features by "replacing" them.

~~~
chris5745
Reminds me of the anonymizing suits in “A Scanner Darkly”.

~~~
drusepth
Actually, you could probably get exactly that effect if you modify the script
to pull from a pool of source faces per frame instead of a single face.

------
Oras
Is it the same technology used by Zao? The example in the gallery does not
look as good as the video here
[https://twitter.com/AllanXia/status/1168049059413643265](https://twitter.com/AllanXia/status/1168049059413643265)

~~~
throwawaywego
I think this tech is too complex to run on mobile devices in 8 seconds for
video transfer from 1 selfie.

What I think Zao does is preprocess the videos (manually or highly accurate
facepoint detection). They pre-calculate the transforms (standard facemorphing
algorithms with opacity/alpha tweaks), and shading depending on the scene
lighting. Then they just need a good frontal selfie, or do some
frontalization, and keypoint detection, and the rest can be rendered/computed
without much resources, following a pre-defined script.

If more advanced than facemorphing, then perhaps something more like:
[https://github.com/facebookresearch/supervision-by-
registrat...](https://github.com/facebookresearch/supervision-by-
registration/raw/master/cache_data/cache/demo.gif) (pre-fitting a 3D face
mask, then texturing it with your selfie)

~~~
JoblessWonder
Yeah, I think the big question is if Zao only allows pre-selected "scenes"
that they have already done the processing on or if they allow you to upload
_any_ video.

From the results, I think you are exactly right in how they are accomplishing
those videos.

------
Animats
Hollywood has been doing this for a while, for stunt performers. Sometimes
well, sometimes badly. We'll be seeing more of that as it works better.

As for the political implications, go watch "Wag the Dog" again.

------
jonnismash
This technology has been posted on HN before and the comments were all too
similar to this one. I understand why a lot of people are falling into the
FUD-hole, but we can pretty easily solve the issues mentioned in the comments
by inserting a digital key - similar to a SSL certificate, into an image/video
upon initial upload. Or even upon creation, maybe using some sort of ledger to
verify integrity. I'm not too worried.

~~~
michaelmior
What does that verify? Assuming they key hasn't be compromised, it does verify
that the holder of the key uploaded a file at a particular time. It still
doesn't show that the video itself is genuine. Although it does seem to allow
us to fall back on the trustworthiness of the source which is better than
nothing.

------
spaniard_dev
Am I the only one that cries when seeing non PEP8 compliant Python code? Why
do data scientists mistreat Python that way?

~~~
topranks
You are the only one.

------
lorepieri
On a (somewhat) related note, I'm working on face recognition with homomorphic
encryption, therefore without compromising the user privacy. The bold goal is
the first privacy preserving videocamera. If you find this interesting, I
would love to chat about it.

~~~
noteness
I'd love to hear more about it & see the code.

------
gallerdude
I'm surprised no one's bringing up the precedent of photoshop. For years we've
culturally realized that pictures may not tell the whole story, and this is
the exact same, down to results that can't quite escape the uncanny valley.

------
jonplackett
On a scale of 1 to 10, how much easier does this make Deepfakery?

------
InfinityByTen
Deep Fake with its revenge porn wasn't enough to give this a second thought?

~~~
kevingadd
Researchers and prototyping engineers generally don't seem concerned with
ethics or consequences these days. Hard to be sure whether it's due to a
pressure to publish or interest in chasing fame/revenue but there's lots of
stuff like this being published lately. Even if some researchers opted not to
chase this down there are always more people looking to push the boundaries.

The natural conclusion goes past revenge porn to revenge videos of beheadings,
murders, etc that can be sent to friends or family members who aren't
technically savvy enough to recognize fakes for what they are. Not to mention
the existing phenomenon of blaming random civilians for murders and terror
attacks - this will be far more convincing when it's paired with convincing
fakes. It'll probably claim some lives.

~~~
nefitty
Is innovation supposed to pause for society to catch up every time something
drastic is created or discovered?

~~~
InfinityByTen
Nuclear tech required a rethink. So maybe yes?

~~~
nefitty
Fair!

