
Here Come the Fake Videos, Too - ezhil
https://www.nytimes.com/2018/03/04/technology/fake-videos-deepfakes.html?
======
tbabb
The era of verifiability-- when photographic and video evidence could
plausibly be trusted-- is over.

Photos have been dubious since photoshop, but still required expertise and
artistry to use. Fake moving pictures required considerably more extensive
expertise, lots of time, and usually specialized equipment.

None of that is true anymore. We are back to a pre-industrial age of
discourse, where rumor and hearsay and anecdote dominate again. Buckle up.

~~~
sdrothrock
> The era of verifiability-- when photographic and video evidence could
> plausibly be trusted-- is over.

If ease of producing convincing fakes of X means that the era of X as
convincing proof is over, then why are signatures and written documents still
taken as proof?

I think it just means that things will be scrutinized more carefully from now
on, and that there will need to be a stronger document trail.

> We are back to a pre-industrial age of discourse, where rumor and hearsay
> and anecdote dominate again.

And I definitely don't think it's as grim as this.

~~~
ry_ry
Anecdotally,

Whenever I receive a signed-for delivery, I'm asked to provide my name and
attempt to sign on the proffered touchscreen device - invariably scrawling a
shape that in no way resembles my actual signature.

I recently accepted a new role, and as part of the on-boarding digitally
signed my contract by typing my name into a text box marked "Digital
Signature" \- I'm kinda assuming it's just a regular input box with added
legalese.

Signatures are absolutely no longer proof of anything, simply a token gesture
with perhaps a dash of security-by-fud tacked on.

~~~
IntronExon
I think a signature is less about proof of identity, than proof of consent.

------
justonepost
I've always felt that deepfakes are a net positive. Think of all the revenge
videos that will be rendered moot. The Jennifer Lawrences of the world can
stop stressing because nobody will take any of it seriously anymore.

It's kind of like those pornographic clones of popular kids cartoons. People
freaked out at first, but really you don't hear much about it these days.

Sort of like an anti-virus inoculation when you think about it.

~~~
toomanybeersies
I think you're looking at it the wrong way. The issue of leaks and revenge
porn isn't the nudity. It's the breach of privacy and trust.

Trying to claim that leaked photos are fake doesn't change the fact that they
aren't. It doesn't make you feel any better knowing that there's intimate
photos of you out there, whether leaked by a jealous ex or stolen by a hacker.

A lot of these people that have had intimate photos or videos of them leaked
have actually posed naked for publications, and they've almost certainly been
naked around strangers as part of their job. It's not the nudity that's the
issue here.

This is especially true of revenge porn. I don't really care if someone sees
pictures of my dick, plenty of people have seen it. I'd be more upset that
someone I trusted broke that trust in just about the worst possible way.

------
runeks
Anyone else rather unimpressed by the realism of these fake videos? I mean,
technologically it’s cool and all, but it seems more like a proof of concept
than anything that can be used to fool humans.

All the videos I’ve watched — out of technical curiosity, naturally — have had
some sort of glitch that made it obvious it was fake. I think this technology
will have a serious problems just mapping the facial expressions of one person
onto another, since many people have their own distinct facial expressions.

The linked YouTube channel with the Putin video[1] is a good example: it looks
completely unrealistic because the actor in the source video makes facial
expressions Putin would never make.

In my personal opinion, I think it will take decades before this technology
becomes good enough to fool humans, and probably longer before it can fool
humans closely related to the subjects of the fake videos — if this ever
becomes possible at all. The fundamental challenge is mapping the emotions of
one person to another’s, which isn’t easily solvable. Just mapping the facial
features of Putin onto SNL’s Beck Bennett isn’t going to convince anyone
familiar with how Putin looks and acts.

[1]
[https://m.youtube.com/watch?v=hKxFqxCaQcM](https://m.youtube.com/watch?v=hKxFqxCaQcM)

~~~
jcims
Eh - [https://vimeo.com/257360045](https://vimeo.com/257360045)

Definitely not perfect, but this is hobbyist grade work. With the recent work
around parallelized WaveNet synthesizing 10 seconds of audio for every second
of wall clock time, a live fake that fools 50% of regular people is probably a
couple of years away at most. Particularly if you can control the setting to
ensure lighting/angles/etc match up reasonably well.

~~~
dawnerd
Wouldn't have even thought that looked like him, still have trouble seeing it.

~~~
saurik
I am starting to wonder if this concept simply works better for some viewers
than others. I seem to always very deeply notice that the shape of the head is
always sort of wrong... I'm wondering if I use "shape of head and typical
hairstyle" as a major way I recognize people. Maybe other people are really
focussing on facial features like "lips eyes and nose"? Honestly, what I keep
thinking when I see stuff like this is "this just looks like Paul Rudd wearing
a lot of makeup" not "this looks like Jimmy Fallon with a narrower head and a
different hairstyle".

------
iopuy
Since reddit banned deepfakes, including SFW content, does anyone know where
the community is now congregating? The appeal of the technology from online
avatars to cheaper cgi is undeniable. Is this a case of throwing out the baby
with the bathwater?

~~~
Choco31415
There is one mention here:

“Following the ban, many of the users moved to the /deepfakes/ board on
8chan[18] and the /v/DeepFakes community on Voat.[17]” [0]

[0] -
[http://knowyourmeme.com/memes/cultures/deepfakes?full=1](http://knowyourmeme.com/memes/cultures/deepfakes?full=1)

~~~
purple-again
This is so frustrating and depressing. I'm come from the hobby indie movie
production side of things and something like this would be so amazing. I came
here all excited to see where I can dip into the technology and this is what I
find...two asshole corners of the internet I would never frequent.

Can you imagine Reddit banning Photoshop and all other photo editing
technology because a bunch of guys used it to post boobies on photos of women?

I honestly can't believe this even happened.

~~~
s9w
Chans and voat can be very interesting. Don't label them asshole corners
because the 'good guys' told you so. They are orders of magnitudes less
censored as HN/reddit/twitter and offer a far wider range of opinions as well
as information.

------
skc
I live in and work in a small African country with very corrupt leadership.
This stuff disturbs me ever so deeply. I can easily see a future where
despotic governments use this technology to wipe out their detractors.

------
Choco31415
10+ years ago we started getting the technologies for facial land mark
detection, and now we have facial swapping. Currently we have the beginnings
of good full-body pose derection, and I imagine soon we’ll eventually have
full body swapping (maybe along with clothes).

That raises interesting questions. As legitimate looking sources become harder
to trust, what other ways can we verify them? One idea that was floating
around is key signing each datafile. That raises the question though of how to
manage keys. [ Maybe have each key tied to a digital id, the id similar to
Estonia e-residency? ]

At low levels of risk, like a recorded automobile accident, is such scrutiny
useful?

Thoughts everyone?

~~~
andreascmj
Couldn't blockchain be a great tool here? The original hash is uploaded to the
blockchain at the time of the recording, then you can verify that nothing has
changed since then.

~~~
telesilla
There is already tech in filming to know what camera filmed what footage. I
imagine there will be some kind of process happening soon that links footage
through its changes, all the way from capture to broadcast.

------
robgurley
I think the effects of "fake" journalism would be mitigated somewhat if we
didn't have laws against slander, libel, and false advertising.

The media-consuming public in the United States still believes that "if it is
in print, it must be true" \- they haven't been inoculated against falsehood
like they would have been otherwise. Presumably, if there were no expectations
of truth in print/media to be enforced by some magical (and actually sort-of
non existent) federal authority, media outside the "trusted" sources would be
automatically suspect unless reviewed by some other trusted third party.

------
reachpari
This Spanish Native worked for me without taking a dime from me, i was saved
by grace, i wrote and published this to keep him in business.

(sebastiann_jesuss@yahoo.com) Kik-(jesusssebatiann)

He offers the following services 1\. Imo hack 2\. Email hack 3\. Skype hack
4\. Telegram hack 5\. Facebook hack 6\. Snapchat hack 7\. WhatsApp hack 8\.
Instagram hack 9\. Tracking locations 10\. Cellphone tapping 11\. DMV points
removal 12\. Website breach/hack 13\. Cellphone clone/hack 14\. Adding names
to guest list 15\. Expungement of bad records 16\. Erasing/Deleting sex tape
links 17\. Erasing a blackmailers database.

------
thesehands
There have been more of these fake video stories recently. Without wanting to
get bogged down in politics, I have wondered if these stories are being ramped
up to provide some plausible defence to possible 'tapes' mentioned in the
Steele dossier? Not necessarily a legal defence, but enough to cast some doubt
as to the legitimacy in the media

------
EGreg
What about just having videos be watermarked and signed by their authors as
well as the equipment used?

Then you could trust their authenticity, no?

~~~
yorwba
You could trust the authenticity of the fact that someone used the author's
key to sign it at some point in time.

You could not trust any claims the author makes about the circumstances under
which the signed object was produced. They could have put their signature on a
deepfake. They could have put their signature on the work of someone else. The
author could have lost their signing key. It could have been created much
earlier than it was signed.

A signature tells you very little about the thing being signed besides the
fact that it was signed.

~~~
gpvos
I never thought I would say that you could use a blockchain to solve
something, but here I go: you can solve the timing problem with a blockchain.

Alternatively, you can go old school and publish a fingerprint in the New York
Times or something.

The other problems remain, of course.

------
justifier
Ghost in the Shell addressed this issue in 2005(o)

The Tachikoma units are debating how to stop a nuclear strike and one suggests
that broadcasting a live feed of the nuclear sub would help, but the idea is
reasoned against due to the technological capabilities to fake such a feed:

"Pictures don't prove anything anymore. It would just end up as a source of
amusement for the uninvolved masses, an image from an unknown source that
showed up at an all too convenient time."

beyond this inflection point one must now trust both the content and the
source

(o) [https://youtu.be/yAoj3AskFMI](https://youtu.be/yAoj3AskFMI)

------
zabana
Fake videos is nothing new. For those interested, you can look into how the
BBC was (is still) using footage from the 1st war in Afghanistan to illustrate
their coverage of the second one. Or how CNN used footage of an Indian porn
movie to accuse pakistani soliders of rape. I'm sure there are many more
examples of such misuse of videos ...

~~~
rasz
Doesnt even have to be real video, game engines pass for the real thing:

[https://www.pcgamer.com/itv-documentary-that-featured-
arma-2...](https://www.pcgamer.com/itv-documentary-that-featured-
arma-2-footage-declared-materially-misleading-by-regulator/)

[http://www.eurogamer.net/articles/2018-02-26-russian-tv-
stat...](http://www.eurogamer.net/articles/2018-02-26-russian-tv-station-
mistakenly-airs-footage-of-arma-3-during-report-on-war-in-syria)

------
NickGerleman
Radiolab did a story related to this a while ago called "The Future of Fake
News". It's worth a listen.
[https://futureoffakenews.com](https://futureoffakenews.com)

------
foxhedgehog
The Kodak blockchain announcement might seem less ridiculous in light of this
kind of thing.

------
fooker
Why are we not using digital signatures for everything?

------
asdfaf13123
This is deeply frightening. I could easily imagine this being used as a
political tool to incite hate and racism online. I fear it will be used a
"proof" that an event happened.

------
anoplus
We will probably need AI based fake news detection.

~~~
EGreg
That won't work. It's like saying to beat AlphaGo we just need AlphaGo.

It would already be factored in. The arms race would just make the videos MORE
indistinguishable from the real thing.

------
HeyWolfey
Step 1.) Work in Silicon Valley

Step 2.) Desire an advantage over your peers and competition

Step 3.) Notice political hysteria and how it affects the behavior of your
rivals.

Step 4.) Deepfake a rival's face into a Nazi rally and send it to his
professional network.

Step 5.) Rinse and repeat for every rival you encounter. It's not like denying
being a Nazi supporter or crying foul play ever assuages paranoid suspicion.

Congratulations, you can now destroy the career of any worker in Silicon
Valley with impunity.

~~~
sctb
You've been posting a lot of trollish, unsubstantive comments. Could you
please take a look at the guidelines and start commenting civilly and
substantively?

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

