
Ctrl Shift Face: A growing ‘deepfakes’ YouTube channel - svenfaw
https://www.theregister.co.uk/2019/05/28/youtube_deepfakes_channel/
======
_red
I would imagine that in the future video encoders will start to automatically
include a cryptographic signature every X frames, which then players will
become aware of and show an indicator for "original / modified" content.

~~~
buildzr
It'd be too easy to fake, just replace the sensor in an approved camera with
your own data feed.

And that's assuming it wouldn't be somewhat trivial to dump keys from one or
more cameras on the market to create a an easy tool or resignvideo.com...
which is a massive assumption.

As the movie industry has already seen, cryptography doesn't help you when you
push out the keys to everyone. It'd be about as effective as DRM is at
stopping video piracy.

~~~
gambler
You're not thinking this through.

The problem isn't that creators can make fake videos (which would be analogous
to DRM). Let them. There is no harm in it. The problem is that users have no
way of verifying true videos. This _can_ be solved with cryptography, because
both parties involved (creators and viewers) would want this to work.

In other words, DRM is about forcing _everyone_ to do a thing. Verification is
about allowing _willing participants_ to check a thing. Different scenarios.

Companies could make tamper-proof sensor modules that do the signing right
where it matters. The important thing is to make sure they aren't
astronomically priced and don't compromise privacy.

~~~
jdietrich
I'm sorry, but you're not thinking this through. Being able to prove that a
video file is the unaltered output of a particular camera is in no way a
useful protection against fake videos. Proving that file A is genuine tells us
nothing about whether file B is fake.

Let's say I want to create and circulate a fake sex tape featuring a senior
politician. In a world where signed sensors are common but not mandatory, the
absence of a signature says nothing at all about the integrity of my video -
there are still millions of cameras in the world that produce unsigned files.
In a world where they are mandatory, I can simply fake the video in software
and then re-record it using a high-resolution display and a signed sensor. A
digital video expert might be able to prove that such a re-recording process
took place, but I highly doubt that they'd be able to say so with any degree
of confidence if there's a bunch of sensor noise and a couple of rounds of
low-bitrate transcoding.

Signed sensors won't do anything to undermine the credibility of fake videos,
but they may have the opposite effect if they're hacked - I can "prove" that
my deepfake is genuine, because I've extracted a signing key or injected a
bitstream into a camera module.

[https://www.forbes.com/sites/kalevleetaru/2018/09/09/why-
dig...](https://www.forbes.com/sites/kalevleetaru/2018/09/09/why-digital-
signatures-wont-prevent-deep-fakes-but-will-help-repressive-governments/)

~~~
deckar01
You don't need to be able to prove the video is raw footage, you just need to
be able to verify that a publisher you trust says they recorded it. An HMAC
allows you to verify that a message (like a video file) was not modified after
the author signed it.

~~~
jdietrich
Hypothetical: An anonymous source releases CCTV footage that appears to show a
public figure urinating on a homeless person. Do you really believe that we'll
all ignore that footage because the signature doesn't belong to a major news
organisation?

~~~
bilbo0s
Yes. We will.

Because the future us will be conditioned to do so by our cryptographically
integrated devices. There'll be so much fake video out there that we'll get
accustomed to seeing it on our devices with the big huge "fake" sign flashing
on it.

(Or, again, even worse, it will not have the "real" sign flashing on it that
the video from validated publishers and cameras will have.)

We can be conditioned to do this extremely quickly. Again, ActiveX is what
made all the plugin security warnings start popping up everywhere. But after a
while, plugins like flash and unity even fell to the reality of us having been
conditioned to click "Do Not Run Plugin" whenever we saw that warning. Are we
better off without flash and unity? Probably, but that's beside the point. The
point is that masses can be conditioned to implement their own control when
you set up a system where only data meeting a certain standard is accepted.
And it's a hardware standard. Consider very carefully the position over us
that such a system would put hardware makers and those with power over
hardware makers in.

Now again, I'm not arguing that fakes are not a problem. They are indeed a
problem. Just saying we should go about solving the problem with extreme care
and caution.

~~~
jdietrich
So you're arguing that at some point in the future, nobody will give even the
slightest credence to amateur footage captured on camera phones, small
business CCTV systems and a million other sorts of non-corporately-owned
camera? Because I think that future is worse than the one that's full of fake
video.

~~~
xienze
> So you're arguing that at some point in the future, nobody will give even
> the slightest credence to amateur footage captured on camera phones

I think that's the end goal of all the "deep fake" hysteria we're being primed
with. In the future if some damning video of a politician or other elite
member of society pops up the defense will simply be "well you can't rule out
that it's a fake, look, it's not signed and it's not in our database of
trusted videos!"

~~~
pessimizer
Or the end goal of the "fake news" scare in general (not to say that there
aren't sites that specialize in making up news.) To combat "fake news," you
start whitelisting "true news," and defining it by the influence of the
provider, not the content. In the end, you just end up with state news
sources.

related: It's telling that "factchecking" isn't a dialogue between various
authorities, but instead just a bunch of weird private judicial systems with
no appeals courts, selected and assigned to specific stories by internet
aggregation intermediaries. Eventually, this can't help but shake out as
stories being divided into two categories: _factual_ and _banned from
distribution._

I sadly feel that we're about 10 years away from licensing journalists with a
security clearance-type process that must be sponsored by an approved outlet.
The PropOrNot at the Post shook me a bit.

~~~
fuzz4lyfe
1st amendment won't allow that, but then again people seem to want to throw
out the 2nd without a constitutional convention so it's not impossible. I can
see the arguments now, "it's the freedom of the press, are you the press? Then
why do you have a own a fully automatic printing press?"

------
knowuh
Someone has to put a deepfake Zuckerberg video up on Facebook where he
forecasts a terrible financial quarter.

Maybe that would have some influence about how seriously they take takedown
requests.

~~~
mises
To any one who reads this, don't actually do that. I wouldn't put it past
Facebook to file a serious lawsuit against you. If it spread far enough, you
could cause serious damage, and the SEC might even get involved.

~~~
cr0sh
If I were going to do this, and I thought it would cause such a storm, I'd
take a raspi wardriving and find an open router somewhere, and then upload it
via TOR to liveleak or imgur or something like that using an old netbook
sourced from Craigslist. Then wipe the devices used and sell 'em back on
Craigslist, or drop 'em off at Goodwill or some recycling company. Plausibly
deniable and all that.

~~~
mises
Some one's been watching too much Mr. Robot... for what it's worth, don't
bother with imgur over tor; they block uploads from those ips. You can,
however, use the lunapic upload to imgur feature. Public library also might be
an option, though I might reccomend i2p. Whether it's really "more secure", I
dont know, but it doesn't have the same mainstream knowledge (or use by
journalists), so three-letter guys would be slower to break it.

------
NullPrefix
Could we get the link changed to the actual YouTube channel?
[https://www.youtube.com/channel/UCKpH0CKltc73e4wh0_pgL3g](https://www.youtube.com/channel/UCKpH0CKltc73e4wh0_pgL3g)

------
moftz
How hard is it for news sites to include links to other websites other than
their own? Like thanks for at least calling out the name of the channel but
maybe make the name a hyperlink?

~~~
mromanuk
Agreed that they should link to the channel explicitly, but if you click on
one of those videos, they take you to the channel

------
ice_nine
Articles like these make me eager for advances made in the 'authenticity'
market. I recently learned of a company called Truepic[0] that specializes in
photo and video verification.

From the New Yorker article I read:

 _" Truepic, a startup in San Diego, aims at producing a new kind of
photograph—a verifiable digital original. Photographs taken with its
smartphone app are uploaded to its servers, where they enter a kind of
cryptographic lockbox. “We make sure the image hasn’t been manipulated in
transit,” Jeffrey McGregor, the company’s C.E.O., explained. “We look at
geolocation data, at the nearby cell towers, at the barometric-pressure sensor
on the phone, and verify that everything matches. We run the photo through a
bunch of computer-vision tests.” If the image passes muster, it’s entered into
the Bitcoin and Ethereum blockchain. From then on, it can be shared on a
special Web page that verifies its authenticity. Today, Truepic’s biggest
clients are insurance companies, which allow policyholders to take verified
photographs of their flooded basements or broken windshields. The software has
also been used by N.G.O.s to document human-rights violations, and by workers
at a construction company in Kazakhstan, who take “verified selfies” as a
means of clocking in and out. “Our goal is to expand into industries where
there’s a ‘trust gap,’ ” McGregor said: property rentals, online dating.
Eventually, he hopes to integrate his software into camera components, so that
“verification can begin the moment photons enter the lens._” [1]

[0] [https://truepic.com/](https://truepic.com/)

[1] [https://www.newyorker.com/magazine/2018/11/12/in-the-age-
of-...](https://www.newyorker.com/magazine/2018/11/12/in-the-age-of-ai-is-
seeing-still-believing)

~~~
saagarjha
How does Truepic work? What's stopping me from spoofing sensor data on my
client?

~~~
frosted-flakes
Probably nothing. But it solves 99% of the problems it's meant to solve.
Casual insurance fraud, for example.

~~~
munk-a
Preventing casual insurance fraud and guarding against deep fakes are very
different problems, with a very stark contrast in terms of the effort and
technical aptitude available.

Occasionally a security minded computer savy person might pull one over on the
insurance company - but most clients aren't going to have the expertise or
dedication to correctness. To contrast that - the pool of people who would try
and produce a deep fake, especially in the political realm, have already
decided to invest a good amount of investigation and effort and are likely
rather technically minded.

So this would stop the script-kiddie equivalent actor in the deep fake realm,
but I am more concerned about dedicated nefarious actors.

------
mromanuk
a few days ago, I was searching YouTube for a video on my TV and I found Bill
Hader's impression of Arnold Schwarzenegger, which is impressive BTW, but I
didn't realize what was going on with his face. At first I thought, wow, he's
physiognomy is like Arnold, but something weird was going on. It took several
trials until I realize that Arnold's face was superimposed on Bill's face.
Spooky.

~~~
tcmb
I had a similar experience with that same video. It was suggested to me in the
YT recommendations (I had watched Bill Hader on Late Show with Stephen Colbert
previously).

I genuinely thought "wow, when he squints his eyes he really looks like
Arnold". The transitions between Bill's own face and Arnold's were totally
seemless to me.

I finally realized what was going on when I saw the channel's name, and it
spooked me that I had fallen for it.

~~~
mromanuk
> I genuinely thought "wow, when he squints his eyes he really looks like
> Arnold". The transitions between Bill's own face and Arnold's were totally
> seemless to me.

my experience, exactly

------
redisman
One thing I don't see talked about much is the Uncanny Valley. No other
technology has made it past it and looking at these they definitely have not
either. Is there any reason to think that "deepfakes" will leapfrog it
somehow?

~~~
robotzero
I can't say that though... had these videos been sent to me and it was
mentioned "Arnold lost weight" or "Brad Pitt reshot the scene after shedding
some pounds" I'd have probably been less of a sceptic and in a quick pass this
"fake news" could have got a pass from me or others. It's an interesting time
and if you're doing cursory browsing of news bites for the day this may fly
right past...

~~~
redisman
The Fight Club one I couldn't tell was I looking at Edward Norton or Brad Pitt
the whole time. Definitely in the valley for me but I guess I was looking for
it.

------
inostia
Generally the best way I've found to detect deep fake video is to listen to
the audio.

When we have deepfake video _and_ deepfake audio we'll be in trouble. At least
for moving images.

~~~
wavefunction
[https://arxiv.org/pdf/1802.06006.pdf](https://arxiv.org/pdf/1802.06006.pdf)

These authors worked/work at baidu.

[https://www.vice.com/en_us/article/3k7mgn/baidu-deep-
voice-s...](https://www.vice.com/en_us/article/3k7mgn/baidu-deep-voice-
software-can-clone-anyones-voice-with-just-37-seconds-of-audio)

------
gforst
This is multi-layered problem with no simple answers but many comments here
touch on solutions, note there will not be 1 solution. Here are a couple of
concepts I know are being worked on

\- Secure Data at the Source -

Public Created - I have seen some demo's of phone apps that put video and
pictures from phone directly on the blockchain with gps, time stamps etc..
Also have heard of dongles being added to phones

Professionally Created at Device - have read about hardware getting processors
to secure data directly on the blockchain at the device.

Created at Editing Software - This could be connected to the identity of the
original on the blockchain and then the edited version has notes with it and
added to the blockchain as an edited version of the first.

Security cameras for Customs and Border Patrol are getting blockchain
capabilities added per this quote - " Factom (blockchain) has integrated their
technology into two brands of cameras used by CBP at border locations to
ensure data collected from these cameras is tamper proof" \-
[http://www.activecyber.net/dhs-st-lays-out-a-broad-yet-
innov...](http://www.activecyber.net/dhs-st-lays-out-a-broad-yet-innovative-
portfolio-of-cybersecurity-research-in-this-years-showcase/)

IoT Devices - This micro processor can be added to an IoT (raspberry pi and
the like) device and secure data the source - [https://iot-
sas.tech/](https://iot-sas.tech/)

Professionals (media) and citizens can then have proof of what and when they
created the video and potential a certification can be added in manor and tool
for others to check the video is in fact the original.

\-- Identify Fakes -- Two articles from universities working on it
[https://jsis.washington.edu/news/deep-fakes-fake-news-and-
wh...](https://jsis.washington.edu/news/deep-fakes-fake-news-and-what-comes-
next/)

[https://medium.com/jsk-class-of-2019/six-lessons-from-my-
dee...](https://medium.com/jsk-class-of-2019/six-lessons-from-my-deepfake-
research-at-stanford-1666594a8e50)

None of the above completely solves the issue but with implementing these and
having standards the public can start to trust what they are watching is
authentic.

------
gambler
The scare over this is overblown.

1\. If the image quality is good and if you pause the video, the artifacts are
pretty obvious.

2\. Making meme videos is not a problem. You have to think about serious
scenarios and ask yourself whether they were impossible before. In a lot of
cases they were possible before without using AI. People edited interviews and
staged recorded events since forever.

~~~
jerf
Memesters are at the end of the tech hierarchy. They're using stuff that's
cheap or free.

That doesn't mean everybody is.

------
sdfjgoiwerj
I think where this will shine, creative people who want to make movies but not
enough money to cast more than maybe a handful of people.

------
atonalfreerider
How about training a neutral net on fake videos to make it able to detect
faked videos.

~~~
brootstrap
easier said than done my friend.

~~~
zarmin
whattya mean? you just train it. it's a neural network. /s

------
llamataboot
What are some of the libraries out there to do this stuff with?

~~~
coolspot
[https://github.com/deepfakes/faceswap](https://github.com/deepfakes/faceswap)

------
MaupitiBlue
Not knocking it all, but much like CRT’s used to provide free anti-aliasing,
it seems as though part of the blending is achieved by the compression
artifacts.

------
RappingBoomer
our benevolent media is worried about losing control of the propaganda
machine..

~~~
SketchySeaBeast
I really don't see a reason to celebrate our accelerating slide into
solipsism.

------
xwdv
I find that putting a deeply faked face of an enemy on a video where someone
is brutally killed or executed to be strangely pleasing.

