
A scientist who spots fake videos - rstoj
https://www.nature.com/news/the-scientist-who-spots-fake-videos-1.22784
======
jw1224
This seems like a good time to mention Captain Disillusion.

A filmmaker and video editor, Alan Melikdjanian, started a YouTube channel
back in 2007. He debunks viral videos and adverts, whilst performing in
character as the rather surreal "Captain Disillision".

Don't let the bizarre performance scare you off - his production values are
simply outstanding, especially considering he does this all pretty-much
singlehandedly, and has done so with hardly any recognition for the past 10
years. I think his channel only hit 100k subscribers last year, so it looks
like he might finally be getting some well-deserved recognition.

Here's a few links to some of his best stuff. You'll probably be thinking "WTF
is this?" when you first start watching, but trust me and stick with it:

\-
[https://www.youtube.com/watch?v=H_slT5YpBok](https://www.youtube.com/watch?v=H_slT5YpBok)

\-
[https://www.youtube.com/watch?v=KbgvSi35n6o](https://www.youtube.com/watch?v=KbgvSi35n6o)

\-
[https://www.youtube.com/watch?v=NXCtgI2lkFw](https://www.youtube.com/watch?v=NXCtgI2lkFw)

~~~
coryl
He is financially supported by crowd funding
([https://www.patreon.com/CaptainDisillusion](https://www.patreon.com/CaptainDisillusion))
and makes videos full time.

~~~
skinnymuch
He has 500K subscribers and growing rapidly. I assume he makes a decent chunk
of change from YouTube itself too?

------
chriskanan
As he mentions in the article, DARPA currently has a large media forensics
project going on. They have split it up into teams making manipulated videos
and images, and other teams trying to detect those manipulations. They had a
workshop at CVPR-2017. I'm on one of the teams that is making manipulated
data.

[https://www.darpa.mil/program/media-
forensics](https://www.darpa.mil/program/media-forensics)

~~~
kaffeemitsahne
It's nice that they call it "media forensics", but it would seem that the
making of manipulated videos is much more useful from a military standpoint.

------
ekidd
I had a lot of great college professors, but Hany Farid (the subject of this
article) was always one of my absolute favorites.

He always placed a strong emphasis on strong mathematical fundamentals, once
joking in class that we only think AI is hard because we're "bad at math." And
to prove it, he showed us endless examples of how the right math makes
problems far easier. For example, I remember one time when he showed us how
correctly understanding something like the sinc function would improve
directional derivatives of sampled data, and then turned this into a better
video motion tracker.

He loved working with colleagues in other departments, solving all kinds of
fun problems and publishing joint papers. At one point, I think the campus
magazine was writing articles on him several times a year, because he always
had something cool going on.

------
coldcode
I wonder how the courts and elections will survive if this type of fake ever
exceeds our bandwidth and investment to detect them. Maybe ultimately no one
will believe video any more? Maybe people will ignore fake detection and go
with whatever their biases tell them is true? Either way electing or
convicting anyone might become too easy to manipulate.

~~~
freeflight
> Maybe ultimately no one will believe video any more?

People believe whatever they want to believe; forensic hair analysis (not DNA)
is also quite a controversial practice, it's still accepted as evidence by
many courts in the US. Last Week Tonight had a bit about that issue in a
recent episode [0]

It's sad and mindblowing that people have been sentenced to prison/death
because a so-called "expert witness" confused a dogs hair with being their
hair.

[0]
[https://www.youtube.com/watch?v=ScmJvmzDcG0](https://www.youtube.com/watch?v=ScmJvmzDcG0)

------
epmaybe
Note that the article also talks about how to detect standards image
manipulation as well. This could be useful for me when reading academic papers
and questioning the visual representation of results.

------
olegkikin
We're already at the stage where one can't trust the video evidence, unless
it's backed by the camera's signature/encryption.

Creating a very realistic fake is now trivial:

[https://www.youtube.com/watch?v=ohmajJTcpNk](https://www.youtube.com/watch?v=ohmajJTcpNk)

[https://www.youtube.com/watch?v=nsuAQcvafCs](https://www.youtube.com/watch?v=nsuAQcvafCs)

Google is very close to synthesizing realistic voice.

It's game over, as far as I can see.

It's a matter of time someone creates a fake video of someone famous saying
something very outrageous, like nazi propaganda, and it will result in the
destruction of that person's career and life.

We really need something like Secure Enclave in every camera.

EDIT:

Another related video:

[https://www.youtube.com/watch?v=hPksv1gJet4](https://www.youtube.com/watch?v=hPksv1gJet4)

~~~
Retr0spectrum
I'm not sure in-camera signatures are the way to go.

In the very simplest case, someone could just point a camera at a very high
quality screen and record that, generating a signed video.

A more complex attack would be to effectively emulate the image sensor and
pipe image data straight into the camera.

If you want to prove that a video was filmed on or before a certain time, one
way would be to hash it and put that hash on a blockchain, but that doesn't
really solve the problem of authenticity.

~~~
olegkikin
Try it. Record something off the screen (including the audio), and see what
happens. I'm yet to see one example of that that looks like reality.

Emulating a sensor sounds like it immediately reduces the number of
perpetrators by orders of magnitude.

I agree though, what I'm suggesting is not 100% bulletproof, but it's using a
proven technology, and it's relatively simple, assuming hardware manufacturers
are willing to add one small chip to their cameras.

~~~
asciimo
I've seen a number of YouTube and liveleak videos that were obviously made by
recording a security monitor with a phone. Seems to be an acceptable practice
for certain genres, and is usually credible.

~~~
gregmac
Which only makes things worse, because if you are trying to create fake
security camera footage, you just need to make it real enough to play back on
a "security monitor" and record _that_ with a phone camera.

------
namanyayg
FotoForensics [1] is a great tool for checking if images are
airbrushed/manipulated, seems like they work by checking JPG compression and
the RGB-average concept.

\-- [1] [http://fotoforensics.com/](http://fotoforensics.com/)

~~~
sillysaurus3
Unfortunately it doesn't really work except in constrained cases. It's easy to
convert an image to PNG and back.

~~~
striking
What? The compression from the JPG and the ways the edits do not affect said
compression is not undone by a simple conversion.

------
tunaoftheland
Recently listened to an episode of Radiolab includes an interview with Hany
Farid: [http://www.radiolab.org/story/breaking-
news/](http://www.radiolab.org/story/breaking-news/)

Pretty interesting thinking about the gap between what is possible now and
where the human population is with understanding that.

------
beached_whale
It is intriguing to think about this in reverse. Normalisation of the error
rates across an image, and the techniques to achieve that or detect that.
Also, rebuild images according to specific camera profiles such that the per
pixel statistics match.

------
dunk010
Captain D is basically the best thing on the internet.

