
The US military is funding an effort to catch deepfakes and other AI trickery - etiam
https://www.technologyreview.com/s/611146/the-us-military-is-funding-an-effort-to-catch-deepfakes-and-other-ai-trickery/
======
AlexandrB
We're rapidly heading into a world where every form of non-fictional media has
to be mediated by a trusted third party because we have no way to distinguish
what's true from what's fabricated with our own senses. This is a complete
reversal of the verifying power that images, audio, and video once provided to
everybody _regardless_ of income or education. Think about some of the potent
imagery that came out of wars like WWII and Vietnam, for example.

We're not too far away from the manufacture of literal "fake news" out of
whole cloth. I'm not sure this bodes well for the idea of an informed
electorate.

~~~
ballenf
We may later view the short period in which A/V recordings as authoritative
over first-hand accounts was a brief "glitch" in the history of the world.

The ability to fake such things means that we have to return to trusting our
fellow human and making wise choices about who can be trusted.

Fingerprints, DNA samples will remain of varying degrees more difficult to
fake, but one must assume that those too will fall as paragons of unassailable
guilt (or innocence, but usually guilt).

~~~
sambull
Those should already have fallen. Gross human incompantance/perverse
incentives to use science as a blackbox for prosecution

~~~
tonto
You really just don't like forensic science or what

~~~
on_and_off
[https://www.wired.com/story/dna-transfer-framed-
murder/](https://www.wired.com/story/dna-transfer-framed-murder/)

DNA is not the end all of an investigation

------
michaelbuckbee
Humans have such issues with confirmation bias [1] that I worry it's not going
to matter.

The last election already had algorithmically generated artificial news
designed to sway people (note: I'm talking FB articles with headlines like
"$POLITICIAN just insulted $NICHE_AUDIENCE. STOP THEM." that linked to
articles that were scraped/spun from other sources). They relied upon people
not actually reading and just hitting like/forward/heart whatever and
spreading the top level message.

We are not going to be able to convince ourselves, never mind other people,
that video (which has been the gold standard for "truth" for nearly a century)
isn't actually real.

Case in point:
[https://www.youtube.com/watch?v=cQ54GDm1eL0](https://www.youtube.com/watch?v=cQ54GDm1eL0)
(which was included in the article) has comments from people deeply confused
by it on YT.

1 - [https://www.newyorker.com/magazine/2017/02/27/why-facts-
dont...](https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-
our-minds)

~~~
0xfeba
Well hopefully there will be such an influx of fakes everywhere that people
will at least begin to accept that anything can be faked, and hopefully take
the next step of thinking a little more critically about what they hear next.

~~~
craftyguy
I see it going to the other extreme, where people believe anything can be
faked, and politicians, criminal, etc start to dismiss evidence of their
wrongdoings as fake, and people believe them.

------
dzink
Any Deepfake detection tool would also be a GAN sparring partner for the
Deepfake makers.

~~~
amelius
Hence they probably shouldn't make it public for it to be effective.

~~~
some_random
But then everyone has to trust that whatever the US Military says about the
video is true, which kinda ruins the point.

------
mjh2539
Why isn't this a solution:

Everyone has some gpg-like setup.

If we see a statement or video or audio by a person accompanied by a public
(graphic or audible) key that checks out...ok, done, verified. If we don't,
then we can justifiably disregard the statement or video or audio.

~~~
stiglitz
Would work in some but not all scenarios. Perhaps the video is legitimate but
embarrassing for the represented party, so the represented party declines to
sign it.

~~~
mjh2539
You're right...so here's the logical space:

A. Video/audio/text of/by person, signed, we have knowledge that this video
bears their signature, they endorse it

B. Unsigned B1. Unsigned because it's not them B2. Unsigned because they don't
want it to be associated with them

I think this is still a better state of affairs because we can verify positive
endorsements as genuine or not.

You're right, we couldn't verify stuff they refused to sign, but!

Imagine the state coerces people to carry small devices with radio
transmitters that constantly transmit, over the length of a second, a signed
key, ad infinitum; that way no one can plausibly deny that it wasn't them
(unless someone plants the persons device on an imposter).

~~~
sushid
If I have some embarrassing/damning video I want to post online you can bet
your ass I'm going to scrub off any metadata.

If there could be some repercussions (e.g. police brutality, gang violence,
whistleblowing, etc.) I'm going to go above and beyond to make it not
traceable to me. And you're here asking the government to make it a law to
make all video outputs traceable to the creator?

Perhaps you should apply to the NSA or Palantir.

~~~
stiglitz
I think it’s a thought experiment and not a suggestion. And the transmitter is
for the person being videotaped, not the person creating the video. Though of
course you raise a good point about anonymity of the video creator.

------
satokema
Half the galaxy's engineers are busy trying to jam the AIs while the other
half are busy trying to jam the AI jammers.

~~~
Florin_Andrei
Starts to resemble biologic ecosystems.

~~~
TeMPOraL
And here we thought we're better than nature. No, we've just replaced
competition of designs with competition of goals.

------
2bitencryption
Slight tangent, but let me just say that "deepfakes" is the perfect sci-fi
terminology for the technology.

I could imagine an entire series (or part of a series) about "deepfakes".

I hope the name sticks, as the technology inevitably becomes more common.

------
laythea
If such a system to automatically detect this was created, then I would
suppose it would become the story instead.

Whatever method used would have to come up with a reference photo (or
"model"), ie. what the photo "should" be. And this tech can then be used for
the purpose of more convincing deep fakes and other AI trickery. Hurray!
Progress! :)

------
sandworm101
Why bother creating these fakes? Saying something loud and often enough is all
you need. Throwing money at social media is far more effective than creating
deepfakes. Also, if the US military did spot a deep fake, do they have the
trust for people to believe them? They are picking an uphill fight.

~~~
some_random
>Saying something loud and often enough is all you need.

Depressingly true, but keep in mind that words < pictures < video when it
comes to provoking an emotional reaction.

>Also, if the US military did spot a deep fake, do they have the trust for
people to believe them?

Depends on how their deepfake detection method works, if it's opensource, or
at least publicly available and verifiably functional, then who wouldn't
believe them? No one thinks their GPS is lying to them.

------
romaniv
Right now the visuals for these fakes seem to fall apart at higher resolutions
or when there is any significant head rotation. This is actually better than
what we have in the realm of images, where almost anyone can create a
"realistic" Photoshop of the wast majority of scenes.

~~~
bhhaskin
It's only a matter of time though.

~~~
Florin_Andrei
Likely you're right.

OTOH, when time is a factor (as in movies - temporal succession of images),
perhaps simple curve fitting may never be quite perfect? Perhaps you need
anticipation, and counterfactual thinking, cause and effect, and all that?

Also, like in video games, maybe even some understanding of real world physics
may be required for a perfect fake.

~~~
yorwba
Most of early physics was done by curve fitting, and on very small datasets.
Simple Newtonian dynamics shouldn't be much of a hurdle for a neural network.
That's about as much anticipation of cause and effect as you'd need for faking
most kinds of videos.

------
smallnamespace
The ability to fake something is equivalent to the ability to rewrite the
past. As our ability to fake things increases, more and more of our knowledge
then comes into doubt.

I think we'll end up in a world where the only way to prove what the past
really is if we hash and stuck on a blockchain somewhere. One can then see PoW
as a constant tax being paid to maintain a literal link to a past when
deepfakery did not exist.

~~~
londons_explore
Fancy making a bot which crawls the web, makes a massive merkle tree, and then
puts the head of the tree into a bitcoin transaction?

You could then monetise it by charging a nominal fee for the path from any
given bit of data to a timestamped bitcoin transaction.

The disadvantage is the cost to do this involves "get access to all data on
the web", but perhaps you could partner with the web archive or something to
do it?

------
freeflight
Blaming fakes for all kinds of things is just the extension of the "war on
privacy and anonymity".

Why are fakes supposedly such a big problem? Their identity might be fake, but
that doesn't make their ideas any less attackable and that's what it should be
about: Ideas, not identities.

If an idea is good I couldn't care less who had it, I only care about the idea
itself. Yet all the public discourse focuses on "fakes spreading wrong ideas",
even the recent Facebook EU hearing had that as a major topic, with Zuckerberg
constantly reiterating that catching fake profiles, who could influence
elections, is one of Facebook's top priorities (which I don't doubt).

But barely anybody seems to make the effort to think this trough consequently,
once you do that, you realize that we are heading in exactly the same
direction China is heading: No anonymity, Social media becoming the de-facto
replacement for government institutions. Is that really the world we want to
live in?

~~~
cortesoft
This isn't about ideas, it is about faking facts. You are telling me you don't
care if someone created a video of you viciously murdering someone, and then
that video was used to convict you?

Sure, ideas can stand on their own, but we have to have facts that we can know
so that we can use those facts to judge the ideas. If we can fake images and
videos, we can create fake facts, and that is certainly dangerous.

------
ryanwaggoner
I don’t understand the hand-wringing about this. We have had this problem for
decades with all forms of media other than video. Text, photos, and audio are
all easily faked and can be done so to a degree indistinguishable by 99.99% of
people. Why weren’t we all terrified that bad actors would create fake images
or audio files that the stupid electorate would find so persuasive that it
would overthrow governments and swing elections? What makes video different?
And in particular, why won’t video quickly join the list of things that people
don’t trust without a credible source vouching for it, like we do everything
else?

------
mathinpens
Anyone know the name of the grant/program/challenge ? I didn't see it in the
article..

~~~
joeyo
MediFor - [https://www.darpa.mil/program/media-
forensics](https://www.darpa.mil/program/media-forensics)

Looks like the BAA came out in 2015:
[https://www.fbo.gov/spg/ODA/DARPA/CMO/DARPA-
BAA-15-58/listin...](https://www.fbo.gov/spg/ODA/DARPA/CMO/DARPA-
BAA-15-58/listing.html)

------
calebh
Is there a way to cryptographically sign a video to make sure that it's tamper
free? I'm thinking about some sort of encryption that happens directly on the
camera while a video is being recorded.

~~~
danenania
You can use a signature to verify that a video hasn’t been modified, but that
doesn’t really help in itself with fakes since fakes can be signed too. And if
the cameras are doing the signing, you’d just need to extract the private key
from the camera and use it to sign fakes. It would require something like the
secure enclave to protect the private key and, of course, we’d then be
trusting the camera manufacturers.

~~~
JoshTriplett
Not to mention creating a world in which people can't build their own cameras,
or change the software on those cameras...

~~~
ViViDboarder
Sure you can. Just wouldn’t be usable as factual proof of an event.

------
L_226
A potential stopgap: increase the necessary computation power required for
fakes by only releasing 'official' media with multiple viewing angles and
device signatures.

e.g. press conference containing multiple mobile phone camera videos of the
speaker from different audience perspectives. Probably the models will
eventually catch up and allow this too, to be faked.

------
mehrdadn
Are there high-resolution deepfakes out there yet, or are they within sight?
One s that you couldn't catch by your eye if you just zoomed in a bit?

------
artemisyna
Alas - the 200 years or so by which humans had visual 'proof' that something
did or did not happen is rapidly about to end. :)

------
majos
Side note: I wish the "killer app" for deepfakes hadn't been porn. The
technology was impressive, but having such a scuzzy popular usecase made
sharing it with others kind of fraught.

~~~
bhhaskin
what would the other use be? Manipulation of new and media?

~~~
Maybestring
TV/Cartoon mashups, AR 'stickers', plenty of silly entertainment possibities.

------
crb002
Lol. More like DARPA wants to use deepfakes in psyops without getting caught.

~~~
alexeldeib
Carries as much risk of being used against, no? If not more for getting fooled
by deep fakes into doing something dumb. They want to weaponize it, sure, but
protect _themselves_.

