
Generating convincing audio and video of fake events - sohkamyung
https://www.economist.com/news/science-and-technology/21724370-generating-convincing-audio-and-video-fake-events-fake-news-you-aint-seen
======
ideonexus
Hasn't this already happened? Remember the Project Veritas videos of ACORN
employees giving advice to a pimp on how to run his child prostitution ring
that were all over the news in 2009? Multiple investigations found the videos
where so heavily edited as to be fake:

[https://en.wikipedia.org/wiki/ACORN_2009_undercover_videos_c...](https://en.wikipedia.org/wiki/ACORN_2009_undercover_videos_controversy#Investigations_of_ACORN_and_the_videos)

The original videos were everywhere, ACORN was driven to bankruptcy, and the
investigations revealing the videos were faked got hardly any news coverage.
The art project described in this post is very quaint in comparison to this
very real life event.

~~~
cazum
To expand on this point further, the phenomenon of synthesizing accounts of a
person saying something has already existed for thousands of years.

If I tell you that "Trump said that low income families will receive
unconditional, federally sponsored medical coverage," you will not immediately
take my word on it. You will critically analyze my statement by scritinizing
who I am to make this claim, and the claims congruency with what you already
know to be reality. I am not a reputable publication, so you are not likely to
believe me, because the claim is contrary to what is already known.

The same will become true of video. A video of what looks like somebody saying
something will become as useful as hearsay testimony. We will, through
necessity, begin critically scrutinizing the source of footage, not just
taking it's content at face value.

In many ways this already exists in the photography world. Convincing photo
manipulation is old technology at this point, and I think we've adequately
adapted to it.

~~~
DougWebb
_In many ways this already exists in the photography world. Convincing photo
manipulation is old technology at this point, and I think we 've adequately
adapted to it._

Have we? I'm pretty sure there's still a pretty serious problem with young
girls developing their self-image based on heavily manipulated photos of
models in the magazines they read. That's a benign example, as far as the
purpose of the manipulation, which can be argued to be 'artistic'. An
intentionally-damaging example of photo manipulation is much of the falsified
propaganda that was distributed last year during the presidential campaign,
which I'm sure included a lot of manipulated photos. That stuff definitely
influenced the election, and our whole political climate. I don't think
people, in general, have adapted to the idea that what they see isn't
necessarily real.

------
some1else
This article title is a bit off. The audio is not generated, the video is not
convincing, and the original conversation is not fake.

This is an art project which highlights the implications of research like
Face2Face: [https://youtu.be/ohmajJTcpNk](https://youtu.be/ohmajJTcpNk)

~~~
olivermarks
Glad you posted this, I was about to look for this same link to share. Between
this and Adobe's voice emulation software
[https://thenextweb.com/apps/2016/11/04/adobes-upcoming-
audio...](https://thenextweb.com/apps/2016/11/04/adobes-upcoming-audio-tool-
lets-you-synthesize-speech-in-anyones-voice/) it appears possible to create
very realistic 'footage' of constructions based on real people doing and
saying things. The film industry create computer generated characters of
deceased actors (Star Wars for example). this makes the Economist piece very
out of touch with realities, and presumably there are far more sophisticated
technologies we don't know about too....

------
dalbasal
Most prestigious publications have weighed in on this topic, with all it's
titles and euphemisms: fake news, alternative facts, conspiracy thinking, echo
chambers..... mostly with nothing to say.

Scientific American recently ran a feature that may as well have been titled:
" _You won 't believe SHOCKING the TRUTH about FAKE news!!_"(1) For the first
half of the article, they cited credentialed research institutes, people, sub-
disciplines like computational sociology and talked about recent "advances"
made "studying social phenomena in a quantitative manner."

As far as I can tell, the only content was "confirmation bias exists," with a
non-explicit implication that our understanding of this newly discovered
phenomenon is advancing fast. There was also an anecdote about texas.

Everyone seems to want to weigh in on this conversation, but no one seems to
have anything substantial to say about it.

So sure, "conspiracy theorists" will be able to challenge video evidence. I
don't know if that changes anything.

(1) actually titled "inside the echo chamber"

------
mooreds
I remember an old sf book (sorry, forget the name) where this kind of "fake"
video is used for nefarious purposes. Imagine a video of a president saying
something inflammatory that they never said.

Talked about that with a friend who does video editing in the early 2000s and
she said they could it then.

So, while GANs make make it far easier to build up fake soundbites, the
capability has been around. We just need restraint in the media and ways for
the common person to verify info. (Given how easily fake news spreads around
FB, the latter is probably more important.)

~~~
pdkl95
> ways for the common person to verify info

If that could be done automatically, we could skip a lot of time consuming,
expensive research. Church and Turing already proved there are no general
solutions to the _Entscheidungsproblem_.

On the other hand, if you were thinking of some sort of service that provides
"authoritative" verification, you've only moved the problem. The service can
be faked (or corrupted) just as easily. Similarly, we already have many
historical examples where restraints on media are ( _de facto_ ) used against
political enemies.

What we need is a way to educate people with the scientific method and just
enough logic to implement it practically. Sagan discussed this problem in "The
Demon-Haunted World: Science as a Candle in the Dark", which I regularly
recommend to anybody that seems to need his "Baloney Detection Kit"[1].

[1] [https://www.brainpickings.org/2014/01/03/baloney-
detection-k...](https://www.brainpickings.org/2014/01/03/baloney-detection-
kit-carl-sagan/)

~~~
ShannonAlther
The issue, unfortunately, is that if I were to generate a video of Mike Pence
doing something X-rated with another man, there's no conceivable way the
scientific method would help. I mean, if he were a closeted homosexual, would
you really immediately discount it as faked just because the evidence was
against it? That happens to conservative American politicians _all the time._

Worse, the current president has set a new bar for saying outlandish things.
Every time he opens his mouth I am literally shocked into silence. The number
of things I genuinely cannot imagine him saying is shrinking rapidly. If you
were to see a video of Trump walking up to a podium like Obama did when he
announced Bin Landen's death and saying very solemnly that bombs were about to
be dropped on Russia, would your first thought be "Preposterous! There's no
way the man would do _that_ ; it must be a clever hoax."

Not to mention that if this technology becomes exceptionally user-friendly it
could be used to destroy lives in myriad ways. Where is your science now?

~~~
pdkl95
I would say that such a video is none of my business; just like the Clinton's
bj, that's a private topic that should be relevant only to the people
involved.

I get your point, that scandals involving outdated systems of morality _are_
still a problem in some areas. Technology marches on, and we must learn - as a
species - to update our social and moral systems _fast_ or we may find out
that intelligence is not[1] an effective survival trait in the long-term.

> The number of things I genuinely cannot imagine him saying is shrinking
> rapidly.

Trump's speech is _very_ predictable in it's general theme. The details are
random, but he always says things that are self-serving, in a childish
"obviously I was always right" attitude. If there's is a way to add a not-
very-subtle insult, he will add it.

Most important, it's foolish to try to parse his statements for _information_
, because that's not how he uses language. Instead, he uses phatic[2] language
to express emotion, confidence, and social standing.

> [Trump] saying very solemnly that bombs were about to be dropped on Russia

Why would he bomb the people he wants to do business with? The few comments
Trump has made about increasing talks with Russia is one of the very few
things I like about Trump (not that I expect he will actually do anything
whatsoever on that topic). Anything that reduces the risk of _nuclear war_ is
great.

> I am literally shocked into silence.

Fear is the mind killer. Recently a _lot_ of people of - orthogonal to
political ideology - are doing everything in their power to push the politics
of fear. Almost all of it is unsubstantiated hot air... which is what the
Baloney Detection Kit is for. Science isn't perfect, but it's still the best
method we have for filtering out charlatans, propagandists, and simple
mistakes.

[1]
[https://news.ycombinator.com/item?id=8916033](https://news.ycombinator.com/item?id=8916033)

[2]
[https://news.ycombinator.com/item?id=14577909](https://news.ycombinator.com/item?id=14577909)

~~~
ShannonAlther
The point wasn't about outdated morality. It could just as easily be a
doctored video of Theresa May stabbing a man to death in cold blood.

I think you've focused too much on the specific examples I gave. Consider a
faked video of your SO committing adultery, or a faked video of you committing
a crime that's used to convict you, or a video of Merkel saying Putin has
erectile disfunction or something.

You can't just analyze these videos for inconsistencies; at some point, the
technology becomes more powerful than human ability to discern whether it's
real. Then what? What do we do once video evidence becomes unreliable?

(By "shocked into silence" I mean, "stunned by how
asinine/insane/pointless/cruel that just was".)

------
nradov
The article's recommendation to demand metadata seems pointless. It's easier
to fake than content.

~~~
avaer
Easier to _generate_, but also easier to disprove. So maybe not easier to
fake. The article briefly touches on this.

------
iambateman
I wish projects like the Hardy video had a convention to tell the audience
that they're fake.

Sometimes the late night shows release satire videos which are difficult to
determine if they're real. We should have a logo for "btw, this news footage
is a joke."

~~~
phreeza
I disagree, sharpening the skill of figuring out what is fake and what is not
is important. If you don't learn that, it puts you even more at the mercy of
malevolent people putting out fake stuff (which they would then of course not
mark).

~~~
eponeponepon
The issue is that there comes a point where the human skill becomes
insufficient - the technical capacity to fake exceeds the cognitive capacity
to detect it. Sufficiently advanced technology etc.

The solution is not clear.

~~~
mirimir
Right. So authenticity becomes a matter of technical analysis. Dueling
experts. And given alternative versions of events, people will believe what
they like.

~~~
amelius
Or how about hard laws that forbid fabricated news (unless tagged as such)?
I'm not saying it is a solution, but it should be explored.

------
calafrax
Nothing new here. GANs still are not nearly as good at creating fakes then
humans.

------
Ericson2314
I think they mixed up which network is the adversary?

