
We’re underestimating the mind-warping potential of fake video - theonionspeaks
https://www.vox.com/science-and-health/2018/4/20/17109764/deepfake-ai-false-memory-psychology-mandela-effect
======
ryandvm
I find it amusing that we're worried about the potential of fake video/audio.
Like everything would be just be fine if it weren't for that possibility.

Deep fakes haven't really shown up in the wild as a significant threat and we
already have large swaths of society that can't agree on the truth. Certainly
fake media isn't going to help any, but folks, we're ALREADY fucked.

~~~
ZeroBugBounce
So is not agreeing on facts/truth The Great Filter?

~~~
nawgszy
It seems to me it's the mere fact that humanity's rise was borne on the back
of its ruthless drive and selfish greed, and the same things that made us the
best hunters also influence society in myriad ways.

In my eyes, greed, hierarchy, and limited compassion / an inability to share
are the filter, and all society's diseases are the symptoms.

~~~
thaumaturgy
I'm really disappointed when I hear this kind of thing. It's a sentiment that
so many people believe, and want to believe, and choose to believe even in the
face of overwhelming evidence to the contrary.

It's almost like a pernicious, psychological cancer, except it can also spread
virulently to other susceptible hosts.

Humanity's rise is due to lots of factors, _especially_ the countless
individuals who have worked to improve the health, well-being, safety, and
wealth of the people around them. Greed is but one aspect of humanity, and
unbounded greed destroys societies.

Yeah, there are problems in 2018, but to sit here today, in the relative
comfort of modern society, and say "we only have all this because we're
terrible animals" is hugely disrespectful to the work of a lot of people.

~~~
nawgszy
I'm certainly not saying that we have no redeeming factors. As you say, that
would be an absurd mindset in this world we live in.

However, I don't know how you can deny that we are greedy and have limited
ability for compassion. Do you deny Dunbar's number? If not the specific
selection, the concept?

Do you deny that some men have 10 figures of wealth while others have nothing
to their name?

Do you deny that we have, as a society, enacted a plague upon the planet, and
are causing extinction of other species at a rate only witnessed within other
extinction events?

Just because one has virtue does not mean one doesn't have vice, and certainly
the same applies to the "one" that is human society. While we certainly have
accomplished great feats and have seen great individuals, as a collective, we
do not know how to relinquish the individual pleasures in order to facilitate
comfort and health for all.

As the article I saw here recently that said something along the lines of
"people are not stupid; life is just hard", your energies are limited, and
there are those who take advantage of the splintered and uncoordinated
thoughts of society to gain great wealth and power, and those who use their
great wealth and power to splinter and confuse the thoughts of society.

That is the cancer, not my thought that these people exist; do not ignore that
their existence is all but guaranteed by the very nature of the creature we
must have been to find ourselves the dominant force of this planet.

------
politician
My personal belief is that we should be generating vast amounts of deep fake
videos and audio clips now that the technology is available. We should
intentionally and hastily move to pollute the information streams around us as
rapidly as possible in order to force an evolution in how we process
information.

To attempt to slow down this arms race, or to ignore its implications just
puts us at a further disadvantage to those that want to capitalize on the
potential it brings them to arbitrage our "default accept" mindset against
their fake agenda.

For example, China's Great Wall will not protect them from deepfakes and
indeed it actually makes the delivery of deepfakes into China more effective.

The advantage afforded by weaponizing deepfakes only lasts as long as the
audience defaults to trust without verification.

~~~
yifanl
So a form of targeted accelerationism? That's an interesting thought, but
assuming it works as intended, wouldn't the end result of this be a total
destruction of trust?

A whole society where all non-physical interaction is treated with total
suspicion can't be a pleasant one to live in. Maybe that's the end result
either way though.

~~~
politician
The idea is to step back from the idea that we can trust media sources without
verification. Trust _but verify_.

Small groups and towns, places where people live and have a regular continuous
presence, are more likely to be places of higher trust.

Your typical broadcast media sources or your typical corporate social media
sources, places where anyone with any agenda can buy their way onto your
screen are not places that deserve high trust without verification.

Given what we know now, screens of all kinds should be (and IMO, will become)
known as low trust environments. Hastening that is paramount because in the
intervening time you have a population that's susceptible to all sorts of
manipulation.

I'm suggesting nothing new. Be nice to people, and don't believe everything
you read.

------
Bucephalus355
FWIW, all audio/video communication technologies have been some what comically
overestimated in their abilities. Two examples:

\- Henry Ford, believing that since “sight” was the universal language, movies
(along with airplanes) would unite all people and places \- Many people who
believed television was so captivating when it first started to be introduced
in the late 20s and early 30s, that it would replace nearly all books and
teachers in schools

I think people adapt really well to technologies more than we give them credit
for. Of course the first few years lots of scams and dangers can arise, but
they are figured out somewhat quickly by the powers that be.

~~~
Miredly
I'm sorry, but your example falls a little flat when you look at how Fox News
became the lynchpin of the shit-show the US is sinking in to right now.

People adapted to it, sure- but that's the problem.

~~~
Spivak
If a single TV station had the power you describe you would think they might
be a little more successful at using it by now. Not saying they have no
effect, but if the station never existed I don't think our history would be
drastically different.

~~~
gowld
What's make you think they are unsuccessful?

Sure, a TV station can't annoint a President of whatever. The power to destroy
is a weaker power than the power to create, but is still an immense power.

Rupert Murdoch's lifelong business philosophy is that all information
gatekeepers should be destroyed, so that no one's opinion (or facts) are
elevated over any other's. It's pure libertarian-democracy of ideas that
throws the baby (education and critical thinking) out with the bathwater
(suppression of outsider voices). The modern "Internet of Trolls" is the
culmination of his philosophy.

[https://www.theguardian.com/media/greenslade/2011/jul/18/reb...](https://www.theguardian.com/media/greenslade/2011/jul/18/rebekahwade-
newsoftheworld)

------
dsfyu404ed
Questioning the authenticity of records (be they second hand accounts of
events, paper documents or 1s and 0s) has been the norm since it became common
for people other than royals and clergy to be able to read and write.
Jefferson and Adams would not find the concept of "fake news" foreign at all.
The sophistication of the authentication mechanism for a document usually has
to do with the use case. Official letterhead was good enough for most
important routine correspondence. Birth certificates get stamped. Money has
fairly sophisticated anti-counterfeiting features.

We'll remember the late 20th and early 21st century as the exception to the
rule because during that time period if there was an audio/video recording of
something you could be reasonably assured that it was legit. That won't be the
case going forward. Audio and video will become a free for all like every
other medium.

------
steego
It wasn't long ago that we didn't even have video, before that we didn't even
have pictures. You had the option of believing or rejecting one or more
accounts. You also had the option of suspending your belief/disbelief until
you felt like corroborating evidence swayed you one direction or the other.
Personally speaking, I'm often surprised at how certain people are about the
information they ingest.

It reminds me of Daniel Kahneman's book "Thinking Fast, Thinking Slow" where
points out that people more likely to read a paper critically if it uses a bad
fonts, poor layout and some typos. It's not a ground-breaking conclusion, but
the studies also implied that people tended to read papers less critically
when they looked good. The idea is the mind has the tendency to substitute out
critical thinking with heuristic thinking whenever it can. If a paper
superficially resembles serious paper from a well-respected journal, there's a
real temptation to ease up on the scrutiny because many other signals are
suggesting your enhanced scrutiny is a waste of time and energy.

I can't say for sure if our reliance on video and photographs over the last
100 years or so had helped or hurt our critical thinking. It's very easy to
argue the hard work of photographers have been critical to bringing important
issues and events to our attention. On the other hand, you can argue an over-
reliance on images have stripped us from engaging our imagination.

I know in a world of fake news, it would seem like the last thing we need is
to engage people's imagination. When we hear the word imagination, most of us
think about the skill the lets one fabricate novel stories or alternate
realities. We don't think about imagination as a tool that people use to try
to figure out our reality. A vivid imagination not only knows how to fabricate
alternate realities, but it also knows how to incorporate information from the
senses, books, articles and information to test and consider different world
views. We need people who can absorb information about how the world works,
and can project possible realities as well as many possible futures.

I know I'm going long on this comment, and I don't want to come off as an
optimist, but maybe we can use the advent of fake video to cultivate a culture
that stops trusting passive sources of information (video) and begins to
engage healthy critical thinking instead.

I could be way off the mark and would love to hear someone tell me what I'm
missing.

------
denimalpaca
A good way to fight this is to cryptographically sign videos/audio. Might be a
bit computationally expensive compared to a text message, but for very high-
profile people, having a copy of the video/audio and it's cryptographic
signature will be absolute proof against fakes. We don't even need AI
algorithms to learn how to detect fakes in this case, it just becomes a matter
of comparing hashes.

~~~
munk-a
When there is a binary on the internets available from various mirrors with
the primary authority on it distributing a SHA of it's contents then we can
use that SHA to verify the contents of the binary...

The issue here is that there is no trusted authority on the video, the people
distributing the video are the content originators so we can use a signature
to verify the video wasn't corrupted in transmission but not that it wasn't
edited before it was signed.

(this technique would work well if the concern was that videos from some
primary source were being edited by a CDN entity helping to stream that video
to viewers, which is entirely separate from the article)

~~~
sixothree
Wouldn't the authority have to be the device manufacturers?

~~~
munk-a
How would they technically accomplish this? Whenever a camcorder or phone or
webcam was used would the hash be transmitted to the manufacturer and
registered there? Would the device sign the video signature with some secret
key? (similar to that big secret number you're not allowed to show)

Assuming you got it to work sanely, how would you ensure that all
manufacturers properly locked down their devices?

Assuming there's a reasonable trust framework and you're able to enforce it...
How would you allow legitimate video editing without allowing people to edit
video to create misleading images? How would a movie cut between characters
and an exploding model in a legitimate and signed manner without allowing
someone to splice a politician's speech in with terrorist training videos?

There is a lot wrong with this "Tech can fix it" approach.

------
creaghpatr
I think the risk is inversely proportional to how famous the person is. A
political figure or celebrity will likely have the resources and publicly
available metadata to show a video is fake beyond a reasonable doubt.

I’d be more concerned to see manufactured outrage having some random supporter
saying/acting bad to someone on the other team, inciting the Twitter swarms to
descend- by the time they dox the individual it’s too late for that person to
go back to their normal life.

~~~
temp-dude-87844
This has always been the case for libel and slander and other misinformation.
High-profile people have more resources to fight bad information (devoted
supporters, real life social networks, money, lawyers), and they're more
likely to have detractors who've tested their capabilities already. The risk
to them is great, but they have more tools to recover.

The ones least equipped to defend against misinformation are people who have
suddenly broken into the public sphere but lack the support systems to help
their case. Typical sorts of people in this situation are fresh politicians
(often local), viral 15-minutes-of-fame celebrities, and internet randoms who
are called out online.

------
laurex
What this article is getting at is that memory and "truth" are not really
compatible concepts, which I think most neuroscience supports. Research around
vision generally shows that even what we see, first hand (not in video) is an
assemblage or construction, not a "record." Yet humans in general tend to
believe that if they see something, it's either real or "faked" (like magic)
rather than suspect because of the natural limitations of our brain system
(which are also advantages for survival). We also have an even harder time to
distinguish 'reality' when our emotions are involved. "Fake" video does
present more of an issue, I think, than past "fake media", because we simply
have more emotional response to it than other media.

------
CM30
Honestly, I have to wonder; is this just really returning back to the way
things used to be?

Think about it, we haven't had video or photo evidence for all that long
relatively speaking, and it's only over time that our ability to 'prove'
things became more absolute. First it was just speech and choosing who to
believe, then writing came into the picture (along with paintings and drawn
art), then over time voice/sound/radio recordings, photography, videos, etc.
For most of human history, we haven't had great evidence, and in many ways
were in the exact same situation we may end up in after these fake videos hit
the mainstream.

History is a long line of people attempting to 'prove' things, and fakers
gradually getting better at lying in a near endless arms race. We survived
with no evidence or entirely questionable evidence before, and we'll survive
in that situation again in future.

~~~
steego
> We survived with no evidence or entirely questionable evidence before, and
> we'll survive in that situation again in future.

Surviving isn't the issue, quality of life is. You made the point that "fakers
[have] gradually [gotten] better at lying in a near endless arms race."

While I understand that things have a tendency to stay the same, history has
also shown trends that have lasted thousands of years can also be punctuated
in relatively short periods. Sure, people have lied and faked for thousands of
years, but fake videos just might be the critical inflection point that in our
timeline that creates a huge asymmetry of power and influence.

What sets this apart isn't the technical quality of the lie, it's the agents
that are performing the lie: Machines.

Not only are machines capable of fabricating evidence, they are incredibly
effective at measuring the efficacy. Lies can be cheaply produced, tailored
for very specific micro-demographics, and directly sent to their intended
recipients.

The early 20th century showed us how broadcast radio could be exploited for
propaganda warfare. The early 21st century could show us how history rhymes
when well financed organizations or nation states can finance disinformation
campaigns that leverage machine learning.

------
stretchwithme
Another potential use of blockchain technology. When an image is first
captured, the time and a fingerprint generated from it will be recorded in a
public blockchain that can't be tampered with. Then we can disregard images
that don't have valid dates that match when the image was supposed to have
been recorded.

~~~
tCfD
All this will do is encourage the proliferation of innumerable bespoke
blockchains, created by parties interested in stamping legitimacy on
illegitimate content. Each of these blockchains will be presented as
equivalent to "the blockchain" with respect to being trustworthy proofs of
original untampered content timestamps.

------
olefoo
I suspect it will be quite confusing for a while; but once people have been
cheated a few times because of faked video it will become something else. We
will all start picking up on peoples kinematics and timing.

There is an opportunity for some type of chain of custody app that guarantees
the integrity of footage from camera to screen...

~~~
Analemma_
> I suspect it will be quite confusing for a while; but once people have been
> cheated a few times because of faked video it will become something else.

You're only half-right here. People will quickly wise up to fake video, but
that will just make everything worse: every video that confirms their
worldview will be seen as totally legit, every video contradicting it will be
"fake", just like what has already happened with text news. Your hypothetical
chain of integrity between camera and screen won't put a dent in that
conviction.

~~~
olefoo
> every video that confirms their worldview will be seen as totally legit,
> every video contradicting it will be "fake"

Yes, this has been true for quite some time already since even innocuous
footage can be made to lie with creative editing.

The chain of custody thing is already a problem and has been since courts
started admitting photographic evidence.

------
KaoruAoiShiho
Everyone's talking about fake news but I'm just excited for the potential of
democratizing movie making.

~~~
creaghpatr
Indie filmmakers could/would pirate actors' likenesses, the same way
electronic musicians currently make bootleg remixes.

~~~
KaoruAoiShiho
It shouldn't be any harder to create unique digital-only actors
[https://news.developer.nvidia.com/generating-
photorealistic-...](https://news.developer.nvidia.com/generating-
photorealistic-fake-celebrities-with-artificial-intelligence/) than to pirate
likenesses.

~~~
gowld
Your digital star would need to work to achieve fame comparable to a real
star. It's possible, but a difficult competition against the existing human
social systems.

~~~
romwell
Well, Hatsune Miku[1] says hello, as do all four members of Gorillaz[2].

[1][https://en.wikipedia.org/wiki/Hatsune_Miku](https://en.wikipedia.org/wiki/Hatsune_Miku)

[2][https://en.wikipedia.org/wiki/Gorillaz](https://en.wikipedia.org/wiki/Gorillaz)

------
WhompingWindows
I don't think we're underestimating it at all, but we are simply already
overwhelmed by fake text-based news like Twitter and websites/newspapers, as
well as TV news shows spouting falsehoods. While fake video is surely going to
be coming down the pipeline and will make things worse, the current issues are
already terrible. Just looking at the 2016 US election, for instance, it was
found that multiple fake stories were able to gain a tremendous amount of
traction: whether that's the pope endorsing either candidate, health rumors
about HRC, or the swirling controversies of Russia/emails/hacking.

The solution has to be that individual citizens become educated, I don't see
any silver bullet here.

~~~
smofnoopttzzaaa
Certain types of elites will fight against education for this very reason. It
erodes their control.

------
DyslexicAtheist
maybe it's just _fakenews_ for you, but I've watched the video and decided
that Obama saying: "stay woke b __ches ", is the reality I want to fucking
live in.

~~~
DyslexicAtheist
on a serious note, it raises a lot of questions for forensics & anyone
currently relying on the assumption that CCTV footage is enough proof in
court. Considering how many IP cameras are already on shodan this is worrying
for a many of reasons
[https://news.ycombinator.com/item?id=17525443](https://news.ycombinator.com/item?id=17525443)

------
extralego
Are there any libraries for this available yet?

What are the solutions we will use to verify authenticity?

Will live speaking events become more valuable as a result?

Many different angles of video will probably not be of much use alone, but
useful for public events. I can imagine events in which the audience is
surrounding the speaker, creating difficult-to-replicate-from-multiple-angles
background noise. But this is just delaying the inevitable.

------
threatofrain
What are people supposed to do about fake evidence when the main problem is
the technological ability to detect? This isn't exactly a legal or social
awareness problem.

If a video is circulating the web and people don't know if it's real, but it
looks real, and experts can't affirm its fakeness, then what are people
supposed to do?

------
octosphere
We’re also underestimating the mind-warping potential of _detecting_ fake
video via carefully trained algorithms.

~~~
politician
Why not both?

Generative Adversarial Networks [http://blog.aylien.com/introduction-
generative-adversarial-n...](http://blog.aylien.com/introduction-generative-
adversarial-networks-code-tensorflow/)

~~~
PeterisP
The particular videos are designed by this approach (GANs), and thus the
generator is highly adapted to generate data that _can 't_ be distinguished
from a real video with these techniques.

If it's built with state-of-art GANs, then you don't be able to reliably
detect it with discriminators from these state-of-art GANs. If you manage to
build a _better_ automatic discriminator than the authors did, then they could
directly use it in the adversarial approach to build a better generator, and
your detection approach would again cease functioning.

Remember that in the end there's a fundamental asymmetry - in theory, there
_could_ be a perfect generator (not that we're close to perfect yet) that
generates undistinguishable samples, and thus there can not be a perfect
discriminator.

------
jokoon
Who will the public believe if it is a high profile video? This might just end
in court...

This will induce peaks of paranoia into the minds of conspiratorial people.

The bad side effect is that now, you can call any video a deep fake, and
attack the authenticity of any video used against you.

------
api
We are entering a period where nothing not directly sampled by one's five
senses can be trusted. Any electronically recorded evidence can easily be
faked and often using off the shelf consumer-grade equipment and software.

~~~
zip1234
Check out the "You Are No So Smart" podcast. The author goes through all of
the ways that our brain can be tricked/is tricked in daily life.
[https://youarenotsosmart.com/](https://youarenotsosmart.com/)

For example, in one episode
[[https://youarenotsosmart.com/2016/10/09/yanss-085-memory-
ill...](https://youarenotsosmart.com/2016/10/09/yanss-085-memory-illusions/)]:
"Julia Shaw’s research demonstrates the fact that there is no reason to
believe a memory is more accurate just because it is vivid or detailed.
Actually, that’s a potentially dangerous belief.

Shaw used techniques similar to police interrogations, and over the course of
three conversations she and her team were able to convince a group of college
students that those students had committed a felony crime.

In this episode, you’ll hear her explain how easy it is to implant the kind of
false memories that cause people just like you to believe they deserve to go
to jail for crimes that never happened and what she suggests police
departments should do to avoid such distortions of the truth."

TL/DR: What is reality? How can I even trust my senses?

~~~
api
That podcast is great. Should also be required listening for anyone who does
any kind of investment. One reason people keep falling for the same scams over
and over is that they think they're too smart to fall for a scam.

------
cpr
Pretty clear the powers that be are using the mockingbird media to step up
damage control for when certain damning videos of very powerful people start
to drop...

------
flingo
Anyone know of technology to invisibly watermark video/audio?

This would be great for enforcing copyright on sites like youtube.

~~~
criddell
The watermarking scheme would have to survive the compression that YouTube
applies to video.

------
ikeboy
We don't need fake videos when people like James O'Keefe and Sasha Baron Cohen
can achieve the same effect using real footage obtained under false pretenses.

------
cryoshon
at the moment, there are telltale signs of deepfakes if you know what to look
for. various video editing issues, image glitches (visible in the
obama/ahmadinejad photo in the article), lighting inconsistencies, sound
oddities, etc.

frighteningly, these glitches can be hard enough to detect even when the fakes
are made with an amateur's level of attention to detail. this implies that
there is already a very widespread use of the technology by people with the
capability to edit out these glitches.

with the technology of today, i think that state actors or potentially
corporate actors could very easily generate nearly undetectable deepfakes if
they cared to do so. i expect that they already have, in fact. but probably
not where there would be a massive public eye to scrutinize an improbable or a
ridiculous forgery. that would be too risky.

instead, the state of the art of deepfakes is likely invested in making the
realm of the plausible "into reality" in a targeted way for the sake of
shifting opinion of small populations rather than 100% forming new opinions or
causing an about face of prior opinions.

consider, for instance, two allied militant groups in some warzone that the US
has an interest in.

the default condition for these populations is the fog of war, wherein
information is often hard to come by and rumor runs rampant, especially among
the combatants themselves. the US doesn't want the alliance of the groups to
persist for whatever reason. now, introduce a very convincing deepfake video
of the commander from one militant group plotting to do something that the
other group -- and perhaps some of the original group's members -- would
disagree with.

there might be immediate chaos between the groups, but probably not. the most
likely result will be that some people from each group see the deepfake and
get convinced, then perpetuate the false information to others who haven't
seen it. over time, the relations between these two groups will become
strained, and their alliance may fray or fall apart altogether.

the incentive to be able to cause these kinds of changes is undeniable.
especially when populations are compromised in their ability to fact-check
information by hard problems like insufficient internet access, deepfakes will
be especially effective.

the article worries about the total capitulation of reality at home to that of
the fabricated image. no, that is not the risk at present. we have too many
alternative channels to validate information. in fact, we are likely to
evaluate information without even intending to by discussing information
learned via one channel -- perhaps a deepfake -- with other people who may
have learned about it via a different and contradictory channel. skepticism is
becoming built-in. for people who aren't in the deep-information environment,
however, deepfakes are already powerful enough to tilt reality to its
preference...

