
‘Deepfakes’ Trigger a Race to Fight Manipulated Photos and Videos - jkuria
https://www.wsj.com/articles/deepfakes-trigger-a-race-to-fight-manipulated-photos-and-videos-11564225200?mod=rsswn
======
cjlars
Seems to me that Photoshop fakes are well handled by a sort of social
whitelist. It doesn't matter what's on the picture if you can't show us who
took it.

The Weekly World News served (serves?) up that sort of nonsense for years, but
for the most part the media didn't go around falling for hoaxes, whatever the
source. Probably a few deepfakes will spread like wildfire on social media and
then the public will start wising up to the technology.

~~~
tasty_freeze
How small of a percentage of people need to fall for it to tip an election?
How many people need to doubt real evidence as being possibly manipulated to
change a jury's conviction?

As it is, there are apparently more than a million people who actually believe
QAnon is real, despite the fact there is nobody willing to put their name to
the source.

~~~
yhamv
Well, QAnon is real, no? It's not a spirit writing those posts.

~~~
rtkwe
Yes sure, if we reduce 'real' to just mean it's not created ex nihilo out of
the aether sure, but everything is 'real' under that definition... ultimately
everything is still attributable to a human decision to kick off a process.

Is QAnon actually a highly placed government official (or someone with access
to information about the function and plans at a high level)? From all their
big predictions so far (arrests etc) being false I think we can pretty easily
conclude no.

------
social_quotient
Here is an example what keeps me up at night
[https://youtu.be/_mkRAArj-x0](https://youtu.be/_mkRAArj-x0)

Adding or removing cancer.

Image manipulation of medical related diagnostics/data. Imagine a hospital
hack where 50% of cancer patient images are clear and unsuspecting other 50%
have fake cancer injected in. The ransomeware on this stuff is going to be
absolutely insane to resolve and/or insure against.

~~~
not_a_cop75
It's not as if political dissidents haven't been executed properly without
this tool, but what you're stating is now it's going to get worse. I'm of a
pretty strongly held belief that everyone the top echelons really want dead
are already dead. People that have the money to reliably make this thing
happen need no low tech solutions, although possibly you're worried now that
anyone can do it?

It seems that such a move would be used to upset a fight for power among a
government. If you have a government where this is already an issue, then the
problem is more likely the government than the action created by the hack.
Furthermore, such a large rise in cancer cases would be heavily researched,
and causation would be discovered, prompting hospital care facilities to start
issuing hash code with images to prevent manipulation, if that is not already
being done.

------
wruza
I actually like this tech. It opens infinite possibilities for using a
real(-istic) image in a made up situation (think games and entertainment). Get
ready for a new wave of “dynamic models” soon, who will not only post images
and videos with new clothes and makeup on their whatever-gram, but also
perform movement scenes that could be used to model them in a virtual way, for
money of course. People will like it, because if it is not them who make an
easy living, at least they can see a good show that doesn’t depend on
performer’s preferences, or make one themselves by movement translation. It is
also a time to make a new fortune out of it.

As for politics and fact-checking, personally I just don’t care. If you do,
then sign your damn video with a public key already and check if a video
matches it. That would require a new sort of signature that is not lost on
resize/rebalance/etc, but nothing unreal. As a bonus, video players could show
the source’s certificate fields right below the video.

~~~
killjoywashere
> sign your damn video with a public key already

Surely the imaging companies are on to this by now, right? They should be
signing the raw data as it comes off the sensor. Tesla does this for the input
and the output, on-chip if I recall.

~~~
Nursie
> Surely the imaging companies are on to this by now, right? They should be
> signing the raw data as it comes off the sensor. Tesla does this for the
> input and the output, on-chip if I recall.

So all post-production work ceases? Or do we have a situation where the
original is published alongside the media-edition where they've applied a few
filters and adjusted the colour space?

I'm not saying it's a bad idea, just interested in how it might work.

~~~
killjoywashere
Right, you string the signatures together. "This image was generated by Tom
Fipples on a Nikon D610, working the Washington Post, and edited by Jane
Wizzlepans working for AP by X copy of Photoshop. Certs include Nikon,
Washington Post, and Adobe. Signatures include Tom and Jane. And the whole
chain has to be in sequence. Just like intermediate certs in all the other
parts of your browser.

~~~
Nursie
OK, so we're effectively (to my mind) looking at something like git with
signing on every commit, and an image file being its own self-contained repo,
so you can [;ay back through the versions to the 'root' that comes from a
camera.

Presumably this format is going to be of interest to journalists and courts,
at the very least, though courts are most likely going to be most interested
in the simplest case - the provable original.

Interesting idea :)

------
nova22033
You don't need deepfakes to fool boomers on facebook. You could create an
image with rolling fields, an American flag in the background, and the caption
"Share if you think veterans shouldn't be euthanized when they catch a
cold"...

Boomers would smash that share button so hard CA would experience another
earthquake..

~~~
keketi
Yea, no kidding. Here's a fine example:
[https://twitter.com/mumonamission5](https://twitter.com/mumonamission5)

Based on just a few tweets she appears to believe ancient aliens existed,
satan worshippers hide their symbols everywhere, said satan worshippers occupy
most positions of power and sacrifice children, "everything is energy", Nikola
Tesla invented free limitless energy, the elites consume adrenochrome
extracted from tortured children, all electricity comes from tall structures
collecting lightning strikes, ancient cultures were able to utilize said
"athmospheric energy" and humans are sprayed with aerosols released from
aeroplanes.

That was just from the tweets of the past week. It's truly mind boggling to
gaze into such a fractal of conspiracy theories.

~~~
rasz
To be fair most of the things you listed can be "learned" on Discovery History
Channel.

------
raxxorrax
Deep fakes can easily be caught if you have the original material to compare
it to. Finding that automatically shouldn't be that hard if it is available.

In a way I already like deep fakes since they force people not to take
anything they see on the net too seriously.

~~~
pjc50
After a weekend of two mass shootings as a result of Internet radicalization,
it's clear that it's too late to ask people not to take the internet
seriously.

~~~
raxxorrax
True, but I believe the recent surge in shootings to be a direct response to
censorship ambitions and media coverage these offenders expect to receive.

------
mises
To be honest, this is just part of a larger problem. Even if we have "trusted"
media companies, that will just increase centralization in the space by making
it harder for indie outlets (who, in the age of the internet, can break
significant stories well before larger media corps). Much of the material used
by such corps these days comes from videos posted on the internet, any way.
Grainy smart-phone video posted to the internet has been behind a large number
of stories over the past few years.

Add to this that it doesn't solve the problem of deceptive editing, a la
Covington kid. The real problem here is _still_ that people are by and large
sheep who care not to read past a headline. If this sounds rude, perhaps it
stems from my frustration with the issue. See case number one, mass shootings.
As tragic as it is, you remain more likely to be struck and killed by
lightning than shot in such an event. Most deaths involving guns are suicides.
We still get several days of air-time for each, while all the while we have
real problems. Homeless populations, poor educations, starving children in
africa. Focus on those first, and go by order of what affects your nation
most.

------
EGreg
Think about it for a second. Deepfake generation and detection are just like a
two-person game. They have been solved - Go, Chess, and so on. MCTS is a
pretty good approach, but there can be others. That stuff iterates until a
point where the detector cannot be sure anymore that something is or isn't a
deepfake.

By that point, just like with Chess, humans are way behind. They can't tell
for sure anything.

So no matter WHAT we do, we are going to be in a world where we can't trust
video or audio evidence of anything. And when we can achieve deepfakes in
realtime, you won't be able to trust that the person you're conversing with is
really your friend.

At that point, people will voluntarily create cryptographic timestamps with
trusted equipment, or several trusted combinations (e.g. a phone + a beacon +
wifi hotspot etc.) and then voluntarily share that (zero-knowledge proof) with
whoever needs to know.

But if bots ever reach conversational level, and we have realtime deepfakes,
it's game over essentially for trusting any interaction online with anyone.

------
marks_list
When global pedophile networks and human traffic rings start getting
uncovered, introducing the topic of deep fakes in the public discourse becomes
relevant to discredit the compromising footage of high-profile individuals who
participated in them and are actual three-letter agency assets due to
blackmail.

------
program_whiz
This is interesting because it speaks to one of the existentially troubling
aspects of being human. That is the fact that there is no evidence that we can
really trust for any external reason. This goes to the point that Descartes
made many years ago:

1\. Scientific evidence -- that's just words some other ape like yourself
wrote down you now believe.

2\. Personal experience -- those are just memories of experiences you think
you had, but you can't really be sure, you just ask your brain and it
responds, once that stops working you never "had the experience".

3\. Photographs and video -- well those are just captured images, and any
image or data can be changed. There is no "sacred data" that can not be
manipulated.

4\. Trusted institutions: just more flawed apes that you hope are more
trustworthy than those they are instructing, controlling, and / or in charge
of.

5\. Rules / Laws: pieces of writing on paper that we all choose to adhere to.
History shows that people can begin deciding not to follow them at any time,
they have no power in and of themselves, only people have power, and usually
because they are willing to wield violence (e.g. police, military, resistance,
militia).

In the end all you have is the certainty that you are indeed experiencing
something, but living with the ego-shattering realization that nothing else is
certain or under your control is a very large red pill to swallow.

------
SzamarCsacsi
I'm terrified of deepfakes because they will at some point completely devalue
photographic evidence. On one hand, it will make it easier to manufacture
evidence and frame people. On the other hand, it will make it easier to get
away with stuff despite real photographic evidence by simply blaming it on
deepfakes. This is already happening with the whole fake news phenomenon.
People's perception of truth will be even more skewed.

~~~
Krasnol
> On one hand, it will make it easier to manufacture evidence and frame
> people. On the other hand, it will make it easier to get away with stuff
> despite real photographic evidence by simply blaming it on deepfakes.

So nothing changes and you have to get additional sources. Just like you
should already.

Peoples perception of truth is already skewed. Imagine presenting a
photoshopped picture to someone 10, 20, 30 years ago.

There is no way out of this as the tech is out there.

------
sbhn
Fear media triggers a race to fight manipulated photos and videos. “OMG we’re
under ATTACK”

~~~
futureastronaut
I've seen probably hundreds of moral panic stories about DEEP FAKES (which
sound so scary!), and have yet to see a malicious use. It's more of an ML
party trick than anything to fear, but the "I fucking love science" crowd eats
it up.

~~~
xienze
> I've seen probably hundreds of moral panic stories about DEEP FAKES (which
> sound so scary!), and have yet to see a malicious use.

Because this is a long-term operation designed to make you question what your
own eyes see. There will be a day when an extremely damaging video of a
politician/celebrity comes out and our trusted media sources will say "we've
determined this was merely a deep fake" and thereby dismiss it out of hand.

~~~
nomadluap
I have a feeling that the next 15 months will be an interesting time for this.

------
devoply
no they don't. deepfakes still look fake, until that changes deekfakes are a
tempest teapot.

~~~
kevingadd
So are you saying they won't ever look realistic "enough"? Are you saying that
even the average grandma or child has enough media and computer literacy to
recognize a deepfake? Even if the fake is of someone they've never met? Is it
harmless for a child or mother to see a deepfake of a family member doing
something horrible?

Even if it's a tempest in a teapot right now (I think this is a questionable
assertion in 2019) it seems pretty safe to bet that it will be a big problem
within 5 years. Maybe less.

~~~
pilif
Also it gets worse because it also undermines the credibility of factually
correct news. Once a fake isn’t easily distinguishable from reality any more,
even media literate people will start to question or even dismiss any piece of
information the moment it feels at all contrary to their thinking.

~~~
devoply
Very good deep fake, still pretty damn fake to me.

[https://www.youtube.com/watch?v=3dBiNGufIJw](https://www.youtube.com/watch?v=3dBiNGufIJw)

~~~
pixl97
The funny/sad thing here is you are setting yourself up for failure. In the
future you will end up considering a faked image real because you will reason
there was no way you wouldn't recognize a faked image/video.

