
Everybody Can Make DeepFakes Now [video] - sahin-boydas
https://www.youtube.com/watch?v=mUfJOQKdtAk
======
xrd
I see this kind of thing, and I think, how will a whistleblower survive the
slander?

Snowden and Assange both were subject to information warfare (lies is a better
word) that facts don't refute. As they say "a lie is halfway around the world
before the truth has its pants on."

I bet a large number of people still remember that Clinton was involved in
pornography distributed from a pizza shop. Except it was not true at all. But
the hatred of her lingered.

What's to happen when a normal person attempts to reveal something and then
video of them surfaces showing them saying all kinds of horrible things. The
scariest thing about this is you could take a picture of them from their
college years and make them out to be a pedophile, and if it gets repeated
enough, it will become the truth. If I admit that I had a drinking problem
twenty years ago, it will "all add up." That kind of imagery is impossible for
our brains to fight.

~~~
catalogia
I don't really buy it, the general public is more clever than they're often
given credit for. Editing photographs became widespread due to Photoshop and
now people are correctly skeptical of photographs. It's mainstream to wonder
if outlandish or startling photographs have been manipulated.

Sure some people will believe the bullshit, but I think _" want to believe"_
is a big part of that. The sort of people who think Hillary Clinton eats
babies believe it because they hate her, rather than hating her because they
believe it.

~~~
som33
No, the public is dumb.

The last 20 years the entire continent allowed PC software to be stolen and
made client server, now every piece of software is spying on us because of
mass computer illiteracy.

20 years ago hackers and nerds on slashdot feared Microsofts and the
industries move towards hardware and software drm when the internet was just
reaching the masses.

The last 20 years has been the public buying client-server drm infected
software en masse.

Everything that used to be a local PC app is being pushed client-server, in
such a world on privacy can exist and the public rewarded Ultima online,
Everquest and world of warcraft.

For those of you who don't know, two or more PC's in a network behave as a
single machine, that means whoever programs the machines in the network
LITERALLY OWNS THE ENTIRE NETWORK. AKa it's the ability to rewrite the laws of
every computing device on the planet and the speed of light is fast enough to
take over every machine on the planet.

It's no longer merely "programming apps". It's literally stealing software en
masse from the public. AKA to program now is "issuing commands", so now
software can be infected with commands that take over your PC and remove your
control and ownership. This is what windows 10 is about.

Some of you make complain that software was always "licensed" and I'd tell you
that those laws were bribed into being by corporate lobbyists because
Capitalist governments have always been ruled by corporate elites that have
rubber stamped their own laws, if in doubt see the last 200 years of IP law.
The public interest lost every time, that is a blow out of the public interest
never winning once.

[https://en.wikipedia.org/wiki/Copyright_Term_Extension_Act#/...](https://en.wikipedia.org/wiki/Copyright_Term_Extension_Act#/media/File:Tom_Bell's_graph_showing_extension_of_U.S._copyright_term_over_time.svg)

The fact that so many are gung-ho about "software as a service" here not
grasping the huge political implications, you can't possibly audit all the
software apps on google play, mobile, or steam, uplay, origin, etc. So these
companies can do whatever they want and since our mind did not evolve to
respond to these threats, many of the public just shrug their shoulders,
because the human mind did not evolve in a technocratic society, so won't
perceive nor act upon the danger.

So no, the push towards locked down filesystems you can't access in windows
10, encrypted binaries and vm's, UWP, is the big push to put bombs in honest
binary blobs to make software unpreservable.

All these incentives exist because they know the public is dumb, the fortnites
of the world and TF2's hat economy, told valve and big tech companies to keep
pushing because the public will eat endless amounts of shit because they most
of them don't care.

Imagine it's the 90's and you're buying your first PC and microsoft sells you
windows, except it requires an internet connection (aka the hardware dongle)
and parts of windows files and functions are held hostage on remote servers in
microsofts office, most hackers of the day would not buy those PC's.

That's what "software as a service" is for many of the dolts on hackernews,
its hardware dongle enabled software by way of internet and splitting all of
what used to be local applications into two pieces.

It's the end of local applications on our PC's and mobile devices and most of
you don't seem to be alarmed, it's the end of privacy permanently.

Because we didn't get any property rights to own our software outright with
source-code, this is why steam, world of warcraft, and all new PC games like
quake champions, overwatch, exist.

The end game from the early 90's was this dystopian state where all PC's on
the planet are turned into dumb clients and the entire society moves to a
mainframe model where no one owns their devices.

Without software ownership, our devices our useless and given the criminal way
in which IP law was written for software, privacy is not coming back.

So to say the public is bright when it handed everything the nerds feared to
the corporate world on a platter is bullshit.

The entire silicon valley industry is having a part at how dumb the public is,
the "client to cloud" revolution is LITERALLY, stealing local apps and
trapping them on company servers so they can finally turn software into
property and kill the "local file" loophole on PC's to "kill piracy" by way of
software as a service, the public has no idea what NTFS is.

Microsoft is now testing remote control of your file system and PC via
gamepass, where companies can now set permissions to files on your PC via
remote.

This is what the nerds feared in the 90's on slashdot and it has come true.
The fact that hackernews is basically pro "cloud" is disturbing enough on its
own.

Here's the industry celebrating the death of local user control of PC's.

[https://tifca.com/wp-
content/uploads/2019/09/ClienttoCloud_V...](https://tifca.com/wp-
content/uploads/2019/09/ClienttoCloud_Vision_V2.pdf)

~~~
f1refly
I have nothing to contribute to that, I just want to thank you for the effort
you put into this writeup. I'm not sure if it's worth to keep up the fight,
but I sure as hell will and I hope others do so as well.

------
bawana
Will this be the 'spam' that destroys our civilized world? When we cannot
believe our own senses, then our only refuge will be a construct that WE made
so we know we can trust it? Will we voluntarily enter the matrix to escape the
chaos that we have corrupted our reality into?

~~~
andarleen
Scary thought - but a practical question is how will deep fakes influence
video as courtroom evidence? Some countries have laws that only approved
recording devices are acceptable for recording such evidence, but is this the
end of phone recorded evidence?

~~~
lucasmullens
Did photo evidence end because of Photoshop? Honestly asking, I don't know.

DeepFakes just seem to be adding video to the list of fakeable things, which
already included photos and audio.

~~~
throwaway17_17
There was a marked decline in attorney’s ability to utilize private detectives
to spy on opposing parties or to present still shots as alabi’s that started
(at least in my area of the US) somewhere around 2005. I’d say it took a while
for the distrust of stills due to the possibility of generic ‘photoshopping’,
but actual and test juries are less likely to definitively base decisions on
stills if the opposing lawyer can in any conceivably reasonable way claim it’s
edited.

As a side note, I am less concerned with the use of deep fakes to frame (civil
or criminal) a person, it is more the potential for an attorney to put big
wholes in a jury’s trust of video evidence by showing a faked image as
counterproof and then flat out telling jurors that one is fake and therefore
the other side can’t prove it’s case.

Caveat -> this doesn’t apply to police generated evidence, because a majority
of jurors, to a statistical certainty barring exceptional circumstances,
ALWAYS believe cops.

------
geophertz
In a certain way, I find the fact that everybody can make deepfakes positive
for society.

Because we will finally stop trusting videos and images as a reliable proof,
when they clearly aren't all the time.

~~~
krapp
No, we won't.

People already know videos and images can be faked, but they still trust
whatever they _want_ to be true as a post-hoc validation of their existing
prejudice, not as the result of a rational attempt to judge evidence on its
own merits.

Just look on Youtube for various "paranormal" videos, things like "real proof"
of reptilians, ghosts, aliens, etc. Most of those videos are laughably fake
looking, but people who already believe in these things take them as gospel,
anyway.

Politically speaking, look at Pizzagate as another example. Whether or not you
believe the "evidence" tends to depend on what you already believe about
Democrats/progressives/leftists/the Clintons.

------
pikseladam
Think about that. You open a page in web and you SEE YOUR FACE telling you
that you liked this product and how it changed your life! Scary times ahead.

~~~
anonytrary
No. You will just see a person who does not exist. People that do not exist do
not demand pay.

------
delusional
I wonder when the first ad-network based solely on getting celebrity deepfakes
to endorse products for cheap will pop up.

is it even illegal? even if it is, could we do anything if they operated out
of Russia or the middle East?

~~~
krapp
That's already basically very much a thing[0,1].

[0][https://www.forbes.com/sites/steveolenski/2017/06/12/the-
bra...](https://www.forbes.com/sites/steveolenski/2017/06/12/the-branding-of-
dead-celebrities/)

[1][https://www.wired.com/story/messy-legal-fight-to-bring-
celeb...](https://www.wired.com/story/messy-legal-fight-to-bring-celebrities-
back-from-the-dead/)

------
symplee
If wonder if we're getting closer to our brain's "algorithm" which takes
reference images from the real world and generates newly imagined scenes when
we're dreaming.

------
zxcvbn4038
How long until someone uses this to make a new Bogart film? Or a new home
alone movie with young Macaulay Culkin? I see a lot of great possibilities
there.

~~~
ludwigschubert
Have you seen the YouTube channel Ctrl Shift Face?

[https://m.youtube.com/channel/UCKpH0CKltc73e4wh0_pgL3g/video...](https://m.youtube.com/channel/UCKpH0CKltc73e4wh0_pgL3g/videos)

Here’s a pretty well done one in which Bill Hader does impressions, during
which his face is replaced with the one of the people he’s doing an impression
of: [https://m.youtube.com/watch?v=kjI-
JaRWG7s](https://m.youtube.com/watch?v=kjI-JaRWG7s)

------
danbolt
I’m not ready for advertising on the internet to be mining social media for
producing targeted deepfakes.

------
legionof7
I wonder if it'll ever become a good idea to preemptively deep fake yourself
saying outrageous things and release them.

Saw something like this in a Neal Stephenson book I believe.

~~~
mirimir
Yes, that was in _Fall_.

------
mirimir
The faces are too wooden.

------
leoh
We need software to detect deep fakes

~~~
kfuwbi2640
I’ve seen a number of attempts to identify deepfakes and other forms of
manipulated images using AI. This seems like a fool’s errand since it becomes
a never ending adversarial AI arms race. Instead, I haven’t seen a proposal
for a system I think could work well. Camera and phone manufacturers could
have their devices cryptographically sign each photo or video taken. And
that’s it. From that starting place, you can build a system on top of it to
verify that the image on the site you’re reading is authentic. What am I
missing that makes this an invalid approach? I do understand that this would
require manufacturers to implement, but it seems achievable to get them
onboard. I even think you get one company like Apple to do this and it’s
enough traction for the rest of the industry to have to follow suit.

~~~
zrm
Such a system would require _all_ devices to be secure against key extraction.
Otherwise the attacker need only choose the most vulnerable device, extract a
signing key from it and sign their deepfakes with it.

It would also allow any device manufacturer to sign anything they like, as
well as anyone who can coerce a device manufacturer to do so.

~~~
kfuwbi2640
Apologies for a late response here (by HN standards where conversations last
only a number of hours). I agree that an attacker could compromise weaker
devices and sign their deep fakes with them. But then hopefully those keys Or
that manufacturer would be blacklisted. In my mind, a company like Hauwei
could implement this, but I as a consumer of media wouldn’t necessarily trust
photos from their devices. But photos signed by an iPhone where Apple has a
better privacy record, I could trust more.

Thanks for replying though, this does help me understand the challenges in a
system like this.

~~~
zrm
It's not really a matter of privacy record. In general manufacturers don't do
it on purpose.

For example, it was discovered that it's possible to extract keys from Intel
SGX enclaves using certain speculative execution vulnerabilities. Intel SGX
predates Spectre. It isn't a _category_ of vulnerability they knew existed
when they were designing it.

Vulnerabilities are regularly discovered in almost everything, iPhones
included. Diligent vendors are quick to patch them, but an attacker only needs
to wait until the next vulnerability is discovered and then extract the
signing keys from a device that hasn't been patched yet.

You also have no way of knowing which keys they are -- if a million devices of
a particular model have a known vulnerability then any attacker could extract
the keys from any of them, and even blacklisting all of them (which would tend
to dissatisfy their innocent owners) still wouldn't save you from an attacker
using an unpublished vulnerability against a device you don't even know is
vulnerable.

To put this another way, this is basically the same class of technology that
Hollywood uses for DRM. Now, how many Hollywood movies can you say have not
been pirated by anyone?

------
badrabbit
Crypto to the rescue! I predict, Deepfakes won't cause as much issues as
people think.

~~~
mschuetz
How would crypto solve deep fake? You're generating new fake content, nothing
to crypt here.

~~~
badrabbit
You're looking at the problem the wrong direction ;)

~~~
mschuetz
Point me in the right direction.

~~~
parksy
Not that guy, but at a basic level one possible approach is that everyone will
have a digital signature, everything posted can be verified against the
signature. A faker could create new content but they won't be able to sign it
correctly.

The tools for digital signing already exist and are used in niche industries
but not really something everyday people are concerned about. It's more a
social awareness problem and this will come with better tools, integrated with
the apps and channels ordinary people use daily so that it is something that
happens in the background.

In addition to digital signing, public distributed ledgers can establish
provenance or a chain of ownership for signed content, so digital content that
is altered and redistributed can be sourced back to its origin more
accurately.

Ordinary people with the right tools built into their devices will be able to
see very easily "Is it in the chain?" based on a simple enquiry to the ledger,
and if the answer is no, then it's untrustworthy.

If the answer is yes, then they will ask "Does trace to the origin?" and if it
does not, or the origin isn't signed by the alleged owner, then it is
untrustworthy.

That's one possible way crypto can help defeat disinformation / fake news
content. The underlying techniques exist but there is a lot of work to do to
bind it to everyday use.

~~~
jdsalaro
> Ordinary people with the right tools built into their devices will be able
> to see very easily "Is it in the chain?" based on a simple enquiry to the
> ledger, and if the answer is no, then it's untrustworthy.

Not that I disagree, but the first thing that came to mind when I read this
part of your comment was that this approach reinforces the Monopoly of
governments and media companies on the flow of information. On the one hand
it's great that we could protect in this way from unverified, "not-in-the-
chain" messages, but on the other hand being a contrarian, producing and
coming into contact with information outside of the chain, not necessarily
really fake videos, but other types of information is crucial to social
evolution and civil discourse.

The "public" and "distributed" would be major priorities of mine in the
scenario you describe.

