Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People are going crazy over deepfakes but we've had Photoshop for a long time and the world hasn't ended. We find ways to verify the content of photos through other means, such as their provenance.


i dunno, seems pretty significant to me. used to be that we only had to be skeptical of blurry newspapers photos, then detailed photos, then videos without people as giveaway, now highly detailed videos and audio aren't trustable. You don't think it seems significant that it's getting to a point where only "seeing with your eyes" is believing? All our systems of spreading information beyond our senses start to fail when we can't trust anything outside our perceived experience (or things brought to us by our trusted friends)


Blurring around the face edges and mismatched lighting still give them away. Time will tell if the machines can overcome a trained eye.


I think that is a bit of selection bias. Outside of what I've seen in papers Ctrl Click Face[0] has some of the best "in practice" deep fakes I've seen. The video in the main post is meant to be a joke. But these still aren't state of the art and really are just "some dude" that is doing this on their own. Not a production studio. What I think is different here is that we can think that: photoshop got so good that your average person can create convincingly fake content. At the end of the day deep fakes are part of photoshop (I mean they have neural filters now...). The question is more about ease of use. "Requires a team of highly trained special effects artists" vs "some dude in his basement and a shiny computer."

And a big part is that you have prior knowledge. I'll be honest, I didn't realize it was Trump at first. Nor did a friend that I sent the video to that didn't have the prior that all characters were fake. Took him a good minute. That's a meaningful difference.

[0] https://www.youtube.com/watch?v=H3pV-_iyT4U


Small correction, but the video in the main post was indeed created by a production studio of deep fake artists.

From the NY Times article:

> The “Sassy Justice” creators said they had spent “millions” of dollars to make the video, including the initial investments to produce the halted movie and set up the Deep Voodoo studio, though they declined to specify the exact cost. “It’s probably the single most expensive YouTube video ever made,” Parker said.


Sorry for the confusion, I was saying that Ctrl Click Face is "just some dude." Talented, but as far as I'm aware, just one person.


The perception of things being untrustable (and that it's coming down the pipeline) is what will break larger society. If people are expecting deep fakes to arrive, we'll start seeing the knock-on effects of that erosion in trust waaaaaay before even a fraction of the risks are real. (imho of course :)


Easy to get around this by shooting the deepfakes in high quality, then crushing the final footage into lower quality before sharing it.

Would make those kind of artifacts much harder to identify


Looking at the video here, the two best by far are the Michael Caine and Julie Andrews deep fakes, and in both cases the voice is doing most of the heavy lifting. Deepfake audio is somewhat scarier to me in terms of political / legal chaos than video, much easier to possibly trick someone with a "secret" audio recording than video recording if we some day get to the point of near identical audio mimicry.


Audio deepfakes are getting better but they have a surprisingly long way to go still.

Here's an exploration of a deepfaked Jay-Z reading/rapping the Navy Seal copypasta: https://www.youtube.com/watch?v=UZzYoOdIXoQ


Not sure there’s any deepfakery on the Michael Caine audio. Peter Serafinowicz (the other collaborator and originator of ‘sassy Trump’) is well known for his Michael Caine impersonation.


>Not sure there’s any deepfakery on the Michael Caine audio

Most likely not, and the parent comment seems to agree with you on that. I think they were just trying to point out, in general, that the arrival of commonplace audio deepfakes might be way more disruptive than video deepfakes, despite a lot of people (including myself) who used to counter-intuitively think that video deepfakes would be more disruptive.


Fair point, and I agree audio is potentially more disruptive - especially if it can get to good real-time performance.


except (in my experience as a photographer), most/a significant chunk of the general public not only don't understand photoshop, but will actively disbelieve you about the extent, purpose and outcome of manipulations.

Also, it's one thing to say the world hasn't ended, but that's to potentially downplay at a minimum the idea that commercial and widespread use of photoshop hasn't had widespread effects on body and self-image, creating and interacting with arguably culture-bound psychological issues such as anorexia, bulimia, unnecessary surgery, self- harm, suicide, etc. Or to take examples from non Anglo culture, eyelid removal, skin whitening, nose surgery, etc.

it's true that the world hasn't ended, but that's a thought terminating cliche. there's a lot of evidence it's creating and created significant harm and significant effects.


It helps that a lot of the photoshops going around the internet are either super-sloppy or depict things that are obviously ridiculous. I actually can't think of any time somebody tried to alter a photo in a way that would change the meaning in a believable way and distributed it in such a way that they were trying to convince people it was real. Have I missed something?


Wind the clock back to 2008 when the problem came to the attention of the wider public. Lots of press about Iranian missile tests, which turned out to be a fake.

https://boingboing.net/2008/07/10/iran-you-suck-at-pho.html

(and of course, you certainly have missed the fakes that didn't get noticed!)


There's a lot of points in there to unpack, and it's a different set of beliefs and claims from the idea that it's not doing any harm.

Focusing first on 'changing the meaning in a believable way'. Believability is a subtlety bordering on a truism. If it's believable, we would often deem it to have "not sufficiently changed the meaning". If it "sufficiently changes the meaning", does that make it no longer believable?

Second, it's arguable that the whole point of being a good photographer/photo-editor is to make an altered image look "believable" or "impactful" without triggering the brains uncanny valley or rejection response. That's fundamentally why we photo-edit.

For a start, there's actually lots of explicit examples from history:

- Lincoln/Calhoun composite

- Stalin editing opponents during purges

- John Kerry + Jane Fonda in US election

- George bush holding book upside down reading to school child

-etc etc etc.

Those examples are just the obvious things that a general man on the street would accept as photoshopped, and the ones that I know are 'shopped. There's a whole bunch of photography techniques involving telephoto lenses into crowds doing the rounds worldwide to make people look more numerous and close together (to spark social distancing outrage), framing and cropping like of trump's inauguration to make it look more crowded than it really was, and arguably "Hunter biden finger-lake tattoo child-molesting" stuff currently doing the rounds on the internet's seedier sites, but I don't know how/whether they're photoshopped, and you get the photographer's dilemma of discussing the lay person's belief that effects done "in camera" are "real" but effects done "out of camera" are "'shopped": a distinction which is often effectively meaningless.

Then we have the general "mass photoshopping" phenomenon, where practically every commercial image one sees has been explicitly edited to create some effect and giving the idea that the subverted-reality is just 'normal' and not in anyway actively subversive: be it skin that doesn't look like actual skin; eyes, hair and teeth standards and colours that are practically biologically impossible; general biological-hair removal; slimming, shaving and exaggerating the respective female body parts; slimming and expanding male body parts.

And practically everyone's doing it: take Kamela harris' officially supplied photo: https://upload.wikimedia.org/wikipedia/commons/d/d9/Kamala_H...

You can zoom in on the eyebrow and hairline to see such poor use of the blur tool that they barely even bother hiding it any more.

Now, as per my previous point you can argue "oh yeah, sure all our pictures and images are messed with and touched up, but it's not harmful and that's not really changing the photo!".

To that I repeat my initial observation: no, it does indeed seem to have a harmful affect in terms of body-loathing, a drive to self-hatred and consumption, and general reality distortion. It can be picked up upon and observed via the way it expresses itself differently in different societies and different ethnicities by observing the peculiar culture-bound mental illnesses and aesthetic phenomenon that appears as each is exposed to different photo-editing ideals and cultural mores.

And then there's the second point that these deep-fakes will actually be the next level. If we see, and know, that there are problems with our photoshopped media while the common man has folk-beliefs about the objectivity of the photographic media, imagine what's possible once we see that disconnect begins to apply to video + sound as well. It's probably going to get worse. Much much worse.

And remember, with doubtful photos, we have the belief in the authority of sound + video recordings to fall back on. Once we remove the authority of sound + video, we no longer have any effective authority to determine the factual nature of the media we're viewing. The deepfakes aren't generally quite there yet, but in 5 - 10 years, they'll be better. And they only have to get to a baseline where a significant number of people are fooled by them, or at least create enough of a reasonable doubt to allow them to dismiss or accept positions they disagree/agree with, before it becomes a real problem.


Something to consider is that we already had video when photoshop arrived, which is able to merit at least more authenticity than images do. Are there other mediums that's able to take this baton of authenticity now? I don't think so.


Verifiable content is definitely not the norm through history. I’d love to see an academic analysis of whether misinformation really went down by much (if at all) when photography became a thing. Or when camera phones and social media came along.

Yesterday there was a fake Trump tweet with a zillion upvotes on Reddit, and that’s just text. Text is trivial to fake, so maybe the chain of trust we see in text is what we’ll see with everything else moving forward. “An anonymous source within the White House provided this footage,” being countered with, “you can’t trust anonymous sources!” It will come down to who you trust, just like it always has.


Democracy and the existence of social media are not the norm throughout history either. Voters used to depend on the news media, which whatever its faults has some serious institutions built-in for fact checking. Today any idiot can post something that goes viral.


Are you blind at how stupid people are verifying even the basic things?


We as a culture are not able to verify content of deceptively cut videos.


The problem isn't about the existence though. Accessibility and convenience to use this type of technology are the issues here.


People are making a bit deal about these "auto mobiles", but we've had horses for ages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: