Hacker News new | past | comments | ask | show | jobs | submit login
The Double Exploitation of Deepfake Porn (thewalrus.ca)
15 points by gk1 12 days ago | hide | past | favorite | 13 comments

The summary of the article is certain people will get embarrassed when someone makes a deep fake of them. Some people will be upset because their copyrighted content will be stolen. Thus the double exploitation.

The article treats these problems like they're new and dangerous but embarrassing people with their likeness and reproducing other's work has been around for millennia.

The author obviously missed when MEPs were contacted by good quality deepfaked Russian agents posing as Navalny alies: https://www.theguardian.com/world/2021/apr/22/european-mps-t...

I think one of the main issues with deepfakes is that it allows public figures to DISMISS evidence against them as "fake news" since the general public doesn't generally have the knowledge to determine whether it's true or not.

It seems like they should be categorized along the lines of libel and defamation for the more malicious usages that intend to hurt a target and those that are effectively IP theft shouldn't be allowed either. These videos are effectively publishing lies that some person said or did something that is not true. It may be hard to find culprits in some cases, but there needs to be some kind of repercussions to fakes that harm individuals. Those producing content by themselves or with small organizations as a business won't have a lot of resources to launch lawsuits and only those that have enough resources might be able to fight them with lawsuits and DMCAs. Maybe elected representatives should push through some kind of legislation drafted to protect those with fewer resources to at least deter individuals that might target them (would be surprised if no one elected in California hasn't thought of doing this yet).

> These videos are effectively publishing lies

As the article mentioned these videos can not pass as genuine; they're obviously fakes. Rather than trying to pass off a lie, this is more akin to fiction where you're required to suspend your disbelief. At least until the technology improves.

Thought Experiment:

What would be the ethical implications of using deepfake tech to anonymize porn. IE, alter face and possibly tattoos and voices such that the person was not recognizable by friends/family?

Why stop there? It's weird. You'd think deepfake porn would very quickly evolve to essentially cut the need for actors in porn entirely. I mean that's what happens with the 3d modeled porn, right?

They do it for a bunch of reasons, more to distinguish from I guess the competition. Things such as outright impossible body geometries (from tentacles, various types of combined male/female bodies, robots, cyborg, ...) to otherwise difficult/impossible sex acts (e.g. a kinda cool one was jumping out of a plane, then having sex before crashing into the decor) or things impossible with humans (like sex with various types of (partial) android-type robots where we're not even remotely close to having advanced enough control algorithms to do that).

This certainly looks like it has the potential to be more popular than porn. Perhaps exploitation will mean humans will always be cheaper, but I don't think so.

It is hard to think of this as a ‘tech’ problem when it is rooted in the objectification of women and sexuality in general. There may be a back and forth of tech fixes and tech exploits but it will remain a societal issue.

And the men in porn whose only job is to stay hard and pound away aren't objectified?

Is the problem “objectification” or just people being embarrassed about naturally private things becoming public? If someone makes a deep-fake of you taking a shit, is the repercussion of that a “societal issue?”

If someone steals your car, is that a tech issue?

Deepfakes are just super high fidelity fan art.

Perhaps, but this doesn't discount the article's arguments.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact