From the main page: “Communication groups [...] • mrdeepfakes - the biggest NSFW English deepfake community”, which strains my assumption of good faith debate regarding their ethics.
If someone makes a deepfake on their own computer, watches it, and doesn't share it with anybody, I don't see how that's markedly different (morally) from just imagining the same thing. Some people have a very strong visual imagination and others don't have it at all, it's only fair if they can use a technological substitute.
Also some entertainment works by artificially instilling a desire which cannot be fulfilled. If people can use deepfakes and masturbation to defuse that desire it might be a moral positive for them.
I think it's a fundamentally different thing, morally speaking.
Imagination is fleeting, ephemeral, jumbled, and usually linked to a specific state of mind that passes once the person has either grown bored of the imagination of achieved whatever satisfaction they wanted from it.
A deep fake image or video is persistent on your device and may leak some day, but worse is that feels real in a way mere imagination never can and it's something a person could go back to over and over, feeding into and enhancing their obsession with the non-consenting person.
I think it's a bad thing in ways that imagination and porn are not.
What if I photoshop my crush's face onto a naked body? Wouldn't that be similar to having a personal deepfake?
I agree with the person you're replying to, you can't stop people doing these things in the privacy of their own home and these deepfakes are functionally similar to other photo / video editing techniques.
_Distributing_ those photoshops / deepfakes is another thing entirely though, and one I am fully against.
My childhood was typical. Summers in Rangoon, luge lessons. In the spring we'd make meat helmets.
Of course it took years for me to perfect my artistic skill in the area of erotic fingernail sculptures and by that time anyone could afford a Photoshop license.
The question wasn't whether people can be stopped from doing it, because for the most part they can't be, but whether it's moral. Or at least different in a moral sense from merely imagining someone else doing those actions.
And yes, photoshopping your crush's face onto a naked body is just as unethical and immoral, even if done in the privacy of your own home, for the same reasons I mentioned above.
Compared to deep fakes, however, it has a much higher effort, lower reward, and less realism, which acts as its own inhibitor. Deep fake generators are becoming trivial to use, and not just for one or two pictures but for as many videos as you'd want. That's going to result in a very different driver of obsession.
> photoshopping your crush's face onto a naked body is just as unethical and immoral, even if done in the privacy of your own home, for the same reasons I mentioned above.
The reasons you mentioned above were "it feels more real than imagination" and "a person could go back to it over and over, enhancing their obsession with the non-consenting person". These are not moral bads that require a "no lusting after people, even in your heart" style rule. Being obsessed with people starts in high school and can persist for decades. They never find out, and it never hurts them.
I'm sure many people would like to take away others' ability to picture them naked, but fortunately they didn't have the ability to enforce that by starting a moral crusade against a Github repo. It's one of life's innocent little pleasures.
I never said this was about a "no lusting after people, even in your heart" rule, nor do I believe it should be. If someone lusts after someone else in their head and never does anything to harm the other person there's obviously nothing wrong with that.
It's immoral because it's creating something persistent that could harm someone without their consent, amongst other things.
I also question your assertion that obsessions of this type are always harmless, because it's obvious that many are not and escalate into outright stalking, confrontations, or worse, and anything that makes that sort of obsession more likely or helps to enhance it where it exists increases the chances of that happening.
Morality evolves with technology. At some point in the future even killing someone migh not be very immoral if technology reaches the point where we can back up and recover from back up.
Maybe, but what’s the relevance? We can’t predict that, nor can we adjust today’s morality to match what technology might enable in future.
Perhaps murder will one day be not very immoral, but that doesn’t mean we should treat it any differently today.
Right now, creating fake naked pictures and videos of real and non-consenting people is immoral. In some jurisdictions it’s illegal. It’s also probably harmful, by taking obsessions further than mere imagination would allow.
Side note about that: people doing amateur porn where their face doesn't show at all still get recognized because of particular marks on their skins. So this might help this very specific case, but maybe not as much as one would think.
TBH I got curious. Clicked on one celeb, oh man, I've seen 90's Photoshop cut and paste the face onto a different body better than those "deepfakes"...
“NSFW deepfakes” literally means “porn with real people’s faces inserted.”
Maybe that’s an art form, but it’s also clearly potentially harmful in the same way as revenge porn, teenage nude selfies, and other cases where regular people and porn intersect.
This sounds like something literally 90% of humanity does in their imagination at some point in their lives. It's inevitable that someone is going to do it in a video when the tech becomes good enough.
I'd say it's not the thing itself that's a problem, it's publishing whatever you create as if it's real. There are many things in life that are potentially harmful, but are part of everyday life since nobody actually harms anyone with them.
You do see the difference between a thought, shared with no one and visible to no one, and an image on a computer, which can be uploaded to the internet, right? This isn't about stopping sexual desire, it's about not distributing images of someone's naked body without their permission. That it's not a guaranteed accurate representation is sort of beside the point. It's good enough if you've never seen them naked before. Some people have gone on record saying they don't like it when it happens to them. Why, do you think?
> You do see the difference between a thought, shared with no one and visible to no one, and an image on a computer, which can be uploaded to the internet, right?
Why do you assume it’s shared with no-one? The representations change, but I can guarantee you that Bianna the Beautiful had several suggestive clay figures created of her without her permission.
Not to say that that was necessarily great, Bianna would probably have been upset too. But trying to fight the method of representation instead of the sharing seems like a fools errant to me.
99% agree. The 1% is for the potential for deepfakes to solve blackmail. If producing deepfakes becomes trivial, any blackmail threat can be dismissed with “Whatever, I’ll just say it’s a deepfake”.
You indeed can. Any of your coworkers can report you to HR right now for allegedly saying to them in private that you find underaged boys attractive. Wouldn't that at least be quite stressful to you?
Personally, I've been on the fence about doing sex work. On the one hand, I'm nervous about people identifying me based on my face or identifying marks, but lately I feel like I shouldn't care, I should be open about it and that way I can't be blackmailed.
Why is it clearly potentially harmful? It is not clear to me at all, as photoshop already exists.
How is someone harmed by someone else producing video that features people who look like them? The whole "seeing is believing" thing hasn't been true for ages, every movie is half CG these days.
I would perhaps buy the premise if deepfakes were sprung on the scene in the 70s or 80s, but the 90s and 2000s have inundated so many with completely fantastical high res CG imagery that nobody thinks videos are proof of anything anymore.
Do you agree that most people wouldn't like to find a nude video of themselves online?
Now why does it make a difference if the video is real or not? It's still a video of a nude body with your face on it.
I find this whole deepfake-porn trend to be incredibly disgusting. And it just saddens me that heterosexual males apparently immediately have to exploit such things for their own hornyness. I guess this is also the reason why this topic is not being criticized as much, because the people who develop and consume these videos can't comprehend how disrespectful it all is towards the people (women) who are being deepfaked AND especially the results being shared online, in most cases publicly.
This recently happened to a large number of women that are Twitch streamers and they are all very upset. You are free to seek out and read their explanations as to why, I'll include a few tweets.
Because it's insanely creepy for someone to photoshop my face onto a naked body without asking me and then post said picture online?!? Especially if it's done so well that people don't notice that it's not actually me.
If your rebuttal is that "not everyone cares if deepfaked photos of themselves are being published online" then the answer is pretty simple: As long as you don't know if someone minds it, don't fucking do it.
It's really alienating to me when I think about the fact that we're discussing if it's okay to upload deepfaked nudes of someone without their consent.
The only reason I can come up with is that some people have probably consumed lots of deepfakes and now don't want admit that it's maybe a bit creepy and wrong.
You don't need consent from anyone depicted in visual art to make visual art. It literally has nothing to do with them.
I see no moral, ethical, or consent issue with deepfakes whatsoever. People are allowed to make whatever sort of CGI they can imagine, and, most critically: they should be.
> You don't need consent from anyone depicted in visual art to make visual art
We're talking about (mr)deepfakes. Which is visual art, yes. Nude visual art.
That's kinda the point. The nudity is what makes it immediately not okay.
If you just deepfake someones face onto Jason Stathams body, they will probably enjoy and laugh about it.
If you deepfake their face onto someone who's getting roughly f*cked in the ass, they might not like it that much.
But then again, not just nudity is wrong. Deepfaking someones face into a video to hurt them is wrong as well.
I guess what I'm trying to say is that I really think that AI Ethics should be pushed harder and more seriously.
If someone deepfakes their favorite actresses head into a porn video, I still think that it's wrong. What I would appreciate is if I can at least agree with this person, that what they're doing is not 100% okay.
> I think that most people would not care if a video is posted online of them if the video is in fact fake.
I think most people would care a lot. After all, it doesn't matter if it's fake or not, if matters if anyone you care about thinks it's real. And some almost certainly will.
Have you asked the people this happened to or are you just making assumptions? It’s important to critically examine our own biases in things like this, particularly if we have no idea why something evokes a strong emotional reaction.
We've not had the test case, but in places where nonconsensual or "revenge" porn is criminalized, deepfakes may well turn out to be illegal. They're certainly distressing for the victim.
This needs an asterisk (and I'll include one of my own: IANAL). In the United States, impersonation is regulated by the states and not consistent across them, but in general this is only true when using impersonation to commit fraud, defamation, or another act that itself is likely a crime already. Many cases would be considered parody or satire, even if no one finds them funny.
This may be the case in some places, and fraud is illegal in general, but should it be illegal to make parodies of politicians using paid actors? I think most people would say no.
Then why should it be illegal to do the same with an AI?