Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From the main page: “Communication groups [...] • mrdeepfakes - the biggest NSFW English deepfake community”, which strains my assumption of good faith debate regarding their ethics.


If someone makes a deepfake on their own computer, watches it, and doesn't share it with anybody, I don't see how that's markedly different (morally) from just imagining the same thing. Some people have a very strong visual imagination and others don't have it at all, it's only fair if they can use a technological substitute.

Also some entertainment works by artificially instilling a desire which cannot be fulfilled. If people can use deepfakes and masturbation to defuse that desire it might be a moral positive for them.


I think it's a fundamentally different thing, morally speaking.

Imagination is fleeting, ephemeral, jumbled, and usually linked to a specific state of mind that passes once the person has either grown bored of the imagination of achieved whatever satisfaction they wanted from it.

A deep fake image or video is persistent on your device and may leak some day, but worse is that feels real in a way mere imagination never can and it's something a person could go back to over and over, feeding into and enhancing their obsession with the non-consenting person.

I think it's a bad thing in ways that imagination and porn are not.


What if I photoshop my crush's face onto a naked body? Wouldn't that be similar to having a personal deepfake?

I agree with the person you're replying to, you can't stop people doing these things in the privacy of their own home and these deepfakes are functionally similar to other photo / video editing techniques.

_Distributing_ those photoshops / deepfakes is another thing entirely though, and one I am fully against.


> What if I photoshop my crush's face onto a naked body? Wouldn't that be similar to having a personal deepfake?

Yes, and it's already widely considered to be really fucking weird behavior


My childhood was typical. Summers in Rangoon, luge lessons. In the spring we'd make meat helmets.

Of course it took years for me to perfect my artistic skill in the area of erotic fingernail sculptures and by that time anyone could afford a Photoshop license.


Fully agree with ya there, but is it _immoral_? I would say no


The question wasn't whether people can be stopped from doing it, because for the most part they can't be, but whether it's moral. Or at least different in a moral sense from merely imagining someone else doing those actions.

And yes, photoshopping your crush's face onto a naked body is just as unethical and immoral, even if done in the privacy of your own home, for the same reasons I mentioned above.

Compared to deep fakes, however, it has a much higher effort, lower reward, and less realism, which acts as its own inhibitor. Deep fake generators are becoming trivial to use, and not just for one or two pictures but for as many videos as you'd want. That's going to result in a very different driver of obsession.


> photoshopping your crush's face onto a naked body is just as unethical and immoral, even if done in the privacy of your own home, for the same reasons I mentioned above.

The reasons you mentioned above were "it feels more real than imagination" and "a person could go back to it over and over, enhancing their obsession with the non-consenting person". These are not moral bads that require a "no lusting after people, even in your heart" style rule. Being obsessed with people starts in high school and can persist for decades. They never find out, and it never hurts them.

I'm sure many people would like to take away others' ability to picture them naked, but fortunately they didn't have the ability to enforce that by starting a moral crusade against a Github repo. It's one of life's innocent little pleasures.


I never said this was about a "no lusting after people, even in your heart" rule, nor do I believe it should be. If someone lusts after someone else in their head and never does anything to harm the other person there's obviously nothing wrong with that.

It's immoral because it's creating something persistent that could harm someone without their consent, amongst other things.

I also question your assertion that obsessions of this type are always harmless, because it's obvious that many are not and escalate into outright stalking, confrontations, or worse, and anything that makes that sort of obsession more likely or helps to enhance it where it exists increases the chances of that happening.


Morality evolves with technology. At some point in the future even killing someone migh not be very immoral if technology reaches the point where we can back up and recover from back up.


Maybe, but what’s the relevance? We can’t predict that, nor can we adjust today’s morality to match what technology might enable in future.

Perhaps murder will one day be not very immoral, but that doesn’t mean we should treat it any differently today.

Right now, creating fake naked pictures and videos of real and non-consenting people is immoral. In some jurisdictions it’s illegal. It’s also probably harmful, by taking obsessions further than mere imagination would allow.


mrdeepfakes.com is a site where deepfakes are being shared publicly, so your comment doesn't have much to do with the comment you actually replied to.


https://en.wikipedia.org/wiki/Aphantasia is a real thing and affects about 4% of people.


> strains my assumption of good faith debate regarding their ethics.

Let's flip the argument upside down. Let's imagine you want to produce porn content, but don't want to be recognized. This technology allows that.


Side note about that: people doing amateur porn where their face doesn't show at all still get recognized because of particular marks on their skins. So this might help this very specific case, but maybe not as much as one would think.


It's trivial to use Stable Diffusion img2img with low noise to change the marks on your skin.


I've already seen porn where someone's face is blurred.

And whose face are you going to put in that porn with this?


The readme for the repo has examples of AI generated faces. Think this person does not exist, but better.


Emma Watson, probably


TBH I got curious. Clicked on one celeb, oh man, I've seen 90's Photoshop cut and paste the face onto a different body better than those "deepfakes"...


Nothing about producing or consuming deepfakes suggests bad faith.

You might not like the art, but deepfakes are not harmful.


“NSFW deepfakes” literally means “porn with real people’s faces inserted.”

Maybe that’s an art form, but it’s also clearly potentially harmful in the same way as revenge porn, teenage nude selfies, and other cases where regular people and porn intersect.


This sounds like something literally 90% of humanity does in their imagination at some point in their lives. It's inevitable that someone is going to do it in a video when the tech becomes good enough.

I'd say it's not the thing itself that's a problem, it's publishing whatever you create as if it's real. There are many things in life that are potentially harmful, but are part of everyday life since nobody actually harms anyone with them.


You do see the difference between a thought, shared with no one and visible to no one, and an image on a computer, which can be uploaded to the internet, right? This isn't about stopping sexual desire, it's about not distributing images of someone's naked body without their permission. That it's not a guaranteed accurate representation is sort of beside the point. It's good enough if you've never seen them naked before. Some people have gone on record saying they don't like it when it happens to them. Why, do you think?


> You do see the difference between a thought, shared with no one and visible to no one, and an image on a computer, which can be uploaded to the internet, right?

Why do you assume it’s shared with no-one? The representations change, but I can guarantee you that Bianna the Beautiful had several suggestive clay figures created of her without her permission.

Not to say that that was necessarily great, Bianna would probably have been upset too. But trying to fight the method of representation instead of the sharing seems like a fools errant to me.


99% agree. The 1% is for the potential for deepfakes to solve blackmail. If producing deepfakes becomes trivial, any blackmail threat can be dismissed with “Whatever, I’ll just say it’s a deepfake”.


This is not how humans work. False statements about someone damage that person, even once everybody knows they're false.


But with this argument you could blackmail someone whether or not you have some evidence.


You indeed can. Any of your coworkers can report you to HR right now for allegedly saying to them in private that you find underaged boys attractive. Wouldn't that at least be quite stressful to you?


Personally, I've been on the fence about doing sex work. On the one hand, I'm nervous about people identifying me based on my face or identifying marks, but lately I feel like I shouldn't care, I should be open about it and that way I can't be blackmailed.


Why is it clearly potentially harmful? It is not clear to me at all, as photoshop already exists.

How is someone harmed by someone else producing video that features people who look like them? The whole "seeing is believing" thing hasn't been true for ages, every movie is half CG these days.

I would perhaps buy the premise if deepfakes were sprung on the scene in the 70s or 80s, but the 90s and 2000s have inundated so many with completely fantastical high res CG imagery that nobody thinks videos are proof of anything anymore.


Do you agree that most people wouldn't like to find a nude video of themselves online?

Now why does it make a difference if the video is real or not? It's still a video of a nude body with your face on it.

I find this whole deepfake-porn trend to be incredibly disgusting. And it just saddens me that heterosexual males apparently immediately have to exploit such things for their own hornyness. I guess this is also the reason why this topic is not being criticized as much, because the people who develop and consume these videos can't comprehend how disrespectful it all is towards the people (women) who are being deepfaked AND especially the results being shared online, in most cases publicly.


I think that most people would not care if a video is posted online of them if the video is in fact fake.

Would you care if someone posted a nude image of someone with your face photoshopped on it? If so, why?


This recently happened to a large number of women that are Twitch streamers and they are all very upset. You are free to seek out and read their explanations as to why, I'll include a few tweets.

https://twitter.com/qtcinderella/status/1620264657926885380?...

https://twitter.com/mayahiga/status/1620586546083803136?t=c7...


> Would you care

Yes.

> why?

Because it's insanely creepy for someone to photoshop my face onto a naked body without asking me and then post said picture online?!? Especially if it's done so well that people don't notice that it's not actually me.

If your rebuttal is that "not everyone cares if deepfaked photos of themselves are being published online" then the answer is pretty simple: As long as you don't know if someone minds it, don't fucking do it.

It's really alienating to me when I think about the fact that we're discussing if it's okay to upload deepfaked nudes of someone without their consent.

The only reason I can come up with is that some people have probably consumed lots of deepfakes and now don't want admit that it's maybe a bit creepy and wrong.


I just don't understand why anyone would care?

You don't need consent from anyone depicted in visual art to make visual art. It literally has nothing to do with them.

I see no moral, ethical, or consent issue with deepfakes whatsoever. People are allowed to make whatever sort of CGI they can imagine, and, most critically: they should be.


> You don't need consent from anyone depicted in visual art to make visual art. It literally has nothing to do with them.

Yes you do. https://en.wikipedia.org/wiki/Personality_rights


> You don't need consent from anyone depicted in visual art to make visual art

We're talking about (mr)deepfakes. Which is visual art, yes. Nude visual art.

That's kinda the point. The nudity is what makes it immediately not okay.

If you just deepfake someones face onto Jason Stathams body, they will probably enjoy and laugh about it.

If you deepfake their face onto someone who's getting roughly f*cked in the ass, they might not like it that much.

But then again, not just nudity is wrong. Deepfaking someones face into a video to hurt them is wrong as well.

I guess what I'm trying to say is that I really think that AI Ethics should be pushed harder and more seriously.

If someone deepfakes their favorite actresses head into a porn video, I still think that it's wrong. What I would appreciate is if I can at least agree with this person, that what they're doing is not 100% okay.


> I think that most people would not care if a video is posted online of them if the video is in fact fake.

I think most people would care a lot. After all, it doesn't matter if it's fake or not, if matters if anyone you care about thinks it's real. And some almost certainly will.


Have you asked the people this happened to or are you just making assumptions? It’s important to critically examine our own biases in things like this, particularly if we have no idea why something evokes a strong emotional reaction.


Wouldn't there be a dramatically different level of effort to photoshop every frame in a video convincingly than to create a deep fake of the video?


We've not had the test case, but in places where nonconsensual or "revenge" porn is criminalized, deepfakes may well turn out to be illegal. They're certainly distressing for the victim.


It's already illegal for a person to pretend to be another real person without their consent.


This needs an asterisk (and I'll include one of my own: IANAL). In the United States, impersonation is regulated by the states and not consistent across them, but in general this is only true when using impersonation to commit fraud, defamation, or another act that itself is likely a crime already. Many cases would be considered parody or satire, even if no one finds them funny.


This may be the case in some places, and fraud is illegal in general, but should it be illegal to make parodies of politicians using paid actors? I think most people would say no.

Then why should it be illegal to do the same with an AI?


This is a shallow take, no matter which way you look at it. "Guns don't kill people."


Nearly everything remotely related to deepfake porn not only "suggests" bad faith, but positively compels the judgment of extreme levels of bad faith.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: