Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Pornpen.ai – AI-Generated Porn (pornpen.ai)
711 points by dreampen on Aug 23, 2022 | hide | past | favorite | 415 comments
Hey HN, I've been working on https://pornpen.ai, a site for generating adult images. Please only visit the site if you are 18+ and willing to look at NSFW images.

This site is an experiment using newer text-to-image models. I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated. New tags will be added once the prompt-engineering algorithm is fine-tuned further. If the servers are overloaded, take a look at the feed and search pages to look through past results.

For comments/suggestions/feedback please visit https://reddit.com/r/pornpen

Enjoy!




I had some fun with StyleGAN a couple years ago. I should probably have written this up, or something. Some others I showed it too thought I should. But I never took it too serious. I was just bored over a couple weekends when I had access to some beefy GPUs. And y'know, it's quasi-porno.

I opted for dudes not gals. Part personal preference, partly because I happened to get my hands on an enormous dataset of hot shirtless guys taking selfies.

The initial results were rather poor: https://i.imgur.com/npzDcCL.jpg

That's around when I realized how important the training data really is. Too much variation in pose. So I trained a model that detected the face, nipples, and belly button, and aligned them vertically with that. Then I briefly skimmed over the dataset manually, and deleted anything that just varied too much from the median, in a rather arbitrary way.

Results were better: https://i.imgur.com/bS7ERC6.jpg

So I just let it churn away for a week, heating up the basement. Some final results with narrow truncation and style transfer: https://i.imgur.com/rhnWnpK.jpg

One thing I find intriguing is how bland those men are, really. Oh, beautiful, but too classically beautiful. In a sense, StyleGAN is a compression algorithm, and highly truncated outputs are something like a median of the inputs. So the above reflect a sort of "average" idealized image, that contains the traits most commonly found, in the media I trained it with.


In your example link here:

https://i.imgur.com/bS7ERC6.jpg

you show one fellow with a smartphone stabbed through the chest, plus a headphone cord fused into his forearm veins.

When you "let it churn away for a week," which of the following is more likely:

1. Smartphone Stab Wound Guys decrease in likelihood

2. More realistic-looking smartphone stabs

In either case it doesn't exactly inspire confidence in self-driving cars.


In a future Tesla shareholders meeting:

“Version 10.2 can detect children crossing the road with 97% accuracy, and collides with them dead on at the same rate. Those well publicised close calls are now a thing of the past!”


To be fair if a kid crosses a road dangerously in front of human drivers there are a lot factors at play that impede with the accuracy of those drivers. No system is perfect and we shouldn’t expect that.


Absolutely, but we cannot assume that machines have the same “right to make a mistake” than humans. If a machine kills a person, somebody is responsible and will get sued.


The good news: we kill less people.

The bad news: when we do, it's unfathomable how anything but a computer could do this


Sounds like it might be time for another instalment in the franchise https://en.wikipedia.org/wiki/Tetsuo:_The_Iron_Man


Superb


This would make for a hilarious blog post


The community can't have too few blog posts / ML experiment writeups, so do write a blog post on your experiences, it would definitely be interesting.


> partly because I happened to get my hands on an enormous dataset of hot shirtless guys taking selfies.

I feel like this is burying the lede. I'm very curious where you sourced this training data.


Wabi-sabi is a thing. Friend noted years ago how kids who are super aware of their weird teeth or hair colour turns out to be their most beautiful trait later.


Those first images remind me of the "brainlet" memes you often see in high-quality shitposts (something I appreciate more than the average HN reader I think).




Look at that reflects



They’re even more wonderful to stare at when there’s a pair of em’.

Boobies are beautiful as well: https://en.wikipedia.org/wiki/Booby


I suppose that could be considered "contamination" in the training data.


Inevitable successor to this: a TikTok-style adaptive porn feed that learns exactly what you like and starts generating porn customized to your kinks and preferences.


Porn addiction is already a serious problem for many people. Combine an enhanced version of this model with TikTok style delivery and I’m very fearful of the end result. It’ll be the equivalent of crack in terms of the dopamine rush.


We may experience the first ever dopamine burnout. How much can the brain handle?

Remember the film "Brainstorm" from the 1980's? Where the guy almost dies because he blasts a real-life full brain capture of sex into his own brain in a loop for days?

We'll probably learn if there is a thing as too much dopamine.


> Remember the film "Brainstorm" from the 1980's? Where the guy almost dies because he blasts a real-life full brain capture of sex into his own brain in a loop for days?

No, I had to go look for it and it was on youtube (NSFW) but is this the scene [0] you're talking about? This was likely what inspired the braindance features in Cyberpunk 2077, it's pretty much the 1980s iteration (with a massive tape) on what looks like a futuristic VHS cassette and a phone for some reason.

Well, it wasn't exactly just dopamine, but there was woman who pleased herself to severe level [1] with 30 min orgasms that impacted her entire life, it's a become a meme on JRE [2].

Sidenote: this is probably the only film where Chris Walken wasn't like he is portrayed now, even Kings of NY had that vibe about him but here he is kind of... normal?

0: https://youtu.be/cOGAEAJ4xJE?t=2969

1: https://www.zmescience.com/other/feature-post/brain-pain-cen...

2: https://www.youtube.com/watch?v=Oq6KMk8tkNE


Reminds me of an Arthur C Clarke story, Patent Pending: https://en.wikipedia.org/wiki/Patent_Pending_(short_story)


I cringe at the thought of watching it again and dismantling it from the very high place it currently stands on my all-time favorite movies list. Mostly just for the concept, I can't even remember the plot.


Unlike other films from my childhood, it holds up very well, mostly due to the actors. The very, very last moment (what "really" happens at death), however, is awful.


Sorry but to say, TikTok for P*rn has already been made, works for iOS and Android. Probably already has millions of users, with a percentage of them uploading original content. Not giving the name or link here for obvious reasons.


It’s Time to Build


"If you build it, they will come" takes a new meaning


As does "normalization of deviance".


This could get very ugly if the algorithm begins to generate images of underage persons in response to the preferences of those twitter users who have 'MAP' in their biography, an acronym which stands for 'minor-attracted person.'


That's why we shouldn't be giving those people nice things.


MAP huh? Exactly a TIL I really hope comes in handy for Jeopardy one day.


clap clap clap


That would be a cause for the models on TikTok to unionize because that’s taking their jobs away.


The really smart models will put together an AI to generate images of them using only their own images as training data, and then they can sit back and profit.

Eventually we'll be watching generated personalities on twitch playing video games, and then we'll know the end is nigh. There is no infinity, eventually the snake chokes on its own tail...


>Eventually we'll be watching generated personalities on twitch playing video games

that won't happen because the entire value proposition is in the personal connection to the streamer. (the modern derogatory term is 'parasocial relationship').

just like nobody watches chess computers or starcraft bots play nobody is going to watch bots play games. in fact to stay on the topic of the thread, people are making millions on onlyfans because they realized chatting with their viewers is much more valuable than generic pornography.


> that won't happen because the entire value proposition is in the personal connection to the streamer. (the modern derogatory term is 'parasocial relationship').

That "personal connection" is about as deep as the one I have to my favourite bartender. So I think the negative connotation of the term is quite justified.

I'm pretty sure, once we can add the ability to read and write external state to GPT3 and make the state personalized by user, we can emulate that as well :)


> the entire value proposition is in the personal connection to the streamer

Right - that’s where the money is made. The people that pay them have to be found in the broader market, though, and that’s where this sort of thing would shine. A streamer could pump out a stupid amount of content with this, always fresh and new, with little or no ongoing time commitment. That in turn would free up their time to provide personalized content to paying customers.


No joke Instagram influencers already do this to a degree.

I knew a guy who came across a post by an influencer at the cafe he worked at. She never came in. A different girl came in and had breakfast taking photos of the food and scenery.

It wasn’t a paid post. She just needed content so she outsourced it to someone who generated it for her.


There may be a difference between what we currently know as "bots" and when someone finally links generated images/video, and content through text and speech together into something that does a good approximation of a person.

The fact that it might occasionally do or say crazy things as it's not a real person might even be part of the draw. It's not like real people don't sometimes adopt controversial opinions purely for the attention it brings.


We've already got vtubers playing robot characters, it feels like it's only a matter of time before someone attempts to make a 'true' AI vtuber. Though I suspect we're still a while away from AI being able to complete arbitrary video games.


Those generated images, fixed permanently in the stage of early adulthood could still be produced when the model is well past retirement age.


Yep, that was one of the aspects that came to mind. I wonder if anyone has trained up a bunch of Marilyn Monroe images to generate new content of her?

What's the legal precedent for for those images? Does it depend on the source set of data? If that's public domain, does that mean new images are owned by the producer (who is likely to be someone else entirely)? Is it more like someone that draws a likeness of someone else (I'm not sure the copyright of that either, but it's probably well defined by this point).


There has been a series of pop concerts of late featuring holograms sometimes of deceased members, notably the group TLC who toured with a hologram of the late Lisa 'Left eye' Lopes who had died in a car crash a decade before. Some kind of legal precedent must have been set in order to permit that to happen. In this case, it looks like whoever holds the rights to the original performances from which the holograms were generated is the owner.

It would be a different matter, as you outlined above, if one created a digital facsimile of a person and used it to perform in ways that the original person did not. Does a person own their own persona?


There's a business model in there though: the algorithms need well tagged poses and images, and one of the biggest issues with sex work is that it is used to attack people in other areas of their lives (even in a perfect society for this, the desire not to be directly personally associated as a part of your identity would probably still exist).

If you can retain paying customers, and then pay performers for a supply of source material (which won't, unaltered, ever end up in customer hands) then there's a new business model somewhere in there.


And then sells your preferences to the highest bidder. A bit like the Westworld HBO series.


For the general public you mean. I'm sure Bezos already has a feed of "urinate in pants".

Why the downvote? At 15 $/h his 162.7 billion could buy 1.24 million years worth of bathroom breaks. You want to argue other motivations are at play here?



I'm having some trouble finding it through search, but if I recall, this project was submitted to HN at some point.


Was it this one? https://news.ycombinator.com/item?id=9847283

If so, it existed for a while but shut down years ago.


Yes, I think this is it. It wasn't generated porn, but it would hone in on what you liked similar to Tic Tac.


hmm yes that will definitely not totally fuck up the dopamine systems of young men and make an already badly addicted generation hopelessly dependent on this shit


We lost that battle a decade ago


Porn "addiction" and all that hand wringing is just FUD from the moral panic types and the skeezy media "doctors" that prey on them. In fact non-substance "addictions" aren't recognized at all. "Gambling disorder" is presently the only condition in the subsection of "Non-substance-related disorders" in DSM 5.

Addiction mediated by chemicals that directly increase perceptions of incentive salience through mesolimbic dopaminergic activation are real addictions.

Perceptions of sensory stimuli come in through the normal paths do not directly hijack any part of the brain like addictive chemicals do. Downstream of the stimuli there may be reward learning but it's through normal channels, not directly turning up incentive salience. You actually have to like/enjoy something first before reward learning happens.


Is there a DSM-5 condition for outsourcing all your rational facilities to the DSM-5?


I cited the current status quo and then I explained the scientific reasoning behind it. You are implying the DSM 5 created my beliefs, but I only cited it here because if I just stated the science alone I'd be critized for that instead. I've spent decades reading journal articles on this topic and there's every reason to believe that substance mediated addition is a very different thing than non-substance mediated "addiction".

The crackpots and abusive groups out there holding "porn addiction" and related treatment camps and services are just about as legit as the "pray the gay away" camps. They're completely unethical and unsupported but moral panic keeps them flush with cash.


Is it possible your issue is with the overloading of the word "addiction"? If we called it a compulsion instead would you be ok with that?

'cos I'm just thinking that Skinner boxes look pretty damn addictive from the outside, regardless of the precise mecahnism of action.


Yes, that's the primary issue I'm addressing. But even beyond that there have been many real studies (not anecdotes) showing that masterbation and porn use have no effect on sexual partners. In fact most of the one's I've read show the reverse correlation if anything: people who masterbate more are better able to enjoy sex with partners. ref: https://www.nature.com/articles/s41443-022-00596-y

Additionally, scientists have studied if it is possible to become "addicted" to sex and other normal sexual stimuli. It doesn't happen. At least not like with, say, methamphetamine. Imagine how many people would be "addicted" to sex if it were. ref: https://akjournals.com/view/journals/2006/11/2/article-p222....

The real dangers here are cults like "no fap" (ie, NCOSE, FightTheNewDrug, Reboot) and other unhealthy moral panic groups promising to kill each other and repeated threatening to kill adult actors, media producers, and scientific researchers.

People with the strongest opposition to scientific consensus have the lowest levels of objective knowledge--yet they also have the most confidence. If someone claims to "treat" sex addiction or porn addiction, they are part of the problem providing "non-evidence-based treatment". ref: https://www.sciencedirect.com/science/article/pii/S027273582...

https://link.springer.com/article/10.1007/s10508-020-01884-8 "Claiming Public Health Crisis to Regulate Sexual Outlets: A Critique of the State of Utah’s Declaration on Pornography"

https://ajph.aphapublications.org/doi/full/10.2105/AJPH.2019... "Should Public Health Professionals Consider Pornography a Public Health Crisis?"


Small point: you don't have any evidence that moral panic groups are unhealthy. If you're going to wield that "objective knowledge" sword, you'll need to die by it, too.


You're right. It probably doesn't change them.

There's a good chance there's just a selection bias for the type of person that joins a nofap cult (religious or otherwise) being intrinsically violent (unhealthy to others). Now they just have a cult to encourage them and tell them which researchers to send the death threats to.


> scientists have studied if it is possible to become "addicted" to sex and other normal sexual stimuli. It doesn't happen.

Dude, what planet are you living on? Sex addicts (people who compulsively seek sex to a harmful degree, and wish they could stop) are extremely common, as are e.g. food addicts.

Porn is also hardly a "normal" stimulus (i.e. comparable to stimuli that would have been found in the evolutionary environment). Cf https://en.m.wikipedia.org/wiki/Supernormal_stimulus

Honestly, it seems like you are saying things that are very obviously untrue because you've read some (probably methodologically and epistemically poor) papers. You should significantly reduce your (unwarranted) confidence or you are likely to get burned.


Please don't cross into personal attack. You can make your substantive points without that.

https://news.ycombinator.com/newsguidelines.html


He study forms of addiction. He's just relaying things he have studied.


I don't think that observation is relevant to any of my objections.


My issue is with the use of the term “addiction”, yes. It’s multiple orders of magnitude smaller in effect size on dopamine receptor down regulation compared to Opioid Use Disorder, for example, and I think conflating them can be a bit harmful.

That said, this battle was lost long long ago, so it’s somewhat of a moot point.


addiction refers to a pattern of behavior not just its scale. like, you can be addicted to eating even though it has a much smaller effect than smack and people still are and end up 600 pounds. i also think addiction around porn is not just addiction it's a fucky form of erotic plasticity that gets people hooked on weird stuff that messes with their ability to have normal, healthy relationships.

like, i watch a shitton of peers have problems with this and then y'all tell me "oh at least it's not opioids". bruh. yes it's not opioids it's a different kind of problem. then there's the people who are out here like "oh you're just a puritan" refusing to admit that there's literally anything potentially dangerous about porn at all.


> oh at least it's not opioids

That's not what I'm saying at all. I'm saying that we shouldn't conflate the two via language, because as you also admit, they're not even in the same planet with regards to their danger and effect on the human organism. They don't operate the same way at all. I said nothing and passed no judgement on the dangers or lack thereof of porn.


The point you should be catching, I believe, is that such things exist in reality independent of their existing in the DSM-5


> I cited the current status quo

You certainly did.

> and then I explained the scientific reasoning behind it.

That's charitable.

> there's every reason to believe that substance mediated addition is a very different thing than non-substance mediated "addiction".

And? They are both addictions.


Carry on doing your thing then.


They’re working on changing that, too.


Thats a long description of habit forming


/snark Welcome to the metaverse


Better than Instagram, at least.


no, its gonna have people combining dark pattern skinner box shit from insta and tiktok with the addictive nature of porn. and its gonna be way worse. yall wouldn't believe me if i told you how common straight up porn addiction is in young kids, it's fucking bad already.

edit: @stickfigure i'm can y'all please quit trying to tar me with this brush, it's bullshit. i'm not arguing about masturbation i'm saying excessive consumption of porn when people have limitless access to weird ass shit that doesn't reflect the real world is causing problems. don't try to make me some puritan or whatever because literally nothing i said suggests that.


If people wanted to work on this, I genuinely believe that could be huge.


I would happily pay $100 a month for that.


Eventually this and similar technology will be able to generate unlimited porn of all types. Including CP. Should or even could it be made illegal?

To me that kinda feels like if it was made illegal then you are basically acting as the thought police.

Edit: are people just uncomfortable talking about this or do they think downvoting is going to somehow prevent it?


The question of whether or not machine generated CP is ethical or not is a confusing one to me, since the crux of problem of CP is that it's violating the rights of a child. If you get rid of that fact, CP becomes a 'victimless crime', in a sense, and it becomes hard to pinpoint what the actual perpetration is. (Ignoring the fact that a CP model has to be trained on real images, which destroys this argument outright, for now).

Nobody wants to be a pedophile, but if you are unfortunately hardwired that way the impulses are impossible to ignore—similar to any other sexual impulse that 'healthy' adults have.

I think the goal is how to create the least amount of harm for children, and it's theoretically interesting to consider the possibility that machine generated CP could be used to ultimately end the real-life CP black market.

I'm sure this won't ever happen because nobody wants to touch this subject with a 10-foot pole; and our species tends to favor solutions that are idealistic rather than rational in many ways. But the European approach to criminal justice is something like: "humans are not perfect, and we can't fix them, so let's try to engineer solutions which make living together more pleasant". Nobody likes an icky solution like this, but the alternative is a lot of children suffering (although I'm not saying those two options are perfectly substitutable).

On the other hand, it could be a completely stupid idea outright. Just an interesting thing to think about.


>(Ignoring the fact that a CP model has to be trained on real images).

that's not how this tech works.

Did you think actual guinea pigs had to be trained to do household chores for these pictures to be generated? (dailywrong.com/new-course-teaches-guinea-pigs-household-chores/)

I had a lot of fun creating pictures of llamas in space. I'm pretty sure no llamas have ever walked on the moon.


I think it is how the tech works; though. The model knows what a guinea pig looks like, and it knows what household chores looks like, so it can concatenate the two. I don't think the model knows how children look like unclothed, and I don't think it can posit that by looking at nude human adults. But I could be very incorrect there; and these techniques will certainly evolve over time.


I think generating CP with AI would be 10000x easier than most of what I have personally made with with dall-e.

creating a llama wearing a space suit and walking on the moon is a lot harder than filling out the very limited amount of skin hidden by a swimming suit.


Question is does it reduce demand from the real world or does it serve as demand multiplier in the real physical world?


I think these are the right questions to ask; and these types of things would need to be studied in order to pose a potential solution. But I can't imagine any modern day institution beginning to pose these questions without facing public outlash. The sensationalized headlines that would spin out of this would probably stop any research in this area in its tracks.

In that way our species is very primitive. We're trigger happy in our outrage and often unable to focus on a larger shared goal, much to our own demise.


We can study near-like issues to get a hint of the phycological effects.

Have people into bestiality to play in their VR worlds and see if it suffocates demand or increases demand and we see spillover into the real world. (find a jurisdiction where the above is tolerated)

Maybe Thailand can produce some data given their apparent tolerance for certain behaviors.


I think the research shows that availability of normal porn reduces rape. (Don't have a link at hand, sorry.) So I would guess the effect would probably be the same here.


It's the age-old dilemma: is it a safety valve or a gateway to worse things?


But what exactly are those worse things?


It's not entirely a new issue, see the discussions around "loli erotic japanese manga" etc. Though I agree with you, nobody is touching this tech anytime soon (at least publically) even though you can easily imagine safe applications like MRI with AI generated images to study exact brain patterns for potential treatments and so on.


I don’t think you can ignore the fact that the model would have to be trained on child sexual abuse material, though. That’s pretty important.


I view that currently as a technical limitation that will likely be lifted as the technology progresses. Our species has been hard at work on the field of AI-image generation for only a decade, and we have thousands of years to go. So I would posit that this barrier is probably a short-term one.


I agree. But what do we do about the models that are trained on CSAM prior to this inevitable breakthrough? Because you and I both know it’s likely to happen.


What about writing a blog post about raping and killing people. We have to draw the line somewhere and for CP the line is at the first step.


Unfortunately, don't think folks watching CP would stop at that. Porn is an addiction (or compulsion?) and I don't see how CP makes it any better for folks with pedophilic tendencies.

To think that porn reduces sexual tendencies may be true (in the short to mid term), but eventually, as the addiction intensifies, those hooked have the urge to seek more novel, more aggressive versions of it (in reel or for real).

https://en.wikipedia.org/wiki/Effects_of_pornography#Sexual_...


Otoh, I have read articles worrying that young men aren't going out looking to meet women like they used to, because they are satisfying their desires though porn.


Not at all contradictory. Opioid addicts need higher and higher doses for their fix. In the worst case they too drop out of society, committing crimes such as theft and prostitution to service their addiction.


I agree that people addicted to porn want more porn. The question is whether they also want more real-life sex, or less real-life sex. It seems weird to have simultaneously a moral panic about both options.


Oh no the engine of society is losing fuel to burn through, how awful. How will the necrarchy ever afford their servants if the subjects have dodged the exploitation of their reproductive urge.


If you fear your reproductive urge is being exploited, I think eradicating the exploitators is a better course of action than withholding having children. There are better reasons not to have children than using it as a way to harm your overlords.


Add reefer into the mix, and you've got a recipe for madness.


I don't know if you're being sarcastic here or not, but I've mixed porn with stimulants and the result is definitely pretty. I've went through 16-hour masturbation sessions and have done things I would havje never imagined. The thoughts still haunt me. Porn addiction is not a joke.


It is a terrible idea to compare pedophilia to drug use.

One might be a victimless crime.

The other never can be.


perhaps it was just in response to the second part

> To think that porn reduces sexual tendencies may be true (in the short to mid term), but eventually, as the addiction intensifies, those hooked have the urge to seek more novel, more aggressive versions of it (in reel or for real).

and if so, it's a joke about the reefer madness movie and moral panic


Is that any different from the argument about violent videogames back in the day? Or Dungeons & Dragons before that?


I wish people would explain why they're downvoting these ethics-related comments so aggressively, because I'm interested in these conversations and not sure why other users find them irrelevant.


The guidelines specifically ask you not to post like this: https://news.ycombinator.com/newsguidelines.html. It's completely off-topic and, what's worse, repetitive.

It's frustrating not to be able to know why someone downvoted a comment, but it's the nature of the beast. Sometimes people think we should require explanations, but you can't change the way people are by demanding they jump through a hoop.

If you see unfairly downvoted comments, give them a corrective upvote (https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...). Other users will hopefully do that too.

A lot of the time, that happens and then posts like this end up as uncollected garbage in the thread, complaining about something that doesn't exist anymore.


I've personally noticed that the collective upvotes start right after someone posts "why downvote" rant like gp did.


Yes, I'm sure that's a thing too.


I think it demonstrates why it's so easy for technology to run ahead of ethics. No one wants to hear a public safety announcement when they get the keys to a new car.

OTOH, I think the risk of CSAM from this kind of thing is overblown, because either (1) it wouldn't actually involve real children in any way, or (2) the model would have to be train on actual CSAM, which there are already laws against.

If no humans are harmed by it at any stage, then how's it any different from the CGI murders all over movies and TV? In the absence of harm, the only reason to consider it a crime is that it's disgusting. But we don't prosecute people for having sick thoughts, just for acting on them. If you take the child and abuse out of the equation, the only argument left is that seeing CSAM (artificial or not) is a "gateway drug" to commiting child abuse. If that's the case then why shouldn't we prosecute people for watching Dexter, on the grounds that they're on their way to becoming serial killers?


> If no humans are harmed by it at any stage, then how's it any different from the CGI murders all over movies and TV? In the absence of harm, the only reason to consider it a crime is that it's disgusting. But we don't prosecute people for having sick thoughts, just for acting on them.

You say that, but created from scratch drawn material is illegal in some countries. What you've stated is a nice ideal but often doesn't match reality, and I'm on the fence as to whether that's a necessarily a bad thing in this case (I'm not sure we really know enough about sexual drive and how it interacts with the subconscious to know whether it's actually benign or has a societal cost).


> You say that, but created from scratch drawn material is illegal in some countries.

Homosexuality is illegal in some countries.


My point was the last sentence of what I quoted doesn't quite match reality. We do prosecute people for sick thoughts in some of these cases, regardless of whether we should.


Well... just because some countries prosecute drawings (or homosexuality... or homosexual drawings) doesn't mean there's any logic behind it. There can be enormous, totally illogical public will whipped up in favor of banning all sorts of things people consider abnormal. We can definitely have that discussion, as to what thoughts or creativity cross the boundary to criminal behavior, because likely at least some of them toe the line. But let's at least acknowledge the gradient of criminality between crimes with victims and those without, and not muddy the waters by pretending that every victimless crime is a stand-in or call to arms for a violent one. That way lies totalitarianism. Let's call it what it is when we want to jail people for having sick ideas - or for creating disgusting art: it's persecuting bad aesthetic choices. It's Thoughtcrime. Either this actually harms humans (crime) or not. Either it leads to a crime (prosecutable) or doesn't. The rest is in the realm of policing people's minds, or the future. And while some countries do that, they tend to be places people who have unique ideas to contribute don't want to live, because that type of control is never limited to one particular type of image or another, but mainly serves as an excuse for imprisoning political enemies. Those countries end up poorly because they shoot themselves in the foot by driving away anyone who might paint edgy art or say things like the earth revolves around the sun, or that God doesn't exist, for example. I'm not saying a society can't make the choice to prosecute thoughtcrime, I'm just saying it's a bad idea. And it's especially bad to mix it up with prosecuting real crime, because criminalizing thought is the onramp to the corruption that lets neighbors turn on each other, everyone bribe the police to arrest their petty enemies, and brings societies down. CSAM is an actual evil, one of the most evil things on earth. Using that fact to prosecute people for drawings or offensive speech or thought (since there is no limit to what can be found offensive - even this conversation is criminal somewhere!) is really cynical and purely for power, if you think about it.


My main point was to note that regardless of reason, we do seem to punish thoughts in some cases, so the last part of the statement I quoted doesn't quite match with reality.

The second part of my comment was more along the lines as to whether exposure to material such as this (material depicting children sexually) affects desire for material such as this, or behavior, for those that already desire it and/or have urges along those lines. I don't know the answer to that, and so an unwilling to make a blanket statement that it should or should not be allowed. There are plenty of things we restrict as a society through our government because we think it's bad for the whole. Sometimes there's little real reason for that, sometimes there is, and thus I'm on the fence until I know more.

My layman's understanding is that some aspects of sexuality are not well defined until puberty and what you're exposed to at that time, and then those are fairly fixed for life. If that's true, there are aspects to consider beyond an individual's desire, and a nuanced informed decision is worthwhile. It's possible the right answer here is not just "freedom of speech and expression" nor "protect the children", but a middle path with the most benefits and least drawbacks.


I'd bet that downvoting is a kneejerk response to the claim "porn is an addiction".


Anything can be addiction. Hell I got addicted to weed. At first for years it was fine but then I got into some rough patches when the pandemic started and I was blazed all day every day. It was a temporary escape where I could still do my job but beyond that I had zero motivation for months besides getting more weed.

Took months to kick the habit, completely.


Is it knee-jerk?

Alcoholism is pretty serious, but you wouldn't say "beer is an addiction". Same with video game addiction and many other things.

I find it a pretty prude generalization: it's a catch phrase for people who get offended by sex.


Developers like pretending technology is always good and/or inevitable, I think this sort of question is uncomfortable to many folks.


We detached this subthread from https://news.ycombinator.com/item?id=32573719.


The GP comment made largely unsupported fact claims, rather than ethical ones.


Ethics? Please. It's moral panic. The putatively ethical folks having a "harmless" academic discussion about which nice things everybody is or isn't allowed to have, according to them.


Now it's clear to me you're not being sarcastic. It's not about morals, I'm all in for you to have access to as much porn as you want, as long it's legal. If that's healthy or not for your mind, that's an entirely different story.


I don't trust you concerned "we live in a society" types any further than I can throw yous.

(This is tongue in cheek. I'm not upset.)


Many countries already ban painted or rendered CP.

US did that for a few years, as well, until the Supreme Court struck it down: https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalit...

So as far as the output goes, there's already some kind of precedent in most countries. Or are you asking about the model that performs the generation?


What? That's about things that _aren't pornography_.

US convicted for written obscenity just in 2013: https://www.techdirt.com/2013/05/20/court-finds-fantasy-stor...


The Miller test applies to all pornography, not just CP. If you look around, you can find federal cases about "transportation of obscene matters" for adult stuff, e.g.:

https://www.networkworld.com/article/2276921/porn-producer-s...


I’ll bite. The issue is it will take all of 5 seconds for an enterprising CP producer/admin/consumer to mix AI content into their stash and then claim they’re being persecuted when the feds come knocking.

It’s like lolicon, sure, you can sit there and pretend it’s a 2000 year old demon warrior… but she is still dressed, posed and drawn to mimic a small child.

Don’t get me wrong, it’s a conundrum, that’s for sure. I hate the idea of the thought police. I love free speech. But in the end, I believe any material, whether it is “art” or computer generated, should be barred if it contains images that can be construed as a child in a sexual act (ancient art gets a pass?). Lolicon does not stop pedophiles from being pedophiles. Nor would this. It would simply act as a gateway and a scapegoat imho.


There's prolly gonna be a big controversy when pedos start using it that way and proof is posted on mainstream websites. I've just discovered lexica.art and I love all picture generator AI tools, but when stumbling upon some pictures of young Emma Watson topless, it didn't seem ok, and reading the prompt for these images showed that the user didn't go for tags like "young" or "topless", it's just StableDiffusion randomly selecting these features, maybe because there's wierd content about Emma Watson online on shady sites, maybe something else.

I'm not sure I want all of them to close access to these tools, stop providing open source capabilities, or prevent specific keywords like Dall-E does, but it's the easy solution for providers. This is why we can't have nice things (there's so many interesting, funny and beautiful things that aren't illegal or even controversial but that are blocked as side-effects of Dall-E's restrictions).


This would be extremely controversial but I remember The Economist mentioned porn for pedophiles as part of the solution (the premise being it's a genetic condition that criminalization doesn't work very well against). AI generation would remove some the moral issues.


That will definitely happen; I suspect the criminal offense will be adjusted toward making distribution illegal, but possession might be tolerated (in some societies) if an artificial origin can credibly be established.

It's very hard to predict how this will shake out. I personally doubt that looking at extreme/contraband porn is a good strategy for coping with highly illegal desires but obsessive behaviors aren't something that's easy to model reliably. Considering that weaponizing false claims of predatory behavior as a coercive social/political tactic is a big thing at the moment, you probably shouldn't hold your breath hoping for any sort of consensus to emerge in the short term.


Watching CP tends to reinforce paedophilic tendencies. Many folks seem to "common sense assume" that it's a safe outlet and it would be nice if that were true, but it's not.


The problem is that you can find common sense arguments both ways. It's just civilization inadequacy that we aren't capable to work on finding objective arguments here, because it doesn't seem like a particularly difficult problem. But yet again, politics is the mind killer.


I hear this repeated verbatim a lot as if it were objectively true, along with the idea that watching porn in general is psychologically unhealthy, but I have yet to see any legitimate peer reviewed studies to corroborate either of these positions.


Unintentional The Exorcist Rule 34?

NSFW, obviously: https://pornpen.ai/view/27ylon4lGNpkfA4iJKJB


Made me think of this (also NSFW, mild though): https://www.oglaf.com/chauncey/


Man, I remember when I first discovered Oglaf. What a weeeeeird comic, loved it.


> also NSFW, mild though

It either is or isn't.


Safety is not described by just a boolean value.


Take a look at this one (also NSFW) https://pornpen.ai/view/CBHr5AlP8AFAZ56Jkp7i


Made me laugh. Showed my wife and she said, stone faced, "isn't this what you would want?"


What's going to be really wild is how the availability of convincing technology like this will seep into expectations that women think men have, and how they modify their own bodies to adapt. This has already happened, even with today's limited production capacity using human models/actors.

What happens when literally any crazy fetish can be explored infinitely?

Based on the images linked here (server has melted down), this stuff is not very convincing - yet. But when it gets there and literally any image can be called up, with endless variation, on demand, that's when things will get strange.

> I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated.

Eventually, this technology will be available to anyone with a few bucks to spend. The software won't be run on a server but on local machines. Those restrictions will not last. It will no doubt cause a crisis of sorts as the lawyers and politicians try to make antiquated concepts of decency and harm apply to hyper-realistic material generated algorithmically and without the direct involvement of any person.


I don't know. I mean, it's been really fashionable to say stuff like this for at least a couple of decades, but in the real world I see things playing out in pretty much the complete opposite. Guys are still "yey, boobies!" no matter the shape and size, most of my male friends would like more serious and traditional relationships, and the only type of women that get rejected are grossly overweight.

The only thing porn is desensitizing us to is... milder kinds of porn, really. Relationships get to live in a completely different mental box. I mean, think about the world around you - do you know of any real life people that got split up because they had regular problem free sex... but it wasn't kinky enough? If anything, I've only ever heard women complain about bland sex. Men are still happy to have it, check that box and judge the relationship satisfaction on the rest of the metrics.


It's almost like many adult males are capable of distinguishing between the pretty picture created for entertainment purposes, and the reality of human relationships. Who could have thought?

A lot of people also behave like pictures of naked women (or availability of actual naked women, for viewing and more, provided appropriate payment is made) is a new thing that human males only got access to with the invention of the internet. It's not that new, it's just another facet of the same old thing, and the same would happen - some small number people would develop an addiction, a bit more people would be regular consumers, a real lot of people would be occasional consumers, otherwise not a lot changes.


People who see those mental boxes as sharing the same space also struggle to understand kinky asexual people. It seems like my ace friends commission more porn than allo[0] friends.

[0] https://lgbtqia.fandom.com/wiki/Allosexual


>What's going to be really wild is how the availability of convincing technology like this will seep into expectations that women think men have, and how they modify their own bodies to adapt.

US waist sizes tell us women think men find walruses attractive: https://www.cdc.gov/nchs/products/databriefs/db360.htm

The idea that even a minority of women do _anything_ for attracting men is quite wrong when you look at actual behavior rather than day time television. The converse is also true. I find it astonishing that a country which is 85%+ fat has some how convinced itself that it has a problem of unrealistic body expectations. We're closer to the world represented in wall-e than that of playboy.


> The idea that even a minority of women do _anything_ for attracting men is quite wrong when you look at actual behavior rather than day time television. The converse is also true. I find it astonishing that a country which is 85%+ fat has some how convinced itself that it has a problem of unrealistic body expectations. We're closer to the world represented in wall-e than that of playboy.

Best post thus far, well done.


Is it fair to include people in relationships in these statistics then? It’s not uncommon for people to “let themselves go” once they are in a relationship.


I always thought that if something's bothering you, it's much easier to change yourself in order not to be bothered anymore than it is to change the world so that the thing bothering you doesn't exist.

It's easier to be careful with where and when I go somewhere than it is to create a world where I won't get mugged/assaulted. It's easier to be content with the life I live than it is to become rich. And so on...


To be clear I'm not making any statements about the creator, I imagine this is a reflection of the dataset: but the fact that there are so many categories for east Asian (Chinese, Japanese, Korean, and I hazard that the "Asian" tag would probably mostly generate east Asians) says volumes about the sexualizing gaze cast on Asian women. While certainly there's an ethical argument over such an algorithm (both for and against) with regards to sex workers and exploitation, it seems to me that the granularity of choice and what choices are available to the consumer also may require additional consideration, lest it reinforce negative stereotypes, introduces new stereotypes, or allows for harmful attitudes to be stimulated.

I'm not trying to be sanctimonious or overly moralizing, but half the tags I would associate with ethnicity are east Asian. And while I lean towards treating it as a mere curiosity, I think it's worth discussing as having unintended consequences and interesting implications at the very least.


Blonde, brunette, ginger and white all refers to Europeans, which represents less people than the Chinese tag alone, so it doesn't seem like it fetishizes Asians more than whites.

The strange part is why we don't have equally rich categories for Africans. Native Americans not having their categories makes sense since there are so few of them left today, so Africans are the only large group left out.


I thought about that while writing my comment, but while hair color correlates pretty well with ethnicity, the level of granularity given to Asians still seems somewhat unique. There are, for instance, no distinction between French and Italian, nor British and Scottish categories. Blonde and ginger do have fairly ethnic implications, but not many people would answer with their hair color if asked where their family originates. That, and the fact that these hair colors are not restricted to white people, due to multi-ethnicity being a thing, as well as commercial hair dye, made me drop them in my "half the ethnic labels" comment (though I will concede that the image in most people's heads when hearing any of those terms would be a white person).

This is also coming from a perspective where I'm assuming that the audience for this algorithm (outside of academic consideration) would be westerners where white people tend to be the majority. In that case fetishization generally would only apply to racial minorities, due to the fact that the term seems to imply a sexually gratifying quality to the "uncommon". I add that, speaking as an Asian American, that it's not necessarily all that easy for most people to tell Chinese from Japanese from Koreans, whereas it's basically tautologically given that a blonde can be differentiated from a brunette since the terms are more strictly aesthetic rather than geographic.

On your point about Africans, I would extend it to both Latina and the fact that "Indian" seems to be a catch-all for any South Asians. Middle Easterners are also not represented (though this doesn't strike me as a thing where racial equality is necessarily desirable). However, this is where I think the data set is mostly responsible.


I think they took the tags from some uploader website. CJK ethnicities can be inherently identified by types of sources(smuggled amateur films, commercially made, etc) as well as by styles, when appearing in those sites.


That's my guess too, but my argument is less that this arises from a specific place of malice or intention, and rather this phenomena comes from an uncritical approach to handling the dataset. There is, for instance, no reason not to group all the CJK data as "Asian" for the sake of similarity with the rest of the ethnicities. But in uncritically reflecting the dataset, it introduces downstream effects that could potentially be amplified in later work.


> but the fact that there are so many categories for east Asian... says volumes about the sexualizing gaze cast on Asian women

I think you're reading too much intent into 4 labels. From an American perspective, Japanese was the original--commercial, censored--"Asian" porn, "Chinese" is typically amateur uncensored, etc. I... won't elaborate further here, but there are stylistic differences that have nothing to do with race.


The tags should reflect that if it's really about the niche and not the ethnicity.


You raise some very interesting points. For me this also demonstrates how deeply rooted, historical bias and prejudice bleeds through to this technology and could become a perpetuating factor. These models will reflect societal values, for better or worse.

But also what will the next model, and successive models, that are trained on data from AI generate images, be like? It seems you could get an amplified or skewed model that is pretty far from reality. At least currently they are all trained on “real” images (granted some images are heavily edited or photoshopped). I can imagine that In the (not too distant) future most images will be generated and not “real”.


Yeah, I wanted to stress that the categorizations seem more to me to be symptomatic than a reflection of OP's decision making. I loathe the label "problematic", but it seemed worth noting a potentially unintended consequence that could introduce some harms.

That said, it might be too early to speculate on models trained on the outputs of other models. This algorithm seems to benefit greatly from the post-DALLE 2 art-gen renaissance, but still has a number of oddities and uncanny outputs (noted in the rest of this thread) that I imagine are nearly unavoidable given how familiar the human form is. But certainly, if such decisions end up amplified in later work, or through widespread usage, then the consequences are more significant than, say, simply reflecting uncritically a skewed data set.


This is one of the most interesting comments here. I notice a lot of biases in image generation AI, it's so subtle it's almost at an ideological level. The style of art it creates, the color and dress of the people it generates-- there are little things that that make you wonder "why this instead of something else?"


Keep in mind this AI is biased towards white men's interests which means it heavily fetishizes and degrades Asian women.

Even searching for other races shows predominantly Asian women in the results, its quite disturbing.


That didn’t take long. :p Stable Diffusion released just yesterday.

How did you design the backend? What kind of server are you using?


You can design the backend with the big ass and small ass tags.


This is the greatest comment on HN I have ever witnessed.


People doing porn can now truly claim they are doing "science" by providing valuable, insightful data samples :)

It shouldn't be surprising to see quick future advances in this domain as most of the porn material is usually available free, easily scrapable and extremely low risk of copyright litigation.

IMHO, the real game-changer as with normal art will be some sort of reliable assistant tool to quickly generate various components. Further finishing touches can be then easily done in Photoshop or similar software. Sometimes along this we will probably see results in 3D art or animations.


> and extremely low risk of copyright litigation

but for a different reason than now!

sometimes porn fails a copyright infringement claim because it can be argued that the particular piece of work doesn't satisfy Article 1 Section 8 of the US constitution backing the copyright concept "To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries;"

It's still unsettled case law because it's only troll law firms that are challenging on behalf of porn copyright holders, and they never appeal.



You have something against the ladies of Eroticon 6?


Nude bodies and fused body parts… These things are very disturbing to my mind as an adult. Scares me to think children would run into this kind of content online and what it would do to their psyche. Now I have to admit that some of them made me laugh but still, I fail to understand why in the 21st century we’d need to waste energy on these types of AI applications. It stops being interesting after the initial wow reaction. Hope this won’t open a subculture and will be relegated to curiosities in technology


I can see strange subreddits dedicated to girls with their heads backwards or with three arms. If people can get turned on by the cartoons they grew up watching, I think that shows the brain is incredibly malleable when it comes to sexual desires


https://neuralblender.com uses the same algo but with open prompt. no login required

NSFW Example: https://s3.amazonaws.com/ai.protogenes/art/d917ae82-2236-11e...


They could make use of a face restoration model post generation like Tencent gfpgan https://replicate.com/tencentarc/gfpgan


also seems to be very fast (I waited like 5 seconds)


One positive outcome of this sort of thing is that revenge porn will become less effective as women will be able to plausibly assert that this tech was used to generate the images.


yes, I think with all those existing deep fake tech we are already there.


If you’re open to constructive criticism: why is there no category for pussy? The model also seems biased towards massive breasts as opposed to a more natural size and shape


It's overfit to the training data.


Because Stable Diffusion has difficulty generating pussies.


Well I guess we’ve ticked the box on rule 34 for ML gents


> Well I guess we’ve ticked the box on rule 34 for ML gents

It was only a matter of time, but yeah, I thought the same. The internet is pretty predictable, when deepfakes images were a thing I thought it wouldn't be long before they used the fappening images to make videos and after that ML designed specifically for this would soon follow.

Porn has always driven much of tech, so its nor surprising, either.


It is true as they say that drug economy and Porn industry are at the forefront of adopting new tech.

I couldn't imagine this in my dream that this can be one of the usecase.

There's definitely some huge threat to the porn industry and its workers at the same time we can see that people may just misuse this to damage the image of an existing lady.

As they say if it takes minutes to generate image , how long before it becomes a video of 1 or 2 min.


ML is good at blending averages together, and here we see the flaw of averages - no real person has "average" proportions. Reminds me of this article about the US Air Force https://www.thestar.com/news/insight/2016/01/16/when-us-air-...


At least someone doing something useful with ML


What are you doing, step-AI?


"I'm gonna decompile your kernal!"


Brilliant.


Dear God, some of these things are horrifying, almost lovecraftian!


this one's gonna give me nightmares.

https://pornpen.ai/view/5fkq15n2JOmv3LoOkoWb

For those who don't want to click, two belly buttons, two nipple on one breast, dead black eyes, older head on younger body.


Imagine making a video game with a poorly trained 3D model generating AI to generate things to chase you. A whole new level of Phasmophobia-style game!


Girl with three hands looks really nice. https://pornpen.ai/view/uYWZG2t7YbYcum8dGo4z


These two might get along well: https://pornpen.ai/view/7Ls0XZkZ1k5buFv1jHrL



Any plans to generate images of men?



that doesn't make sense in early iterations given how demand for this stuff skews


the demand for females must be at least 100:1 . even higher if you consider people who are willing to actually pay for nudes it's not even more skewed because of gay guys


FWIW roughly 5% of men are attracted to men, so that'd be 19:1 . Still, yes, heavily skewed.


I wonder how much women are attracted or tolerant to women or depictions of women, that's rarely discussed, often assumed nonexistent, that I think exists in a substantial volume.


> roughly 5% of men are attracted to men

Citation necessary.



Different statistic altogether, but ~17% of Gen Z self-reports as lgbt in Gallup surveys: https://www.washingtonpost.com/dc-md-va/2021/02/24/gen-z-lgb...


This is most likely increases in "B", less "LGT", though I expect all to increase given the reduction in stigma.

Vast majority of new cases of "B" are likely to be cultural. i.e. White women in HS/Uni. Declared, but never expressed.


as a gen z, there's a considerable fraction of mostly girls who identify as bi or nb just bc it's "trendy". tbh it's a problem and lowkey trivializes people who actually are those. i think this is what happens when you emphasize a "lgbt community", young people seeking community will join for no other reason.


Perhaps of the menacing werewolf variety.


I tried it on neuralblender - did not quite work out: https://s3.amazonaws.com/ai.protogenes/art/bbba4208-2378-11e...


how about women with male parts?


not sure what happened hear (NSFW, (horror?) warning): https://s3.amazonaws.com/ai.protogenes/art/367f0f8c-2379-11e...


So you can't specify custom text, but you left the option to specify 3 breasts?

https://cdn.pornpen.ai/146136F0946B4772.jpeg


It also has some weird understanding of arm anatomy...

https://pornpen.ai/view/BRKQgdJG6vabTZk8eXDl



... can't ... unsee ...




Every persons fantasy.


Wow I'll never look at Machamp the same way again


I am DYING at these


It is the future, have you seen Total Recall ? if not watch it you'll understand


same thinking, good call, upvoted


Sorry, sometimes it produces really strange results :P


This is just a quirk. Custom text could generate kiddy porn.


Wouldn’t that require feeding it that type of data?


The algorithms could relate 'barely 18' to a certain kind of body. And 'barely 18' is similar to 'teenager' which is similar to 'child'. Don't underestimate neural networks. Nobody knows what goes on inside it. https://xkcd.com/1838/


And child is similar to infant, and infant is similar to fetus, and fetus is similar to sperm, and sperm is similar to cell. Finally a ML way for simulating biology.



Three really is a crowd isn't it.


People need to seriously grapple with the copyright implications. I don't know if (and don't imagine) porn producers are okay with people using their product which is supposed to lead to paid memberships (on of and the like) being used like this.


Did you look more closely at the results of this generator? The generated images are really not suitable for the same purposes as the content you can find behind those paid memberships.


Some people sell nudes, so yes...


When I said "more closely", I meant it. Click on "search" and then click on individual thumbnails to see full images. I have yet to see one that doesn't have something misshapen in it. Eyes that remind me of Corinthian from Neil Gaiman's "Sandman", a surfeit of arms or breasts, buttocks in place of stomach, breasts that hang like slightly melted plastic, and so on, and so forth.


It's manna from heaven for the small number of people who are erotically stimulated by uncanny valley ఠ_ఠ

Seriously though, it's very good for a tiny homebrew project that hasn't been running long or with huge resources. Just think about where this is gonna be in 3 years. Photorealistic interactive videogames are probably not far off. I read a while back that Second Life is now a hub for virtual sex experiences so I assume this sort of tech will converge, and do so rapidly.


Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: