Hacker News new | past | comments | ask | show | jobs | submit login

In the future, I hope we can use existing AI to create fake child porn images so these brave officers wouldn’t have to feed these motherfucking-sickshitbag-fuckers real child porns (although the police stated the pictures they used already exist else where).

Bravo to all the people crackingndown in human trafficking and protecting children from harms




I hope they got signed releases from the grown up victims whose non-consensual images were being posted. If not, then they are increasing the harm. If an a adult who valued privacy had a sex video stolen and then distributed by the police they would be livid. For it to happen to someone who was abused as a child is unconscionable. From that perspective artificial might be preferable.


Out of interest, in what way is harm increased? Does the same harm occur in cases in which the image is only being viewed?


This is like some sort of awful Black Mirror episode. What if we could generate infinite "new" content without harming another human to do so?


I say do it. I'm already ok with Loli Hentai existing, although I find it revolting to look at. If artificial child porn indistinguishable from the real thing becomes available, it will devalue its production. Given the risks, it seems likely that much fewer children will be raped.


In Australia it would still be illegal, here's the link for NSW[1]. Maybe it should be reformed to match your idea. But there's a reason why the law is written to make things that are "implied to be" child abuse material to still be child abuse material.

[1]: http://www.austlii.edu.au/cgi-bin/viewdoc/au/legis/nsw/conso...


But what if it's artificial child porn that looks like your child? Do you really want to enable that?

Currently, in the USA it is clearly illegal to produce porn that resembles a real child, even if no child was exploited to produce it -- despite the fact that purely fictional (e.g., drawn) erotica featuring underage characters is still only of dubious legality. There are justifiable reasons for this.


It looks like it would be legal if it is distinguishable as being purely fictional, where what a reasonable person would believe is taken into account.

For those who do not want this type of search query to be in their history, have a link:

https://www.justice.gov/criminal-ceos/citizens-guide-us-fede...

If my reading is correct, images that are fictional, and appear fictional to a reasonable person - inferred by me through a number of law courses and not directly stated in the link, are not illegal. Drawings should be fine. CGI, maybe, depending on how easily it is distinguished as being fictional.


The PROTECT Act specifically had provisions against even drawings of any kind, however those parts to my knowledge were ruled unconstitutional. On the other hand, there are various state laws against the material; I don't have a better source than Wikipedia at the moment, but:

>Currently, such depictions are in a legal grey area due to parts of the PROTECT Act being ruled unconstitutional on a federal level; however, laws regulating lolicon and shotacon differs between states; several states have laws that explicitly prohibit cartoon pornography and similar depictions (such as video games in the state of New Jersey), while others usually have only vague laws on such content; in some states, such as California, such depictions specifically do not fall under state child pornography laws,[70] while the state of Utah explicitly bans it.[71]

From: https://en.wikipedia.org/wiki/Legal_status_of_drawn_pornogra...


I'm not qualified to speculate how well those laws will stand up to a challenge as a 1st amendment issue. The SCOTUS tends to take a strong protective of the 1st view.

My inexpert opinion would be that those laws would be ruled unconstitutional, but my opinion is absolutely not an expert opinion. While I have taken numerous legal courses and have a decent familiarity with law, I am not a lawyer.

I will also add that not only am I not a lawyer, I'm not your lawyer and the above isn't legal advice. Consult a qualified legal representative, in the appropriate jurisdiction, if you need to.

That said, some areas have some pretty tough obscenity laws that are still on the books. When challenged as a free speech issue, they tend to fall - even though the law may still remain on the books and people still get prosecuted.

Florida has at least one such law and they have lost in front of the SCOTUS but still use the law. More recently, it was used to charge the folks who make the overtly degrading porn. They actor does stuff like write on then with lipstick, spit on them, etc... They used the law to charge them, though people have previously appealed that law to the higher courts and won their appeal.

I'm not actually sure how they get away with that?


>But what if it's artificial child porn that looks like your child?

I don't understand the problem. Let's assume that no real images are used as input to the artificial child image creation process. So what if it looks like your child? That is mere coincidence, and doesn't seem to me to be reason why it should be illegal to possess or produce.

>There are justifiable reasons for this.

Can you please elaborate on what these are, or where I can read about what they are? So far, the reasons I have been given and those that I have seen used for example in England when laws were being created (the CAJA 2009 law) have been poor reasons and unjustifiable restriction of freedom of expression in my view.


Then we use that to prevent children from being raped at no cost to civil liberties, or risk for the innocent.

And if anyone has a moral problem with that, because it feels wrong, well fuck them.


That would actually still be illegal in Australia (at least, in all the states I've checked). For NSW in particular the The Crimes Act (1900) s91fb[1] states that materials that "appears or are implied to be" child abuse material are legally defined to be child abuse material. The criterion is further expanded by saying that the judgement of whether something qualifies is by "the standards of morality, decency and propriety generally accepted by reasonable adults" (among other restrictions). Though, as this article proves, Australian police are particularly well-known for being allowed to break laws for the purposes of an investigation.

Given that loli isn't legal in Australia, I doubt that photorealistic artificial child pornography would fare better.

[1]: http://www.austlii.edu.au/cgi-bin/viewdoc/au/legis/nsw/conso...


This idea raises an interesting question - what if all child porn in the future was artificially generated. Would it be illegal to consume or share that material?


Probably. Child porn is illegal today because consumption incentivizes production of even more material. If you could somehow guarantee that production would only come from artificial sources then there would be no reason for it to be illegal, but making that guarantee in the real world is unrealistic. Once you create demand for artificial content then someone is going to try and slip the real thing in, bringing us back to square one. Artificial depictions of child porn are already illegal in many jurisdictions for that reason.


"Once you create demand for artificial content then someone is going to try and slip the real thing in, bringing us back to square one."

That is a slippery slope fallacy. Would you apply the same logic to legal pornography and say, "Well the actors are just too young, someone is going to try and slip in underage actors?" Or to non-visual descriptions e.g. Nobakov? At what point do you think the line should not be expanded?

"Artificial depictions of child porn are already illegal in many jurisdictions for that reason."

I think the reason is a lot less rational. In the 80s and 90s we had a widespread moral panic about child molesters that results in thousands of innocent people being thrown in jail. We still see remnants of that panic. I think we have shifted from a rational motivation for protecting children to a moral motivation to jail pedophiles. People do not want to have pedophiles in their communities and if a pedophile is able to avoid breaking the law people demand a broadening of the law. The idea that a pedophile could avoid prosecution by satisfying himself with cartoons instead of recordings of child abuse led to the law being broadened to ensure that the pedophile is punished.


While consumption may incentivize production, reducing supply does not affect demand.

Why would you assume that people slipping the "real thing" would be a problem? We watch violent movies for entertainment, and some will enjoy /r/WatchPeopleDie, but the latter will always be niche, and intentional production of such clips is still illegal. I do not see the benefit of outlawing artificial depictions.


> Child porn is illegal today because consumption incentivizes production of even more material.

Does it? Does that happen with other kinds of not-paid-for porn? More importantly, shoving the entire thing underground has got to make it harder to find the abusers creating these images. Which effect has a bigger impact?


> you could somehow guarantee that production would only come from artificial sources then there would be no reason for it to be illegal

Yet the majority of anime porn is illegal on the countries of the commonwealth even though they are not even close to looking realistic.


I think I read in the 90s that made-up child porn (such as drawings from imagination) had recently become illegal in the U.S. It doesn't seem like that could've survived a first-amendment challenge, but IANAL and I don't know what's the status now.


It would in the United States; the law here criminalizes possession of cartoon depictions of child sex abuse:

https://www.law.cornell.edu/uscode/text/18/1466A

There have been prosecutions in the past:

https://www.wired.com/2010/02/obscene-us-manga-collector-jai...

Yes, it is controversial, as the original reasoning for banning child abuse imagery was to protect the children who are harmed by its production. This is the result of a shift in thinking: while the original idea was that we should protect children, today it is a matter of morality and pedophiles are considered morally deficient. As a result, even if a pedophile has never harmed a child, the common notion is that they are a danger to the community and that we can never be too careful.


https://www.justice.gov/criminal-ceos/citizens-guide-us-fede...:

"Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. "


I am glad there are laws like that. Unfortunately, the hardware and software keeps improving. That means that someday a person sitting at home will be able to produce all the realistic child porn they want, and the police will have no way to detect it.


Even more interesting: would it be moral. It seems to me like it would be great if police wouldn't have to feed them, but they wouldn't even have to rape children to feed themselves.


But what if looking at AI-generated child porn inspires a pedophile to do something similar with a real child?


[flagged]


Please tell me you're joking, snakeanus. There is no way you could seriously be trying to make out paedo's as the victim here?

> I fail to see how arresting people for a thoughtcrime is heroic.

Thoughtcrime? No, viewing or possessing CP is an actual crime.

> How can someone who has never raped a child but just views CP on an online forum be a "motherfucking-sickshitbag-fucker"? Who was the victim of this person?

Quite obviously, the child/children in the pictures they're viewing.

> Why do you feel the need to insult people who haven't done anything wrong?

They have done something that isn't just illegal, but unquestionably morally reprehensible. How could that ever be considered 'not wrong'?

> How do you exactly "protect children from harms" by catching people that simply view CP?

What are you asking here? How do you protect children from paedophiles by arresting paedophiles?


[flagged]


Ok this is ridiculous, only on HN could we see child pornography being defended as a victimless crime.

First of all, no child should be exploited in that way. Period. Not up for debate. Whether or not they feel victimised at the time by what has happened to them is irrelevant. So by viewing or obtaining CP, one is supporting and proliferating that material.

> I will have to question that.

Question all you want, I don't think you'll find any other sane rational person who thinks watching videos of children being raped is totally fine.

> There is no victim nor is anybody harmed by possession nor viewing. Thus it is not wrong.

Here it is again 'CP isn't wrong'. Yes it is, I don't see how you can think it's not.

> Just like not every straight/homosexual person is a rapist, not every paedophile is a rapist.

Really? Every paedophile is a paedophile. It's not a kink, it's not a fetish, it's not a sexuality, it's a mental health problem and one that can potentially have horrific outcomes if the resultant behaviours are welcomed and encouraged.

I'm not responding any more after this, because frankly if you think being a paedophile is totally fine then you need to get help.


We've asked you before not to make things up about HN to score points in an argument. This one was absurd, and a nasty thing to say about a community you belong to.

You've broken the HN guidelines here by resorting to personal attack. You've been breaking them elsewhere too. Someone else being wrong doesn't give you license to break the rules and make this place even worse, so please read https://news.ycombinator.com/newsguidelines.html and clean up your act.


Apologies, dang.

This particular issue is just a bit of a sore spot for me because I have a friend and a family member who both suffered sexual abuse as young children. Doesn't excuse my comments though so again, sorry about that - I'll make more of an effort to be less abrasive and confrontational in future.


> We've asked you before not to make things up about HN to score points in an argument. This one was absurd, and a nasty thing to say about a community you belong to.

Hang on - people routinely defend creation, possession, and distribution of images of child sexual abuse on HN, especially composite images. I see it here far more often than on other forums.


> only on HN could we see child pornography being defended as a victimless crime.

No, not really. I have seen it being defended in multiple places over the years. Mostly places where there are many STEM people and can't bear the thought of certain numbers being illegal.

> First of all, no child should be exploited in that way. Period. Not up for debate.

Exploited in what way? Remember, even a 17 year old taking a picture to show it to his/hers gf/bf is considered CP in every country that I know of. Who is exploited in this case?

> Whether or not they feel victimised at the time by what has happened to them is irrelevant.

If they are a young child then I will agree. If they are a teen then no. I fail to see how this is relevant to the discussion however. We are not talking about the production of CP, we are talking about the consumption and distribution of it.

> So by viewing or obtaining CP, one is supporting and proliferating that material.

How did you deduce that from the previous sentence exactly? In any case, I fail to see how "by viewing or obtaining CP, one is supporting and proliferating that material".

> Here it is again 'CP isn't wrong'. Yes it is, I don't see how you can think it's not.

Viewing or distributing CP isn't wrong because nobody is harmed. I don't see how you can think that it is wrong.

> Every paedophile is a paedophile

Sure? Just like every straight person is straight. That being said not every paedophile is a rapist as neither is every straight person.

> it's a mental health problem

They said the same about homosexuality a while ago. Look what happened to Turing for example, a great mind put to death because he did things that even though they harmed nobody it offended some people in power who considered them gross.

> because frankly if you think being a paedophile is totally fine then you need to get help.

If you think that we should arrest people who have done no harm on another human being directly or indirectly then you are a monster and you should get help.


>Here it is again 'CP isn't wrong'. Yes it is, I don't see how you can think it's not.

You haven't defined any criteria for 'wrong'; the person you are replying to is taking issue with the fact that the harm principle (a commonly accepted ideal of what is 'wrong' in individualist society) does not apply to any given instance of viewing or possessing child pornography. You have to define your criteria for 'wrong' rather than simply assert it. By defining criteria, only then can you be refuted in any meaningful sense, rather than a rally of "it's wrong" and "no it isn't wrong".

So if you say that it is wrong, please back up your meta-ethical position.

>It's not a kink, it's not a fetish, it's not a sexuality

Pedophilia is a fetish; it may be other things too, but it is a fetish for children, this is even in the 'plain' sense of the word fetish, which is to concentrate on a particular aspect above all else. The pedophile, rather than fetishising say "normal" features like breath, fetishises extreme youth.

>being a paedophile is totally fine then you need to get help

Nowhere did the poster say this.


We've banned both your accounts in this thread. That kind of sockpuppetry is an obvious abuse.


Agreed with the crackdown, but not with the "brave" methods. Someone is getting paid to not only look but distribute this all day, everyday. We need to take measures to demolish government institutions that post and distribute this content under the guise of "law enforcement." I read far more articles about police-run sites and forums being shutdown than I read about wild sites being taken down.

This is no better than the "brave" arms smuggling operations of the ATF.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: