I've spent two decades plus now browsing the web, including my share of porn, but have yet to come across an unambiguous child porn or child abuse image. I don't doubt that many of you have. But I do wonder if it's a significant enough problem to justify the level of surveillance that would be required to wipe it out.
"Thousands of gigabytes" is consistent with a single cheap hard drive. If that's the threshold for alarm then the alarm is permanent.
For a while you could sit on 4chan.org/b/ hitting refresh every few seconds and you would inevitably see someone sharing abuse images inside of a not-very-long window. Typically it was just someone being a dickhead and think it's amusing to post it or something, but sometimes you'd see someone post a whole album, and I assume were using 4chan as a (temporary) distribution point.
It's worth noting that the mods on 4chan have always been shit-hot at removing this content, but /b/ used to get just _so_ damn busy would take a while for it to be spotted. Like really busy. Multiple pages of new posts every second busy.
The point is that they're pretty easily findable if you're looking for them.
And stolen or extorted images of late teens, in particular, are all over "amateur" porn sites even if you're not looking for them.
"Thousands of gigabytes" is consistent with a single cheap hard drive. If that's the threshold for alarm then the alarm is permanent.
The concern ("alarm" if you will) is not based on the count in gigabytes (which is of course trivial), but in the harm caused to real, actual people. As explained in the article.
Was it explained? I saw plenty of discussion of the harm done to children, and the lasting emotional problems carried into adulthood, on the part of the people who abuse children. Then there were a few words from one of those victims who said that it is upsetting to know that images of his abuse are still circulated and consumed. It seems like a stretch to say that the existence of those images causes harm; at most it would seem that the possibility that such images exist complicates things for victims, but that the actual harm was and has always been caused by the actual abuse that occurred.
The fact is that nobody can ever know with certainty that child abuse imagery has been thoroughly deleted from ever storage medium on the planet; there are just too many computers in the world, too many ways for the imagery to be stored, transmitted, and shared. Victims will always have the possibility that someone out there is viewing such images hanging over there head no matter how many forums are taken down and no matter how many file sharing services remove files. It is one of the darker realities of the Internet age.
It seems like a stretch to say that the existence of those images causes harm;
Not a stretch at all; I'd recommend you so some reading into the stories of the many, many people whose lives have been permanently upended (or worse - there have been many suicides) after finding leaked images of their abuse (or simply revenge-dumped images of consensual relations) posted in easily searchable websites.
As to the rest of what you're saying -- "There are too many, computers, we'll never really be able to completely stop this" -- look, it's a matter of degree and ease of access. Just because we can't bring the effective visibility of these archives down to zero -- doesn't mean we can't reduce it very substantially, down 90 percent or more from the current level.
My point is that victims will never actually have the peace of mind to know that imagery of their abuse is inaccessible. Once the pictures/videos have been taken the harm has already occurred and the victims will live out their lives knowing that "somewhere" "someone" is consuming those images -- even if they never see evidence of it. The harm is caused by the possibility of such sharing.
There is either less of it, but more likely so much other stuff now.
and the pedos have evolved to better hide it.
There is a lot of really sick shit out there.
In the simpler times of a much smaller web,
pedos were not very cautious and you could come across it
whilst just looking for regular pr0n.
From what I read you can still find some with a tiny bit of work,
but most of it exists in closed forums and other methods of private
sharing.
A somewhat similar quandary relates to a Norwegian black metal band
(some say they started the whole thing, others not).
They were a troubled bunch of men.
There is a movie about them (Lords of Chaos).
Anyways skipping over most of it, the singer in Mayhem, the first one,
went by the nick Dead, his real name was Per Yngve Ohlin. (from Sweden).
He blew his brains out quite literally with a shotgun. (1991)
When band member Euronymous (Varg Vikernes, now Louis Cachet) discovered Dead, being quite dead with a big mess, he ran (or drove) to a nearby gas station and bought some disposable cameras.
He then documented the scene with a series of pictures.
(Allegedly he also collected skull fragments)
These pictures are gruesome.
The family of Per Yngve Ohlin (Dead) has attempted and struggled to have
the images of their dead family member removed from the net.
They have been unable to do so.
It always pops back up.
(One of the photos was used for a bootleg (I think) release as the album cover)
This is such a small, limited event, but getting rid of them is near impossible.
The scope of getting rid of all horrible photos on the net is vast and, I think, nearly impossible today.
I imagine the laws and the infrastructure changes required to better be able to remove offensive / illegal / violent / gore material from the Internet would be so pervasive that it would destroy what we have today.
We used to run a dating site where people could exchange pics. We had to manually check every submission. Loads of cp, we had a direct link with the police to report them.
I think nowadays you could find an explicit image scanner and ping someone for a near 100% match right? I wonder if today you still have to rely on reporting.
In the early 2000s I was checking out various Gnutella clients as possible Napster replacements, and when I wanted to watch a movie, I would downloaded a bunch of files named <movie title>.mp4 or <movie title>.mpeg. I say "a bunch" because only a few would be a watchable version of the movie. A lot of them, especially for new, popular movies, were actually zip files full of porn. Usually these downloads functioned as ads for (presumably legitimate) porn sites, with an included text file telling you where to go to get more of the same, but a few had no attribution and contained things I would rather not have seen. Each time it happened I got a little bit more scared until I finally took a sledgehammer to the hard drive and never installed Gnutella or LimeWire again.
Reading the article, I guess it's still common for such content to be made available online with no security other than a bland and misleading filename. Seems like low-hanging fruit for harm reduction, if it can be done in a way that respects civil liberties. When you say something as absolute as "wiping it out" of course that suggests draconian, intrusive measures, but it would be reasonable for a service provider like Dropbox to be able to detect unencrypted child porn shared openly to the public via their service. Even if the only result is to force predators to use encryption and basic access control, that could have a disproportionate effect on how widely individual images are distributed.
How about ambiguous cp? Seems like Twitter, Instagram, YouTube and other social media sites are filled with people sharing this stuff. And if you report it- often nothing gets done.
I'm guessing what they mean by that is certain kinds of hentai I shall not name here, as well as underage models.
Those are things I inadvertently ran into over a decade ago, but these days that virtually never happens, especially in terms of the latter.
And no, I didn't search for those things, so don't make jokes.
The characterization that mainstream social media platforms are "filled" with such material seems disingenuous, unless I've been living on a completely different planet. I'm sure it exists but I believe that you'd have to actively seek it out.
I'm talking about the underage models. It's a bit different from kids self posting content, that then attracts creeps vs the underage models that have an entire organization/staff of adults that are taking photos and selling them.
I'm not on social media very much at all, but- my partner used to spend quite a bit of effort reporting Instagram accounts advertising this stuff. Reports were routinely rejected, and she eventually burned out on it. Meanwhile, misinformation is very actively and preemptively policed.
Yes, you'd probably have to seek it out. But, what was troubling is how it's just routinely ignored, despite reporting.
At least one so-called "grey area" involves teenagers wearing revealing outfits or posing in potentially sexual ways. As far as I understand it most police forces do not waste their time with such imagery, even though it may technically be illegal, because there is just so much unambiguous child abuse imagery to deal with -- teenage girls taking pictures of themselves is a lot less alarming than a 40 year old filming himself abusing a 3 year old.
Since age ist continuous (at the scales that matter here), as are “situations” and peoples’ wellbeing, it does seem quite blurry to me. A twelve-year old being forced to dance, for example, would fit both categories you mention.
I've trawled around the dark web (.onion) before for quite a while and have never landed on any CP. I did see a link once to something that looked like it could have been CP but of course didn't follow it. Point is that even in onion-land it seems that one can't easily find CP unless one is actively attempting to search it out. It's not something you are likely to be ambushed by.
I explored on Tor about 15 years ago, by just finding indexers and what not from the regular net. I couldn't avoid links to CP everywhere. For every "innocent" link I saw, there must have been multiple links (claiming) to be of CP stashes. I'd say the only thing comparable was the number of "hitman; Pay half first, half after" links.
My experience was the same and I quickly gave up on Tor for that reason. I occasionally toyed with it as a clearnet proxy but hidden services made me really pessimistic about its mission.
Fortunately it does seem like a lot of that has been expunged. I recently revisited hidden services, as well as the Hidden Wiki (which I'm almost certain had CP links a long time ago but my memory is cloudy), and it seems like that filth isn't tolerated anymore on most indexes I found.
I think it's safe to assume that they were fed honeypots.
And you know something, I don't know if I'm even opposed to that.
The idea that the government must have a ton of CP stashed somewhere for such purposes feels about as disgusting as disgusting gets, but I'm not sure how many of those monsters you can catch without placing bait.
1. Honeypots are controversial and practically invite an entrapment defense. Better to have police infiltrate forums, identify people, and give prosecutors unambiguous evidence.
2. Calling people who view these images "monsters" contributes to a stigma that makes people hesitant to seek help before their urges lead them to abuse children.
2.5. If they are "monsters" and someone you know and love is not a "monster," what do you do when you discover that your loved one is one of those people? We go around asking why people look the other way; the way we reflexively dehumanize pedophiles is one of the reasons. We need to recognize pedophiles as human beings who have serious and potentially dangerous mental health problems, and instead of calling them "monsters" we need to reoriented ourselves to say "let's get this person help before they do something monstrous." Obviously a person who is actively abusing children needs to be arrested, but let's not forget that we are talking about people who actively consume imagery of children being abused, many of whom are not necessarily abusing children (yet).
I have no idea why you’d be opposed to the idea, yet I’m fairly certain they are not honeypots. I’m not even sure if that would be legal. Besides, if these sites had the ability to identify their users, we’d hear about the prosecutions. Or, if they were run by actual criminals, we’d hear of shakedowns of users of these sites?
> Or, if they were run by actual criminals, we’d hear of shakedowns of users of these sites?
Why? If someone is using those sites, they are extremely unlikely to tell anyone they're being shaken down because doing so requires admitting that they have used those sites. I would imagine that that group of people is a lot more likely to stay quiet and pay up than many other people.
That aside, I'm not entirely sure how people using those Tor sites would even be shaken down given Tor's anonymity. Unless, of course, they paid with a CC and not a form of digital currency.
We do hear about prosecutions, and the sites are run by criminals (as operating such sites is a criminal act). The police are better resourced and better able to identify users than almost all the forum operators, and in any case I doubt the majority of forum operators are interested in shaking down their users.
No idea, I kept away from them as best I could, but I did see comments (on the indexers that, I guess, were there for "rating" the links) lauding most of them as such.
Honestly, it reminded me a lot of the early 90s www - that is that anything and everything just couldn't be trusted unless you knew the source outside of the web, too. (Like you know the author's personally, or you have confirmation it is a university site, etc.) and that put me off quite a lot. I really didn't fancy the idea of even accidentally opening the wrong link and seeing these images so I basically stopped using Tor altogether and haven't been back since.
I could not tell what was real there. I'd find a wiki that had links but it's like is this a honeypot or is it real kind of thing. I don't know what I was looking for some like hidden secret, something cool.
I know a child who was repeatedly raped on video, and the perp kept on boasting to the girl about how he was trading the videos for stuff.
He was eventually charged, and it was really really tough making a case against him. He argued that the stuff the girl said was implanted by the therapists, that her testimony was tainted, that he would counter-sue, etc. (He got one year!! which was more than I expected, and will soon be out to terrorize her further.)
I have no doubt that there are many government services that are busy curating all this type of stuff [and I don't doubt they are mostly staffed by sickos, but that is irrelevant for this post].
I dreamed that there be some way I can upload a image of a victim, and be told if there is such a video in the DB - and then, with 100 proofs of identity, only she or her guardian would be able to access it. That would at the very least allow such perpetrators to better be convicted, and in the case where there are multiple kids in the videos or the videos are part of a group, it would help find the other victims and perhaps stop a lot of wickedness.
I reached out to one British agency, and they told me that they have a collection of videos, that its existence is murky legally, and that they cannot help me search within it.
But maybe someone somewhere knows how this could be done?
It’s not really a tech problem? They obviously have people that are allowed to look at the stuff, so they could look for evidence themselves. The total number of victims should also be quite manageable, especially if you know the timeframe, location, camera etc to narrow it down? From what the yandex image search can do, I believe a simple photo of the alleged scene of the crime might be enough.
The problem, above, is that it’s the wrong person “investigating” this. You won’t get much support trying to find your stolen bike. Vigilante CP investigator sets of wvwey alarm bell there is.
> The problem, above, is that it’s the wrong person “investigating” this.
I don't follow. If there was a web interface, and I could upload an image of myself, a copy of my passport, my phone statement (that proves my phone number is linked to me) and estimated time slot and I dunno what else. And I get back a link to my phone to download the video in which _I am featured_.
Why is that an issue? Why can't that be setup on a official government website?
Extending that to a case where the legal guardian is requesting on behalf of the child doesn't add anything novel, and doesn't imply vigilantism.
I really like the new language of calling it child abuse imagery. It’s not porn, it’s abuse. We as technologists need to do a better job of building the moderation tools that help platforms act on abuse reports. At YouTube a staggering quantity of abusive videos were uploaded. The content moderation team had mandatory therapy, were provided with powerful tooling to glance at thumbnails without watching full videos and still people burnt out so fast 3 month contacts were the norm. There was a dedicated FBI liaison who’s job it was to collect evidence and report it. (Funny note, it was technically illegal for him to possess the evidence while collecting it to report it. This became interesting due to mandatory record retention during an unrelated case.)
Better tooling allows platforms to quickly pull and report abusive content. We should all take this need quite seriously. I’d love to see some open source tooling around content moderation geared at stamping out abuse.
Is there a free API for small SaaS products to use to check user uploaded images for CP and other violent imagery? Seems like such a tool should be a funded public service. I think AWS has something but it's likely paid and I don't use them. I'm pretty sure these tools are only available to companies that have the budget for it. Imagine open source projects like Wordpress and Mastodon having the option to toggle on such CP scanning of user uploaded content that would use the API?
PhotoDNA, but it requires a suite of lawyers since the material it uses to fingerprint is high res enough to make it identifiable again. Or create false positives.
The article describes schemes where perpetrators share images via file hosting services by encrypting and sharing symmetric keys out of band. There is no way you can possibly identify the file content if they're doing that. The only way is for law enforcement to discover the nature of the photos by other means and then tell you which files are actually CSAM, which works perfectly well until somebody they haven't caught yet who still has local copies uploads again from a different Tor entrance using a different encryption key.
Hosting encrypted zips en masse would also seem to be slightly shady? Sure, there are legitimate uses for that. But if it’s 90% of what your customers share on your platform, you might want to consider that it’s also legal not to host that type of file.
Most of those file sharing platforms get a lot of business from copyright infringers. Keeping files password-protected allows the platform to respond to takedown requests but deal with a minimal number of takedowns. The platform doesn't need to tell its users to do things that way; it's just a natural outcome of incentives. Uploaders don't want their uploads removed. I remember downloading encrypted .rar files to get expensive textbooks even back in the early days of RapidShare. The sites are by their nature pretty sketchy but usually for less heinous reasons than those under discussion here.
As horrifying as it sound. I wonder if we are not better off having child pornography around. I expect it would result in a diminution of child abuse. (That being said, I don't think there has ever been a study on the subject).
It would need to be AI-generated deepfake imagery of course (I can't see any justification for using real imagery of minors). From a harm minimization standpoint, it's worth investigating at least (and I do believe I've heard the proposal somewhere, already).
However, it would require that we, as a society do something we don't yet seem ready to do: accept those folks with truly unsupressable desires of this sort as people, who deserve to be treated humanely and with compassion, if at all possible.
I don't know... The article itself mentions something related:
> "These forums can contribute to the lowering of inhibitions. [...] When these people interact in a community of others like them, a distortion of perceptions may result".
So maybe having people share those pictures in an uncontrolled environment can be even worse than having the images shared and re-shared. And it definitely matches the experiences we have with other echo chambers on the internet.
None of those studies demonstrate a causal link between the rise of internet pornography and the statistical drop in sexual assaults - simply that there appears to be a correlation.
The abstract misrepresented the findings. The study didn't examine exposure to pornography. It examined exposure to what they called extreme pornography. Simulated rape for example.
The conclusion asserted consumption of extreme pornography leads to sexual aggression. The methods didn't support causation from what I saw though.
There is no such thing as child pornography, only child sexual abuse images. Hammer this into your head, children cannot consent, and pornography is a legitimate form of adult entertainment.
If only I could be sure that all illegal images were also abusive images. Two seventeen year olds having consensual sex is not abusive and is legal in Washington state. Two seventeen year olds sharing sexually explicit images of themselves with each other ("sexting") is not abusive either but is not legal anywhere in the United States. Over-zealous officials sometimes threaten teens with crimes for making images of themselves. As long as the law criminalizes consensual images produced by randy teens in addition to abuse of children, this ambiguity will persist.
There are plenty of forms of pornography that do not require a literal subject to pose for picture or video.
While I would be strongly against allowing distribution of real people, I find it much harder to argue against material that is generated. (whether that's art/CGI/cartoons/thispersonisnotreal.com/etc)
Do the topless pictures of Samantha Fox on page three of the british The Sun from the 80ies sixteen count as child pornography now? She was sweet sixteen then.
You can tell the argument against cryptography from "but child porn!" is in bad faith because child abuse of this sort is systematically under-investigated, under-prosecuted, and under-sentenced. If they really cared it would be a priority in terms of both investigation and sentencing, but far more is spent chasing drugs and sentences for small-mid size drug offenses are often much longer.
Get caught with a big stash of meth? Go away for 10+ years. Get caught filming child abuse? Get a few years and some psych counseling, if that.
At least in the US, the psych system for such offenders is anything but "cushy"... Also inmates tend to absolutely _hate_ child offenders, so good luck in there.
Could you please stop posting unsubstantive and/or flamebait comments to HN? You've been doing it a lot, unfortunately, and we're trying for the opposite here.
"Thousands of gigabytes" is consistent with a single cheap hard drive. If that's the threshold for alarm then the alarm is permanent.