Great work done by this company, but I wonder why social networks like Instagram isn’t doing this themselves to protect children? If a paid service has caught so many already, you can imagine the scale of the problem... Hope I’m wrong and they are doing it.
The last time I tried Instagram I got followed by 5-6 bots within a few minutes of creating the account. The problem seems similar to Twitter in that there are more people working on spamming the network with bots than there are Twitter employees blocking them, so they focus primarily on automated blocking than on catching other types of abuse.
There are obvious privacy implications with social network owners running fake accounts to entrap their users without consent.
This company is training machine learning on grooming language and they could sell that data to companies like Facebook to automatically monitor child accounts-- with their parents' consent. That's a much better solution.
The dark secret is that no social media company would clamp down on such a lucrative source of user engagement. A lot of the early adoption of social media platforms is by young people and... well, you see?
Just because pedophiles are common on social media doesn't mean they're particularly lucrative as a demographic . It's not as if they can legally form pro-pedo groups and chat and share child porn and create a social graph around their interests that would be of any value to advertisers.
Your comment triggered the thought that blackmail is INCREDIBLY lucrative.
So much so that I'm surprised it hasn't already happened. If favourite rogue state infiltrated FB/IG enough to do what Bark is doing on a system level... that's a scary amount of leverage.
And it's difficult to see how our society could protect against it.
Leverage against some random dirtbag who lives in a trailer? The bigwigs are using services like Epstein's, not Instagram spam. And they successfully shut down that potential threat when he was compromised.
There's a lot of spectrum between a trailer and Epstein. And you don't necessarily need people in Epstein's social circle, for such a scheme to pay off. Industrial espionage and sabotage doesn't require bigwigs - worker bees that happen to have access to things just so that they can do their daily job can do a lot.
One thing I felt was missing from this article was concrete suggestions for parents. What do you do if you have an 11-year-old daughter? Maybe 11 is too young for an Instagram account, but are you really going to prevent her from having a social media account at 15? How do prepare her for this kind of vileness, so she's not burdened with secrecy and self-blame?
You start when they are 4 or 5 years old. No, I'm being serious here: that's the age at which most kids are using tablets and phones regularly. That's the age you need to start having conversations about the dangers of the Internet and how your kids should protect themselves. And just like teaching any other skill, you start with the absolute basics and build stepping stones.
I would assume that, at the very least, every large social media platform that allows image uploads is checking them against a law enforcement database of CP hashes. If not, they should be.
There is a major conflation of wildly different issues in this thread. Downloading child pornography is not cybersex, and neither one is kidnapping/molestation.
I suspect the conflation occurrs because most people don't want to bear thinking about specifics, and stop at "abuse". This is understandable, but distinctions are important when analyzing any problem. For example, the idea that grabbing the perps from the story means there are fewer kidnappers is highly wishful thinking.
Details are necessary to develop appropriate solutions. The behavior in the article is something that legal enforcement likely cannot curb, like "speeding" 10mph over. The perps are certainly problematic and guilty of something, but their quantity/fan-in is too great at Internet scale. The only solution I can see working is curated whitelist-only environments, the same way you drop kids off at a purpose-tailored daycare rather than a downtown alley or a prison.
Details are also important for making sure that the "kid-safe" solutions are appropriately targeted so they don't end up leaking to wider society. Anonymity in general is important for a whole host of marginalized peoples, and there are many interests that wish to erode it for their own nefarious ends.
> There is a blatant conflation of wildly different issues in this thread. Downloading child pornography is not cybersex, and neither one is kidnapping/molestation.
Is there evidence that no correlation exists between downloading child porn and soliciting minors for sex online? It seems oddly pedantic to insist that only a lack of critical thinking ability could lead one to assume that behaviors on the same platform which try to satisfy the same form of sexual urge might be related, because they're not literally the same.
>For example, the idea that grabbing the perps from the story means there are fewer kidnappers is highly wishful thinking.
I mean... there are n fewer for n arrests. The set of all child predators may be undefined, but it isn't infinite. Are you arguing that law enforcement shouldn't bother attempting to investigate or arrest criminals because crime still persists?
>The only solution I can see working is curated whitelist-only environments, the same way you drop kids off at a purpose-tailored daycare rather than a downtown alley or a prison.
Why can't multiple solutions work? Why should the only acceptable solution be society retreating behind walled gardens and simply accepting that pedophiles will (even, for the sake of maximizing freedom of speech, should) be left alone to freely operate in any public space?
Although I do agree that children probably shouldn't be on these networks, and better curation would definitely be a good idea (particularly where PMs are concerned,) I also believe a public platform has every right to moderate activity and police itself.
> Anonymity in general is important for a whole host of marginalized peoples, and there are many interests that wish to erode it for their own nefarious ends.
And speaking of blatant conflation, it seems like all such interests are assumed to be nefarious in these discussions, and everything is a slippery slope towards the camps.
In general, you're using loaded terms that continue the conflation - eg "predator" and "pedophile". Someone looking at child pornography is not a "predator". And someone who is generally attracted to women but tries to chat sexually with 11 year olds because they make for easy targets isn't necessarily a "pedophile".
The point isn't pedantry or to defend any of these people, but rather to avoid succumbing to too-easy explanations. For instance, your mentioning of "image hashes" in response to the topic of "protecting children". Instagram certainly loves that narrative, but it doesn't actually address the topic at hand.
> Why should the only acceptable solution be society retreating behind walled gardens
For the same reason that dropping your kids off in the middle of a city doesn't make for daycare. Greater society inherently involves being robust to normalized ever-present abuse (eg advertising, for one), which requires adult maturity.
All the dialogs in the article are creepy as fuck, but half of them were ultimately just conversation and will likely ignored by law enforcement as inactionable. If you want to prevent those conversations, the only way is to drastically reduce the scope, eg a heavily-curated playground.
The dialogues in the article included people sending dick pics to a person they thought was an 11-year-old child. I’m sure that the lack of a ‘real’ victim is a problem in some countries, but this kind of offending is regularly prosecuted in Australia.
I wasn't putting that behavior in the category of "inactionable"! I had also thought there were more discrete conversations, but rereading it's really excerpts all from (presumably) the worst one. But, the article does provide its own accounting:
> text-chatted with 17 men ... and seen the genitalia of 11 of those
So even though two thirds are doing something that could be straightforwardly prosecuted, it seems like one third are still abusive in some way that isn't necessarily easy to write into law, especially one that won't be worked around.
FWIW "regularly prosecuted" doesn't necessarily mean that the chance of an individual prosecution is high!
Absolutely. The people who get caught doing this stuff generally haven’t done much to conceal their identity and get caught with evidence of other unreported offences on their computers. It’s horrifying to think about what the sophisticated ones might be getting away with.
I overlooked the word ‘half’ in your original comment and brought up the dick pics because they make the crime so easy to prove. I’d say all of the conversations were illegal grooming, but it’s harder to prove that the person had the necessary intent when the conversation isn’t explicitly sexual.
My startup which aims to help people discover unique and meaningful events, and most importantly, to find a group of people with similar interests to go with!
I don't see why the fuss about Craigslist not providing an API:
1. Anyone is free to come up with a better provide that serves Craigslist users' needs, you can argue that craigslist is the dominant player and it's hard to steal users from them, but many niche-focus startups have proved otherwise (or they simply don't have a compelling enough products);
2. When it comes to the time Craigslist fails its users functionally and aesthetically, it's hard to imagine no other players will surge up in the game;
3. Even if Craigslist provides an API now but doesn't improve its user experience, do we just want it to serve as a database for all other services that piggyback on Craigslist?
So my message to Craigslist: "Just think hard about how to serve your users better. You're not an asshole for restricting data access to your site, but it is irresponsible to waste other engineers lives on trying to defeat you while you can easily make yourselves better for your users!"