I didn't read through much of this so I may be misguided here, but how is this any different than trying to get cellphones banned because underage children may send pictures of themselves?
If these people aren't actually doing anything but talking in these groups there's no issue here. It's distasteful and FB SHOULD probably boot them off the platform, but this feels like its a far cry from the pornhub broujaja where they were hosting illegal porn and not taking it down.
In my brief time at Facebook, I was asked to pick a team after boot camp. One of the options was their "Security tools" team, who were tasked with interfacing with law enforcement. They were really good guys, though they definitely looked like they bore some heavy weights.
I also went through boot camp with a guy who'd been a customer support rep, learned to code and use the internal tools, and was being promoted to engineer to tackle improvements to them. At some point, he introduced me to his boss's boss, a director equivalent that was responsible for both teams. They creeped me out, but gave me some advice that the security tools team was not the one to join.
I later (after I left Facebook) met the boss who warned me off that again at a well known kink event. They were talking with an unfortunately too close acquaintance, and very drunk. Very drunk and talking about their ageplay fetish that clearly wasn't just humor or roleplaying.
I'm not surprised that Facebook has a child predator problem.
This echoes their previous neglect that led to the ads platform being abused by state actors to stoke genocides. They said that they only have time to focus on a shortlist of countries (basically an economic ranking). We evidently can't trust these orgs to be responsible or to address harmful neglect without public outcry and action to drive accountability or real change.
Given it's easy for Facebook's recommendation algorithms to dig these groups up, why is it ostensibly hard for Facebook to identify and shut them down?
This is ultimately the question. What's the theory, They're making too much money targeting ads to child predators? Don't want to piss off that constituency? Too cheap to hire more moderators? I guess it's so bad it's strange to me.
“Dig groups up” as if groups weren’t being removed because they were somehow lost in the database even though the recommendation algorithm can find them.
Who's talking about ability, it's obviously chiefly about business resource priority and incentives. Product requirements sink or swim under operational processes and exec or managerial direction. Businesses keep resources constrained at any scale for economic growth efficiency.
If these people aren't actually doing anything but talking in these groups there's no issue here. It's distasteful and FB SHOULD probably boot them off the platform, but this feels like its a far cry from the pornhub broujaja where they were hosting illegal porn and not taking it down.