Of course the real answer is that this sort of site is a bad idea and shouldn't exist. Wide-open signup and public visibility of content, or ease of sharing it with strangers. Bad combo, don't care how much money it's making them (and other, similar sites).
Oh, dear god, no - have you ever used Reddit? This is what happens when you outsource moderation to the sorts of users who enjoy moderating other people.
Very completely different things. Subreddits would be the equivalent of Facebook Pages and they already have moderation tools for those that run the page.
Hello user, you have been randomly selected to help moderate.
Does this depict pedophilia? Y/N
[ Image here ]
This image was flagged by other users as depicting horrible gore and death, do you agree Y/N?
This post may contain hate speech, do you agree? Y/N
[ Embedded Nazi Screed post ]
Thank you for making Facebook a safer place for everybody!
"Facebook made me look at child porn" is probably a headline Facebook would prefer not to have going around.
Would that be expensive at the scale required by Facebook? Very.
Personally I'm fond of zucc-bux
Overall not possible. On Slashdot everything is basically public, on Facebook it's largely private/restricted and I'm sure from past discussions that the worst stuff is in private groups or clusters of friends. They could do much more aggressive banning of users (and detecting new accounts designed to circumvent bans) for sharing such content or being in groups focused around such content, but that might hurt their most important metrics.
So yeah, it was tongue in cheek, with a side of reductio.
I'm not saying if Sally starts a forum and it gets hacked and someone posts illegal stuff she should be fined or charged with anything. I'm not saying she should even be in trouble if she gives one person who turns out to be a jerkass posting rights and that person posts illegal stuff. But if she opens up public signups, gets a billion or two users, then says "well I guess I'll just have to pay a bunch of people money to look at all this gore porn that's getting posted" the rest of us should go "uh, no, you should instead just stop".
[EDIT] and given that second sentence, no, I'd get nowhere in American politics.
That being said - those cases are really rare and in terms of harm, the naked calculating exploitation that corporations flirt with is WAY worse imo than the harm a too-big-for-its-britches union causes.
It would be like expecting doctors to clean bedpans. Now it is a neccessary task and those who do so should receive appropriate respect even if it is just "Thanks glad I don't have to do it." but anyone off the street and willing to do the dirty job could do it without a doctor's training which they spent over half a decade on to be /entry level/.
Plus incredibly inefficient for effectively paying say $100/hr to a janitor when they could be saving lives instead.
Now asking them to do it in a neccessary and justifiable situation (say posting in an aid camp cut off by weather or in space thus making sending any extra bodies expensive) is one thing and they would be in the wrong then for refusing out of pride.
Absent that neccessity it shows poor sense of their actual value until medical degrees and skills become just as common as the skills to clean bedpans.
If I can adapt (and perhaps torture) your analogy, I'd say it's like Facebook currently has doctors who save lives and command high salaries, and janitors who change bedpans but don't have access to hot water and latex gloves. So inevitably, the janitors catch a disease (hospitals are filthy, after all! This is foreseeable!) and are no longer able to work, at which point they are replaced.
Given that the hospital can afford to pay the doctors, we might ask if they could splash out for latex gloves and soap for the janitors, too.
Content moderation job is not for everyone.
People are essentially asking if FB can increase salary/ perks by x amount because internet commenters are somehow not feeling good about current scenario.
Yeah, this is basically what I'm saying. :) Labour codes are developed over time. In the beginning, it's the wild west. Want to be a janitor and not use any personal protective equipment? Go right ahead! After workers keep getting sick, the government begins to legislate requirements around health and safety.
Awareness of mental illness is a relatively new thing. We'll probably see some developments in this area if the big tech companies continue to outsource moderation at scale. https://www.theguardian.com/news/2017/may/25/facebook-modera... is a nice article that describes accommodations that other companies provide to moderators. They include, as an example, monthly psychologist visits which continue after the person stops working for the organization. They also include training for the person's family and social support groups.
Not wanting drudge work is common to nearly all people who have options. Why should devs get all the hate?
(stolen from Scott Adams.)
I actually wonder now how much of their workforce would fit in that demographic.
just because some people become desensitized doesnt mean the obscene content wont damage the mental health of the entire set of people
Granted, not everyone does that, someone nibbling at the frontend or sitting deep in some distributed DB generally does not care much about data quality.
It’s management 101
Not to mention that, in data driven orgs, the launch/no launch debates are focused on those metrics. So if you want to call the shots you kind of need an in-depth understanding.
Just the economic incentive to cut moderators loose and use algorithms would be enough to do so if it were possible. After all you can buy a decent amount of compute power for 30+k/yr, that will certainly deliver more than the 100 or so QPD[ay] form a content moderator.
PS. I'm sure that most of the work is being done by algorithms anyhow.
Don't know about the banning situation, that'd require insider Facebook knowledge. I'm sure they ban some people.
[Clarity: Am not a Facebook employee.]