Hacker News new | past | comments | ask | show | jobs | submit login

Maybe Facebook should make all of their own employees do 15 minutes of moderation per day - just to share the pain out a bit....



Make the users do a little every so often. Like Slashdot's "hey, go moderate some posts" thing but forced to before they can use the platform anymore. That'd be fun.

Of course the real answer is that this sort of site is a bad idea and shouldn't exist. Wide-open signup and public visibility of content, or ease of sharing it with strangers. Bad combo, don't care how much money it's making them (and other, similar sites).


> Make the users do a little every so often.

Oh, dear god, no - have you ever used Reddit? This is what happens when you outsource moderation to the sorts of users who enjoy moderating other people.


Have you ever used Facebook? It’s much worse and this is specifically in reference to media that are against Facebook’s guidelines (e.g. nudity, gore, etc), not all content as a whole. Users doing some moderating in this context would simply be presenting flagged media randomly to users and asking if it breaks the guidelines.

Very completely different things. Subreddits would be the equivalent of Facebook Pages and they already have moderation tools for those that run the page.


Reddit is a bit different than what was suggested here, since moderators are not a rotating position there and this encourages power trips.


Stack overflow has the users do the moderation and it works well, maybe not perfect but its not bad.


There's not really a shortcut around having employees do it because Facebook has a significant amount of private content which users will not tolerate being shared, standards vary widely (imagine a vegan flagging every picture with leather visible or a post about eating meat as obscene — if you're requiring this, do you just disable their account once you notice it?), and especially because it's the target of coordinated group activity so errors won't be uncorrelated: get a cycle of outrage politics going and you'll have thousands of people around the world who have no easily-observed connection to each other flagging things as either objectionable or safe.


And yet Facebook lawyers recently argued to a judge that there is no expectation of privacy on Facebook.


I’m not defending them but there is a distinction between Facebook using your private data for mining and displaying it to random strangers. Any sort of user-driven moderation policy would need to have a way to handle that or people would stop using the service.


Isn't everything you post on FB available for anyone to browse?


You could have a system where multiple people flagging certain content would be a signal to give it manual review.


I've always expected that a system which could work, albeit with flaws, would be to ratchet up censorship for any individual user who flags content. The more they flag, the less they see. The only 'problem' is that this would leave users fearing that there are others who have not flagged the same content objectionable that they have, and are viewing that content without being actively prevented. They'd be protected themselves, but it would be easy to convince them (if any convincing is even necessary) that 'almost everyone else' is consuming the raw obscene feed and being turned into monsters by it.


This already exists on Facebook.


Just what I want my grandmother to see when she logs into Facebook once a week to check on the grandkids.

------------------------------------------------------------

Hello user, you have been randomly selected to help moderate.

Does this depict pedophilia? Y/N

[ Image here ]

This image was flagged by other users as depicting horrible gore and death, do you agree Y/N?

[ Image here ]

This post may contain hate speech, do you agree? Y/N

[ Embedded Nazi Screed post ]

Thank you for making Facebook a safer place for everybody!

------------------------------------------------------------


Make it captcha test: Select all the pictures of animal cruelty. /s


Wow, captcha from hell.


> Make the users do a little every so often.

"Facebook made me look at child porn" is probably a headline Facebook would prefer not to have going around.


Personally, I wouldn't go that far - just that I would think anyone doing this kind of job would need a lot of training, active support and monitoring and probably periodic breaks to do something else and decompress. Probably long term health care coverage as well.

Would that be expensive at the scale required by Facebook? Very.


Only if they dole out libra-bux


>libra-bux

Personally I'm fond of zucc-bux


Seen on here the other day (can't remember the poster to credit them): "Douche Mark". A play on Deutsche Mark, for those who are either not European or not old enough to remember that currency, as seemed to be the case with some respondents to the original post.


Slashdot's "hey, go moderate some posts" thing but forced to before they can use the platform anymore. That'd be fun.

Overall not possible. On Slashdot everything is basically public, on Facebook it's largely private/restricted and I'm sure from past discussions that the worst stuff is in private groups or clusters of friends. They could do much more aggressive banning of users (and detecting new accounts designed to circumvent bans) for sharing such content or being in groups focused around such content, but that might hurt their most important metrics.


I agree with your point about the site being a bad idea, so maybe your first point was tongue-in-cheek. But displaying reported child pornography to people outside of your controlled environment? Not a great idea...


Just pricing in an externality. If that's a disgusting and shocking way to deal with this, then it doesn't follow to me that paying a little money to folks to do it makes it anywhere near OK.

So yeah, it was tongue in cheek, with a side of reductio.


No the concern with distributing likely child pornography and snuff films to users is that the whole point of moderation is to STOP distributing that. You want to have it moderated in as controlled an environment as possible. I'm all for not subjecting more people to it than necessary, but it should at least be people who explicitly consent and are trained and provided with the necessary resources.


Don't have to subject anyone to it. Don't have Facebook.


I have never distributed child pornography or snuff films. I'm already not subjecting anyone to it. Communications channels exist. Unless you want to undo that, or enable this behavior, you have to deal with it somehow.


We don't have to have huge, centralized communication sites where anyone can sign up and post stuff and the operator's shielded from liability for what's posted so long as they subject a bunch of humans to it in order to filter it out, even if there's tons and tons of it posted constantly. We only have to have that in order to keep supporting the current ad-tech business models, which isn't a given.

I'm not saying if Sally starts a forum and it gets hacked and someone posts illegal stuff she should be fined or charged with anything. I'm not saying she should even be in trouble if she gives one person who turns out to be a jerkass posting rights and that person posts illegal stuff. But if she opens up public signups, gets a billion or two users, then says "well I guess I'll just have to pay a bunch of people money to look at all this gore porn that's getting posted" the rest of us should go "uh, no, you should instead just stop".


That's a very arbitrary line that's really just based on you making a judgment call for things you like or against things you don't like. You'd probably do well in American politics.


I don't like subjecting a bunch of people to horrible images on an industrial scale, no. And I don't think "it's making us money and we can't think of a better way" is an excuse to keep it up.

[EDIT] and given that second sentence, no, I'd get nowhere in American politics.


So your solution here is that instead of all these people voluntarily seeking other jobs, their department shouldn't have a reason to exist, ergo shutdown Facebook and now all these people and more are involuntarily unemployed. Riight.


All the devs would leave


That's probably true - I wonder whether it would be because of the affront of having to do such menial tasks or the horror of realising that operating a global social network requires somebody to trawl through the filth and risk their mental health?


No, its more like the devs have the necessary marketplace leverage and fallback to either put a quash on that notion or move on to another less ridiculous job. Don't think for a minute devs wouldn't be treated the same (and in some places ARE) if the corporate c-levels could get away with it without bleeding talent that is expensive to re-acquire. I am not, by any means, a Workers of the World Unite kinda person, but group negotiation and gasp unionization is becoming more and more necessary - if ONLY to get the beancounters to consider labor in their souless cost calculations when making decisions that lead to the conditions in the article.


Historically, unions are a tried and true way for improving working conditions and average compensation. Of course, that's also the reason why they are so vehemently opposed...


Very true, but in honesty, they can be almost or more trouble than they solve when they grow so big and bureaucratic as to no longer represent or even actively harm the workers they purport to support - it's these cases a lot of union detractors latch onto in their arguments against unionization, with some degree of fairness. I speak as the son, grandson, and great-grandson of people who were VERY active in their respective unions. In particular my great-grandfather helped to unionize electrical workers in refineries in Texas, and my father was a union steward. They are full of stories of the headaches the union bureaucracy caused.

That being said - those cases are really rare and in terms of harm, the naked calculating exploitation that corporations flirt with is WAY worse imo than the harm a too-big-for-its-britches union causes.


It is because it is a waste of their time and a bad approach and everyone knows it fundamentally on some level - however nice the eglatarian ideal would be in other ways.

It would be like expecting doctors to clean bedpans. Now it is a neccessary task and those who do so should receive appropriate respect even if it is just "Thanks glad I don't have to do it." but anyone off the street and willing to do the dirty job could do it without a doctor's training which they spent over half a decade on to be /entry level/. Plus incredibly inefficient for effectively paying say $100/hr to a janitor when they could be saving lives instead.

Now asking them to do it in a neccessary and justifiable situation (say posting in an aid camp cut off by weather or in space thus making sending any extra bodies expensive) is one thing and they would be in the wrong then for refusing out of pride.

Absent that neccessity it shows poor sense of their actual value until medical degrees and skills become just as common as the skills to clean bedpans.


Yes, I should perhaps clarify - I mentioned developers to point out that Facebook has the resources to provide excellent working conditions.

If I can adapt (and perhaps torture) your analogy, I'd say it's like Facebook currently has doctors who save lives and command high salaries, and janitors who change bedpans but don't have access to hot water and latex gloves. So inevitably, the janitors catch a disease (hospitals are filthy, after all! This is foreseeable!) and are no longer able to work, at which point they are replaced.

Given that the hospital can afford to pay the doctors, we might ask if they could splash out for latex gloves and soap for the janitors, too.


The different is there are plenty of people who are not affected negatively with those "horrible" content. Me for example, if not for the low salary I would do it.

Content moderation job is not for everyone.


Your example seems useless. There would be minimum working condition requirement enforced by labor department. So it would be illegal that janitors be forced to work in filthy conditions without necessary gears. Similarly FB would be fulfilling the working conditions set by labor department for desk workers.

People are essentially asking if FB can increase salary/ perks by x amount because internet commenters are somehow not feeling good about current scenario.


> There would be minimum working condition requirement enforced by labor department.

Yeah, this is basically what I'm saying. :) Labour codes are developed over time. In the beginning, it's the wild west. Want to be a janitor and not use any personal protective equipment? Go right ahead! After workers keep getting sick, the government begins to legislate requirements around health and safety.

Awareness of mental illness is a relatively new thing. We'll probably see some developments in this area if the big tech companies continue to outsource moderation at scale. https://www.theguardian.com/news/2017/may/25/facebook-modera... is a nice article that describes accommodations that other companies provide to moderators. They include, as an example, monthly psychologist visits which continue after the person stops working for the organization. They also include training for the person's family and social support groups.


Executives would also not put up with it - obviously, to the point that I'm not sure whether you meant to include them or not when you said "employees".

Not wanting drudge work is common to nearly all people who have options. Why should devs get all the hate?


...but there must be a downside, no?

(stolen from Scott Adams.)


Probably not the ones—if any—that lurk 4chan and other extreme imageboards, having been desensitized to such content or probably even get something off of it.

I actually wonder now how much of their workforce would fit in that demographic.


anyone thats been lurking/posting on chans for a long time also know neurotypicals couldnt deal with the content

just because some people become desensitized doesnt mean the obscene content wont damage the mental health of the entire set of people


I was just thinking that. 4chan janitors are volunteers. Of course, the payoff is helping 4chan exist, not helping Facebook make a few billion more, so the incentive is still vastly different. Still, that might be the right recruitment pool.


I don’t think so. They’d just beat off to the orphan vivisection videos all day long.


There is a lot more to that website than /b/. The janitors on /out/ do a very good job of keeping the board clean.


that would be great, then they could just the whole thing down and the world would instantly improve by 10%


Have you ever worked with mechanical turk or anything of this sort? You pretty much have to do at least 15 minutes of manual moderation per day. Training "human workers", setting up rating templates, finding human mental "bugs", i.e. things that raters/moderators get consistently wrong - it takes quite a lot of time.

Granted, not everyone does that, someone nibbling at the frontend or sitting deep in some distributed DB generally does not care much about data quality.


Make the executives do this.


It’s common practice for them to do so. Not just at Facebook, but YouTube and other platforms.

It’s management 101


Oversight over key quality metrics (including understanding what exactly raters/moderators do) is a VP or even SVP level job. It's critical if your entire org's goal is to deliver ever improving data quality at reasonable latency and reasonable reliability (example: Google Search).

Not to mention that, in data driven orgs, the launch/no launch debates are focused on those metrics. So if you want to call the shots you kind of need an in-depth understanding.


You and the GP claim this, yet if these execs have truly been exposed to the worst that gets posted to their platforms, why has there been no appreciable change made? Unless they're so hardened to it they can't emphasize with the horrors that they're asking people to face.


There's no alternative. The only other option (for Facebook and content moderation) is to shut down the shop & return equity to investors, or do nothing and wait for the platform to blow up. And even then someone will make Facebook v2.

Just the economic incentive to cut moderators loose and use algorithms would be enough to do so if it were possible. After all you can buy a decent amount of compute power for 30+k/yr, that will certainly deliver more than the 100 or so QPD[ay] form a content moderator.

PS. I'm sure that most of the work is being done by algorithms anyhow.


That’s all or nothing thinking. The immediate alternative is to improve working conditions for moderators. Or impose more stringent bans and punishments on those who post that sort of material. Those are just for starters.


Look I wish everyone had a nice life, I really do, but markets set salaries while governments sometimes intervene (e.g. by setting a minimum wage).

Don't know about the banning situation, that'd require insider Facebook knowledge. I'm sure they ban some people.


The Market Police don’t haul you off if you pay something different than the “market wage.” Markets set a floor on wages by making it so you won’t get enough workers if you pay less. But companies can easily pay more than the floor if they want to. Obviously they don’t want to, because they love money, but it’s ridiculous to frame this as some iron law of markets.


Peace!


This makes complete sense. Why pigeon hole employees into, well, pigeon holes? Technology, Operations, Marketing, more. Each understanding more about each others' roles, challenges, solutions. A knowledge organisation is based on knowledge. I'd favour a scale that could be chosen, 5-50% outside of core role.

[Clarity: Am not a Facebook employee.]


I suspect it would come with a "manager's discretion" clause that would ultimately effectively eliminate it, like 20% time at Google.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: