Hacker News new | past | comments | ask | show | jobs | submit login

It's hard to separate content moderation from the problem of Evil. Low entropy evil is easy to automate out, high entropy and sophisticated evil can convince you it doesn't exist.

This is also the basic problem of growing communities, where you want to attract new people while still providing value to your core group, while still managing both attrition and predators. What content moderation problems have proven is that even with absolute omniscient control of an electronic platform, this is still Hard. It's also yields some information about what Evil is, which is that it seems to emerge as a consequence of incentives more than anything else.

In the hundreds of forums I've used over decades, the best ones were moderated by starting with a high'ish bar to entry. You have to be able to signal at least this level of "goodness," and it's on you to meet it, not the moderators to explain themselves. There is a "be excellent to each other" rule which gives very reasonable blanket principle powers to moderators, and it's pretty easy to check. It also helped to take a broken windows approach to penalizing laziness and other stupidity so that everyone sees examples of the rules.

Platform moderation is only hard relative to a standard of purity as well, and the value of the community is based not on its alignment, but on its mix. If you are trying to solve the optimization problem of "No Evil," you aren't indexed on the growth problem of "More Enjoyable." However, I don't worry too much about it because the communities in the former category won't grow and survive long enough to register.




> In the hundreds of forums I've used over decades, the best ones were moderated by starting with a high'ish bar to entry.

I've had the same experience. And at the other end of the spectrum, the reason Facebook, Twitter, etc. have such problems with moderation is that there is no bar to entry--anyone can sign up and post. With what results, we see.


Agreed, with a thought:

I actually think that Facebook's business model was premised on the Steve Carell movie, "Dinner for Schmucks," where a tool originally designed to signal you went to an ivy school got that cohort on board, and then invited everyone else for as kind of a cruel joke where they didn't know they were only the entertainment. Now the party's long over and the only people left are loud drunks yelling at each other and couples acting out some very public domestics.

Imo, there is no solution to those platforms' moderation problems, only management into something even more banal and worthy of disruption.


>There is a "be excellent to each other" rule which gives very reasonable blanket principle powers to moderators, and it's pretty easy to check.

easy to check, but most sites tend to have almost zero accountability for moderators "staying excellent" and most moderators are often unpaid, normal human beings with their own personal beliefs. They won't be let go unless they frustrate the site owners.

And to begin with, people's perceptions of if "laziness" and especially "stupidity" should be aligned with "Evil" is a question for discussion in and of itself. Some people may just want to throw around some jokes and memes and not be in some pristine forum full of essays for response. It's not my personal preference, but I also don't think it's worthy of a blanket internet ban, nor do I necessarily want to say forums for those people are of "lesser quality" (even if you can argue that it attracts more truly "Evil" people).

I guess it's a tiptoe as moderation as always. I've also see some forums lean too deeply on quality and end up feeling gatekeep-y as a result (and not even for sensitive topics where it may be needed, but some "casual" forums, as advertised). You turn a community into work and people will look elsewhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: