Hacker News new | past | comments | ask | show | jobs | submit login

Ah, ok—I was probably reading your comment through the filter of "comments supporting my view are constantly getting downvoted/flagged/suppressed on this site", which is a more common complaint (which was also making appearances in this thread).

Software can indeed affect subthread ranking, and moderators do that too—but certainly not for pro- or anti-diversity (or any other $divisive-topic) reasons. We downweight top subthreads for reasons like this: (1) they're too generic rather than engaging with something specific in the article; (2) they're high-indignation/low-information; (3) they're meta; (4) they're offtopic in a predictable way*. What all 4 of those categories (and there are probably others) have in common is that they tend to both get heavily upvoted and to not be interesting in HN's sense of the word. Once they get heavily upvoted, they tend to hang out at the top of the page, accruing mass and choking out more curious conversation. We try to use moderation and (to some extent) software as countervailing mechanisms to these default tendencies, because we've learned over the years that this is one of the highest-leverage things we can do to make threads more interesting. But this has nothing whatever to do with agreeing or disagreeing with the points being made.

Edit: I'm guessing you were talking about https://news.ycombinator.com/item?id=29785012, which indeed got downweighted in exactly this way, because it falls into categories #1, #3, and #4 above. But not into #2, at least, which is unusual and to your credit.

* Unpredictable whimsical offtopicness, by contrast, is often interesting and we try to give it a pass. Otherwise moderation would get too predictable in its own right. I often have to remind people about that.




Have you considered transparency for when you down-weight a specific post or thread?


I don't see a reason to treat that type of moderation decision separately from others, so your question amounts to whether we'd publish a complete moderation log. We've certainly considered that, but I don't think it would be a good idea—the fear is that it would end up just leading to a lot more procedural questions from a small minority of users, that would take tons of time to answer, without doing anything to benefit the majority of the community.

Instead, the approach we take to transparency is to be willing to answer specific questions when people have them. It seems to work fairly well. Not everyone is satisfied with that, but most users are, and of the ones who aren't, a nontrivial portion wouldn't be satisfied with a complete moderation log either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: