Hacker News new | past | comments | ask | show | jobs | submit login

Oh, I didn't realize (and if you said it, sorry I missed it) that you were talking about comments rather than submissions. We rate-limit accounts when they have a history of posting too many unsubstantive comments too quickly, and/or getting involved in flamewars. I know it's annoying and a crude tool, but it's one of few ways we have to address that problem in software. If I knew a less rude way to do it, I'd love to replace it. The overwhelming majority of these cases, though, are ones in which accounts really were abusing the site.

We're happy to remove rate-limiting if people email us and give us reason to believe that they won't do those things in the future. Moreover, we take the penalty off accounts when we notice that they've been contributing solidly to HN for a while, as opposed to whatever they did earlier to reduce signal/noise ratio. Not that we catch every case of that.




Late to comment here, but wanted to throw in something I haven’t seen in the article or scanning the comments.

I appreciate the virtue of trying to carve out a space in the internet for a forum that is polite like a Tibetan monastery. I do.

However I don’t think that is a realistic goal to have when there is so much fake/misinformation floating around the world, and there are bad actors looking for every opportunity to spread misinformation into legitimate channels like HN in order to further their particular narratives.

Being patient and polite is one method to deal with misinformation, but a skilled actor is adept at spreading the misinformation while being equally polite and dragging out discussions to the point of attrition.

Unfortunately the Tibetan monastery falls apart when a bad actor like China decides to intentionally take advantage of these polite rules of discourse through subtle manipulation via misinformation, institutionalism, and other means to influence/protect a status quo with false narratives.

It is unfortunate that the HN rules value politeness, tolerance, and patience above eradicating misinformation and ignorance. Bad actors will intentionally take advantage of Tibetan and westernized rules to their own benefit.

We should not restrain ourselves in discourse with one hand tied behind our back when we encounter parties that spread misinformation and perpetuate more ignorance. Identifying individuals that are being less than honest in a firm, direct, and fair manner is more constructive than allowing the charade to continue. Sometimes those comments are flagged as inflammatory or offending the individual spreading bad information because their poorly informed ideas are under attack (rightfully so).

We can’t protect ignorance. All we can do is act with good intentions correspondingly exchanging information. When that like correspondence is repeatedly abused to ignore facts or spread misinformation, we must act instead of wait for good intentions to reveal themselves (a bad actor has no intentions of changing) and meanwhile hundreds or thousands of people have read and latched onto their misguided theories.

If the individual is being above the board, the facts will come to light and the situation is usually self-resolving. If the individual cannot defend their position, that is a good indication the HN community is perhaps better without that individual.

You may say that we should strive to create a culture of politeness and respect. I agree in so far as we must then come to terms with the fact that culturally, deception and dishonesty are also taken as being impolite and disrespectful—which presents a bit of a conundrum if we are paying close attention to our virtues.

Random downvotes without comments say you can’t think of a good response or reason to support your opinions.


There are a number of problems with this, but the main one is that people's perceptions of dishonesty, disingenuousness, and manipulation in others are terribly exaggerated. Odds are that the other person simply disagrees with you—and if their view seems obviously outrageous, wrong, or stupid, this is because people are much more divided than we realize. Disingenuousness exists, but the assumption of disingenuousness in others is nearly always wrong and comes out of a failure to understand how different someone else's experience is.

Users are much too quick to reach for explanations like "you must be a foreign agent" when even the public record of the other user's comments—let alone the private data we look at—show that to be trivially unlikely. Foreign agents exist, of course, but foreign agents as an explanatory device for things one finds provocative online is, to a first approximation, a fiction. Same for astroturfing, shills, and the other things users accuse each other of in arguments.

That doesn't mean ignoring the possibility of manipulation—it just means that we should look for evidence. I can tell you that when we look for evidence, we basically never find it. Even if we're being fooled by clever manipulation in some cases, it's painfully clear that in the overwhelming majority, there's no there there. What there is, is people reaching for 'disingenuousness' as a simple explanation for what they find painful and offensive. That assumption blocks any solution. There's no way to resolve pain and offense without recognizing the experiences of the other side.

You mention China. I can tell you that all the flamewars I've seen about China since they started blazing in the last year or so have been examples of what I've said here.


China was chosen as the example “agent” in this case because they are a nation that is emblematic of manipulation (in currency, IP theft, free markets, etc.). China was also chosen for the historical imagery, forcing the Dalai Lama to flee while capturing a peacefully sovereign Tibetan territory in 1959 and killing 87,000 directly in the conflict and 430,000 dead in the ensuing occupation.

That is to say that China was chosen for reasons beyond the literally obvious example of state agents. I know from experience state actors are rare, as well as being rare to detect. My rub is that latching onto China or Russia state agents as the only concerning actors that spread misinformation is not accurate. Somewhat more common are paid or unpaid social media shills/trolls that have many generalized accounts to forcefully influence topics.

But in my experience misinformation is spread most by those with vested interests, deeply gross misunderstandings of the world, strong attachment to personal biases they only believe and never bother to confirm or disprove empirically, reductionist and oversimplifications and un-nuanced tidbits learned from an introductory course of some topic, and so on.

These individuals overwhelmingly choose to ignore facts presented to them and espouse their incorrect views (perhaps hiding behind politeness or qualifications or an institutional authority). The view that such an individual will eventually with enough patience, see truth... is rather intractable and untenable.

I’m afraid you have latched onto but one example, possibly missing its purpose of imagery in the comment and neglecting to consider the other examples of bad information and bad actors which are actually fairly common that spread damaging tropes (that while inaccurate, continue to circulate nonetheless).


You've made this so general that it applies to literally everyone in every contentious argument. We all have vested interests, deep misunderstandings, strong attachments, personal biases, oversimplifications, and everything else you mention. So in that sense, yes: there's falsehood and misinformation all over the place. But I don't think seeing others as the problem is going to get us very far; seeing others as the problem largely is the problem.


That really doesn’t get to the heart of the issue, I’m afraid. It seems to offer a form of cover and protection for views that are known the be problematic and continue to spread. Partisanship and hyper-focus on views corresponding your identify are bad, we should not provide cover for it and partisan views not supported by facts via polite discourse. We should not allow these poorly supported views to spread. It seems HN has no actual stance or response that directly addresses this issue. Some of HN’s guidelines provide shelter for enabling unsupported views. Choosing to the focus the light inwards on yourself doesn’t really solve or address this issue in the modern age of misinformation, and the Tibetan analogy was chosen to illustrate the dangers by turning inward too much and ignoring the dangers. Does that make sense? I think I’m being fairly direct and specific here.


Of course people post "unsupported views". We don't ban users for being wrong—who would be left if we did?—and we don't have a truth machine.

When people argue like this, in my experience, what they mostly want is for us to ban the views they disagree with. We can't do that. Running a complex community like HN is nowhere near that simple.


I agree that is not a good reason to ban everyone that is simply mistaken. But I have seen users flag one another for absolutely no reason other than they dislike their views being colored as incorrect, or dishonest when the behavior is repeated again and again. That is rather absurd and provides shelter for ignorance. I’m sure those people are very intelligent in their actual area of expertise, but their immaturity shows through in other areas and it’s rather toxic to witness a mod stepping in to say their pride has higher priority than letting someone directly confront their ignorance with facts and truth. Yes this is the internet, we all have better things we could do with our time than debate with strangers. However this also one of the most intellectual and influential havens for discussing tech and nearly anything else found to be interesting.

Modern times have also brought on about a host of new issues where technology can be both beneficial and a detriment to society. The rapid spread of misinformation is a major technological and social issue. How are we going to navigate the new era that is becoming more complex, conflicts are increasing, and people are becoming more partisan and incorrectly reinforced because technology and modern life makes it very easy to filter out the inconvenient facts that they need not be confronted with?

My point is we should not be assisting the enablement of misinformation. Being a hotbed for powerful people and powerful ideas, there is a certain amount of responsibility that needs to be accepted in preventing the spread of misinformation. Rules of discourse that prevent resolutions is something I believe is harmful rather than helpful at HN, and enables the spread of misinformation.

Alternatively, those that don’t like an atmosphere where less than well informed views are actually challenged may very well choose to leave on their own accord, and they will no longer be spreading misinformation here. Bans are probably not needed at all really, we just shouldn’t be enabling.


> Rules of discourse that prevent resolutions is something I believe is harmful rather than helpful at HN, and enables the spread of misinformation.

Very well put. This is my biggest concern as well. HN mods prevent resolution by punishing participants in back and forth discussion for being part of a “flame war”. It is an incredibly coarse and un-nuanced view of debate.


Most back-and-forth discussion here doesn't get moderated. The ones that do are not the kind that usually end in resolution.


I think each account should show its history of being rate limited, and the mod who initiates it should cite specific comments that were used as evidence of wrongdoing, as well as describe the nature of the wrongdoing.

The mechanism itself isn't necessarily a problem, it's the arbitrariness of it and the lack of accountability. Most people have some degree of accountability in their job. I think HN mods are an exception.

> whatever they did earlier

It would be impossible to audit whether this is being done judiciously or fairly without a page listing all such moderations, their context, etc.

> whatever they did earlier to reduce signal/noise ratio

I'd argue that moderation itself reduces the s/n ratio. If I notice a pattern where one user continually posts low quality comments, I'll be inclined to ignore or downvote that user. If the user got throttled, then it removes my ability to notice the pattern.

Similarly, if stories are re-titled (a common abuse of moderation) I may not realize I've already read the discussion or the linked content and read/click it again.

Worse yet, re-titling submissions often removes any clue about what made the submission interesting. Ironically the moderation practice of titling the HN submission with the article title introduces more click-baity titles into HN than would exist due to submitters' tactics.

No offense is intended by my feedback. I do think the moderators have a few pretty glaring blind spots and I am hoping that my feedback is well received.


> Most people have some degree of accountability in their job. I think HN mods are an exception.

I might have thought that too before working with a community this large, but the degree to which we're accountable is much more intense than anything I've experienced in a job before. When every misstep is met with instant outrage and hard pushback, you learn to adapt to feedback quickly.

People think we control HN, and to some extent we do, but we are controlled by HN to an even greater extent. This is maybe the most important thing for understanding how HN works. HN consists of a big system (the community) and a little system (the moderators) and the two interact via reciprocal feedback.

There's a third system too (the software), but I left it out for simplicity.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: