Hacker News new | past | comments | ask | show | jobs | submit login
Tell HN: Uptick in mean-spirited comments
155 points by mjfern on Oct 31, 2010 | hide | past | web | favorite | 278 comments
I've been visiting Hacker News for just under two years, and recently I've noticed a significant uptick in mean-spirited comments. This echos a recent comment by PG, "There are more dumb and/or mean comments than there used to be..."

As you are commenting, or up-voting and down-voting comments, please be conscious of how comments might affect others. This is a fantastic community of incredibly bright people. We should be able to engage in thoughtful debate and discussion, while remaining friendly and polite.

Thank you everyone!

Incidentally, if anyone has any suggestions for technical changes/features/tricks that would help fix the problem, I'm all ears. Fending off the decline everyone thinks is inevitable for forums is the main thing I work on. It has been since practically the beginning.

I believe that the less anonymous a forum like this is, the more polite it will be. You have said you want people to speak to each other here as they would in the real world: well, I believe the more strongly you tie their real world identity to their account here, the more polite they would be.

My suggestion, similar to that experiment in the past, and similar to twitter, add a "verified user" badge in the users profile. However, to become verified requires a simple step: add a link to your facebook, linkedin or xing profile. Any already verified user can then verify you.

Non-verified users then have a very minor voting hit, for example a comment karma cap at 50 or so.

This will encourage people to associate their real life accounts with their hacker news accounts, and will result in conversation that more accurately reflects what one would say in real life.

The fundamental dynamics of the site would not change, since the advantage to being verified are not that major.

I'd like to register my strong disagreement with this notion.

While accountability has many benefits, I believe our world doesn't allow any feasible way to discuss many important topics without anonymity— especially in regard to business or our jobs. Think of the trouble that can come from discussing salaries, and don't even bother going into what sort of trouble opinions on drug policy can bring people.

While real name accountability can bring a lot of gravity to a discussion, I feel our current social order will give certain people unwarranted power under their real name while forcing others to censor themselves entirely.

Don't you get it? Anonymous and non-anonymous comments have equal ranking. The point of my system is just to encourage most people to be non-anonymous. This means that the vast majority of comments will stay civil, and when most comments are civil, the tone will tend to stay that way. If you need to make commentary on drug policy or your salary, then simply make an anonymous account and say all you want to say. There is no penalty to that.

I somewhat agree, but you're neglecting the more interesting point: Anonymous posting makes trolling easier, yes, but it also allows for a highly fascinating form of discussion - one that is merit-blind[1].

Right now, certain posters carry a lot of weight around here, mainly because we know what they have achieved in their careers. Inevitably, there will be lots of up-voting, lots of agreement, lots of influence regardless of what they actually say. Introducing real names would exacerbate this effect. I enjoy the fact that sometimes, you'll find a renowned Silicon Valley person arguing with an enthusiastic college freshman about a technical/ethical/... topic. I suspect that's only possible if they're called "johnny32" and "jimmy21".

Display their real names, and the college kid will think twice before he/she answers. Anonymity encourages trolling, yes. It also liberates discussion.

[1] Of course, I'm talking about "external merit." The karma system is wonderful, maybe even essential. But I don't think that non-HN-related achievements should directly affect how your comment is received. Postings need to be effective on their own, not in virtue of who wrote them. (Obviously, if you're head of engineering at Facebook and talk about scaling, please say so! This does affect how your comment is and should be received. But as a default? No.)

That's of course an interesting point, but I don't think it would significantly change the way the discussions are currently held. That's because most people would not go clicking through every account before answering. See for example the cpercival putnam thread.

And when in fact, the credentials of the person are know, it would only raise the bar for the discussion. If, for example, some famous lawyer makes a long comment about the various legal aspects of a particular incident, and most people know it's a famous lawyer, then replies that diagree will be by people who are qualified to disagree, thus raising the bar. If johnny12 makes the same statement, a lot of people with very little knowledge would argue, and it would lead to a poorer and longer discussion.

Discussion can indeed be liberated by making comments anonymous, but this does not scale. You are going to lose either way - either you lose by making the society increasingly more formal, or you lose by making the society less formal. The second comes automatically as it grows. Working towards making it formal will only work against the informality, maintaining an equilibrium, in my opinion.

The problem of big names dominating the discussion is a different problem, but it's a better problem than that of nobodies shouting down everyone else.

> Don't you get it?

That was a fine example of what people have been complaining about recently.

(And you're not even anonymous! :-)

Ah, to be fair neither one of us took the time to fully explain our understanding of each other's points, and I sort of attacked with the disagreement phrasing when I could have only asked him what he thought of my issues.

I haven't found anonymity to be very well correlated with politeness on tech discussion forums I'm on. Some of the most abrasive ones are filled with people posting under their real names, and folks like rms, Theo De Raadt, Ulrich Drepper, and a bunch of the comp.lang.lisp crowd have no problem flaming you under their real names.

But that's real life. The goal is to make the conversation here equivalent to how people talk in real life, not be more polite than that.

In a way I understand what you're saying, but I feel our environment influences how we interact with each other-- anywhere. For example, rules can be added to give structure to a debate. The whole debate format, while artificial, exists to increase the utility of 'real life' discussion.

It seems to me anything we can do to boost the interesting aspects of this community are fair game, and they augment real life itself.

Seems unnecessary considering some of the most thoughtful discussions come from throw-away accounts doing Tell\Ask HN. It could also exacerbate other issues: http://en.wikipedia.org/wiki/Argument_from_authority

[ ie. comment upvotes co-vary with name of commenter instead of content communicated ]

That's why I believe there should be no real disadvantage to using an anonymous account, and no way of telling in the comments if a comment is anonymous or not. All it does is encourage people to link their real life account in a behind-the-scenes manner, which will raise the general discourse level.

> which will raise the general discourse level

On one hand you want to avoid trolls, but on the other hand you want to avoid a circle-jerk and "my boss can google everything I wrote" timidity and un-hackerish conformism. Tending toward either extreme can disincentivize thoughtful discussion and negatively impact a community. So I would be hesitant to declare that a safe assumption.

That's awfully rich coming from someone posting under a carefully-maintained pseudonym.

I was going to ask for some kind of evidence, but it seems I missed this episode: http://news.ycombinator.com/item?id=1353050

So yeah, on the face of it that's some quite scummy behaviour.

That's quite a negative word about someone else on a thread about negative comments.

It's accurate though. Hopefully more people will take your comments about real world identities in context.

This is the same name as is on my id card, and has been for years now.

Out of friendly curiosity - what led you to change your official name? What are the steps to do that? (I have an awkward last name and am considering it).

It seems like you even changed your first name. How did it feel, how long did it take until your new name felt natural?

Changing your last name is a reasonably straightforward process in most countries, because women usually do it when they get married. The first name is a bit more difficult, but if you have appropriate and reasonable reasons, it can be changed.

As an anonymous user I actually kind of like your idea, but can you clarify whether you propose that being "verified" would mean your identity is available to everyone or just to other verified users?

("I was walking home one night and a guy hammering on a roof called me a paranoid little weirdo. In morse code." - Emo Philips)

It would be available to everyone and google, since the point is that what you say is forever linked to you on the internet.

> and google

> forever linked to you on the internet

I'm happy for you as a respected user of this site to know who I am. Google - forever and ever - not so much.

As far as comment quality goes, you might cut out some of the crap, but you may also get more sterile discussions from good citizens such as myself. My hunch is that some people just have an attitude problem, and will be a detriment to HN with or without their identities on display.

> I don't mind if you as a respected user of this site knows who I am. Google - forever and ever - not so much.

Unfortunately, you can't get the first without the second, as witness the leak of the "journolist" e-mails.

Agreed. If you don't want your presence to be known on the Internet, there's a very simple way to accomplish that.

Don't post, don't publish, don't comment. Simple as that.

Some time ago another person made a comment about pearls and some sort of livestock. Throwing...things...in places?

Yeah, even posting anonymously on the internet your identity could probably still be found, given someone sufficiently motivated to find it.

Was interesting maybe a year back we had someone post anonymously and then had a thread on people trying to work out who it was, knowing it was someone who had commented a decent amount before. People were applying algorithms to attempt to work it out, no idea if anyone ever did, can't find the thread.

I think there is a distinction between saying stupid things, contributing to sensitive topics, and outright trolling which you're ignoring here. All three of them are things that I wouldn't want to be part of my permanent record, but:

* Saying stupid things (Software X doesn't support Feature Y, which is good if you want to do Activity Z) and then be corrected (because in the meantime, Software X got a Feature Y for the latest version) provides overall value to the discussion because other people are likely to have made the same mistake that I did.

* Contributing to sensitive topics (be it gender equality, the question of "race", or any political question) will always send the goon squads your way whenever you voice a non-majority opinion. Admittedly, HN has a policy that these contentious topics are not great material for posting, but you get one of these once in a while and it's hard to make an informative contribution to such an issue without drawing ill will from anyone (which will result in downvotes to an anymous HN poster, but could be potentially harmful if those trying-to-be-well-informed-but-not-necessarily-popular opinions ended up on your permanent record). As an example, consider the question of founders being able to raise kids at the same time. While there are people who do a good enough job at it to be able to step up and say, I can do it just fine, and people who can claim anything because people will listen to them anyways, anyone in between would run the danger of drawing the ill will from people of one side or the other for claiming that it's possible or not so possible.

* Outright trolling is something you would not want in the comments, but there's a very fine line between a pointed remark that someone made, and trolling. And as with blog posts, successful trolls are still successful when they are not anonymous.

I think one of the problems here is that, when the audience grows large enough, a pointed post will draw a larger number of upvotes than a boring-but-informative one. So it may be useful to partially decouple the comment-ranking function of up-downvotes with the karma calculation to disincentivize karma whoring.

For example, one could reduce the weight of comment up-/downvotes whenever the total number of (comments+upvotes+downvotes) of an article crosses a certain threshold - say 100. In this way, having a +20 voted comment on an article that barely made it to the frontpage would count more than a +60 voted comment on an incendiary topic that draws a total of several hundred up/downvotes.

As an aside, is there any particular reason that the comment header (containing the commenter's username and other details) was designed to be grey and of the same text size as the actual comment body?

I think we can say that standard forums are much more 'scannable' compared to HN, and I often find myself looking out for posts from users whom I respect on that particular forums. In this case, Putting the username before the the points of the comment may lead people to view the comment for the commenter first and foremost, as opposed to the number of points the comment has.

I don't know whether or not that is a good thing. For one, it makes negativity very obvious, and it makes recognising a bad commenter pretty easy.

On the other hand, it may give a free-first pass to 'respected' commenters who might make mean comments every once in a while.

On the whole though, I think it may make it easier for your average user to recognise and avoid mean comments when they do see them.

Another possibility may be to shift the position and/or size of the upvote/downvote buttons to make upvoting a more conscious decision. I've got no suggestions about how to do this yet though =(.

One thing I personally like about HN is that, in general, I can read the comment and form an opinion about it before I read who posted it. If the username was 'in your face' it would be harder to not let that colour your reading.

For example, there are several prominent members here who post very reasonable and thoughtful comments that I often agree with, but sometimes disagree with. I think it would be harder to judge their comment on its merits alone if it was very visible that it was <prominent user> who posted it.

The username being unemphasized is quite intentional - Reasoning being who cares who said something? If it's right it's right; if it's wrong, it's wrong.

"to become verified requires a simple step: add a link to your facebook, linkedin or xing profile ... Non-verified users then have a very minor voting hit"

Why use some third-party proprietary system? It's almost as easy to make a fake Facebook account as an HN account. As far as "verifying" users goes, just look at their About page and previous comments.

I disagree.I don't use any of the social networking sites. It would be bit crude to assume that the only way for me to be verified is belonging to a social-networking site.

Please, no social networking verified user.

I use my real name to keep myself accountable. It's not my full name, but it's enough for anybody who is interested to narrow it down in a few minutes.

Downvoted comments are bad for the community. They are literally negative. Remove downvotes. Anything that truly deserves downvotes probably deserves flagging/deading, too.

Downvotes change how you think about reading comments. Instead of "was this helpful?" you're thinking "do i love this or hate this?". It's polarizing, and as the number of users increases, it's easy for a comment to hit -4 in a couple minutes.

People feel bad when they get downvoted (often because they feel the downvotes are unjust), and some of them turn to trolling or flaming as a defense mechanism. This spreads across threads and infects the whole community. Emotions are contagious.

Also, add the ability to undo your upvote.

I've often thought of it. The reason I had downvotes originally was to deter people from saying things that were mean or stupid. But maybe the right way to deal with that is a separate mechanism from voting, as it is with stories. Maybe the right combo for comments is an uparrow, plus a flag link, and encourage users to flag comments that are mean or stupid.

I'll think about the real solution more after I'm done reading applications, but I'll increase the downvote threshold to 500 now.

We had a saying in physics: "there are no stupid questions, (only stupid people)". We encouraged people to ask stupid questions, because (a) quite often that's the only way you learn, (b) it encourages shy people to speak up, and, (c) sometimes a stupid question isn't stupid at all, and leads to a deeper insight.

I'm a little disappointed that HN isn't like this. Most of the time I have a "stupid question", but I don't comment due to the HN's disapproval of stupid comments. I self-censor to the extreme with HN, and I don't think I'm the only one.

Maybe I'm in a minority here, but I'd like to see mean comments flagged/deaded. And stupid comments filtered to the bottom (but not otherwise punished).

"Stupid" comments are one thing, "stupid" questions are another. With "stupid" questions you at least demonstrate a willingness to learn and not just spew political talking points, biases, etc.

I self-censor because of this and think we shouldn't mind. I end up writing comments I don't submit, but I consider the exercise worthwhile because forcing myself to write my thoughts out and then judging whether they're truly helpful/insightful teaches me what knowledge gaps I need to fill eventually and where I'm just falling back on preconceived notions/biases to form my opinions.

So if you think you'll be down-voted ask yourself why. If you don't know why, then re-formulate your post as question to understand others' reasoning.

On second thoughts, I have to agree with you re the differences between "stupid" comments and "stupid" questions. Good point.

Can you clarify the kinds of self-censoring you're doing?

Some of it is really a very good thing -- namely, "am I asking a question that would be trivial for me to answer for myself?"

There are plenty of "stupid" questions that are important to ask of yourself (and are part of the learning process for all of us) but need not clutter up a good discussion.

If everyone self-censors by asking "will this increase the value of this discussion for other readers?", I think that's a good thing.

I think you're talking about self-censoring based on "will this comment reveal my ignorance and harm my karma?", which isn't the same thing. A question or two along the lines of "I don't know much about this, but it's interesting and seemingly not well-covered online -- does anyone want to give a quick high-level explanation?" are normally welcomed.

Be polite, be on topic, don't ask something that's trivial to answer for yourself, and consider the value of the question to other readers... and you don't have to be already a wizard on the given topic to discuss it.

I think self censoring is largely a positive - it means people will actually take the time to think through their post, rather than just responding emotively. Whilst I'm sure there are times that valuable comments are lost because people are too cautious, I think the overall effect on signal-to-noise ratio is hugely, hugely beneficial.

But "in physics" sounds like a university class where people are paying money to learn.

Is Hacker News about teaching people things? Or does it have some other goal?

Agreed. Punishing people who make stupid comments seems like it causes resentment and anger and thus more stupid comments.

I wonder if there a parallel with the whole punishment vs rehabilitation aspect of the justice systems. In essence, a justice system is a social tool to cause people to act a certain way. Some techniques are better than others.

I think the problem lies in that the downvote has two distinct purposes:

1) People down vote to express disagreement. 2) People down vote to eliminate value-less comments.

There needs to be a separate function for each of these. Perhaps the current down arrow for disagreement and some sort of irrelevance flag which wouldn't impact karma, but would still make the comment fade or eventually disappear.

I typically approach down or upvotes to my comments as (dis)agreement.

What is important to me is furthering the discussion if someone downvotes me for disagreement.

Requiring a dissenting opinion with a disagreement downvote would not only satisfy my wanting to hear the objections to my comments, but it would also require the downvoter to have a good dissenting opinion (lest his post be flagged as valueless/irrelevant).

Requiring a dissenting reply with a downvote is for my own personal reasons, but I do think there needs to be a distinction between disagreement downvotes and irrelevance/valueless downvotes.

I've had my comments republished in your monthly but I rarely comment on HN anymore due to the negativity. Case in point:


We've gained a few trolls but I don't think we've suffered from evaporative cooling yet so it's not too late. I support removing down voting and switching to flagging specific offenses e.g. spam|offensive|troll to start.

Alternatively [just a crazy thought] how about we open up and show what each user has down voted. Perhaps as a courtesy we could only show down-votes starting Halloween 2010.

It looks like that was a case of piling on. There may be stuff I can do to mitigate that specifically. BTW, it's not our monthly; Hacker Monthly is someone else's project.

> ...but I don't think we've suffered from evaporative cooling yet...

If we assume that a user's total score is a good indicator of their value to the community, then the HN leaderboard would agree with you.

I'm not so sure, though. IIRC, pg has mentioned in the past that he spends less time here now, and I rarely see anything from Reddit's founding user group. Another HN user that I've been following for a while -- DaniFong -- who's doing some pretty interesting stuff and has a tendency to post fairly intelligent comments, has had a really obvious decline in activity here.

Although I've been a part of this problem as often as not, I've also found that the discourse here isn't worth participating in any more.

But, maybe there are alternative explanations for all of that.

I think of downvotes as the gong on the gong show. One can follow all the established rules of politeness and still negatively affect the atmosphere of the site. The downvote is crucial for dealing with bores, stupid people and obnoxious comments which fit into the discussion guidelines, but are still obvious trolls.

I'm relatively new to this community, but I've thought for a while that the problem with any points-based system is the inherent competitive nature. One of the interesting things I've learned in improv classes is the "Yes, and..." structure that performers use to build on each others performances to create something bigger and better.[1] Structurally this may be as simple as implementing a mandatory "Yes, and..." string to the beginning of each comment. Maybe it is only displayed as the user is writing comments, but it could be a simple, effective nudge to be more constructive used in conjunction with the existing points system.


Yes, and what should we do if someone posts something that is not constructive or that cannot otherwise be built upon?

You're right, this is strictly an attempt at prevention, not a cure. But it might be interesting to attack a problem like this on multiple fronts. Right now, however, most of the focus seems to be on the "cure" side of things.

What I'm curious about is what sort of data is available on these negative comments. If we were to identify (and agree) on a large enough sample population (n>30, probably) can we start seeing any trends in terms of downvotes, account age, posting frequency, etc? Do any of those correlate with a sample population of comments we identify as positive?

Other half-formed ideas I have that may be interesting: - mandatory cooling off period (one to two weeks) before account activation - analyzing downvotes by user and, once a threshold has been reached, putting them in a "time out." - being able to block certain users from appearing in stories once you've logged in

I'm an advocate of mean answers, if they're well-placed. I've seen them work at kuro5hin.org; they're 'area effect' weapons against stupid posts (by example), stronger than just a downvote. The truth is, as a site grows, you really only want a subset of the readership posting, and you want the rest to think twice before doing it (that sounds bad, but if you want read-quality and not write-quantity, you've got to do something to make people pay attention). Now kuro5hin.org is pretty much dead, but I don't think that was the cause.

The trick is that we don't actually want to shame and anger people posting comments that don't fit into the HN ethos; we just want them to improve their comments, or wander off elsewhere without feeling too bad about it.

If someone has good ideas but is used to discussions on wilder forums, we want to educate them in a respectful way. If someone really doesn't belong here, we don't want to get their attention; we want to make HN as boring as possible to them, so they go away instead of posting a self-righteous rebuttal or seek vengeance in some way.

On the subject of respectful education... it might be helpful to have some built-in explanatory text shown next to downvoted posts shown only to the author, with the standard reasons for HN downvoting, and suitable responses for downvotes that seem unjustified.

Here is an idea: Instead of deleting "down vote", instead, add "yay!" and "boo!" links.

A user can see the "yay!" and "boo!" scores of any recent post or comment.

"Yay!" and "boo!" scores do not effect karma or otherwise effect HN privileges.

(The idea is to allow the exchange of "push a button" feedback that, in some sense, doesn't count. Up/down votes can still work as always.)

To be honest, since opinions with which I disagree often seem stupid, I see downvoting when there is disagreement as a natural reaction to a certain extent.

Philosophically, downvoting should impact the karma of the downvoter.

I propose: a "mean/troll" flag button in addition to the regular voting buttons for each comment. In addition, we need a mean/troll loserboard.

Agreed. I think downvoting starts a revenge cycle, and the methods that are in place to prevent revenge only make the revenge less direct.

For instance, you can't downvote replies to your comment, but you can go into their user history and downvote their other comments. This is worse, because their other comments quite literally have nothing to do with what would have been downvoted had that measure not been in place.

If someone views downvotes as punishment, then this engenders retaliation, and retaliation doesn't engender civil discussion.

Actually downvoting a bunch of someone's comments doesn't work. There has long been protection against that.

You can't down vote old comments, so there history is safe.

Downvotes can be used to stifle dissent, but are generally valuable.

I feel like 60-70% of the heavily downvoted comments I see are already trolling or flaming (or in some cases something dumb, like a 3-word reference to an internet meme). With the grayed out text, I think this is a fantastic visual cue about how not to act on HN.

(I could have just written this comment as "/disagree", and then watched it get appropriately downvoted, and learned my lesson)

Edit: I should also add that as a barely-contributing user, I don't have the ability to downvote. But I'd still like other people to be able to do it, to enforce the community's standards.

I'm embarrassed to say that I have accidentally upvoted many times. This always happen on a particularly egregious comment that caused you to be distracted while clicking.

We should not remove down votes outright. I've seen a comment get downvoted to -4 because people didn't like the commentary, but then shoot back up to +10 when it was shown that they were correct after all.

…but if we remove downvotes, they will just stay at 1 instead of getting downvoted. Later they will still be able to shoot up. They will only get flagged if they are actively in the wrong spirit of the community. (spam, abuse, etc..)

The concept of "down" is conflated in people's minds in a way "up" isn't. It is used by people to mean either "I disagree" or "totally inappropriate". The first is not supposed to affect karma but the second is. If you simply replace "down" with "flag" it changes how people think about it.

As for accidental up-voting, moving the links so that there is separation would fix that.

Ah, ok, that makes a lot more sense. I was imagining removing comments that had been downvoted; not removing the link to downvote.

I generally agree with that, but it reminds me of another problem. The Karma number is now an ever-growing and meaningless figure. I think it should be harder to improve your karma as you get higher, and it should be easier to lose it. That it actually has a meaning.

ah.. Progressive taxation. You're a commie liberal aren't you. ;-) (don't worry, I am too)

Inflation may be a problem some day, but for now I think it's under control. The leader board is still at a human understandable scale: http://news.ycombinator.com/leaders

I agree with you. The only times i have wanted to downvote was to mark some comment as spam and/or troll. Given that we have a 'flag' link on comments, i don't see a need for downvote, except to undo a upvote.

Since you asked... I've always thought you should be a bit more rigorous about eliminating the sorts of "rah rah!" topics that get people fired up. That's not a technical solution, but it's not really a technical problem either.

What I mean by "rah rah" things includes political articles that make people think "right on!" Paul B's article on Proposition 19 falls into that category for me: I strongly agree with him.

But when I step back and look, I think that I don't want to read everyone's "rah rah" articles because they tend towards flame wars, and, if opposing views are trampled to the point where those people go elsewhere, group think. I also think that, since they are topics that anyone can have an opinion on, that they are much more likely to draw people only marginally interested in the core topics of "hacker news". Those people in turn will submit more articles about politics. At a certain point a few days ago, between Prop 19, John Carmack's libertarianish views, and something else I don't recall, the top three spots on HN were occupied about articles that were essentially political in nature.

I would much prefer that articles like that one, much as I agree with it, and think Paul B is an awesome guy and a great contributor here, were ruthless pruned from this site.

The proximity of an election is likely a factor in this.

That's a good point; hopefully it will subside afterwards.

There's always another election and another hot-button topic right around the corner.

What you're talking about is the primacy of emotional alignment vs intellectual improvement. That is, identifying with the emotional content (like you said - "RIGHT ON!" or "NO WAY!") vs reading content that causes emotions.

The former is, of course, the Twinkie of human growth. HN is becoming Twinkie-ville. Reddit is Twinkie-ville already, but it's Twinkie-ville with a sparkling sense of humor, which HN lacks (cultural difference).

Granted, we're all emotional creatures first and foremost -- false impressions of reason aside -- but… there's something that's dramatically changed in the last two months. Don't you think?

Even the non-emotional-alignment stuff is more boring. Boring things stay on the front page longer. Articles that would normally have captured the HN spirit languish.

What's going on?

I almost miss the "What makes an entrepreneur?" pr0n.

That's something I wish HN could find a middle ground on. I _like_ Reddit. It's fun, and if you don't like the conversation, you can hide it and move on.

On HN, I can chat with some of the most influential people in technology, and it's great. But the moment anyone cracks a joke? Downvoted, sometimes in to oblivion.

I find it hard to believe such a sharp group of people would be so averse to humor, but the votes seem to indicate that.

The reason seems to be that HNers feel like they have to project an ultra-logical persona at all times. Even Spock-like. How'd it get started? I wish I knew.

It's not just jokes. I've experimented: I can make the most insightful and practical comment in the whole thread, but if I use "cuz" or "gonna," or "man," or more than a single (maybe two) exclamation points in the whole thing, no upvotes for me! But the more pseudo-intellectual trappings I pile on -- for example, eschewing contractions -- the more votes I get.

Mentioning a logical fallacy, even if it is itself fallacious, always works a treat.

It's pretty silly.

I think that's why I've been spending more time back at Reddit than HN lately. They talk about the same stuff with the same depth, but they have more fun doing it.

Even Einstein wasn't afraid to stick his tongue out for the camera.

I've had humorous comments voted up here. IMO, the reason why is that they were actually funny in an original way, and perhaps not so blatantly, "ha ha" obvious to the casual observer as some of what passes for humor on reddit.


For example. I've seen others upvoted too, but it's mostly only for genuinely funny stuff that's not some tired, bedraggled "meme". The other thing is that it's nice not to have too much of it - people "clowning around" just for the sake of attracting votes.

Color me illiterate, but I couldn't find where on Reddit they talk about the same stuff? The startup subreddit looks dead and the programming reddit is quite different. I'd love a link or name!

Here you go: http://www.reddit.com/r/Entrepreneur/

As for other stuff, the stories you see on HN still end up there. Try to submit one and it'll point you to the story if it's been submitted. And if it hasn't been, anyone can submit stories.

Replace the total karma score by the username in the upper right of the page with the average that's already visible on the profile page.

An average is a better indicator to the user of how well they are engaging with the site. It only goes up if the user consistently posts worthwhile comments, which rewards being selective about when you post and thoughtful about what you post. Fewer and better is encouraged. Consistently good is encouraged. Lousy comments (like one-on-one arguments and "Me too!") are discouraged even if nobody votes them down.

This bridges the gap between two existing forms of feedback: total karma (which nobody really cares about, even though it's on every page) and the voting on individual comments (which is the closest thing we have to an indicator of comment quality), and gives the user continuous feedback on how much value they are adding to the site. It makes it much easier to know if your commenting is getting better or getting worse. That is, it is much easier to tell that my average is up or down from the last time I looked than it is to detect the fluctuation in the rate at which my karma accumulates.

Even if lousy comments get voted up--as happens when, for instance, someone first-posts an opinionated quip on a controversial submission and people swarm to vote in agreement--the impact on the user's average karma (which better reflects their value to the community) is much lower than the immediate spike in their total karma (the feedback they currently see). You don't have to stop people from writing such comments--just give them a smaller cookie when they do.

Obviously you can't make anybody alter their behavior over a number, but I think most here would if given a number that they could trust as representative of the shared goal of keeping this place from sucking.

tl;dr average karma is better feedback than total karma

Lousy comments (like one-on-one arguments and "Me too!") are discouraged even if nobody votes them down.

Comments which few people will notice are also discouraged. Most of my 1-point comments are the only comment on their respective articles, and many of those are 'Ask HN' posts. I don't want to find myself thinking "I could contribute here, but doing so will drag my average down".

The ideal measure would be number of upvotes of a comment divided by the number of views of the comment. That way you wouldn't get punished for making comments late in a thread or on a less well read post. Unfortunately measuring this statistic would require a lot of work to implement.

A lot of work, and it might even be impossible. If someone loads a page with 200 comments, how many of the comments do you judge that he has read?

That said, upvotes per page-load-including-the-comment is probably still better than the raw number of upvotes.

You could hook some javascript to the page's onscroll event. The javascript would send ajax calls back to the server that record which comments have actually been seen (although there is no way to judge if the comment was actually read). It would be a bunch of work though, because the raw log would be enormous and so you'd have to create a special batch processing job to get the view counts split out for each separate comment.

You could probably reasonably approximate this without too much effort.

The number of votes an article has and the number of comments are both going to be strongly correlated with the number of views per comment. So would the mean and range of the comment scores. You could also take the time since the comment was posted into account.

You could have the visible "raw score" for each comment and an invisible "quality adjusted score" that is used to give each user a quality rating.

It also encourages you to upvote people who are wrong so that more people will see it when you correct them. Similarly, it encourages people to upvote bad stories so that more people will see your comments.

Comments which few people will notice are also discouraged.

I think this is a feature, generally, but I get what you're saying. With any criteria you present for deciding what is worth posting you run the risk of it being the only criteria. Ideally, we'd all only care about our karma insofar as it helps us avoid making lousy posts, and ignore it entirely otherwise, but idealism doesn't scale.

"How will this influence my average?" is definitely an oversimplified heuristic for deciding what is worth contributing, but it's reasonably accurate and I think it's the best one we've got. (Better than total karma at the very least.)

As I said in the above comment, the voting on comments is the closest thing we have to an indication of their quality. It obviously doesn't always reflect reality, but what that points to is that some comments--the ones that we want around here but that would bring your average down--are undervalued. Those comments should result in positive feedback (via average karma or otherwise), but they don't.

That is to say: The site has a reward mechanism (karma). There is desirable behavior that is not being rewarded or is under-rewarded (e.g. replying to less popular Ask HN posts). Is there some way to reward that desirable behavior?

FWIW -- and it appears that pg has just implemented exactly this -- I decided to play a game with my "average" a while back, back when it was a reasonable value.

I found that it didn't improve my interaction with the site, it just changed the way that I "played the game". Not literally playing a game, but there was a clear difference in motivations for commenting.

For example: thread is more than 6 hours old? Don't comment, nobody will ever see it or upvote, so my average will get dragged down. Don't reply to any comment at the bottom of a long thread. Use short, witty comments, because those take the least amount of effort for others to upvote (I'm not good at making comments like that, though). Etc.

Interesting. What about a moving average? One that ranks you based on the last ~30 days of activity?

How about the ability to shut off karma for certain posts? One exception if it's down-voted by more than ~10pts. Then the user gets that subtracted from their score, preventing abuse when posting mean/hateful comments.

what if it were average of: (comment score/parent score) ?

An average is a better indicator to the user of how well they are engaging with the site. It only goes up if the user consistently posts worthwhile comments, which rewards being selective about when you post and thoughtful about what you post. Fewer and better is encouraged. Consistently good is encouraged. Lousy comments (like one-on-one arguments and "Me too!") are discouraged even if nobody votes them down.

I disagree. I like it when people make a post because they have something to say to someone else. I like it when I reply to someone and they reply back and it starts a conversation. Maybe it leads to an interesting train of thought that I might not have otherwise come up with, maybe it does the same for the other person. I wouldn't like to see HN losing that.

And possibly like this post. I don't know how many other people will find it interesting or will agree with me. But, if I cared about average, I might be hesitant to make this post.

Lets hide all karma.

Lets keep the up/down voting arrows, and let the karma operate in the background in the usual way (percolating good comments/submissions to the top, and letting bad comments fall through). But lets just not show karma. Anywhere. Thus keep the quality of submissions/comments the same, and keep the eyeballs focused on what really matters: good articles and good comments.

I think this is the best approach, too, and is one of the tactics I'd decided to use when building a forum one day.

If we assume that any displayed score is going to trigger some amount of gaming motivation in most people, then displaying scores of any kind contributes to a rise in game behavior versus social behavior.

This is really the best solution because it creates disincentive for upvoting "on the bandwagon," as so often occurs.

Extending this idea a bit, add a second statistic which indicates what percentile in the overall HN bell curve you fall into.

For the sake of example, imagine an evenly-distributed bell curve. The high end averages 10 karma's per post, and the bottom end 0 karama's per post. If your average is a 6, you'll be in the 60th percentile. A little "60%" could be displayed next to your username.

This allows you to quickly compare your level of contributions to those of others.

A lot of people probably like logging in and instantly seeing their 'karma results' from whatever they recently posted... but I guess that's kind of addictive behaviour we'd probably be better off without (in terms of spending too much time on HN), even if the change causes some irritation.

Looking at the leaderboard, the relationship between high ranking hn members and average comment quality seems fairly weak to me:


This is probably due to the fact that some people spend a lot of time commenting on threads that don't get much viewership. I doubt there's much harm in showing the average as a reminder to try to keep the quality high, but I think there are probably better ways to encourage it.

Ok, let's try that for a while.

Is there any way I can see other people's average karma?

The individual users pages (except my own) still seem to display total karma.

The leaderboard is still ranked by total karma. If we are adopting (temporarily perhaps) average karma as a metric , it would be great to see who is ranked highest on it or even just how much average karma a user has, just as we can check total karma of a user today

I also suspect that the frequency of average karma calculation may need to be increased.

There is now, on the profile page.

Agreed - people whose behaviour might be changed by the move from total to average are people who care about how they look, so if no-one else can see it they won't care, surely.

Showing (total, avg) may be a good compromise. Showing the average is an incentive for / against certain kinds of comments (i.e. against answering questions in old threads), showing both might soften that. Also, I use changes to my karma as a quick proxy for if I have more comments in any threads.

Come to think of it - a "new responses" link on the profile page could be very helpful.

Is there a reason flag isn't shown on comments until the reply page?

I usually end up posting on old threads (since I only check once or twice a day), which means I rarely get more than 1-2 upvotes per comment (if any).

I've got .77, which seems a lot lower than I would expect. I rarely end up in negatives on my comments. But when I do, the negative is a high one.

This definitely discourages posting things that won't be highly upvoted.

The problem I have with this is that absolute number of upvotes is not equal to quality of a comment.

A good comment posted under less popular news item will get eyeballs, less upvotes, and lower poster's average.

Similarly participation in long thread (even if that's constructive thread, not a flamewar) or response far down the page is unlikely to get many upvotes, regardless of quality.

I think the problem comes from the fact that we don't have a tool for expressing that we disagree with the tone of what someone is saying, but we agree with the position of the comment. This means that someone can write a mean spirited comment that happens to support a popular position in this community, and they won't get downvoted. Other people, seeing this, will take it as permission to be rude in their own posts.

I would suggest having a "tone" button that downvotes for tone. The level of fading applied to a comment will be bases on whichever of the two scores is the lowest, tone or voting - so a post with 10 upvotes but -4 tone will be invisible. To avoid abuse of the "tone" feature, it should be possible to recover the user names of those that flag a post for tone.

Particularly for folks who are not malicious, a reply saying that the tone is not normative is often enough to correct them.

I always downvote on harsh tone or meanness directed at others (but not directed at myself). I upvote for valuable and interesting information, and intelligently argued points. I try not to upvote for mere agreement (although sometimes it's hard not to), and I never downvote for disagreement. (There are some exceptions - questions that are phrased as surveys etc). Since my time here on HN, it seems to me that most people follow something like this, much more so than on most of reddit where upvote-if-funny/upvote-to-agree/downvote-to-disagree is the law of the land.

Agreed. there's a perrfect example of this in recent nosql debates - a few abusive comments are sitting at plus 17 because people agree with whats being said, but the tone is very, very nasty towards anyone being disagreed with (which includes myself amongst others).

Try to prevent people from feeding the trolls.

When people reply to a -4 comment it makes it harder to ignore that comment. It effectively pushes it up on the page. So disable replying to -4 comments. It's probably also necessary to hide any existing replies.

Exhibit A: http://news.ycombinator.com/item?id=1852305

How about you add a little note just above the reply button. Something like:

  Does this comment add value/insight/healthy debate to this topic?
Very low tech I know, but sometimes people need a little reminder to think about the tone of their message and how it will be received by the community before they hit that reply button.

Having a reminder whenever you comment could cut down on thoughtless comments. Everyone makes a bad comment every now and then I think, a little nudge in the other direction could be all that's needed in some cases.

Two improvements:

1. Make it a checkbox people have to check. Just to force people to take one minor action, and read it.

2. Rotate the text with a variety of different statements, just to mix it up:

"Are you being nice?" "Does this contribute to the conversation?" "Following the golden rule here?"

and most importantly:

"Will this get upvoted?"


It's a low tech solution which focuses on fostering community values.

In general, text like that is never read on websites. It just adds clutter.

Totally agreed. I made a similar comment above. I think it'll work.

Okay, here's a trick. The goal is to detect people who can't discriminate bad stuff from good, can't restrain themselves from replying to a troll (or can't identify when they are doing so).

The mechanism is, counter-intuitively, to allow people to rate their own comments. However, this rating is not to be directly used to affect how the comment is displayed, since that would obviously be easy to game. Instead, it would be used to identify people who can't restrain themselves from engaging in a low quality discussion; ie, replying to a troll or similar.

Obviously we can't expect trolls to rate their own post as trollish, but we can expect at least some people to have the self-knowledge required to admit when they are posting while annoyed or similar.

So, the system assumes that a comment is only as good quality as its parent, or at least, not much better. Obviously, one can make a good contribution in reply to a bad one, but it's not necessarily a problem to discourage this - since the reader of such a contribution has to read the bad contribution to get to the good one.

Internally, each user has their own scale, which their replies - and the replies to their comments - relate to the scales of other users, using the above assumption. A user would then be shown only comments to which they are likely to reply; this means trolls see all the trolls, but quality users see only other quality comments.

There are a couple of potential problems. One is that it would bias people against replying to deeply nested comments; and ultimately, towards making submissions instead of replies. One rather drastic way to deal with this would be to remove the submission mechanism, and only have one conversation tree; the frontpage items would then be the roots of current discussions inferred from the structure of the tree.

The second problem is that while this mechanism would help current users filter out trolls, it would do nothing to help new users find the best discussions. There is a conflict between preventing the system being gamed, and allowing new users to find the best discussions; because the former would prefer the ratings to be secret, but the latter would allow information about them to be public. It is not obvious that there is a workable trade-off.

You have the full HN dataset and voting logs. What if you used it to train a simple classifier that would predict the likely karma of a comment as a function of a number of features, including username, username responded to, bag-of-words vector of the comment, time of day, total number of comments in thread, "controversy" of the comment (as reflected by a mix of up & down votes) and similar factors?

The goal could be something like Google's Beer Goggles, but for comments: "it looks like you're writing a flame, are you sure you want to do that?"

Another thought: there's a big difference between a comment at -1 with 10 upvotes and 12 downvotes vs. one that has occasioned 2 downvotes. Might be interesting to see the effect of showing both the number of up & down votes rather than just the sum. The idea is that genuinely trollish comments at -1 will be differentiated from minority points of view.

I hope you would reward controversy rather than punish it? I don't come to HN to read happythoughts.

Indeed, the idea is that a "controversial" comment can evoke both disagreement and support, while a trollish comment would just result in downvoting.

OTOH, some particularly clever trolls are of course intended to provoke division of an intellectually unproductive sort. Not sure if machine learning could identify that...though features computed from the commenter's past history could be informative.

I think it's more effective to train classifiers on users' voting logs and comments in order to maintain the quality and thoughtfulness of HN. This allows you to classify all your users into classes that resemble their characters. e.g. A genuine nice guy won't post mean comments and won't pick 'punish' a user he disagrees with. A person who downvotes when he disagrees can be classified as such. I'm sure a troll has a specific pattern too.

Then you can enable or disable aspects of the site for each user-category. e.g. disable comments from trolls, disable downvoting for people who downvote when disagree as this stimulates groupthink, etc...

A set of users can serve as a training set for each category. It is possible to let the 'genuine'-category classify comments from users from the other categories in order to enhance the classifiers.

Actually this is an interesting machine learning problem, I still need to pick a project for my course in machine learning so I can research this if I can get my hands on the datasets.

Griefing: If the system does not already, I think it should flag for review users who consistently downvote a particular user.

Cliques: Likewise if the same few people are consistently the first to upvote stories or comments of a particular user.

Multies/Sockpuppets: I think votes for a user buy another user from "too close" an ip address should be automatically discounted. I'm not sure how to define "too close" (and I suspect HN already does this)

Anonymous posting: I like the way MetaFilter handles anonymous posting instead of people making throwaway accounts. The downside of this is that 1) Posters have to really trust the site admins; 2) This becomes more work for the admins

Disallow noob posting articles: What % of the [DEAD] articles were posted by users with a very low karma? What % hit the front page?

(Yes, those last two are not directly related to the OP's comment but I feel they're ideas worth throwing out there)

I would be highly surprised if the majority of your suggestions are not already in HN, in some way except for the anonymous postings. As for the trust factor, that applies to MF just the same, unless you go out of your way to hide your traces you are always going to be identifiable to the site admins. Fortunately.

Disallowing noob posting articles would be a good thing, a very low karma cap on posting articles would help (5 or 10 or so).

That would be just enough to increase the cost of spam to the point where it would no longer be worth it.

The first thing I ever did with my HN account was make a submission that made it to the front page. There will be some false positives from banning noob submissions.

Flagging an entire thread by clicking 'troll' would help. The up/down voting tends to promote trolling, as the first controversial cliche is the top comment, having spawned a deep threaded conversation with lots of emotions and voting. A different symbol is needed. Something that makes people take pause and say, 'Is this a good subtopic of this link, or is it the exact debate anyone would have predicted?'

Put troll threads/comments at the bottom of the page, or hide them. Or use spam ratings to amplify users' votes who up vote non spam comments.

Short of that, add the ability to minimize entire threads.

I would say that many do not know the flag options exists, because you first have to click the comment's link, and then click flag. If more people knew about the flag button, and if more people remembered to use it, then I think these threads would disappear more quickly.

It was already suggested often that people should "pay" karma to post and maybe even to comment

Users should be encouraged to comment LESS, only if they have something useful to say.

That's what I'm doing anyway, because I really don't care about karma and speak only when I have something worth to say, would be cool if anyone was acting like this since we have no shortage of users.

You're almost there with your "reply time delay". Just take it all the way...

Limit the number of times one person can post in the same thread to once (or maybe twice), either from the top node or a sub-node. Example:

PersonA says, "blah blah blah..."

PersonB says, "You're an idiot because blah blah blah..."

PersonA is locked out from replying to PersonB.

This will do 7 things:

1. It will eliminate argument strings between 2 people.

2. It will remove the incentive to flame because your flame can't turn into an argument.

3. It will encourage people to think through what they have to say, anticipate counter-arguments, and address them in their original comment (or with an update in the 2 hour edit window).

4. It will foster a greater sense of community by encouraging people to stick up for each other. (PersonA made a good comment, PersonB trolled, PersonA can't reply, so I will.) PersonB will get the message much better when confronted by the community instead of PersonA. (Of course, if no one sticks up for PersonA, then maybe they really were wrong.)

5. It will encourage 2 people to take their legitimate discussion (of little interest to others) off-line.

6. It will reduce the length and indentation of threads making them easier to read.

7. It will encourage everyone to slow down and think a little. Those who don't will stick out (not in a good way). Trolls will get more satisfaction somewhere else.

(This is already working by accident. I have gotten so busy that I only visit hn in short bursts. Often, when someone challenges me, I don't reply because I have left and returned much later. But others have replied for me. The resulting thread is invariably better than if I had replied myself.)

I'd worry that it also eliminates the opportunity to have those tremendous back and forth type discussions where people help each other and make incremental improvements to the idea they're working on. That's a lot more rare than two people griping back and forth - but what price would you pay for the sublime?

You would really lose the 'discussion' aspect of things by turning it into a 'say once' system.

People in complex situations almost never behave exactly as you'd expect them to. An artificial limit to discussion might have terrible unintended consequences. What about the problem of someone asking a question in their comment.

"Hey, wait a second, what did you mean by X?" [I'm sorry you're prevented from replying in this thread]

Good point. (I guess I wouldn't have been able to say that if we adopted my suggestion.) Maybe a one reply limit instead of a zero reply limit.

I know the time delay thing was a good start. I just think it needs to be extended. It seems that threads lose value as they get deeper.

Oh, and I just thought of #8: It encourages you to get back to work since you won't be replying. That's something everyone needs to consider.

This to counter number 1, but I see other issues too:

2. Trolls do not care who replies. This can avoid the argument between the OP and the troll, but does not prevent the latter to post the snarky comment in the first place. Some people just want to express their frustration attacking someone, they do not care about replies. Moreover, the lack of chance for the OP to reply could even incentivize the bad behaviour, giving the flamer a sort of "revenge" weapon. People often find gratification in silencing others.

3. Some discussions give value to the community because they happen in a short lapse of time. If someone needs 2 hours to reply, then there is a chance fewer people will see it.

4. As I said, trolls just want replies, from anyone. Flaming the full community could even be more satisfactory to him.

5. Unfortunately often there is no way of doing this because some users do not have their address in their profile.

Perhaps further replies could be allowed only if the comments are upvoted/not downvoted?

I would argue that if there is a way to get a comment on the board at all, those who want to continue arguing will post those comments.

Also, if you can edit your comments, you're opening the door for twisted threads where replies are in one block per person with "@person timestamp," although that's easily fixed by only allowing editing for a short while.

Two sets of scores and arrows: one for agree/disagree, the other for high/low quality. Separating the concepts would emphasize the issues as distinct, and also make them easier to vote on. The high "quality" score would act like the current "flag" button, but with a finer control. Although a much more complex UI model, it's maybe worth trying.

Possible solution 1: Smaller communities; Dunbar's number. I know it's been suggested before, but here it is again... When everybody knows everybody, people are nicer. Just create one unique, isolated instance of the site for every X number of users. Distribute current users randomly (or whatever) among the instances. Assign new users to the least populous instance. If an instance gets over max_number of users, split it in two.

Don't bother trying to do fancy stuff like connecting votes between communities; if something is really good people from each community will submit it separately. Also, it might be nice to be able to move between communities... but that sounds like something for version 2.

You could have these smaller communities run in parallel with the main site if this seems like too big of a shift.

Possible solution 2: Not sure if this really addresses the issue squarely -- but what if, when ranking stories and/or comments, you factor in the number of votes in the intersection between the current user and the submitter/commenter. Something like:

define votes(username): return ids of last n upvoted items for username

item_score *= (length(intersection(votes(user_x), votes(user_y))) / (n / 2))

Just use the original algorithm if the user hasn't yet cast n votes.

One man's garbage is another man's gold, as they say.

You could do something similar by checking downvotes as well. If two people are polar opposites (one person upvotes stuff the other downvotes), then comments by one could be greyed out or hidden for the other.

If there were simply a box I could check to make my votes public, then maybe somebody could write a Chrome or Firefox extension to pull this data and handle the re-ranking.

Neither of these two ideas would be perfect, but I think they would improve the user experience significantly. If nothing else they would be interesting experiments.

Possible solution 1 sounds like a horrible idea. By fragmenting the community it would fragment the knowledge, quite a high price to pay for clean comments.

I don't know, I find that in smaller web communities I tend to participate more, because I know my voice is more likely to be heard. And if everyone knows everyone than people would be more easily remembered for making good comments. That would give them more of an incentive to be helpful and constructive. So the knowledge loss might not be as great as it would seem.

And communities could still be visible to everyone, even if only members of that community can participate. Then if there's a really good comment in one community, someone from another could just copy and paste it over.

This would kill HN almost instantly. Knowing that there is a very real chance that I'm in a rather boring and mundane segment of the community, solely due to being user #232 instead of user #231, would make me think twice about joining and participating.

Solution 2 is very interesting and a form of instant personalization. "You will like this if this is the sort of thing you like." People who want to seek out disagreement/incongenial views might be able to turn it off should they so please.

With the right data structure you could do this with a single matrix-vector multiplication per page view.

> Possible solution 1:

I actually built this, but never put it up on a server.

Cool. Might as well put it up somewhere, right? Of course, you'd face the big chicken and egg problem of getting users.

Yeah, I didn't want to have to actually work on it, just trying out an idea. The code itself has since been lost in a backup mishap.

You could try fragmenting HN in the same way that Reddit allows; this makes it possible to turn off the bullshit that invariably rises to the top. I mean, I don't even read reddit main or /r/programming anymore; they're just garbage.

Otherwise... aggressively ban people that don't contribute well enough. It is literally the only non-fragmentary way to fix the problem, because it's not a technical issue. People are just terrible. (I'm of the opinion that fragmenting is simply a less aggressive version of banning: leave the trolls to their shit-heaps, forge new ground.)

I'd say you should be banning at a rate over half of the new users you see. And if you look at my posting history, you'll probably see that I'm not exactly grade A material, so I'd expect to get chopped off eventually, probably sooner than later.

In my experience, aggressive moderation often ends up having a worse effect than the symptoms it's trying to treat. Clamping down on a popular (but, in the moderators' opinion, negative) voice usually spawns arguments and discussion that are very rarely productive. Moderating this follow-up discussion leads to a spiral of arguing, resentment, and mistrust between users and moderators. It happens all to easily, and it's not pretty.

What about perhaps a combination of the two ways? Fragmentation and access restriction (lets face it, banning is just complete access restriction)

Those users who are noted as being excessively mean spirited should have to learn to play in a sandbox nicely with others before they're let back into more active areas of the site.

The net effect is perhaps less of a knee jerk negative reaction so that users can eventually learn to contribute again in a positive manner and it keeps the more frequented areas clean again.

Have you considered charging for write access? Allow the site to be publicly readable but charge people a yearly fee to submit stories and posts. They would be much less likely to act abusive if they are paying and their access could be revoked. Several membership levels and tiers of forums could be provided, including private and invite only.

No need to make it that complicated — just require a one-time $5 fee to create an account.

Metafilter has done this very successfully for the last 6 years.

I havent figured out anything specific, but I have a couple of thoughts about the karma system.

It occurs to me that the main failure of karma-type systems in general is valuing quantity over quality. This is perfectly rational for a commercial site that seeks to maximise engagement and page views, but it seems to actively harm both comment quality and the community dynamic.

It seems that the prime concern we have for HN is maintaining a high S/N ratio; Why should the karma system not make that desire more explicit? I may be completely wrong here, but I think most of us would like to see a community where people think carefully before they post. If we want to discourage flaming and poor-quality comments, I think we need to think harder about behavioural prods to discourage them.

I propose removing the display of a users' total karma on their user page and replacing it with their average karma. I would like to see voting privileges made contingent on average karma and would be interested to hear possibilities for punishing users with a high proportion of low- or negative-karma comments.

It is of course dangerous to use heavy-handed technical changes because of the myriad possible unintended consequences, but we all know that internet communities almost inevitably fail. As a community, we should be considering our options now while things are generally OK, perhaps even designing a 'nuclear option' for the possibility that HN experiences some sort of Eternal September scenario.

We also measure average comment score. It's the avg field on the profile page, and it's displayed along with total karma on the leaderboard: http://news.ycombinator.com/leaders

I don't think it's useful to punish users with low or negative karma ... because they can just get another account. Karma will only make you invested once you've accumulated enough.

Maybe a nuclear option would be good. We certainly had success with Operation Erlang I & II. Maybe a nuclear option would consist of locking out access for 24 hrs to all but the old guard and then requiring that for the next week or so that new accts must be active for more than 24 hrs before they can post.

We also measure average comment score. It's the avg field on the profile page.

is it too late for me to chip?

if a user is downvoted>n times, delete the comment with added penalty factor. this would keep the content clean forever.

Despite the downvoted comment being hidden, it almost stands out the elitist comment(and nowadays the chain of explanations growing usually offtopic)

Could we restrict users to one user per ip? I sometimes suspect people using multiple ids to upvote their comments/posts. There are some posts way out of HN league that get upvotes within minutes from the new page (even the spammer ones).

Would it be hard to have new users moderated? One way is for new users to write few lines about themselves. Moderator would look for human(no-spam) and genuinity.

Show a warning or part of content from http://ycombinator.com/newsguidelines.html upon submission of a post or comment - until they attain their downvote karma rights.

One liners like 'i like it too' should be deleted. Perhaps there should be a minimum amount of words for a reply or a warning telling the user to verify if he really wants to post.

Link searchyc.com on top of the page beside 'submit' - new users don't know of this awesome search site. While there is a search link at the bottom of the page, I wouldn't be using it more than once that I know it's googlified. SearchYC works! with lots of criteria. imho, this would reduce many of'django vs rails' kind of questions, that have been discussed in the past.

i'd like to see two things:

1) a metric based on account age, submission karma, and comment karma. less weight on submission karma, since that is easier to manipulate. make this metric the barrier of entry for features instead of basing it purely on karma. it makes manipulation harder, and it encourages actual community participation/development.

2) add another meta-moderation tool similar to the "flag" option. i mostly use flagging for spam or offensive posts. i'd like to see something else (unlocked by #1) that allows you to mark something that is off topic, non-contributing, or otherwise adding to "the decline". accounts accumulate these like points, and the more points an account has, the more features get revoked from it as punishment (which leads up to a temp ban). points decay with time, and have a hard cap on how many can be obtained in a single day or by a single comment, but a long history of points leads to more points being obtained per flag.

edit: just to add some thoughts -- the only way to prevent people from being anonymous internet idiots and adding noise to the signal is either with a carrot or a stick. there's not much we can offer as a carrot that other sites can't, or that they can't get by non-participation (just reading the site). that is, unless you want to implement a private karma-barrier section of the site (unless you already have an i'm not in it... :( but then i wouldn't know about that carrot and wouldn't change my actions to reflect wanting to obtain that reward).

the only real alternative is the stick, hence the revoking of features.

keep track of submission karma and comment karma separately. People should not be able to downvote in comments unless they have earned the karma threshold required to do so from comments.

Ex [of what to prevent]. someone submits the latest tech-crunch article first and gets upvotes from everyone else trying to submit it. They can then downvote in comments without first being subjected to a socialization period of earning upvotes from thoughtful comments.

But then what's the incentive for good submissions? If it's gotten to the point where essentially every article from certain sites (eg Techcrunch) gets submitted by karma gold-diggers, it might make sense to just have them auto-submitted so no one gets the karma, but they can still get discussed.

EDIT: Another thought: Only allow submissions from people above a given karma. That way you would always have to earn your stripes through commenting. Downsides: Wouldn't allow sensitive anonymous 'ask HN' posts from throwaway accounts, and we would lose all the good submissions from new accounts.

I think you need to do as reddit did -- hire a full time person as community manager (http://blog.reddit.com/2008/10/welcome-erik.html, http://www.reddit.com/user/hueypriest). You can only take technical tricks so far, community management is an AGI-hard problem.

Educate people on flagging inappropriate comments, like offensive ones, instead of downvoting. Downvoting can be used for disagreement, but nasty comments deserve to be flagged and then eliminated.

Then, put a heavy fine on it: for example, if a user has a comment eliminated due to flagging, he loses a big amount of karma (20%?) and he cannot comment/submit for one day.

The flag button doesn't appear until you get to the reply page - making sure people know they can flag comments would be a good start.

Also, 20% of karma isn't very useful for troll / griefers on throwaway accounts. A fixed value (-100?) might be better.

I'd like to see the karma system dispensed with entirely. I don't think it encourages thoughtful and productive discussions.

The karma system discourages duplication and it is worth its weight in gold just for that. Imagine if the 50 people that agree with a comment felt the need to chime in. Whereas with karma you upvote and are done with it.

To a newbie, karma voting patterns have huge instructional value (at least they did for me)

HN without karma wouldn't be HN.

Why assume that everybody that upvotes now would take the time to "me too"?

If anything what I see again and again is that some comment takes a quick lead in upvotes in a topic and dominates the conversation afterwards, despite being often followed by more interesting comments.

> I'd like to see the karma system dispensed with entirely

Totally agree with that. Comments should be judged on their merits, not the number of points they have gained or lost. Disagreement should be expressed using actual words rather than "thumbs up" or "thumbs down". Removing comment karma would slow people down as they scan comments: rather than just looking for the highest-rated ones, they would have to look at the meaning of all of them. Bitch-slapping commenters with a down-vote doesn't add anything to community or discussion. Clicking the "me too!" upvote button doesn't help that much either.

Keep a flag for "spam", "inappropriate" or "offensive" and leave it at that. People who habitually misuse those buttons should lose the ability to use them at all. People who are constantly (and justifiably) reported using them should have their account limited in some way, maybe limiting the frequency or number of comments they can make.

(I mean comment karma, specifically, not submission karma.)

I want to measure the worth and contribution of a comment; here's how I tried.

    (defn v-rat
        [uv dv] 
        (/ (expt uv 3/2) 
           (+ uv dv))))

    (define nest-weight
        (/ 1 (+ (nest-level comment) 
                (/ 3 (nest-level comment)))))

    (+ (/ ak 2)
       (v-rat uv dv)  
       (reduce + (map (fn [c] (* (nest-weight c) 
                                 (v-rat (upvotes c) (downvotes c))))
                       (remove (fn [c] (= user (user c)))
Where ak is the average karma of the user (only on comments, if possible), uv and dv are upvotes and downvotes respectively, and comments is a list of comments in reply to the comment we're weighing.


Abstractly, I'm trying to consider the comment based on both its individual value as well as how much good discussion it generated. I also take into account, plainly, the average past worth of their comments.

This could certainly use some tweaking, but I like the values I've plugged in so far, and it promotes discussion as well as thoughtful contribution, and gives a nod towards users who have high quality comments in the past.

I'd be interested in tweaking it some more if this seems at all a step in the right direction. I think I could use a better heuristic for replies, as well as a more sophisticated v-rat. I also wonder if it would be possible to see which users downvote and upvote, and give those votes more power (technically possible, that is), and also take into account the users who are replying, and to what effect. This would require a lot more testing, and maybe more CPU cycles than PG would like.

I'm not sure taking time into account is a great idea past the first ~10 minutes of submission. For those first ~10 minutes, perhaps weigh the comment solely using the avg karma of the user + some constant. After those first x minutes, rank by the algorithm.

I have a suggestion: after first creating an account, users could be redirected to the 'guidelines' page. That might help frame the tone of subsequent comments.

Can we hide negative comments, ala reddit, slashdot, everywhere else? I hate yielding real estate to trolls.

If I could fix this I'd be launching the next billion dollar start-up around the technology. Just about every site goes through this type of growing pain: in the beginning it's obscure and spreads by word of mouth so that all of the people who visit the site share a common ethos; then, it grows and starts to attract other people who do not share the spirit that started the site; eventually, the "new" users outnumber the old users and they begin to set the standard for the sites personality.

You can try to raise awareness and ask people to not be evil on your site, but that's almost as effective as asking the tide not to come in. There are many bitter people in the world and they outnumber the well adjusted people who would rather spend their time building up a project instead of tearing down someone else's. These trolls crave attention - the negative attention they get by aggravating people on-line - so it's really not affective to point out the problem and ask them to stop or to ask the community to gang up on them; that's what they want.

We need a solution that frustrates these people. I've seen some sites where the comments a troll posts look like they are live on the site to the troll but they are hidden from everyone else. I think this is a very effective technique because it denies the troll the attention he is seeking and it saves the rest of the site from his attacks.

But, perhaps the best response is to remember Margaret Thatcher's advice about people trying to make you angry (i.e. no one can make you angry without your permission). If you see something that's not in the spirit of hacker news, just ignore it. Don't respond; don't show it to other people; just let it die a bitter and lonely death.

Why not do something quite simple. Have it setup so that if a user has over a certain karma threshold they can flag a user's post in some manner as being mean-spirited or other such. If a certain number of users also flag the comment, then the poster does not get any of the karma points their comment may have earned. Alternately, the commenting flagging could be coupled with a slashdot style moderation and meta-moderation system.

Already implemented, but "hidden": you must be on the link to that comment to see the flag link - eg. to see it on pg's GP comment: http://news.ycombinator.com/item?id=1852736

Downvotes could be made precious by limiting the number one could make each day, ensuring that people will use them more carefully.

It could then be conceivable to give the downvotes more weight, perhaps putting users in a karma hole, with their posts starting out at 0 points, appearing at the bottom of the comments page, if they have accrued too many downvotes. They would then have to dig themselves out by posting more valuable comments.

Force users to give a reasons when voting.

This would allow two things. First, users can downvote for specific reasons - mean spirited/dumb/insulting/off-topic. If a comment is technically correct, but unnecessarily mean, then I can its disposition.

Secondly, it would provide feedback to the commenter. If my 'me too!' comments are being downvoted, it'd be great to have some reason why.

Insert a delay of, say, 10 minutes until a new comment is shown.

Of course, the commenter should see it immediately and should be able to correct it (or even delete it) in the meantime. But everyone else should see it only after 10 minutes.

This will slow down A-B-A-B-... types of conversation which are usually only productive when being written thoughtfully and thus slowly anyway.

As Shirky says, "handles you can invest in".

One way to invest in handles would be to add more social features like points you can give to awesome comments (you can only give 1 point a day or something), etc. Then again, all these social mechanics are complex and easily gamed.

Another way would be simply to promote handles more: make an easily searched HN users list etc.

Replace comment arrows with a plus and a flag. Plus increases karma and plays into the comment arrangement on the page. Flag is scored separately and silently as a measure of meanness and/or community culture clash. From two dimensions of data you'll have a better capacity to guess whether someone is a bad participant on HN.

I think there just needs to be a better way of communicating the reason for a downvote (or, in fact, asking for a reason for a downvote, like popup and select troll/OT/etc.) so that it costs you a little more in thought/time. It's currently just as easy to downvote as upvote, but the cost of a single downvote to the comment/user is greater than the benefit of a single upvote.

Additionally, I've been thinking for a while that it would be cool if threads were ranked a little differently, and not just on the points/time of the parent. Sometimes there are really, really great comments that are replies to low-ranked ones, and the resulting thread gets kicked to the bottom of the page. Something like Avg points/time or just accounting for the average points of next-level comments when determining rank of a reply at each level.

The marker for high avg. karma that was previous tried for a day. Indicative data: if mean-spirited comments correlate with low avg. karma.

EDIT The previous experiment and discussion: http://news.ycombinator.com/item?id=467181 (4 Feb 2009)

Use the same system that decays a post's weight on the front page to decay a comments weight on a thread.

This will balance out the uneven distribution of karma that is talked about above, as new comments will be seen and given a chance to voted on even well into a conversation.

Having done this, comments that stop being relevant to the conversation should drop to the bottom and naturally attract less attention. Ideally, this will leave trolling posts ignored. (This can be helped further by having user selected timeout apply. If comment / responses are mapped as a tree, hide all branches that have no new activity for the past x hours, allowing users to change x as they choose. Give visual feedback to a comments age by fading old comments)

This solves the 'income' end of the system, as karma should now be distributed a little more evenly over time.

Have you ever considered giving people the option to register their abstinence from voting on a comment? Right now we have up and down, but a lot of the time I feel like I wish I had an "abstain" button. This would be reserved for the comments that aren't mean-spirited, but don't add much to the conversation.

Then perhaps a comment with 20 up, 5 down, and 500 neutral would rank lower than a comment with 20 up, 5 down, and 0 neutral. The idea being that, all things being equal for two comments, the one with more indifferent voters is considered less valuable, and is thus ranked lower. Maybe a neutral vote simply works by increasing n when judging the size-weighted score that HN uses to calculate rankings now.

I'm sure there are flaws in this line of reasoning..perhaps someone would be so kind as to point them out?

A flaw: I might vote neutral as "I agree with this comment's current point value." In that light, 20 up, 5 down, and 500 neutral is a fairly confidently ranked 16-point comment, but 20 up, 5 down, and 0 neutral is not. However, the less confident one should be "floated" to a small portion of users to get more votes.

> I agree with this comment's current point value.

That would be a really cool feature. I would often use it for comments at 0 - something that's factually wrong and needs to be voted down, but without nasty pile-on downvoting that sometimes happens, that to me looks a bit mean-spirited. If the comment in question went below zero, my 'stay' vote would count as an upvote. If it went higher, as a downvote.

I like this idea as well. It also would be easier to implement than my idea :-]

I think the abstain effect can be emulated by just skipping that comment :-)

Ah, but how do you count that in a ranking algorithm? I'm 90% sure my idea is not the answer, but maybe we can think of something else.

Maybe this is naive, but there's something about a binary voting system that just strikes me as something that doesn't fit into a conversation. I often find myself agreeing with certain parts of a comment and not other parts of it. Maybe if the user could vote up the parts of a comment he likes and vote down the parts that he doesn't. Technically, that would be difficult (especially since on HN a comment is simply a text field).

I don't think it needs to get any more complicated. Paul asked for a "technological" solution, and it's not in the rankings; if the profile pages are made more prominent, then people will invest in their social reputations.

With respect to counting comment views, it's already implicit in the page views.

I think the problem is lack of a clear data element that captures what you need. The current vote up/vote down system blends a lot of factors together, which is valuable for some calculations, but not the one you're looking for here.

I suggest a "flame" flag for every comment, which can roll up into a user's rating as a "flamer." The flag would only be available to HNers with a low flame rating themselves and a significant history of comments.

By default, no one would see comments that have high flame ratings. Users that accumulated flame points would get a time out from commenting, leading to a permanent ban if the behavior doesn't stop.

The site is wonderfully minimal at the moment, but using a UI graphic such as a fireball or flame could draw attention to the importance of civility.

Perhaps a gentle warning built into the voting system will suffice. I like the concept of downvoting to express disagreement (rather than cluttering the thread with replies), but I don't think anyone should downvote a comment into negative figures unless it was actually abusive. That creates bad will, which breeds more bad will, et cetera. So what about a little message that pops up saying

"Are you sure you want to downvote this comment? Doing so will push its karma into negative figures. At HN we don't encourage this type of downvoting except in response to abusive content. "

In my experience some people just don't have a clue how to behave online and it's not even expected of them in most places. HN is different.

The ability to post should be linked to the rate of posting. Or maybe the rate of pt change in general. This has two purposes:

It cuts down trolls who snipe in lots of different threads at once.

It cuts down on me too commenters, or the know it all contigent who seemly does nothing else but post on HN, again hitting lots of threads at once.

It encourages people to think abiut what they're writing, instead sending out one sentence snark, or writing a long missive they feel they can "clarify" later in a pointless semantic debate.

This isnt a complete solution... I have no idea how to correct for mean spiritedness, for example. But some kind of limit that tells people that they should stop venting/pontificating and take a walk might help.

When you sign up, a page that explains the community guidelines (before you're moved into the site) might help a little. I for one have actually never seen/found them, and I've seen plenty of comments asking if something was appropriate.

I wonder if it would help at all to add some kind of persistent disincentive for users with low (negative) karma. Maybe the rate at which low-karma users are allowed to post comments could be limited, or maybe low-karma users' posts could start in a slightly shaded state (like downvoted comments).

I guess the success of that strategy would depend heavily on whether a user's karma is a good indicator of the likelihood that they will make a mean-spirited comment in the future. It would also depend on whether the low-karma disincentive is actually undesirable to a potentially mean-spirited poster.

This could also possibly encourage more bad behavior. After all, if the site is going to treat them like a bad actor with no hope of redemption, why even try to avoid trolling?

For every one mean-sprited comment, there are probably at least five comments that are just plain wrong about something. Are mean-spirited comments really that big of a problem compared with, say, commenters critiquing the methodology of academic papers they haven't read?

edit: Jon Stewart is generally quite mean spirited, in that he goes out of his way to get people fired and treats people he disagrees with like little kids, but if he were an HN contributor would he really make the community worse?

It's a good point. I stopped reading another well-known site years ago in large part because so much there was outright wrong, and I was afraid that I'd picked up and internalized false ideas, and would continue to do so, in unwary moments.

It seems to me that many of the 'bad' comments come from users who has a strong feeling on a subject and therefor posts many comments on story, where one might get upvoted early on. I think it could be a good idea to account for the consistency of scores on comments by a user within a story.

Edit: I also think that the effect of the first downvote on a new comment is (or was) to powerful. Especially when the leading comments have many replies.

Add a "time-out" mechanism: If a comment is killed by the editors (and/or by being flagged many times) then for the next X minutes (I'd go with X=60) handle any GETs from that user by redirecting to a page saying:

  The following comment has been judged to be unhelpful
  and as a result you are now on time-out:
  <comment quoted here>
  You may return to Hacker News in Y minutes.

Perhaps clearly stating the policy/approach at the top of the HN page?

Not of the form "You must do this". But rather of the form "The mission of HN is to supply stimulating, insightful, {insert mission here}... Therefore always be polite, thoughtful, engaging and supporting".

That'll be enough I think. It's currently not immediately clear what the mission is unless you explore and delve deeper.

It's also the simplest non-tech thing to try first ;)

pg, (this might be somewhat obvious but)couldn't you detect and (penalize?) the "punishing" some folks compllain about? someone downvoting a specific person across discussions. I would imagine most people do this by bringing up the target's profile first and then downvoting all his comments. Hard to think of a lgitimate use case for this behaviour.

Case in point ...

I've occasionally found a comment that I felt was both mean-spirited and largely content-free. I've down-voted that.

Then I wondered if that person had a history of such comments, so I've brought up their profile, looked through their comments, and attempted to assess each one on its merits. In cases of doubt I've left them alone, but if I've felt that the comment in question was again mean-spirited and content-free, I've down-voted that as well.

I've realised that such actions, while intended well, might be viewed as "punishing" the "contributor" in question. Further, I suspect that PG already detects and inhibits this behavior.

So here we have a case where I believe I'm doing the right thing to help reduce the mean-spiritedness of comments, and I get prevented from doing so.

Limit the amount of downvotes one can perform per day, or per X hours.

Points from comments should have way more weight than karma points from submissions. It's easy to get massive karma points if you submit something interesting. On the other hand, you could spend a lot of time writing a comment and you wouldn't get than many points from it.

Every forum says "don't feed the trolls", but there are always people who do.

If you can identify troll-feeders, you might be able to stop them from feeding the trolls (just a text reminder above a comment field might be enough), hence create less incentive for trolls.

Not an easy problem to solve though, I've tried myself and didn't succeed. May try again.

Only accounts older than 2 (or more) months should be able to comment. Make sure you can't register more than 1 account in x days/weeks. The majority of spammers who'll use proxies to register tons of accounts will probably forget they even had the accounts after 2 months. Problem partially solved.

I just read Paul's essay in Hackers and Painters on his spam filters using Bayesian probability of words in the message. Could something like that work on trolls? The body's corpus would be only replies marked with a certain flag?

Please consider running the anonymization experiment.

Mean-spirited comments can quite possibly be the response to the rampant favoritism that exists on HN and that can be as dangerous to maintaining discussions balanced and interesting.

Weight votes by some derivative of karma, without showing fractional values.

I suggest hiding the points from submissions, comments and other people's summary page.

Seeing the points doesn't add value and it distorts the opinions and upvotes (people tend to like more what other people recommend).

A picture of a puppy above the comment field.

Alternately, what are voting ratios looking like? Are there some people downvoting more than up? If so, a notice above the comment field letting them know could help.

After thinking about it, how about adding a 'flag' link next to the comment headline? You can see it individually when you reply, but not on the main thread page.

Paul, what about having anonymous comments to avoid groupthinking, but keeping those comments linked to the poster's karma?

Don't allow downvote without a comment. Maybe use a different color font for username who downvoted.

what about a "members only" club, where you need an invitation to get inside? It worked for google to get members if this is a problem and it may show the relative quality of the members subtree.

You make me think of a way for every user to make their own members only club, by filtering, and they 100% control who is visible in it; others can view through another's filter:

1. every user can config a "whitelist" of users, to personalize HN to only show comments from those users (and access other users via parent links). This is the "filter", of who they "follow".

2. you can set your whitelist to be the whitelist of any other users. This is "subscribing".

Some few users would end up working to maintain detailed whitelists (like, who they "follow"), and most others would just "subscribe" to them. New users would be left out in the cold til one of these services picked them up (like new bands getting signed to music labels?)

No way would pg go for this, but it's interesting to me. It's sort of a variation of subreddits; and a bit like the various newswire services (like reuters etc) being picked up by local newspapers...

Actually, none of the two items were on my mind: you are a member but you don't own the club to set the rules. You are just allowed to invite someone else in. I'm not sure manually whitelisting scales.

Yeah, it's a different idea, but inspired by yours. Good point about scaling: perhaps your whitelist could include selected whitelists of others; you can aggregate whitelists.

increased popularity has decreased the signal/noise ratio. just reverse the popularity :) or just make somewhere new to migrate to.

FWIW, I discovered that people take disagreements personally and will stalk you across threads if you're not careful with their egos.

For some reason I am incapable of remembering names and can only identify about 10 - 15 people here other than the ones I have met or know from somewhere else. So the whole thing feels to me like detached, anonymous discussion. I was surprised to be 1) remembered, and 2) "punished" in another, unrelated context.

There are also times when I withhold contributing to a thread because of its strong fanboy/flamer possibilities. Certain companies have .. enthusiastic supporters/haters.

I am surprised people "punishing" online but I did realize that people used downvoting for disagreement (or if they are frustrated) while it should be used only if a comment is irrelevant to the topic or is frivolous.

I usually upvote the comments I disagree with because a healthy debate always helps me fill gaps in my knowledge. But when passions and emotions rule, "healthy debate" is simply impossible. Some people seem to be too stuck up with their beliefs.

Come On guys, many times it is good to be _wrong_.

I've also noticed that there seems to be a mob downvoting mentality. I seldom downvote already downvoted comments unless they are especially egregious, but recently it seems that you either get no downvotes or a torrent of them. Either that or people are playing sockpuppets.

[Edit: I think one way to mitigate this might be to make a user's votes viewable.]

I wasn't going to reply until I noticed I had a long line of comments with 2s a couple of hours ago, now they're all 1s after this morning's little scuffle: http://news.ycombinator.com/item?id=1851115

You can never prove anything unless you have the logs so there's not much point worrying, just play with and enjoy the furore. The best defence, if there is any, is to have a lot of karma and only engage with these folks if you can bear to lose some of it.

Err, that thread was fine until you linked to his history, Peter.

But yeah, drug related threads, red flag :-) I hope you learned your lesson.

I don't see a problem in looking at people's previous submissions - it can come in handy to get some context (in this case, it did, since his behavior on HN was getting him voted down in the past).

That said, I just went back and re-read your post and, yes, I see your point now and it's not quite how I read it first time round. I think the real problem is people who go back and vote down stuff to "punish" people, not those who introduce context from looking through your history (which I'm quite happy for people to do when replying to me).

OTOH, there are threads that are just way too interesting, but get virtually no contributions.

Here is a thread with one comment and one upvote, both mine, and it just has the potential of being very rewarding, IF the person who started has any intention to continue:


pg's quoted comment http://news.ycombinator.com/item?id=1833255

I've noticed that if you reply to a mean-spirited comment with a straightforward and dispassionate counter-argument, completely ignoring its rudeness and tone, you get massive upvotes. Transcending meanness also feels really clean; it is neutralized.

By framing the meanness, the site's collegiality is not undermined, but demonstrated.

Great point 8ren, and perhaps this is a solution to the issue. A concern is that the mean-spirited comments are not always countered in a thoughtful manner, or in time before the post drops off the front page. That said, perhaps we can draw on your suggestion, and as a community be more vigilant in quickly responding to mean-sprited comments in a logical and dispassionate manner to counter their effect.

Not trying to be passive aggressive here, at all; I agree with you here. The reason I didn't post anything was that HN has gotten pretty damn meta recently too. I'm not sure posts like this will help anything, much; probably something needs to come from the top, or the community needs to focus on enforcing the behavioral norm by down-voting (even if they normally wouldn't put in the effort) comments that are of an assholish nature.

As an aside: http://twitter.com/#!/ihodes/status/29059931658

It's a weekend. There's always more random stuff on weekends.

I've been on HN for a few years, and I'm usually extremely skeptical of people claiming a decline in the quality of HN, but I agree, and furthermore I have noticed an increase in acceptance of clever but empty Reddit-esque one-liners. I don't mind them, but I think it's unfortunately a very slippery slope...

No social news site is going to be perfect. If it's people that are writing comments, some comments will be good and some will be bad. Read them and decide for yourself. If a comment hurts your feelings, get over it. If you like a comment, upvote it, and help other people find the best comments.

Really, I kind of get tired of reading stuff like this because there is always some subtext to it. "mean-spirited" usually means, "someone disagreed with me". If people disagree with you, you can either get over it or learn to argue better. Passive aggressive, "oh noes everyone is so mean" is not going to change what people think. Good writing will.

I think meanness is something you have to push back against actively.

This site is not in a state of nature. There are all kinds of things I do to keep meanness at bay, and the situation would be a lot worse if I didn't. Preventing the site from declining into a swamp of meanness and stupidity (I believe the two are closely related) is something I think about constantly.

I've had to close a large-ish site of mine due to meanness. Things can get very nasty online, it's hard to really understand until you've tried running one of these.

Sounds like you have a lot more experience than most of us. What's in the intersection of that and your observations of HN?

(Sorry I don't understand the question.)

I was just wondering if there are any interesting things you noticed on your own site that you've also noticed here.

Preventing the site from declining into a set of meanness and stupidity (I believe the two are closely related) is something I think about constantly.

Does that imply anything regarding hateful posts in HN, or is it just a 'hey, there's a correlation here' observation?

This post was not a reaction to a discussion that I was engaged in, so in fact there is no subtext and it's not personal. :-)

I think a big problem with a "harsh debate climate" is that the people who can provide real value with experience, knowledge and insight will stay away. For different reasons like personal, professional or just disinterest. That will in turn attract more people who enjoy a harsher debate, which most (if not all) of the time are less enlightened.

Or as the saying goes "Never argue with an idiot. They will drag you down to their level and beat you with experience."

> I think a big problem with a "harsh debate climate" is that the people who can provide real value with experience, knowledge and insight will stay away.

An overly civil debate climate, viewed as a "circle jerk", might also keep some valuable contributors away. Whether that is a good thing or not is up for debate of some sort.

That saying is only true if there are no other people listening to or reading the argument. If your goal is to convince lurkers, it's actually pretty easy to win a written argument with an idiot.

Which is the only valuable thing to do if you are going to debate an idiot and perhaps the only valid reason to debate an idiot.

When it comes to things like politics/religion/etc no amount of good writing can convince the other person.

I don’t think that’s true at all.

I would make the argument that it depends more on the rationality of the person receiving the argument then what the argument is about.

A rational person is capable and willing to change their mind when new evidence is offered.

A irrational person may easily change their mind on some topics with no evidence and never change their mind on other topics(politics/religion/etc) even with the best evidence.

I think the trick is staying calm while you confront somebody about something. If you set them off, they stop hearing what you're saying and the whole "discussion" turns into a spitting match.

Everybody knows that crazy talk is a waste of everyone's time. I get depressed every time I find myself reading the comments on CNN or YouTube. . . .

"Really, I kind of get tired of reading stuff like this because there is always some subtext to it. "mean-spirited" usually means, "someone disagreed with me"."

Unfortunately this is often the case, however it should still be possible to make some sort of judgment based on rationality, patience, and communication skills of the people involved in a debate. If one side is constantly irrational and/or impatient then they are often also described as mean-spirited. This is why comments/complaints like the op's can be valid and important; though (ir)rationality/(im)patient/lacking communication skills are my preferred terminology over "mean-spirted" and other similar descriptions.

I upvoted this because I agree with the concept, but I don't believe such things are necessary. General rule of life: assholes will be assholes. This isn't going to make anyone more polite. Most of us are already polite, and quite content to simply downvote people who aren't.

If mean-spirited comments are down-voted, then it's not much of an issue. The problem is that I've started seeing mean-spirited comments receiving up-votes, and rising to the top of discussions. How do we solve this as a community?

Where have you seen that? I have seen most of them getting downvoted and some getting 1 or 2 upvotes

The recent NoSQL discussion. There's a user called gaius who spent a lot of time abusing anyone who mentioned why they used a nosql database.

http://news.ycombinator.com/item?id=1847978 http://news.ycombinator.com/item?id=1847783 http://news.ycombinator.com/item?id=1847836

everything is a sarcastic take on the parent, a comparison without any justification, or a straight out insult ie 'it makes me laugh'. The parents one of whom is me are all making polite comments about why we use nosql in our projects, and paraphrased sarcastic insults are sitting there upmodded. Who seem to agree with the sentiment but don't care about the tone, civility or lack of supporting arguments.

I would rather not single out any individuals/comments. In any event, it appears like a growing issue across the community, and is not isolated to just one or two people.

"I thought this question might come up, but I would rather not single out any individuals or comments."

So, you state a claim (and it is an interesting claim!). Other people ask for instances or examples and you decline to point them out. How are we supposed to "deal with it as a community" if we don't know what you are talking about?

I really would like to see an example of a mean spirited comment receiving plenty of upvotes and being on top of a discussion. I suggest (just suggesting, since I don't have any examples of what exactly you mean) that what you consider "mean spirited" may not appear so to others (and hence the upvotes)? Sounds like a simpler explanation (or at least an equally valid one) to me.

What I have seen is that empty/mean comments get downvoted to oblivion very fast. Comments with plenty of upvotes generally have some valid points/insights/value, even if the tone is sharp or sarcastic (neither of which is necessarily "mean").

Posting a mean comment which gets a lot of upvotes and making it to the top of a discussion on HN, without getting flagged etc, would seem to take some real skill in writing.

I would really appreciate any examples.

I actually think it's a good idea not to single out anybody; you don't want to attract the attention of somebody who is already belligerent and vile.

"you don't want to attract the attention of somebody who is already belligerent and vile."

I would rather these folks/comments are flushed out (the OP claims that someone belligerent and vile are getting a lot of upvotes, which seems strange to me - I don't doubt that mean people post on HN but if they are getting tonnes of upvotes (vs being downvoted or flagged) it would be really nice to see examples), but I appreciate that other people have different approaches to confrontation.

The problem is that then it is very hard to take a claim like "general uptick in mean spirited comments" seriously. A asking for evidence of any claim is valid in any discussion.

Heck if anyone wants to point these out but don't want to be seen doing it, just create a throw away account and use that?

I would agree that there is no reason not to single out those who are constantly belligerent and vile. When these individuals are easy to identify they are often isolate from a group and do not end up causing many problems. However that is probably not the case that causes the most problems.

The case that I have seen cause the most problems are the people who are normally ok but are belligerent and vile with a specific topic and or individual. This hypothetical individual is normally a productive part of the community and may not deserve to singled out. I would not out such a person unless I had good reason to believe that that they would be treated rationally and that the community would only see it as an opportunity to help said hypothetical individual to become more rational.

I’m sorry but I really think examples are necessary.

Slashdot attempts to fix this with meta moderation. Users are given the chance to judge up-votes essentially. Whether that is actually effective for them or applicable to Hacker News, I could not say.


I think I'm a pretty nice guy, but I've written some mean comments when I strongly disagreed with someone here. Imagine most of the mean comments here are coming from nice people who would rather (if they thought about it for a second) not respond in a negative way in those types of situations.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact