Even if the top submissions and top comments fit squarely with what HN is supposed to be there is still going to be comments and submissions that don't and they will continue to be a problem. If a post has 50 comments and 25 are "bad" it's a problem, even if the 25 "bad" comments are at the bottom of the page because people will still reply to them and they will still be a part of the discussion.
"People are voting up comments or submissions that don't fit..." sounds just like "People are hiring people who don't fit..." to me.
Or am I wrong?
What is it supposed to be?
There are too many non-interesting, but pedantically correct comments that derail discussions. The community goal of an HN comment thread should be to unearth the most interesting, useful aspects of the submitted content. Correct but non-interesting, non-contributing comments should be down-voted.
Not so, there are norms: http://ycombinator.com/newsguidelines.html.
If people stop voting up such comments, people will eventually stop making them.
That's like saying, "If we start putting people in jail for using illegal drugs, people will eventually stop using illegal drugs." Unfortunately that's just not how it works - some people comment (or use drugs, etc) because they can't help themselves, rather than for karma/recognition/trolling/attention. Idealism is a good place to start but but, long term, you have to recognize it as such.
In the same way, some people will make stupid comments. But there are also a significant number of people who won't. I know I'm far more hesitant to make dumb/funny comments on HN than on other sites because I know that they won't be tolerated, and will be downvoted.
Just because a system isn't perfect, and won't fix the whole problem, doesn't mean it won't have a significant impact on mitigating part of the problem.
It's not my analogy; that's just drug addiction. If you don't believe in the addiction part of drug addiction, then that's where we would differ, I suppose. I think that drug and alcohol addictions are true addictions for some people (as opposed to just bad habits).
I have to confess that I don't understand how the rest of your comments relate to my post.
The discussion was on using this system to deter people from posting "bad" comments, and you compared this to threatening people with jailtime for drugs.
And under the proposed system, those people will never achieve a high karma score and so never have disproportionate influence over the site. That's precisely why it's an improvement.
If you believe this, then what is the purpose of putting illegal drug users in jail?
However I think his analogy is flawed, and that a system that successfully penalizes poor comments and submissions does have a valid societal purpose here, mainly because posting is nowhere near as addictive as most illegal drugs. And for those whom it is - perhaps we can get a treatment program going :)
The moral majority (or a vocal minority with the necessary clout) imposing it's ideas? Bureaucracy inventing roles for itself using some early 20th century myths about drugs? A convenient mean for controlling ghetto populations (like blacks, latinos, etc)? Sheer stupidity? All the above?
Only if those comments are made public to everyone. If this system were adopted, it would be sensible to also implement a filter so that low-scoring comments only appear to high-ranking users, who are presumably responsible enough to filter the bad ones out and raise the good ones to visibility. I'd be happy to see fewer comments per post if the quality increased.
What is a "bad" comment anyway? Your idea of a bad one is bound to be different from mine, and pg will no doubt have a third view. Would it be possible to present to each user those comments that that user is likely to want to see, based on that user's voting behaviour?
I'd like to see, as a result of algo changes, less nastiness and better thought-through discussions. Fewer me-too comments, fewer arbitrary no-value comments. I'll happily trust our dictator-for-life on this issue.
Of course there is danger in that as well, but I see so many lament that they wish it was like it was in the "good old days", where good old days equals the approximate time in which they joined. The way to do that is to let them have the member base just as it was back then, while still letting the site naturally evolve for the rest.
Here's a good example-
The solution then is to autohide comments with low scores(configurable on the page) like Slashdot does. Some of the HN comment threads have grown too long these days. It would be nice to skim the comments that the community thinks are the most valuable.
> The important downside of these results is that the people using the system were not aware that points are calculated in a different way.
As I mentioned in that HN-meta-whining thread, you can't engineer away jerks, and social problems need social solutions. People will naturally adapt to any sort of engineered restrictions and find away around them regardless. Personally I think the only way to maintain a very high standard of quality in a community is strict hands-on moderation and swift ejection of members who don't match the community's goals and vibe.
(In either case I think the thread this post refers to was about the negativity in comments, not the quality of submissions. Though I might be remembering wrong.)
Properly weighting the votes from each user would allow more effective moderation to be crowdsourced from the whole community. Everyone should be a moderator, but only with the degree of authority that he has earned.
That weighting can be done with a technical approach as has been proven by search* companies. How does Google deal with the fact that there is no moderation on the web in general, and a multibillion dollar incentive to manipulate their results? Google results are really clean and usable despite the above limitations. Their approach is purely technical.
This proposal is an effort to enhance the ability of the crowd to moderate HN more effectively by improved weighting.
*Note that Google is almost solely a ranking problem, and the the term "search" is a bit of a misnomer. If you could see the bottom ranked matches for any query I'm sure you would see plenty of unsavory content, but the first results (the ones you actually see) are really clean.
2. It actually is very cleaning divided, bots & humans. Bad humans have no real power due to point 3.
3. Google has a click traffic amount that is so high that it can only be taken advantage of by bots. Hackernews is tiny in comparison and any algo can be manipulated with just a few bad actors.
Case in point: Digg.
Similarly, a techical solution which gives jerks a smaller influence seems an excellent partial solution. Sure, you might still need policing, but it isn't either or.
And the majority of people are jerks.
Furthermore, even if the technical solution work to any extent, it doesn't work as well as it could if it were visible. The attitude of relying on monkey-see monkey-do for establishing your social boundaries is wishful thinking if your approach to unacceptable comments and people are to sweep them under the rug.
That kind of tinkering is bound to create a community where the only people talking are those who most adamantly adhere to the echo-chamber. We already know there are certain opinions on the periphery that we just don't discuss on HN, no matter how politely or clearly articulated they are. They go against the groupthink and are good way to get shadowbanned.
The lack of visible moderation has not served HN well. Community management isn't rocket science, but sometimes it does have to get a bit ugly. The price you pay for letting people disagree is disagreement. You can keep it within bounds, but wall-papering over the bounds every time they are crossed doesn't actually improve anything, it just makes the same wounds happen again and again.
Many communities I've seen have done this in a way that drops-off over time.
Screw up any of your first 3 posts... you're out of here.
Posts 3 > 10... you're going to get a warning, and may still be out of here.
Posts 10 > 25... warnings only, only extreme cases get banned.
Above 25 posts you're largely left to it, though once in a blue moon someone will screw up so spectacularly that they'll get pulled up by the community.
Even then, the "pulled up by the community" is literally that. No moderators or banning, but the communities that enforce rules strictly for new members and than relax once you get comfortable within the constraints, generally are also the ones in which if you suddenly made an extremely racist post the solution is to be ripped to shreds by your peers rather than to censor and hide it... it reinforces the "that behaviour isn't acceptable here".
This is currently the most successful pattern I've tried. It works far better than other systems I've seen.
The question was, "How does that system deal with the old members > new members hierarchy that would develop? I think the problem with giving a member a longer stick the longer they stay/comment is that older members are given greater leniency when they do decide to be jerks, and new members who may very well develop into excellent members are kicked aside for possibly minor errors, thus creating an unfair advantage."
That hierarchy does exist. I'm not seeking to eradicate it.
The older members tend to be the ones that have established the tone and quality. This can be used to determine the rules that are used in the early stages of membership of a site.
But generally that hierarchy refers to inner circles and cliques who are protective of what the site evolves into for them, and this isn't addressed in any way by the system to integrate new members.
Old members will eventually leave, the question is more whether the value of the community is preserved when that happens.
Edit: You could say that only experienced users (members for several years with X many points) may use the jerk flag. Or something like that.
If the majority of a site has $opinion_x, chances are people with $opinion_y will earn jerk points every time they express themselves.
This also reduces the problem of people down-voting posts simply because they disagree with an opinion.
But the downside of the testing results described is not necessarily negative. Maybe if people knew that their reputation and voting influence are on the line, they would try harder to write more constructive comments. Thats all speculation of course and the only way to find out is to experiment :)
You could ask folks first. Not as reliable, but a lot less expensive.
Here's an unsolicited data point. I've got reasonable karma. I don't care about karma. And, I'm pretty sure that the some of the most interesting folks don't care either.
That's not quite true. I do care about all the karma whinging. I dislike it.
It seems like I'm always on the tail-end of communities. Is this just simply a case of the grass is greener for you veterans of HN?
Communities seem to have a 2-5 year lead in cycle, followed by 1-3 years of good, healthy community. Then they get mainstream, bloated, and meta.
But they never really die, which is the saddest part. I can go to social networks I signed up for over a decade ago, and people are still stomping there.
And they are angry. Angry at being deserted, or angry because they put all their eggs into one basket, but either way you wind up with little hovels all over the Internet that are remnants of the gold rush for good discussion.
Some day, the Internet as a whole will find its El Dorado. One thing's for sure - it's probably not Facebook.
However, I've been on here for 3+ years. I've seen the community decline unfortunately. It's only been in the last year that I've seen more than 100 comments (although this probably a cognitive bias) consistently frontpage articles and in my opinion is a bad thing. The quantity has certainly increased, but technically the quality hasn't diminished, it's just much harder to dig through the crap to find it.
TL;DR - The signal to noise ratio has gotten much worse, but the signal quality seems to be the same.
Some new community is brewing somewhere as we speak, just participate wherever you still enjoy while you still enjoy it. And when you find a new community that you enjoy, simply participate ... in ten years, I guarantee that you'll make a comment exactly like this to some young gun ;)
Sadly HN is feeling more and more like it's in its twilight years, if only because of all these endless posts on how to "fix" it. Particularly since I'm seeing people suggest a $5 signup charge. That killed K5 stone dead.
It may not be everyone's cup of tea but the site has grown leaps and bounds and hasn't yet strayed from its original goal, which is albeit more open-ended than HN. I'm not suggesting users pay $5, but there are more nuanced approaches than trying to squash the problem with the math hammer.
Let's create artificial scarcity: let each user be able to vote only on a small random sample of submissions/comments
Other ideas: Require approval by 2 blindly selected users before showing, or allow multiple dimensions in voting; but these don't seem to work.
Anyway, this is a deep issue, essentially political . It's also fun to consider the dizzying election procedure that the Venetians used .
Slashdot first pioneered that. In case you are not familiar with the scheme, all logged-in users are occasionally asked to vote with their 5 allotted points. Users with high karma are given a chance to vote more often. The points expire in a day or so if they are not used. They also have a meta-moderation system, where you judge how others have used their votes.
They've had the present system for a while (I want to say over 6 years, but I don't frequent the site as much these days). The overall quality of comments there is... OK. Not great, but not as bad as it was for a while around the turn of the millenium.
I remember when real scientists used to visit /. Now, not so much.
I also appreciated the changes they made when comments marked as 'funny' didn't count towards your karma, and made it possible to filter based on that too.
One way to provide context could be to allow people to tag their downvotes with an adjective like "irrelevant", "mean", "notreddit", etc. Then, critically, only the author of the comment would be able to see this additional feedback.
Karma has the most influence on long term behavior - HNers have to learn community mores over time, and one vote one point is well suited for that task.
One problem with community forums is that there is a drift to the 'majority taste'. As the population changes, so does the majority taste. This could put some drag on that drift, but you'd probably need to use historical voting data if it were to pull things back to the HN center. Otherwise it might just hold the community in a place where HN doesn't want to be.
I see the voting system as one of the problems. More accurately, the obvious emotional or fanboi voting trends along some topics. Blind down-voting, as I see it, is a problem. If you down-vote you ought to be required to provide an explanation, otherwise, don't. Topics such as many subjects surrounding Apple draw in people who will mercilessly down-vote well reasoned comments simply because they happen to be anti-Apple. The same applies to the occasional political discussion in which anything that isn't aligned with liberal thinking is castigated.
The biggest problem I see with this is that it does not promote intelligent contrasting posts. Those with different opinions either leave (or stop posting) or fall-in with the crowd and choose to become part of the collective. If one was after collecting points and accolades all you have to do on HN is be all over anything that relates to Apple and talk like a Liberal. That's the formula. Oh, yes, you also have to be 100% pro FOSS.
The fact that it is so easy to define is kind of sad. Then again, maybe that's what the YC guys wanted HN to be. If that's the case, there are no issues. If the idea behind HN was to bring together pro-Apple, pro-FOSS, politically Liberal hackers, then there's nothing to fix.
If, on the other hand, if HN is intended to promote diversity of thought along a wide range of subjects of interest to hackers and entrepreneurs, the voting system needs to reflect this.
Here is an idea:
- require downvoters to provide that explanation.
- Use some simple automated method (e.g. run the text typed through a language recognizer, and see whether it thinks it is English) to prevent simple key-banging as a response.
- by default, do not show the explanations, but do show a 'has downvotes' indicator and a 'show downvotes' control (could be a single UI element)
- when showing downvotes to a user, do not show author names.
- allow users to flag downvotes as inappropriate.
- let the powers that be manually check flagged downvotes (hopefully, there aren't many), and take appropriate action against either the downvoter or the flagger, or both.
The post that started this recent wave, someone was, I suppose, genuinely upset about not receiving good comments to a "look at my project post." However, a user looked through posts and actually found a lot of praise ... So, I don't know what to think about that.
oh yeah, i might get down votes - but - its still one of the massive comment systems that more or less works. more.. or less. :)
I also have to say that I love this site. I feel that the content of the posts have made me a better programmer in general and I really appreciate everyone's input. Even if the discussion threads spiral out of control quickly.
It would be nice to have expandable comments so that when people begin name calling we have an easy way of filtering them out on screen and just viewing the people who directly address the post.
If pg really wants to preserve the feel of HN, he needs to develop a system that analyzes the behavior of users to determine their acceptance level of other users. This will create a network effect, where the most influential users will drive the site.
He then needs to seed the system with 5-10 users who he considers paragons of HN (I can think of five off the top of my head, but his are almost certainly different), and tweak as necessary.
Only votes by members of my extended circles count towards my "score" for a given item, weighted by distance. If something is upvoted in my perspective, I can see who and why, and break that connection (remove a person from my circles).
Finally, I can publish a group of users I trust and give it a name, which makes it easy for others to follow well-known groups.
I think it would work. A user has to put in a little effort, to find people they think have good judgement. It's very easy to knock out bad judgments by breaking links.
If I were less tired I'd think this through. The downside to all of this is processing time, of course.
Also, know the official etiquette/guidelines:
You also cannot moderate and participate in the same conversation. Posting a comment revokes any moderation in the thread.
You can change your user preferences to augment the moderation choices, such as -5 to funny and +3 to insightful.
CmdrTaco provides history on how the system evolved: http://slashdot.org/moderation.shtml (1999)
Particularly with the nerd community, the ability to reply and pick apart what the parent post said almost ensures that every discussion devolves into inanity.
I try to make it a point, if I'm going to comment, to just leave a top-level comment and not respond to replies. It works well enough, and it might be worth considering as a rule.
It's well known that the number of votes a submission or comment gets is very time-sensitive, and the Wilson bounds are substantially affected by that number. This leads to little-seen content gaining their author very little influence, which may be purposeful but probably isn't.
A further little-mentioned complication is the existence of thresholded voting: some people distribute votes freely, while others manage their vote histories by limiting themselves to voting on exceptional content, and yet others use contextual voting, touching only misrated content. Given the dependence on voting, I feel this aspect has not been sufficiently investigated.
You could take this to the level where if Bob and Mary both vote for something, it's a sure hit, but either one on their own means little. Some kind of Bayes implementation.
So: We need to be able to measure the quality of HN. This doesn't sound any easier -- maybe it isn't. But the process of figuring out how to measure the quality will take us a long way toward telling us how to maintain and increase it.
Without measurement, we don't really have a basis for agreement on whether a change to the mechanism is good or bad.
vote = max(10,floor(karma/1000))
At the same time noobs don't have voting power until they get enough karma, increasing comment quality.
Trolls who get below -100 will be automatically hellbanned.
have been updated pretty recently, as I see one significant change in the guideline text that responds to controversies that came up over the last two months. The Hacker News welcome message
gives an overview of the community experiment here, summarizing the site guidelines.
The Hacker News FAQ
gives some additional details about how Hacker News is administered. The welcome message distills the basic rules into a simple statement: "Essentially there are two rules here: don't post or upvote crap links, and don't be rude or dumb in comment threads."
In recent discussion,
several HN participants proposed further revision of the guidelines, and also proposed making the guidelines more visible during the article submission and comment submission process. For example, have the "add comment" form field prominently display a link to the guidelines, perhaps with a snippet of text referring to the guidelines most related to comments, and similarly have the submit form
prominently display a link to the site guidelines and a brief description of what kind of submissions are most desired.
Weighted voting fixes based on user behavior signals do seem like a good idea (that is what pg is thinking about), and we might as well discuss how to make those fixes mathematically correct. The technical tweaks can best be reinforced by ongoing efforts at user education, including possible revision of the guidelines, and, in the opinion of several HN users, simply making the guidelines more visible to everyone who submits an article or posts a comment.
Edit to reply to first comment kindly made to my comment:
What kind of comment is most often upvoted, and by whom, is an empirical question. I don't have access to the data to resolve the question of what kind of comment is most readily upvoted. The bestcomments page here on HN
shows a current snapshot of what the current community, based on current communication of the guidelines and current technical features of the HN software, has upvoted the most in the most recent several days.
I think since we can't see the vote numbers people are less likely to upvote longer comments now - but I don't imagine them losing out to jokes.
I'd still think that over all, good comments are going to attract more votes that joke comments, and with less risk of being downvoted for 'making this like reddit'.
Making a dataset available, even to a limited set of folks under NDA, could allow some interesting experimentation.
A complicated algorithm or formula will not fix this.
New members have the ability to upvote. If there are enough of them with interests and standards contrary to the current culture of HN, they will upvote each other and bootstrap their own culture. Eventually they and the people they subsequently attract to HN will overwhelm the original culture.
This problem will not be fixed unless there is a threshold system for upvoting or PG and the Y Combinator mods take a more aggressive approach in removing articles.
Sure it can. If you hide low-scoring comments from low-ranking users, but show them to high-ranking users, the high-ranking users can exert influence on the low raking-users by upvoting quality comments and downvoting poor ones. It becomes a self-organizing system with a bias towards quality, rather than the open loop 'majority rules' system currently in place.
Moderators, moderators, moderators, moderators, moderators, moderators, moderators, moderators. Moderators, moderators, moderators, moderators, moderators!