Hacker News new | comments | show | ask | jobs | submit login

I think the cause is simply growth.

Absolutely, the more the original community is diluted, the more anonymous people feel, and the less restricted they are by perceived norms (as there are none). You can see this in extreme form on some forums like Kuroshin (remember that?) which have been abandoned to the trolls and are now overrun.

The problem with most systems to control quality is defining what is good, and what is not good. That's not a simple problem and I'm not convinced you'd ever be able to get a crowd to decide on it satisfactorily - there's a reason that mass-entertainment panders to the lowest common denominator, and a crowd's votes are likely to trend the same way and any automated system is subject to gaming and the ignorance of crowds as soon as you let a lot of people have input, however small. Another solution used by sites like reddit or stackoverflow is to segment the communities into small enough groups to be self-policing.

Another approach I wondered about recently was along the lines of 'A plan for Spam' - it'd be interesting to use the massive comment base of HN to build up some sort of corpus of good and bad comments and apply Bayesian filtering to comments. You could use this to:

Set an initial point score for comments based on their content

Weight comment votes from users who consistently scored highly

You would have to seed the corpus of course, and let it learn from trusted users' votes, but otherwise this sort of system would in theory continue to work as long as there were enough good comments being posted.

I think those a great ideas. One question I would have is how would one prevent people from karma burning? Could enough poor comments form a high-karma person make them forfeit their karma/gravitas --thus highly discourage such behavior?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact