Hacker News new | comments | show | ask | jobs | submit login

That would certainly help. But more than anything else, I think what we need (at least in computer science) is a mechanism to penalize people who publish the exact same research at half a dozen conferences.

Running pagerank on authors with an unnormalized weighting function of "how many papers does author X have which cite something written by author Y" (so that if author Y publishes the same paper multiple times it doesn't have any effect even if both copies get cited, while author X publishing the same paper several times downweights all of his outgoing links equally) could work, though.




Don't search engines already penalize duplicate content? Maybe it would suffice to 'just' put all papers online, convert all references to hyperlinks, and let Google sort out what is important and what not. Actually, (at least in computer science) I don't think that stretching one's research across several publications is being done widely - at least when looking at 'top' conferences.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: