Hard disagree. We need to incentivized more curation, distillation, summarization, comparing/contrasting, systematizing, categorizing approaches along various axes, meta analyses etc., not less.
These are often very valuable, more so than yet another 1% improvement paper that never gets reproduced.
Review papers don't steal citations. These are cited from more distant literature so readers can familiarize themselves with the topic. Those wouldn't cite all the papers mentioned in the review individually.
But as a broader point, yeah, maybe something like PageRank could be used to "pass on" the citation Ina sense.
> These are often very valuable, more so than yet another 1% improvement paper that never gets reproduced.
It's not an either-or.
And one should always read and accurately describe the contents of what you cite, meaning of course you wouldn't cite everything in a review paper directly. Only that which is relevant.
Suppose we count both separately, research citations and review citations, the latter for your great review papers or even if people just want to reference the lit review section of your research paper. What would happen to the incentive? I suggest that reviews would be less incentivized as people would primarily be concerned with rating researchers according to their research citations. Just like no one cares about your textbook sales, popular though it may be for introductory uses. This would imply reviews were not "honest" citations but are done to hack the metric.
Well you guys are pulling me to an extreme position here. Of course I love good textbooks too. And review journals like signal processing magazine are my favorites. I'm sure many people do them for great reasons too. Let's not forget that academics teach too, and there are academics and colleges that are entirely devoted to teaching, not research. People are free to prefer them more if they like. However, when it comes to research metrics, the person who did the research deserves the credit for it. Even if we subtract the citations for obvious review papers (which is commonly done) there is still the lost citations by the original researchers.
Reviews also cause other devious problems, like journals hacking their own impact factors. And don't forget the Matthew principle (aka rich get richer) whereby only big-shot researchers can get invited to have their reviews published in high-impact journals, warping the network effects to their favor even more.
These are often very valuable, more so than yet another 1% improvement paper that never gets reproduced.
Review papers don't steal citations. These are cited from more distant literature so readers can familiarize themselves with the topic. Those wouldn't cite all the papers mentioned in the review individually.
But as a broader point, yeah, maybe something like PageRank could be used to "pass on" the citation Ina sense.