I think the long term answer is decentralized publishing. Publish everything you do on a university or private website and let others decide if it's good or not when they want to cite it instead of a peer review that is set in stone. I think people reading papers deciding if they want to cite you are smart enough to figure out if it's good research or not. The peer review process is overrated (and quite often suffers from insider networks). If you decentralize publishing you can also have other researchers upvote a paper to basically approve of the academic standards in the paper.
I also think the static nature of papers is a problem. I'd much rather cite a specific version of the paper. I'm thinking about git and pull requests along the lines of "want to cite, fixed layout" or "new research disproves this" etc.
Journals provide a filtering intermediary that helps me better use my time. Hopefully I can figure out which editorial teams are going to publish good vs ok vs crap research and pick from those, relying on the curative capabilities of their staff to provide interesting and useful reading.
Journals should still want to publish great research, and publicly funded research should still be available to the public. Maybe there's indeed a more relaxed middle-ground to "just put it all out there and hope folks find and share it." We all agree the journal industry needs some help, but I don't think society is well-served by completely decimating its economic model.
I don't have time to scavenge through all the tech news every day, but that's partly why I visit HN. Also, I use Google when I'm looking for some specific topic.
In other words, filtering can be done perfectly well by communities and third-party organizations.
It comes with its own trade-offs, particularly the tendency toward "viral" titles and abstracts so that a particular paper would be more likely to be clicked / skimmed / shared. I'm not sure that we want to encourage that as I think it would injure the quality of published works in general.
There are merits to being able to put faces, and reputations, behind the editorial direction of the information stream. Social-network-style consumption would necessarily anonymize or obscure the effect.
As before, there's definitely agreement that more access is better and I'd bet there's a good middle ground available.
For example, a journal I worked with while I was an undergraduate changes their articles to open-access nine months after publication. You still get the editorial process and curation, the team has some financial support to evangelize their journal's mission and most folks who won't have needed the most immediate access can get to it for free. If you're an individual and want the articles immediately, the "subscription" is in the form of a $40 or more donation each year, which receives as a gift the biannual physical volumes and full digital access.
They're not the majority of course but options do exist that straddle both stakeholders' needs. Flipping the market completely on its head simply might throw out useful qualities that wouldn't completely transfer over.
> I don't have a ton of time to search each university's publication database or every 2nd tier research team's private home-grown web site
Directory of Open Access Journals - https://doaj.org/
Advance Public Access to Articles (Chorus) - http://www.chorusaccess.org/
So-called "high impact" journals suffer more retractions. http://peterdedmonds.blogspot.dk/2015/06/retractions-and-hig...
Society is most definitely well-served by completely decimating [legacy publishers] economic model. Legacy journals stand in the way of progress and they are laughably inefficient in what they do (filtering / online storage / layout (really!) )
Related anecdote: I had a confererence publish a completely plagiarized article of mine, changing only the names of the authors. The paper still contained recognizable photos of my students performing the experiments. I reported this to the confererence and to the IEEE which owned the copyright to our original paper. No action of any kind was taken. Not even a 'knock it off' email.
I don't have time to read every newspaper. Aggregator sites (like HN) provide me with a handy list of links to articles relevant to a specific interest or theme. Thus, I have not been overwhelmed by celebrity gossip when I go looking for nerd news.
And many of those aggregator sites are community curated, meaning that there is no editor to pay--only the site hosting bills.
But on the other hand, I'm not sure I want to see what karma whoring looks like for academics.
The major difference, however, would be that the research would be at least accessible to those who can't afford to pay the intermediary. That's what seems odious (to me, at least) about the current arrangement: if I want to access most of the research my taxes have funded, I have to pay again. This is on a moral par with being forced to pay for access to read laws and court decisions that should be a matter of public record, especially in an era where the marginal distribution cost is rapidly heading to zero, and I can't fathom an argument against having basic, immediate, and perpetually guaranteed access in either case.
Profit margins around 40% if I recall...
 https://discrete-analysis.scholasticahq.com/post/40-welcome-... (ctrl+f $30)
One scientist has an idea, he publishes his hypothesis and intended metodology. Others can jump in and tell him, for example, that the hypothesis has been proved wrong in a recent paper, or suggest improvements in his methodology. Others can chip in and offer to replicate the experiment to increase the sample size. Mathematicians could observe and correct the statistical analysis before the conclusion are published. I think that will improve the quality of the results.
Of course, the bigger problem is that most papers would have dozens or hundreds of authors, diluting the individual contribution of each one. On the other hand, finished reasearch already would be peer reviewed and corrected.
From my experience a main problem is real estate; you can't do most science experiments without a proper lab. We could probably brainstorm some creative solutions (like operating an "open" contract research organization to be the hands for everyone's ideas), but at the moment this is too big of a barrier.
With real estate comes cost, then funding, then jockeying for funding, and finally journal impact factors as a measurement of worthiness.
It's also why one can't make a science-related "hackathon" without changing the concept.
someone should figure out a way to make that happen
If you have less points than other people, you don't get a grant, or you're more likely to not have your contract extended etc.
The ministry reevaluates the (journal's IF<->points per publication) mapping each year, but it takes a few years for a new journal to get significant number of points. Publishing in low-IF journals if your paper could be accepted by a major one is a career suicide.
The hidden implication is that the system reinforces the power of the all-powerful top journals.
So, in a lot of ways, the competitive research culture isn't helping inaccessible/exclusivity journal problem, since researchers are actively seeking out those journals.
Mathematics seems like the most promising field to start with if you follow a bowling pin type of strategy. There have been some boycotts of Elsevier etc. from math departments and they have somewhat of a "free publishing" tradition at least from my outside point of view. It's also an excellent viral starting point because you can infiltrate other disciplines that tend to cite math papers (physics, machine learning, economics etc.)
I doubt a system like the journals would have developed in a world where internet already existed.
I think a variation of the system used to build the Linux kernel could have worked for scientific papers as well. In the Kernel you have a pyramid of trust. Linux trust some people directly, who again each trust another set of people. This continues downwards. In this fashion code can be signed off and bubble up through the hierarchy.
I suspect in similar fashion a scientific paper could bubble up by being reviewed by ever more respected scientists in a hierarchy.
(Probably not a big issue right now, but it could be in the future).
I think we can agree though that knowledge should converge into being easily accessible, all while reducing the collateral effects of doing so (as the article explains: tenure, promotions, prestige…).
It is incredibly obvious that journal content shouldn't cost as much as it does.
- Scholars write the content for free
- Scholars do the peer-review for free
- All the legacy publishers do is take the content and paywall PDF files
Can you believe it? Paywalling. PDFs. For billions.
Of course the publishers say they create immense value by typesetting said PDFs, but as technologists, we can clearly see that this is bunk.
There's a comment in this thread that mentions the manual work involved in taking Word files and getting them into PDFs, XML, etc. While that is an issue, which you could consider a technology problem, it definitely doesn't justify the incredible cost of journal content that has been created and peer-reviewed at no cost. Keep in mind that journal prices have risen much faster than the consumer price index since the 80s (1).
The future is very clear, academics do the work as they've always done and share the content with the public at a very low cost via the internet.
PS. If you want a peek into how the publishers see the whole Sci-Hub kerfuffle, check out this post from one of their industry blogs - the comment section is a doozy: http://scholarlykitchen.sspnet.org/2016/03/02/sci-hub-and-th...
For example, say you submit a paper to Nature. Odds are it will be rejected, because Nature has a high impact factor. Say your paper is both accepted and passes the review process. Now you have to pay Nature to publish your paper. Depending on the options you ask for (e.g. making it free to download costs extra) you can wind up paying thousands.
No university is going to say to one of their researchers, "Hey, could you maybe not get accepted by Nature quite so much? It's expensive!". Nature knows this.
Also, the format of submission varies by field. While word files may be standard in some fields, in others latex is expected. Obviously, typesetting a latex document is pretty gosh-darned easy for a script to do.
Depends on the journal and the field, really.
This is not true for the vast majority of journals in the biomedical scienecs.
It's just that publicly funded institutions should be forced to make their papers widely available free of charge.
Exactly. This problem needs a big push from outside the system, from government and people, because the publisher-academic careerist marriage is, unfortunately, working well for both.
It makes just as much sense as wearing a fashion accessory. "This person must be important because of his fancy watch"
The fact they charge the author to be in the journal immediately brings into question if it is truly the best research.
Papers published there are certainly at a high quality level, in predicate logic:
A - Published on Nature
B - Good Research
A -> B is true, which does not imply !A -> !B
Amusingly, when I was a grad student, some friends showed me how they could use LaTeX to produce a manuscript that looked like a spitting image of a Physical Review article.
That was in 1992.
Web based LaTeX editing, makes it very easy to collaborate. But also they have many standard journal and university templates, and direct submission links to lots of journals to make the whole process of formatting and submitting a paper much simpler.
Yes, this of course raises the question of why some journals charge so much.
Who can blame them in a capitalist society. Anyway, I'm glad to see people respond via places like Arxiv. Sci-Hub is more like civil disobedience. Great that it happened for all the learners out there, but it's going to make some people very angry because they have bet their careers or invested based on the idea that the government protects and enforces copyright agreements.
We want people to invest in research. Academia has been very lucky in this regard as higher level institutions have continually grown in America. Should people begin to feel that investing in research has no return value, there might not be so many tenureships around in the future. I don't think this is happening I'm just conjecturing and suggesting that while a correction is definitely needed, I'm not sure leaping to the "all free" model is going to work for every scientific community. I could be 100% wrong though. I like the direction in which we are headed. It's more trusting, and I think that means growth for a society.
"A PDF is a weapons-grade tool for piracy"
"This is despite the fact that the publishers’
action directly addressed a very real
Journals can run themselves (with proper software) and historical impact factor needs to be abolished.
Wow, I'm amazed that a scientists of all people would tolerate such a scam. Why don't they come together, and en masse leave these pay-walled journals, and switch to free open ones?
In politics, you hear about how uneducated voters are tricked by politicians, but this is far more worse. Here you've got some of the most intelligent and creative people on the planet knowingly allow themselves to be screwed over. Wow.
Researchers are employed to drumroll please research. (inc. teach, and mentor, and advise, and admin, etc.)
Disseminating your research, that requires writing it up, yes. So, no, they don't technically write up the research for free, but those who benefit (the publishers) are not the same entities who pay the researchers salaries so it's as if they are getting something for practically nothing, i.e. "free".
> Yes, it is not like publishing a book but these papers are not really like books anyway.
Nobody is making that comparison.
It's very simple, the internet has shone a strong light on a very profitable business model and people don't like what they see.
Honest to god and on the grave of my mother this happened to a friend. He was going over some old Fortran 88 code, filled with GOTO statements. The code, at best, was a rat's nest. Only a deep and long fight could get it into your brain. At about 3 am after a long day in the lab, she finally gets to a line in the code that says GOTO LINE 12345 with the comment 'HAHA MADE YOU LOOK' . Her boss bought her a new computer after she threw that one off the roof.
That experience is universal with Biologist code.
That's not actually the point. The point is reproducibility. If you write your own code and get a different result, why? If the reason is the code then having the other code, no matter how terrible it is, allows you to figure out what the difference is. And then somebody is wrong and you know why and can publish that result.
The end result is that empires are built, reproducibility is (paradoxically) harmed, and we're someday going to end up finding out that some big, high-profile projects were built on pillars of sand.
From what I've seen when supporting EEs? People who are that much smarter than I am don't have this limitation. They can write thousands of lines of spaghetti perl and bash and as long as nobody touches the damn thing, it works fine. God help you, though, if you make a change.
But... there is an established job role for this; I mean, you don't want to make your EEs sysadmin their own LSF cluster, either.
Or need to add functionality. Or, just wait a year.
I think sometimes you just get so caught up in your field of expertise that you miss some of the tools that could drastically help you entirely. You have to wonder how much of an efficiency drain that adds up to over the course of even just individual research projects. And that's before you get to bad code.
I got access to the source code, and the super complicated algorithm added almost nothing to the results, the glossed over / hand waved past data normalization worked so well that there wasn't a need for any further classification. This paper was pretty well received and cited, despite it basically not work.
I think a central hub for public comments, with source, and available PDF would be incredibly useful for CS as a field. For most papers I had access to, to get the source I'd have to go through at least one person. So I tried for quite a while to replicate the result before even trying to get access. If it was available, on say github, I'd have grabbed the source code as a resource to understand the paper, and with something like publicly available commenting, I think it'd have been a non-issue.
The problem is, I can see how such a system would be incredibly constructive to the field and the community, but it could be a liability to the authors, and would make publishers meaningless, so I doubt it happens anytime soon.
To be fair, a lot of people do publish their working code with the paper. It helps people like me understand how to do the work.
Bad code is fine, good code is better, great code is great. Stick to fine and I can do my job and you can write papers knowing that your work will be used by many.
The CRAPL: An academic-strength open source license
A much bigger problem is that grantsmanship strongly incentivizes against verifying your methodology and being diligent in your construction of null hypotheses.
Well, if you're embarrassed by your terrible handwriting, maybe it's a time to put some effort to improve it.
Especially because if you are forced to publish your code, it will be better.
Academic code sometimes only has been made to work on one machine (the author's). Someone reproducing the results should still have access to the source, even if they have to hack it to work on their system.
Really, a lot of the comments to this and the downvotes seem to betray a lack of understanding of what cs research is like, which is understandable I guess. CS research is not like creating physics experiments, there are methods, techniques, and algorithms. When there is an experimental result, CS researchers do not treat those as sacrosanct and do have an informed skepticism but we are generally able to determine and discern the bulk is what is important from the content of the paper itself, in most cases.
Yeah, if you look at the experimental results section from papers in SOSP, you could come up with all kinds of complaints probably all day long but there is more to it and the community than is being taken for granted here.
The alternative has way, way too much potential for rent seeking. You'll get private interests who connive a way to have the government pay to do the hard/uncertain/expensive part while they don't put in their dime until they're already sure it's going to pan out, at which point they swoop in and claim the full IP monopoly to the detriment of the public.
The second is a completely separate problem from the research. If you have a promising drug that people have been using in other countries for a thousand years but has never been FDA approved, it never gets FDA approved because nobody can patent the prior art so nobody will pay for the drug trials even if no research is required whatsoever.
Two possible solutions are to have the government fund the FDA trials or to give the party who does a temporary monopoly on the drug regardless of patentability. And if two parties offer to fund the trials in exchange for the monopoly, let them bid for the right.
If so, why should nations give away their expensive research that potentially give them an edge either military or commercially?
Personally I almost never traded technology when I played civilization, let alone give it away!
Yes. The answer is unequivocally yes. Your country will benefit vastly more from the research being freely available than other people having a better world on your dime will disadvantage you.
And here's the thing. Other countries face the same calculus. Scientific research brings the greatest return on investment of anything ever. It's why we have electricity and computers and satellites and penicillin. At the national level you can pay for it and give it away and still come out ahead regardless of what anybody else does.
Since every country benefits from doing it by more than it costs them, everybody can "cooperate" by doing research and giving it away without having to worry about anybody else defecting -- because defectors can't hurt you, they can only not help you. The sooner everybody realizes that the sooner everybody can choose to cooperate and the better off the whole world will be.
And the returns when every country publishes research for free are... large.
Good point about the current situation.
"We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it. "
- And especially when public policy often demanding massive input from the taxpayer depends on it.
This is one of the easiest ways to manipulate the AI players...
>But that financial model requires authors to pay a processing charge that can run anywhere from $1,500 to $3,000 per article so the publisher can recoup its costs.
Edit: also, I don't expect margins of 30% from companies producing very low value. Do you really think that if they'd lower their margin to 0, with prices of around 30% less, everyone would stop complaining?
1. Their margins are extremely high, 90% range.
2. Open access journals can charge orders of magnitude less
Neither is broadly correct.
Regarding the claim that ~30% margins are excessive: do you think that if their prices went down by 30%, everybody would be happy?
It is insane that they go after people who wrote the paper when they try to give it away on their own website.
A little reading goes a long way: https://en.wikipedia.org/wiki/Elsevier
Claiming that they're just extracting rent ignores the cost, which even at a 40% margin is still 60% of what they charge.
Do you have any information on their legal costs that would reduce the cost above?
Do you think everyone complaining about prices would be happy if they would reduce prices by 35% across the board, thus making no profit? Or would you still want it to be free? If the latter, your complaint is not solely based in Elsevier's "rent extraction", but in something else.
It's hardly surprising that publishers would fight dirty to hang on to a business model where scientists do research that is largely publicly funded, and write manuscripts and prepare figures at no cost to the journal; other scientists perform peer-review for free; and other scientists handle the editorial tasks for free or for token stipends. The result of all this free and far-below-minimum-wage professional work is journal articles in which the publisher, which has done almost nothing, owns the copyright and is able to sell copies back to libraries at monopolistic costs, and to individuals at $30 or more per view.
Given that peers review a paper for free and it costs fractions of a penny to host a pdf, why should we care about the publishers? What value do they ultimately bring to the table?
We should be rethinking how this is done entirely.
This is off the cuff, but...
Perhaps researchers applying for public funding should be required to register as peers. Each researcher submits their publication to a public service like data.gov (maybe research.gov?). Peers are automatically assigned according to area of expertise. The peering process proceeds much like it does today.
On completion of review and approval, the paper and any materials needed to reproduce the research is made public and available to anyone.
The cost for all of that would be relatively trivial.
Also, many of those journal fees are paid for with tax dollars to begin with. An expense that would no longer be necessary.
It seems much more likely that PLOS simply charges the "$1000 or so" because they can, and it's easy enough to piggyback on the traditional publisher narrative that they are actually providing this amount of added value.
People forget the weird workflow in any journal that does biomedical work (including most of the PLOS family): manuscripts are submitted as Word files, then converted to XML and reformatted by a semi-automated process. There's manual labor involved to format the references and cross-references, mark up tables and figures, and proofread the HTML, PDF, and XML versions. The XML is then submitted to PubMed Central for archival storage.
There's a lot of manual labor involved in the profess. You can't just hand-wave away and say "peer review is provided for free" and assume the rest of the process is free.
Proper formatting makes it possible to automatically extract bibliographic reference data, text-mine the contents for whatever interesting purposes you might think of, display it in apps and new formats, and store it for decades in a stable format.
Nice value added by the MSRI style right here.
There's a slow-moving Scholarly Markdown project, but I don't know what its status is. http://scholarlymarkdown.com/
I agree with the sentiment, though. If there were some markup language that supported what academics do with LaTeX and Word, and converted cleanly to JATS XML and to nice-looking PDFs, it would be enormously useful.
Do you have any evidence of such economies of scale in existing publishers? As the article says, their margins are around 30%. That means the maximum they could reduce the price without losing money (with simplifying assumptions) is in the 30% range. If you have a way to be significantly more efficient than the status quo, go ahead, I'm sure everyone will be thrilled.
>As a taxpayer, you already pay state universities to access those papers.
That's a fair point. Is the government paying more in access fees than they're saving by not using open access journals?
Here's an example of how to do it on the cheap:
"Scholastica charges us $10 per submission. We have a grant to cover this and our other costs, which are very low. Our total costs probably average about $30 per accepted article. Therefore, there are no charges for authors, and obviously there are no charges for readers either. A typical article processing charge for one article published in a traditional journal would keep us going for around 50 published articles."
Additionally, these journals really do provide more services than most specialized journals. In particular, the editorial service at Nature or Science is of much higher quality than your typical specialized journal. The editing includes editing the text for language and clarity, typesetting, and commissioning third-party high-level summaries (Nature's "News and Views" section). In contrast, at least in computer science and mathematics, most journals do minimal, if any, editing. The editors of prestigious journals do add value, and this helps to maintain the journals' relatively broad audiences.
And for a lab that's publishing a lot, it can add up. Some labs are putting out 10+ pubs a year. And depending on the field, that $10-20k can be a big deal.
Other open access journals are similar. Publication rates are high, because the vast majority of published research is NOT open access.
They may have pulled it out of thin air (although I've seen similar numbers elsewhere, this doesn't seem to be in dispute).
And furthermore, they're minuscule compared to costs of journal access.
Edit: I'm wrong on the second point. They're actually comparable. See comment below.
Research costs may be correct. Do you know where I can find data on average funding for published studies? Or how many government funded studies were published at what budget?
Let's say that I'm writing a paper. Maybe the underlying research cost $100K. So paying $3K for publication isn't much. And let's say that I need to review 100 papers for background, discussion, etc. At $30 each, that's also $3K. Even less, for subscription-based access.
So maybe it's a wash for grant-funded academics who publish in journals that charge $3K to publish and $30 per copy. But it certainly isn't a wash for academics with marginal funding, not-for-profits, etc, who publish at much lower cost.
They asked if I wanted my dissertation to be available, free of charge, to anyone interested in reading it.
Clicking on "yes, I want to make it available for free" would cost me something like $800.
Clicking on "no, I'll let you charge people to see it" would cost me nothing.
Having just finished, and being in debt to do so, it shouldn't come as a surprise that I wasn't rushing to pay even more. So now, if people want to see my dissertation, they have to pay -- or be part of an institution that pays an annual fee to ProQuest. (BTW, e-mail me if you want a copy.)
My guess is that it's similar with other journals. And while professors have more than PhD students, they have limited enough research funds that they'll hold their nose, save the money, and keep things behind a paywall.
Which is totally outrageous. It's about time that this change, and I'm happy to see what looks like the beginning of the end on this front.
Frankly, every university in the world should have an open archive for thesis (and mutualized is even better). It's just the continuation of keeping thesis copies at the university library.
To me it doesn't seem criminal. It is criminal, for to keep knowledge in the hands of the few and privileged to the detriment of the majority of all humanity is a crime. But not only does the Industry keep this knowledge to itself, it prevents even those who directly fund it (tax payers in developed countries) from accessing it themselves.
I honestly despise those journals who manipulate the current system for profit, just as I and I believe most anyone, would despise any organization or individual manipulating a system at the expense of the well-being of the many.
And I laude sci-hub and Elbakyan for something I consider a service to humanity.
In a language that's understandable to the HN crowd, journals:scientists are as VCs:(founders + investors).
They also have a somewhat restrictive licensing on their software:
This part is kind of weird, as they have a noncommercial clause on their copyrighted algorithms, which makes them non-free.
On a slightly different note, I was somewhat saddened by how the International Lisp Conference talks were recorded but unavailable because the ACM will not allow their publication.
The finances are on pages 12 and 13. The short version: they're spending money on all the things you'd expect, but they seem to be slightly cash flow positive and have accrued around $100m in net assets.
In reality, very few people pay the individual article prices. The real market is in site licensing, as it were. The individual article prices are probably set to support the sales story on these institutional access licenses, not the other way around.
But what are criteria for student memberships?
And here's an article that, for instance, costs $15 to download: http://dl.acm.org/citation.cfm?id=954342&CFID=591346815&CFTO...
I found the ACM paywall to be one of the obstacle that limit access of the new wave of the code educators to older results in teaching programming: they are not in the academic sector and lack both knowledge of and access to this enormous amount of papers.
So, in this case, the paywall keep a new generation naive, as they reinvent independently research that have been conducted decades earlier.
But, do realize that there most certainly is value being contributed by journals who are making you pay. For example, the editorial work, the typesetting, the selection, the whole system, etc. So they do need a paycheck from somewhere.
So, I think this is actually like the common software/music/movies piracy situation... but a lot better!
My (simplified) feelings about piracy are: do it if you're dirt-poor, don't do it if you can easily afford it. By this account, Kanye was a d-bag for pirating.. whatever software he pirated, because he, of all people, should be able to afford it. A starving artist with lots of debts, in my opinion, can't be accused of doing a great moral wrong when he's stealing a thing which has an effective zero marginal cost (arguably). The biggest problem here is that... there do exist people like Kanye who really should pay for things but end up not doing so.
What's different in this scenario is that... I am almost 100% certain that all institutions that should be paying... will be paying! I mean, can you imagine some Harvard or MIT lab skipping payment on the journals? No, that's just not going to happen. The Harvards and MITs of the world will keep paying... the rest -- the public, or schools in suffering areas and nations, who can't afford it as easily, will be able to get it using scihub. It works out wonderfully in this way.
This means we don't get many fancy videos unless we or our university's PR people make them, but it still seems to work fine for getting science done.
* 112 IT
* 10 customer service
* 3 data analysis
* 28 editorial
* 3 accounting
* 4 general MGMT
* 3 HR, 35 marketing
* 18 manufacturing
* 22 product MGMT
* 3 program MGMT
I would really like to see an official breakdown of the cost per article and especially the processing cost paid by authors.
The correct question to ask is 'can' all research papers be free - does the world continue to spin, will research still happen, will we still progress, if they are free?
The only reason we even have this debate to begin with is because the producers of this information require scarce/controlled resources in order to survive.
What happened here is that jerks hijacked the academic publishing industry. They turned a system that was largely pro bono publico into an intellectual pyramid scam. Academia has been slow to respond, mired in the web of prestigious citation. But maybe this is the end game.
This generally chimes with my feeling. It seems to me that the users of research can derive vastly different value from it and so the flat price model doesn't really work.
Consider some new research on light emitting diodes and the value Samsung might get from that vs. me reading it out of curiosity.
For that reason, to me it makes sense to treat academic research as infrastructure and have free access to all funded via taxation.
Maybe completely free research papers are not the future but there should be a Spotify for research papers that is affordable for everyone. I hope that Elbakyan will reach her goal and ultimately change the whole industry.
Also, a research that has been at least partially tax-funded resulting in a publication, must not be usable as an necessary ingredient for a commercial patent.
That is, a patent can include this type of research, but it cannot be a 'necessity' for the patent to be viable. Or, if the particular research, is necessary for a given patent to be viable, the patent must grant no-fees, no-commercial-strings-attached use.
This allows a corporation to establish patents as means to protect itself, while allowing the tax funded research to be used by others without commercial strings attached
There must be a lot of interesting meta-analyses that aren't getting done because the necessary data is locked away behind paywalls, and usually not in an easily machine readable format into the bargain.
For me, this is the cog of the problem. People who are in a position to change should push for it.
I am a PhD who'd love to be working in industry, but I'm shit scared that once I leave the gates of the university I'll simply lose touch with the state of the art because the papers will no longer be accessible.
Just my opinion as a PhD, working in industry, who does not read nearly as much as he used to.
It is in the best interests of humanity to make the knowledge obtained through research available to anyone looking for that knowledge. There is a clear consensus among scientists that the current publishing model is at best inexpedient and at worst hostile to that end.
Most people are asking what good the current publishing model provides, but I think to answer that question we need to ask: "compared to what?" It seems clear to me that the current model is better than having no publishing mechanism at all, but I doubt that anyone seriously thinks that the "none" model is the only alternative.
I think that if we sat down today and thought up a new publishing model from scratch, we would be able to outdo the status quo on just about every "good" people have mentioned here, as well as provide features that the current model is incapable of. I think it is highly likely that we could make a system that ran on donated resources alone.
Some things we might want/have in a "from scratch" model:
1. Direct access to data-as-in-a-database instead of data-as-a-graph-in-a-PDF
2. Blockchain-based reputation system for scientists
3. P2P storage and sharing of scientific data
4. Tiers of scientific information, e.g. an informal forum-of-science, semi-formal wiki-of-science, and formal publications
5. Automated peer review process
6. A better and more consistent authoring tool for scientists
Conferences are more important to a scientific career than an OA paper. The people who hire you will in all likelihood have access to your paper even if it's behind a paywall.
Depends on you field of research, maybe? Paper publications make or break early biomedical research careers.
Further, the grants that these funds are paid out of often have travel funds built-in, so conferences are still available.
Yes, but that has nothing to do with the OA-status of those publications - if it's a high-impact journal, then it makes the career, if it's not, then not.
I am a professor and I was at a meeting a couple weeks ago that talked about open access publishing. There were people from many departments and there was one professor there who asking questions that made it clear he didn't know anything about online publishing (he was asking things like "where are the papers stored?" and said something about how "all this online stuff is like Big Brother")
Every researcher I know would publish for free if it wouldn't ruin their career to do otherwise. They want their results disseminated as widely as possible.
Why can't the national/international associations in each field set up simple publishing operations (or pay a U. press to do it), with the same people doing peer review, and use their power to designate the presitage of each journal: 'this will be our tier 1 journal, this one a tier 2, etc.'
Whatever it costs, it would cost far less than the existing setup. I'd think the U. libraries would be happy to share some of their enormous savings to fund the operation.
The real problem is the use of journals as an arbiture of research quality. If it didn’t matter to your career if you published in Nature or PLOS One then everyone would publish in PLOS One. If you are more likely to get a job or grant by publishing in Nature then people will do almost anything to get their work into Nature. We are trying to solve the wrong problem by worrying about journals controlling copyright.
But the control of copyright is a real problem, and i think we underestimate how much it hampers science (esp. considering the wonderful things one could do with machine analysis of the texts)
Libraries used to get a physical copy of the papers, which would grant lifetime access to the research.
Today, they pay for subscription, and as soon as they stop paying, they lose access to everything.
I am not sure that the way you put it s right either.
Because "who should be profiting from research papers?" is too generic of a question, and does not appear to necessarily supersede the question 'should tax-funded publication be readable for free?'
If I may rephrase your question to be:
"Quality control of a research paper, must be, necessarily funded (either by money or a form of barter).
Therefore question a) who should fund it, question b) who should receive funding to do the quality control"
Then, obviously, this is an important question. And I do not believe has been clearly answered either in polices or on this forum.
My answer to ( a ) would be -- the same entity that funds the research (therefore in this case the tax payers)
My answer to ( b ) would be -- a licensed or otherwise professionally certified group, independently selected (that is not selected by the researcher that authors the publication).