Hacker News new | past | comments | ask | show | jobs | submit login
Sweden ends contract with Elsevier, moving for open access for science articles (kb.se)
784 points by rreichman on May 18, 2018 | hide | past | web | favorite | 116 comments

I am surprised that it took so long after the invention of the Internet. In the pre-Internet era, these journals used to play a significant role in distributing research papers in physical paper form. It made sense then. There was no way to copy a 10 page research paper from Germany to China with a few finger taps at low cost. But with the advent of the Internet and its pervasiveness, it no longer makes sense to rely on a costly media based on physical printing, distribution, and centralized organizations milking money out of it.

I remember Timothy Gowers calling for a boycott of Elsevier back in 2012. It's 6 years since then and Elsevier is still alive. Influential researchers still submit their work to Elsevier! It took less time (a few weeks?) for everyone to boycott Digg!

It's surprising how the Internet has been used to distribute cat videos, advertisements, time-draining, and attention-draining content to a sickening degree but it is still underutilized to distribute good content like research papers such that the Internet becomes the primary and de facto media for such content.

That's because, despite being called "publishers", publishing is not the main function academic publishers like Elsevier are being used for. Their main function is identifying and recognising the most valued academics in each field. So obviously academics are going to pay whatever amount of public money is needed and available to get that stamp of approval, because otherwise they'll be out of a job in a very competitive job market.

Edit: The argument in more detail: https://medium.com/flockademic/to-fix-scholarly-publishing-d...

Most of the time the publishers just forward the abstract or sample to peers and then make their decision based on that.

Imagine HN where you are only allowed to publish your own content and random users would be selected to evaluate your work.

Right, it's not the actual evaluation part that's the publisher's value (as in: they do provide value there, but others can do that just as well), but in the brand name that acts as the credential, i.e. its the recognition that publishing in a certain journal gets you that leads to the high prices.

> Imagine HN where you are only allowed to publish your own content and random users would be selected to evaluate your work.

We should try this!

it's called "the internet"...

So like submitting a pull request and having my peers evaluate it?

Sort of, but there's the additional constraint that the repo only accepts a limited number of PRS per month

I have one small question about the article:

It says "If researchers are evaluated on the quality of their research instead of the journal it was published in, that would remove their need to pay whatever amount of public money a publisher asks of them just to obtain career credentials."

But at least in my field, most of high impact-f journals don't charge any money for publishing there. So the researchers don't really use any significant money on the publishing part.

I don't know how exactly they get their profit (other than the obvious subscription fee they charged the school), but I don't think researcher themselves are paying that (unless of course, by "researchers" you mean universities as the organizations, but then again "find ways to promote your academic work that are independent of the journal it’s published in" will not be relevant since you still need to pay to read other people's publications).

The money comes from the subscriptions.

The need to pay "whatever amount of public money a publisher asks" exists because the important researchers choose to publish in those journals instead of other journals where their articles would be freely available.

If researchers are evaluated on the quality of their research instead of the journal it was published in, then one could simply require that they publish somewhere where it's freely available, but it's a bit of a problem since they need to publish in exactly those journals, which happen to be very expensive.

Right, that could have been more clear. It referred to money one often has to pay to make a work available as open access when it's in one of the traditional "prestigious" journals, i.e. the non-subscription ones.

However, in that case, too, the money is not the researcher's. That said, they are the ones that spend it.

In the case of subscription journals, the argument is similar, except that in this case, it's academic libraries who will have to pay whatever amount of public money a publisher asks of them just to be able to access that research. If the researcher could obtain their credentials elsewhere, they would be able to make their work available somewhere where the libraries and the general public would not have to pay to access it.

This is it. They are not distributors, they are the gateway to scientific professional advance. You will not get much just by posting things to arxiv.

So that implies that a majority of academics, or at least a portion of them with meaningful power, of academics actually place value on the 'name stamp' that Elsevier and others apply to other researcher's work. They all could just read a PDF from the original researcher's own web page or university's online archive but they choose to prefer the 'Elsevier gated' papers instead. Otherwise people publishing via Elsevier wouldn't "get much" either.

The question is whether that pays off in the end. That is, do Elsevier and others' papers have the bar high enough that it's absolutely not worth browsing and reading papers outside the commercial publication, or if it's just laziness and convenience acting up by placing trust in these academic publishers. The academics should be aware of the money and power game involved in commercial publishing, so they should be in a position to judge whether there is value to be trusted in it.

Well, it's not so much about the readers of the research (especially in some disciplines, most of them access articles through arXiv), but about the academics on tenure track of grant application committees. They're not actually going to read the research of the hundreds of applicants they have to judge, but will rather rely on proxy measures such as the journals they have published in.

See e.g. https://theconversation.com/why-i-disagree-with-nobel-laurea...

In some sense gatekeeping is essential - for example, one might want to skim all recent accepted papers from the top venues of a particular domain but skimming all submitted papers from all venues in the same domain is physically impossible, even if that would be a full time job doing nothing else. You do need the community to read all the proposed papers and throw away the majority of them, and sort others into lower-ranked publications focused on a narrow niche which you read only if that niche matters to you.

So publishing on your own site on arxiv isn't sufficient on it's own, some curation and gatekeeping is essential for academic publishing to work. However, this is not a service that Elsevier et al provides for their payments, this service is provided by uncompensated reviewers from the community.

On the other hand, the ratings are used by the evaluators of science e.g. funding agencies which won't (and can't) evaluate results directly but need some trusted proxy to separate random ramblings from acceptable science, and the prestigeous journals fulfil that function.

The fact that they put the 'name stamp' is due, essentially, to the referees that allow that paper to be published. Refereeing is also part of the CV of a researcher. So, unless you find another way to do the refereeing process, there is no alternative to the 'name stamp'.

Just uploading to the arxiv will not guarantee that your paper has been reviewed.

Reputation is misleading. The Special Theory of Relativity is not good because Einstein wrote it. It stands on its own.

We could even argue that a purely meritocratic system of publishing will accelerate good quality research as opposed to a reputation based system.

Funding and career advance should be ortogonal to knowledge distribution.

>Reputation is misleading. The Special Theory of Relativity is not good because Einstein wrote it. It stands on its own.

Your Einstein example actually contradicts your point about reputation.

The mistake is using hindsight to judge that paper's quality. Instead, we have to imagine looking at that SR paper as if we lived in 1905 and didn't know if the 26-year-old patent clerk was a crackpot or a genius.

Einstein sent his manuscript to an academic publisher where Max Planck was one of the editors. This is important because MP already had the reputation in physics so his recognition of Einstein's papers' value was taken seriously.[1] The journal itself also had a reputation which was reinforced by the reputation of its editors.

In 1905, Einstein's papers didn't stand on their own. Its dissemination and acceptance was helped along by Max Plank's "stamp of approval".

[1] https://en.wikipedia.org/wiki/Max_Planck#Einstein_and_the_th...

So do we need more people like Max Plank and Einstein or we need Elsevier?

Absolutely, but you're describing what the world ideally should be rather than what it actually is, I presume? :)

This is not a technological problem, but a coordination one. Each of the individual researchers, faculty and grant committees are being perfectly rational in supporting status quo, they are just stuck in a suboptimal Nash equilibrium [1].

To quote Yudkowsky for a possible solution to this metaproblem "That’s why we have ... there doesn’t seem to be a word in your language for the timed-collective-action-threshold-conditional-commitment… hold on, this cultural translator isn’t making any sense. “Kickstarter”? You have the key concept, but you use it mainly for making video games?" [2]

[1] https://en.wikipedia.org/wiki/Nash_equilibrium

[2] https://www.lesswrong.com/posts/x5ASTMPKPowLKpLpZ/moloch-s-t...


> An assurance contract, also known as a provision point mechanism, or crowdaction[1], is a game theoretic mechanism and a financial technology that facilitates the voluntary creation of public goods and club goods in the face of collective action problems such as the free rider problem.


> Dominant assurance contracts, created by Alex Tabarrok, involve an extra component, an entrepreneur who profits when the quorum is reached and pays the signors extra if it is not. If the quorum is not formed, the signors do not pay their share and indeed actively profit from having participated since they keep the money the entrepreneur paid them.

There's actually a publisher in the humanities that operates like this! Knowledge Unlatched funds monographs (i.e. books), then academic libraries pay a fee not to access those books, but to have them made available as Open Acess. It's actually been really successful in obtaining funds for open access publishing while circumventing the problem of people not paying for it because others might be: https://en.wikipedia.org/wiki/Knowledge_Unlatched

Interesting. So perhaps a non-profit could start a grass-roots crowdaction campaign for all researchers in a field to switch to open-access overnight, and use US contract law to enforce compliance (in the US) if the threshold is reached?

There could be some penalty that researchers agree to pay if they renege on the agreement, and that penalty could be contractual. The non-profit could collect donations for marketing, and enforce the transition through legal action (if needed). Any proceeds could go to funding the next research field's crowdaction.

Critical mass, momentum, catalysts, etc?

Because as techies we fail to recognize that some things are the way they are not for lack of a better alternative, but because of entrenched interests.

I am still amazed that the record industry managed to shut down what was likely to be the next iteration of internet: P2P exchanges. Now we have a centralized youtube solving in a bad way what eDonkey+VLC could have solved for two decades already.

> P2P exchanges. Now we have a centralized youtube solving in a bad way what eDonkey+VLC could have solved for two decades already.

Hey, that can still happen..

Here in China, where privacy is rampant, P2P dies because of lack of public IPs.

Was that an accident, or was it on purpose because centralized systems are easier to monitor/censor?

China does not have many public IPs (v4) in the first place, so I doubt it is in any way deliberate. See https://en.wikipedia.org/wiki/List_of_countries_by_IPv4_addr...:

Country | Number of IPv4 addresses Per Person

US | 4.91196

China | 0.24591

And IP addresses per internet-connected person?

Searching for 'China internet percentage' shows several sites claiming ~50%, so double the ratio to ~0.5, or 10 times less than the USA.

I don't know the numbers, but even if it exceeded one, it wouldn't mean every Chinese would get one public IP. Many IP addresses are still on sale.

Isn't IPv6 widely deployed in china? Most major bittorrent clients have partial or complete IPv6 support these days - although IPv6 DHT support is still lacking apart from Transmission and BiglyBT.

But I don't know about the state of things of the multi-network clients which are popular china.

> Isn't IPv6 widely deployed in china?

No, IPv6 is currently available in only few places, such as universities. Where it is available, torrenting is indeed prevalent, and the speed is good (typically saturating our 100Mbps ethernet link).

Recently there has been a renewed push from the Chinese government to widen the deployment of IPv6, but the effect remains to be seen.

What do you mean by shutdown? I'm using torrents almost every day.

But you are not using P2P to exchange pictures or video with your friend or family. You store them in a big central server that somehow magically get paid for.

Wikipedia still begs yearly for hosting whereas if people voluntered their bandwidth and space there would be no problem hosting it.

I remember eDonkey fondly. The variety of things you can get with torrents pale in comparison of what was open to us: people would just share "their" files with a huge variety of different content. Nowadays you only get a few thousands of the most popular US movies.

Hosting costs are not the main reason for the yearly fundraisers on Wikipedia. From July 2016 to June 2017 the Wikimedia Foundation raised $87.5 million in donations and spent $2.2 million in hosting.


Cost of maintaining my pictures and videos is tiny, as is cost of maintaining Wikipedia (relative to its users). Simple VPS for $1/month is enough for me, and with economies of scale, I'm sure that it's few cents per user at most. Wikipedia has no problem getting enough donation to continue its work.

I like the idea of P2P social network, but honestly I think that there's no problem to solve in the first place. While distribution of huge video files and music files is a problem and torrents solve it, that's why they are popular. It's not easy to host terabytes of video with thousands of downloads. But it's trivially solved with torrents.

This has nothing to do with the physical form of the media. Of course everything is digital and has been for decades.

The root of the problem is that to make an academic career you have to publish is one of those journals. When you apply for a new position, the first thing they look for is how many papers you have published in high profile journals.

It has everything to do with physical form. The fact that research is carried out via centralized journals is a relic of a practice that is over 300 years old, when printed paper was the only feasible medium to distribute such research.

If scientific research had started in this century, I am pretty sure no researcher would have ever thought of publishing research papers in a centralized journal where the journal gets to make so much money for a contribution that is not commensurate. Instead, they would have chosen an online platform that is funded and managed by the universities and research centers participating in this platform.

What you are saying is equivalent to stating a tautology: Elsevier is still important for researchers' careers because Elsevier is still important for researchers' careers.

The deeper question I ask is: Why is Elsevier or such centralized journals still important for researchers' careers?

Why can't the academia move to a model where only the impact of the paper matter where the impact is measured by some open source metrics like citations, number of stars, etc. I am not suggesting these exact same metrics. These are only examples. But it should be possible. For example, in the software world, GNU, Linux, Node.js, Git, etc. did not become important by publishing in a centralized journal but by publishing on open repositories, some amount of marketing, and by the virtue of being very useful. What prevents existing researchers from moving to such a model? If enough influential people move to this model, Elsevier and others automatically become unimportant.

> What you are saying is equivalent to stating a tautology: Elsevier is still important for researchers' careers because Elsevier is still important for researchers' careers.

It appears you've entirely missed the point. There is no tautology. There is only a vicious circle of being forced to publish in established journals to have a chance of having or even starting a career in research, and the fact that most if not all established journals are controlled by companies such as Elsevier. Therefore if your chances of having a career depend on consuming a service sold by the likes of Elsevier and producing content for Elsevier, it's quite obvious that you have to play along this scheme if you intend to have a career.

I will be careful to compare academia with software world (open or not), since the latter ones more often than not fall into "applications", which can be relatively easily judged by, as you said, "the virtue of being very useful".

It's often hard to do so with the theoretical work in academia due to various reasons, either by its pure technical difficulty, or lack of immediate applications. That's why they need to be peer-reviewed. And Elsevier provides a "prestige" platform to have respectable in-field people to review your research and decide it's good or not, due to the lack of other metrics.

I failed to see how it's related to journal's physical form. As soon as the importance of the research need to be evaluated by other experts, similar platform will exist.

Of course, things like arXiv [1] also exist.

[1] https://arxiv.org/

Peer review is necessary and time-consuming, that is true. But I do not believe that requires the need for profit-focused enterprises like Elsevier or the other publishing houses. I don't think any field has come up with a perfect solution yet, but as a computer scientist, things seem less grim than in many other disciplines of science.

I publish most of my research in the proceedings of the ACM conferences. ACM is a professional association, not a for-profit enterprise. Peer review is carried out by unpaid program committees, where I believe the primary motivation to participate is a combination of moral obligation, networking (with the other committee members), and an opportunity to get exposed to cutting-edge research. This part is similar to other fields, I believe. The main difference is that paper authors are expected to provide camera-ready PDFs of their paper, with the ACM doing little apart from proving LaTeX templates, concatenating the PDFs, and perhaps arranging a few print runs (but I think it's been a while since I have seen physical proceedings).

Most ACM conferences (and journals) are not open access, but some are. Generally it takes a subscription to their Digital Library to download the papers. My impression is that this subscription is cheaper than for the large publishing houses, but I don't know. In any case, an author can pick various licensing models for their paper, with one of them permitting the author to provide a "preprint" of the paper on their personal or institutional website. In practice, that means the papers are publicly available with minor effort, only that ACM does not take responsibility for the hosting.

The entire process appears quite lean, and the expenses incurred by ACM are covered through general membership and conference attendance charges. Further, the ACM appears to continually (and rather aggressively) move towards even more open access-ish policies, likely because it is ultimately run by academics.

The main difference from arXiv is that pre-publication peer review does actually take place.

And that's not taking into account that much of academia comprises disciplines where nothing is exactly "useful", or expected to be (e.g. arts and humanities). Metrics of worthiness in these areas are naturally going to be pretty subjective. (Which is not to say they couldn't be crowdsourced.)

> What you are saying is equivalent to stating a tautology: Elsevier is still important for researchers' careers because Elsevier is still important for researchers' careers.

It's not a tautology, it's a vicious cycle: https://medium.com/flockademic/the-vicious-cycle-of-scholarl...

Coordination is hard, otherwise this would already have happened. I recommend "Inadequate equilibria" by Eliezer Yudkowsky https://equilibriabook.com/ if you want a better grasp on how and why such situations arise and cannot be easily optimized.

Uhm, I can barely code my way out of a paper bag but I took a stab at exactly that in https://github.com/ecausarano/heron

At some stage it even worked when used in the same lan. I guess I’m not one of those heroes that can single-handedly design and build a working program, I wish I could muster some help to make it real...

Well you needed a robust hypermedia platform that permits low friction collaboration. Sharing DVIs via email, FTP, or Gopher doesn't cut it.

You shouldn't conflate the negative nature of for-profit scholarly publishing with the use of paper as a medium.

In my own field, virtually all of our journals are produced by non-profit learned societies instead of companies like Elsevier or Springer. But thank goodness that these journals are still published on paper and collected in my library alongside the production of PDFs. Any work I write requires having several publications open in front of me at any given time, and that is a lot easier with paper journals instead of PDFs. Plus, printed journals in library holdings allow one to spend hours browsing through research without any of the distractions that electronics bring.

I definitely appreciate having PDFs of articles, but paper still has its place.

Behold the invention of the printer! You can turn PDFs into paper in the comfort of your own home!

When an entire department of researchers wants a publication on paper, as well as people in other departments who need to occasionally draw on this field’s research, it makes sense for a single paper copy to be produced and held in a library where everyone can readily consult it without having to incur their own printing costs.

Printing costs are pretty much insignificant for documents as small as the typical publication. If you really like it to be on paper, printing ten pages, reading them and throwing them out is still cheaper than spending an extra ten minutes of your time to walk somewhere and fetch an existing copy. Printing a paper costs much less than a dollar in supplies, walking to a library costs much more than a dollar of paid-for time. In this case, convenience is cost efficient and it does not make sense to rely on centralized paper copies to try and save a few cents of toner.

Seriously, I'd only consider going to the library if I need a whole book (for a single chapter I wouldn't bother, digital or local printing is fine for that) or if the resource is not available in a digital format at all, which is getting exceedingly rare unless I need something really old. Especially if I've decided that I need paper X for whatever reason, then it'd be useful to read it right now at my desk instead of some time later this day after a visit to library.

When an entire department wants a publication on paper, a single copy in a library won't serve that need (since that copy can only be in one place at once). I experienced this in college (before electronic journals were widespread) with professors assigning a reading to an entire class and not providing copies.

The vast majority of scholarship is unlikely to be in any assigned reading list and the journal will probably be there sitting on the shelf when you want to have a look at it. In cases where there is suddenly high demand for it, then it makes sense to make copies of the article for everyone (or let them download a PDF), but I am happy that I don’t have to print myself everything I want to read.

Printing is significantly cheaper than a subscription to an Elsevier journal.

Please read my comment above. In my own field, none of our respected journals are produced by Elsevier (or a similar for-profit publisher). The non-profit learned societies that do publish our journals provide the PDFs free of charge, and for the paper copies they charge libraries only the cost of production – which also makes it possible for even individual scholars to often buy their own bound hardcopies.

> Any journal submission I write requires having several publications open in front of me at any given time, and that is a lot easier with paper journals instead of PDFs.

I strongly disagree I’ve writtrn two published articles and two dissertations, and PDF versions have made my life so much easier (mendeley in particular)

Why "strongly disagree"? If PDFs have helped you, then that speaks for the value that I suggested of having both paper and PDF.

Why did people boycott Digg?

Good, now the Nobel Committee should say it will only consider science published in open source journals for future Nobel prizes...

I think the Nobel prize is not what drives most researchers. Rather, tenure track and grant application committees should only consider open access research. However, obviously their main concern is finding the "best" applicants, not changing the scholarly publishing ecosystem, and doing so would help the former but arguably hinder the latter.

Love this. GenBank had a similar tactic - got Journals where people published sequencing papers to say they won't publish any results that haven't been uploaded to GenBank. Created an anchor resource that led to computational biologists having world class shared datasets and evolving the field much faster.

Excellent idea. Wish I'd thought of that :)

It's arguable that stuff isn't actually in the public record if access requires payments.

Elsevier is a criminal enterprise, nothing more, nothing less...

Extreme hyperbole like this rarely helps your case. Please try to elevate your discussion. Elsevier is a significant drain on public benefit relative to free/open journals but that doesn't make them criminal.

Research that is funded by the public for the public is being put in the hand of for-profit organizations like Elsevier and then held private unless someone pays. Its not criminal but it is also more than just a drain on public benefit relative to free/open journals.

Its like if we paid taxes to build and maintain roads and bridges, only to have the constructor give the rights of the finished result to a toll-both company in order so they could evaluate and rate how good the individual construction workers are and make it easier to promote and hire them in the future. It make no sense for any practical profession to behave like this, and yet thats how academia has operated for ages. It is time that the public get what it paid for.

We gotta stop enabling them. If we demand that publicly funded universities adopt open access Elsevier will be gone in a few years. And of course I'm not recommending Sci-Hub because it's illegal but worth mentioning it...

If we demand that publicly funded universities adopt open access Elsevier will be gone in a few years.

Elsevier will happily publish your papers as open access (for a small service fee).

Unfortunately, funders don't want to restrict where researchers can publish the work (in order to prevent themselves from being able to silence results they might not like, i.e. hinder "academic freedom"). This means that this demand is implemented by simply paying hugely disproportionate publishing fees ("article processing charges") to the same old journals to make the individual articles they funded available as open access.

How will Elsevier be gone if it controls so much of the content used in citations?

Over time I believe more content will be published in open access journals. In Machine Learning this is already quite common and it's amazing.

I would appreciate an explanation of this assertion. What makes them criminal?

As far as I know, they're not breaking any laws, so they're not criminals.

Basically, over the years, they acquired a bunch of academic publishers that were effectively not-for-profit. And since, they've been raising prices. Harvesting market position.

Sure, that's not illegal. But it's arguably immoral.

Well, not criminal, nor is Disney, but people who bribe law makers to extend copyright to defeat the purpose of copyright seem criminal to me. Remember, copyright was made to give authors a period of monopoly, then open source. What if Microsoft bribed law makers to extend patents for 100 years? or drug makers?

If research grant funding calls would require the results to be published in open access journals, the field would switch pretty much overnight.

I made this point elsewhere as well, but:

> Unfortunately, funders don't want to restrict where researchers can publish the work (in order to prevent themselves from being able to silence results they might not like, i.e. hinder "academic freedom"). This means that this demand is implemented by simply paying hugely disproportionate publishing fees ("article processing charges") to the same old journals to make the individual articles they funded available as open access.

That's an absurd rationalization, and I don't believe it.

First, it's an absurd rationalization because all the finding agencies need to do is say "starting Jan 2019, all research papers funded by this fund must be open access; each violation results in $1k reduction in the grant", and any journals that don't yet have the option would provide a ~$1k open access publishing option.

Second, I don't believe the rationalization is what's really going on. I think there just aren't any/enough people at the funding agencies that care.

As for your first point: that is in fact what is happening. It's somewhat of a "loophole" in that is has allowed funders to mandate open access without restricting academic freedom, by simply making money available to publish those single articles as open access. The downside then is that it still leads to public funds being wasted: rather than paying money to access research, it is paid to publish it. The costs are then still disproportionate to the actual service provided, and it's still a barrier to inclusivity in academia (e.g. towards researchers in developing countries or in industry). I expanded on this in more detail at https://medium.com/flockademic/to-fix-scholarly-publishing-d...

As for your second point: obviously there are many funding agencies, and surely people don't care at some. I was mostly referring to those who do want to promote the adoption of open access, such as the Gates Foundation or the Wellcome Trust (and many national funding bodies). They have absolutely come out making the above argument. They're really hesitant about this (although without a doubt also due to lobbying by the traditional publishers), which is laudable but also exacerbates the problem.

(Note also that they do have reason to be hesitant: science has seen a lot of pushback in the past, so these lessons have been hard-won and are ingrained in their DNA. Any movement that appears to go in the direction of the times when the catholic church could limit the adoption of scientific theories is met with heavy scepticism.)

Unless you think you are really likely to win one its still going to be important for your more immediate-term career to publish in high impact journals, of which most aren't open access (ex. Nature/Science).

The more democratic you are as a nation, the more time elected officials spend thinking about ways to improve the lives of ordinary people (not just those in power).

Been considering moving to one of the countries high on the democracy index to work/code and pay taxes. https://en.wikipedia.org/wiki/Democracy_Index

I've got to say, political debates in Scandinavia have nothing to do with what you see in the US. People are debating the real problems (education, housing, healthcare, etc.) instead of focusing on inter-party rivalries.

I watched the debate between the party leaders recently and the level of conversation really struck me. That's not to say that there aren't problems - the Sweden Democrats have basically succeeded in wedging immigration into every single conversation. But there seems to still be an appreciation for serious policy and statesmanship.

Unfortunately I can't agree. Sure, Sweden is still one of the better countries, but things have changed in recent years. In 2006 the right wing parties created a coalition, forcing the left wing parties to do the same. So now we have essentially two parties and a right wing fringe group, which is quite similar to the US.

Many issues therefor remains unresolved. The housing market would be exhibit A. Fortunately Sweden can "coast" on past achievements for a while, but sooner or later the dysfunction will have lasting consequences.

You see the same thing in Vermont, New Hampshire, Maine, Massachusetts, etc. In the US, on average if you want to see a better form of political discourse, you have to move toward more local politics (whether state, county, city, town).

Sweden = local politics. It's a small nation. 25% of their population lives in just four cities.

Directly comparing the whole of the US to Sweden is absurdity. One would expect a dramatic increase in messy national politics if you took 33 Swedens and put them under a Federal authority.

> You see the same thing in Vermont, New Hampshire, Maine, Massachusetts, etc.

For comparison, Sweden is about the size of California, with roughly one forth of its population.

> In the US, on average if you want to see a better form of political discourse, you have to move toward more local politics (whether state, county, city, town).

In my experience, local politics tends to be... More entertaining. Maybe because it's a lower barrier for entry.

> Sweden = local politics. It's a small nation. 25% of their population lives in just four cities.

The majority of its population lives outside of the largest four cities. There's a lot of difference between those four cities as well.

I could see that, maybe the U.S. should just separate into smaller city states. Let each fend for themselves... It'd be much easier for people to get what they want out of a national government - they could just move to the state that actually represents their interests better... Want universal Healthcare / Legalized Marijuana / try california. Want to make sure nobody gets handouts at all including SSI, Medicaid, and Medicare move to Louisiana or Texas.

I've often wondered if the U.S. would work better if there were some sort of multi-layered federalism wherein each cultural region (Northeast, Deep South, Cascadia, etc) got their own executive and legislative office above the state level, but not at the federal level. Doing this would probably eliminate a lot of the gridlock in DC since people could work on cultural and social issues at that level rather than across all 50 states (although it would also probably make things a lot worse for minorities in the South...).

See also the Corruption Perceptions Index:


For example, Ireland is currently number 6 on the Democracy Index, but way down at number 19 on the Corruption Perceptions Index. Correspondingly, Singapore is high at number 7 on the Corruption Perceptions Index, but down at 69 on the Democracy Index. I would suggest that you should look at both lists before deciding where to move.

Yes. Just came here to say that the linked to site appears to be a content aggregator and that the original article is here: https://www.timeshighereducation.com/news/sweden-cancels-els...

Superb news though, couldn't happen to a nicer company.

URL belatedly changed from https://www.snip.today/post/sweden-ends-contract-with-scienc.... I'm sorry we missed this earlier.

All: it's helpful to email hn@ycombinator.com in cases like this because then we see it, and can act, quickly.

> Starting June 30, Swedish researchers will no longer be publishing in Elsevier and will not have access to Elsevier magazines.

Legality aside, given how high the usability of Sci-Hub is these days, I have no doubt in my mind about who got the short end of this stick...

Here is a dumb thought I just had.

What if in stead of a giant stack of hard to nav papers...

...what if in stead each discipline would aim to publish a book????!

Each chapter would highlight the most important components of which the full version would be.... another book?

tier 2 of the books would simply refer to papers to provide even more additional reading.

The whole thing would keep it self up to date using version control and the closer to the front page of tier 1 the more extremely critical the review would be. An as-large-as-possible crowd sourced budget should be dedicated to reviewing and rewriting each of the book.

Each would be freely available online but every self respecting nerd would want a copy on his bookshelves.

A strict less is more policy would keep the books portable.

Technicality of the tier 1 books should be limited as much as possible in order to fit in a little encyclopedia of terms and methods.

By exposing the most important parts of a field to an audience as large as possible scientists would finally get the recognition they deserve which in turn would stimulate allocation of public funds.

A truly absurd idea, there, I said so myself.

And with Sci-Hub still alive, Sweden etc have more leverage.

I had trouble getting to Sci-Hub and LibGen at my uni (this was about a month ago) and I don't know if it was my uni's firewall or the recent court ruling smack-downs in earlier this month[0] and last November[1]. I just checked from home and I found https://sci-hub.tw/ and http://gen.lib.rus.ec/ work okay for me so ¯\_(ツ)_/¯

[0] https://torrentfreak.com/sci-hub-pirate-bay-for-science-secu...

[1] https://www.theregister.co.uk/2017/11/23/sci_hubs_become_ina...

Sci-Hub's DNSs often change due to ISP blacklisting and such. However Wikipedia has a pretty reliable list of still alive DNS/IPs.


I try to keep this list up to date with the latest links: https://citationsy.com/blog/download-research-papers-scienti...

Citationsy also has a similar feature built-in: https://citationsy.com/blog/new-feature-citationsy-archives/

Here's a crowdsourced list (based on Wikidata): https://whereisscihub.now.sh

The article links to this page about how to find articles: http://openaccess.blogg.kb.se/openaccess-definition/hitta-op... (in Swedish)

Neither that page nor any of the linked pages mention sci-hub. I can see why they don't, but I guess all researchers know about it already.

Cool, odd that I as a Swedish researcher haven't heard of this... Maybe they ment Switzerland - people have major difficulties discerning the two.

Tack, ha en bra dag!

Nope, it's Sweden :) Much is still unclear though, as these negotiations were done on the level of the universities, so even academic librarians don't fully know what this might mean to e.g. their budgets, and whether they can invest the money saved in open research infrastructure.

Knowledge will eventually be communicated by adding new information to a universal network which represents all known truths ... the act of ~publishing~ will be to add new nodes and/or edges ... this network will get launched by seeding it from culling all existing published papers however once it goes live the very idea of publishing research to any journal will become obsolete and counterproductive

Some more details for those of us who would like to know who “Sweden” is in these circumstances. This kind of decisions are usually not taken on the level of the national government.


Not by the government directly but by a national library organisation reacting to a goverment mandate for open access.

Yes, I think that was what it said in the link I provided. :)

long ago, while working at a science software company, I was casually told by a co-worker that in Japan, science publishing is run by their mafia.. The scientists I saw were so busy, so intent and so disciplined, that no management topics were ever discussed. So I went to a US business school library and looked up some management-side statistics about profit-per-employee.. and saw that the science companies of the time had less than half the number of employees, typically, with large revenue.

Some number of people are just predatory over profits, whatever the level of intelligence, legality and social status. Whatever the origins of this science conglomorate, you can bet that over the years, a crude extraction of profits via control of players, emerged.

As with other online transfers, actually publishing/reading material has become the easy part and a whole pile of issues with security cropped up instead. (You don’t even really need a publisher to help with notoriety, if your stuff already shows up easily on Google.)

I wish that Elsevier was being paid to solve a bunch of hard security problems but they seem to just be an expensive paywall. For example, do they provide a block chain or other trusted time stamp solution to make it easy to prove that a publication was “first” (no matter who decides to steal a file and copy/paste their own name as author instead)? I’d really like to see those kinds of things become mainstream defaults for publishing. Right now the main downside to just throwing files on random web sites is that they don’t typically have those security elements, making it easy to steal and hard to authenticate what you’re seeing.

Journals do usually list the submission date, I believe. Otherwise, submitting your work to a preprint server can prove that you were first (assuming you trust the preprint server), although in some fields, journal publication is still the only thing recognised as planting the pole.

You'd only need blockchain if you don't trust the preprint servers, but I don't think there's reason not to.

I worked in the scientific publishing industry. Not directly for Elsevier, but we had dealings with them too.

Many people here view Elsevier as this evil nebulous entity. Leaving "evil" aside, like every large business, Elsevier is composed to people, some of them very smart.

Which is to say Elsevier has seen this "open access" movement coming for a better part of a decade now, just like everyone else. As far back as 2011 the industry has been inventing ways to make "open access" as profitable as the current system (ideally, even more). Green open access, gold open access, diamond and hybrid; moving walls, paywalls, article processing charges…

Having seen the sausage made, I guess I'm a little cynical about "open access". I see it devoid of the idealistic "stick it to the man" connotations, and more like another feel-good buzzword scam.

> As far back as 2011 the industry has been inventing ways to make "open access" as profitable as the current system

If that's what they were doing, nothing could better demonstrate that they missed the point entirely and misunderstand the word 'open'.

Many companies fell despite having many smart people working for them.

I would say every semi-big company probably has many smart people working for them. That does not prevent them from doing very stupid/bad/evil things, and sometimes disappearing.

The fact that they anticipated the move does not mean anything regarding their future, just like Kodak with digital photography for instance.

Taken differently, maybe you see a tsunami coming from very far away and are able to even calculate how strong it is and when it will come, but that does not mean you will be able to save your house in the seafront ;)

Sure. My point was not only did Elsevier see the "tsunami" coming, they also prepared for it profusely. Using a combination of legislation, regulation, lobbying and technology.

Obviously any company can fail. I'm just offering a little inside information that pushes back against the notion that Elsevier is a static, dumb, backward-facing entity, taken completely by surprise by this new-fangled thing called "open access". Not the case.

> Which is to say Elsevier has seen this "open access" movement coming for a better part of a decade now

A lot earlier than that, I was discussing the options for online publishing with the then CTO of Elsevier in 1987.

Online publishing is a different concept than open access.

Elsevier has had a very strong online publishing platform for quite a while now (including SRU interfaces for federated article search, which we integrated too). If you were involved in that effort, my belated thanks :-)

It’s irritating when a journal article from 1933 is behind a £42.50 paywall. <idealism>Information, especially science, should be free and open to the public.</idealism>

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact