Hacker News new | past | comments | ask | show | jobs | submit login

Let’s ask scihub to tell us how much traffic they get from uc*.edu ip addresses



Traffic from US has always been huge anyway on scihub so thats going to be hard...

Put you in the shoes of a grad student.

What would you choose (ethics aside) navigate to the website of your library do 20 clicks through a shitty interface to finally get a paper, and thats if you are lucky, or just paste a DOI and wait 5 seconds...


When on mobile/iPad, my university (Harvard) makes me two-factor for every paper.


Luckily nearly every university offers their library proxy as a bookmarklet. It's two clicks: 1 to hit the bookmarklet, another to hit login on my autofilled secure sign on page.


My university does (a top national one) and it’s still less reliable and UX friendly than SciHub.


Thankfully some universities have a decently working ezproxy config but still. Then you hit the paywall, or the non working website from the crappy publisher that thinks people want to read pdfs as their enhanced-pdf and require two other clicks to get a real pdf, during that time the 5 (not kidding look at what those pages load!) trackers are loading...


In my experience at least those are edge cases from some super dodgey journals. All the major publishers and journals worth reading work fine.


I'm talking about a major publisher here, Wiley $1.7bn revenue...


What's a DOI?


Digital Object Identifier. It's a permanent URL to avoid link rot, really useful for scientific articles.

https://library.uic.edu/help/article/1966/what-is-a-doi-and-...


Said behavior is not limited to grad students.


You're right I started with grad students as an example and after rewriting I lost that effect.


You would have to get a 10 years before / after comparison for it to prove anything. Also, comparing a university against itself is interesting but not at all noteworthy, for bonus points to make the paper actually useful, you could compare against 10 years of traffic from other universities.

But then, Science magazine did a segment on this: https://www.sciencemag.org/news/2016/04/whos-downloading-pir...


You don't really need 10 years. Each researcher at UC might be confounded by their local network, but you could argue that is just a casual variable anyway. As for statistical power, UC has >30 researchers and each one of those researchers would access several papers a week. Take the ratio of sci-hub vs non-sci-hub papers accessed per week or month and compare to the month before UC's contract with elsevier ran out. Take the mean and do a t-test. Don't even need timeseries!

I suspect plotting sci-hub/non-sci-hub papers accessed per week for several weeks before and after the contract expiry would dispense with the need for any statistical analysis.


Or maybe there's been little noise about the change because everyone was using sci-hub already anyway!

I recently introduced a part-time professor to sci-hub, when he was having trouble getting his vpn connection to work, so he could access a paper in a journal. Open access is just easier.


Does sci-hub release geographic group-by ? She should, I have a corp all access pass to basically everything, and I still sci-hub.

The racist, classist control of information has to stop. It should basically be a reverse paywall.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: