
Frustrated Scholar Creates New Way to Fund and Publish Academic Work - mathattack
http://chronicle.com/blogs/wiredcampus/frustrated-scholar-creates-new-route-for-funding-and-publishing-academic-work/53073?cid=at&utm_source=at&utm_medium=en
======
BruceIV
I love the idea, but bootstrapping is the real tricky bit with a site like
this.

Putting aside the grant funding idea, I think you could really democratize
academic publishing by adding a StackExchange-like rating system to something
like ArXiv, and also building in a conference organization system (this part
is actually key). You'd allow researchers to link off-site (even paywalled)
content to their profiles (email to the address listed in the author list
should be sufficient authentication), and bootstrap the rating system by
granting reputation points in certain subfields to authors with recent
publications in reputable conferences/journals. You get reputation by people
who already have it positively reviewing your papers. The system gets papers
by including a really good conference organization system (in-person
networking is still very worthwhile, even if we solve the journal problems of
distribution and peer review), which requires conference submissions to be put
into the system, along with the reviewers comments. You'd fund the thing (and
weed out junk submissions) by imposing a relatively nominal fee (say $50) for
lifetime hosting.

Such a system is technically possible, but it would need to achieve critical
mass to actually replace the existing journal system.

(I should mention I'm coming at this from the perspective of a computer
science researcher - in a lot of the subfields of CS the most expensive parts
of doing research are 1) your time and 2) institutional library access to get
at the papers you need (Google Scholar helps here, but it's far from perfect),
so large grants are less essential than in disciplines that need expensive
laboratory equipment or travel to exotic locations.)

~~~
yungchin
Even if you'd achieve critical mass, I think there's another problem. A
fundamental difference with say StackOverflow, is that for a typical
programming question there will be many people who, even if they don't know
the answer, can distinguish a good from a bad answer. Most questions have a
reasonably quickly verifiable answer, or they're about best practices.

Scientific publishing on the other hand, should always be on the cutting edge,
and has endless niches. If you judge that sort of work in StackExchange-style,
in the best case an expert shows up (the guy who Cell editors would hopefully
have invited for the peer review anyway) and gives you a subtle argument
for/against, and almost nobody can vote it up/down because they lack the
background to grasp the subtle detail. In the worst case, a Nobel Price winner
shows up and leaves an off-the-cuff heuristic criticism, and everybody votes
it up because well, hey. And it would look like the crowd sourcing worked...
except that he totally missed the subtler points.

~~~
BruceIV
Fair; StackExchange might not be the best model. I'd say public reviews of
papers, but not crowdsourced ratings of reviews - rank reviews by researcher
reputation instead, weighting that reputation by reputation in a given
subfield, and maybe with a weighting boost if the paper in question cites the
researcher's work.

Anecdotally, I've got a paper draft on ArXiv right now, which I sent to both
the authors of the paper I based it on (who I got some really useful feedback
from), and to a mailing list full of developers working on similar problems
(who argued with me about typos my abstract didn't have) - this suggests that
the people you cite might have better input into your paper than the general
public.

------
StandardFuture
Article is entirely about this site:
[https://onarbor.com/](https://onarbor.com/)

------
WhoBeI
I have a hard time taking this site seriously. Most popular project in the
chemistry section seems to be a hand drawn picture of a cat. The entire site
seems very focused on gigantic thumbnails instead of, you know, comments.

I file it under 'internet based funding agency'

~~~
untilHellbanned
The site just launched. Most content is still just test content. Please give
it time and/or contribute to bringing content you like.

If you click the gigantic thumbnails, then whatever content is published,
videos, music, ebooks, images, apps, will play. Reviews and Comments are two
different things. Reviews of works are written by backers and commenting is
restricted by review. Our reviews/comments model was inspired by Github issue
tracking. We felt this was a fitting approach because Onarbor allows
versioning of the works. Therefore, we wanted a way for the various
reviews/comments to keep track of each version.

~~~
WhoBeI
Fair enough. I'll check back in a few months.

~~~
untilHellbanned
Thanks. We'd very much appreciate that!

------
dnautics
fascinating. Here is a question, outside of the publishing aspect, why would
one go with onarbor (or experiment.com, even) instead of more mainstream
avenues of funding such as crowdtilt, indiegogo, etc. which take a smaller
cut?

~~~
001sky
The goal of the site was to combine something like stack-overflow with indie-
gogo.

------
psychometry
Are academics going to be sufficiently incentivized to do free online peer
review for meaningless internet points à la StackOverflow? I have my doubts.

~~~
jules
Academics already do their peer review for free. They don't even get internet
points.

~~~
stfu
They get points from their peers. That's even more valuable to them.

~~~
jules
That would be the same with this system.

------
chm
I've gone through the (low quality) pictures offered at the front page and I
still don't really get what the product is.

~~~
timrpeterson
The site just launched and there is only one developer. It will improve.
Onarbor is a publishing and funding platform.

------
RankingMember
"Traditional academic publishers hate him!"

