Hacker News new | past | comments | ask | show | jobs | submit login

Is it not possible to create a separate foundation / organization that manages peer-reviewed, open-access journals for multiple scientific fields? I realize the costs involved in asking academics to review submissions, etc., but couldn't a small $10-$15 million grant for an organization be enough to kickstart an open journal consortium? Even if they did require small review fees from submissions ($25-$50), I feel that with proper management, and digital distribution, that a project like this could 'disrupt' conventional journals.



The problem isn't the technical implementation of such a system, it's getting the reputation prestige. eLife is an example of a journal attempting to go from zero to elite right out of the gate, but the solution to that is to throw a ton of money at the problem. Their last financials show it cost them about $14k/article to publish. So either you spend A LOT of money and make the prestige play quickly, or you do it cheaper (PeerJ, PLoS One, etc) and struggle for many years to build up the prestige.


I believe that the prestige of a journal is mainly defined by the members of the editorial board. So if you can convince members of the editorial board of prestigious journals to resign and join your new open access journal, it can very rapidly become very prestigious.

That is what happened in machine learning in 2001, when forty editors of the journal 'Machine Learning', published by Springer, resigned to join the open access 'Journal of Machine Learning Research', which was created in 2000. Today, JMLR is much more prestigious than Machine Learning (impact factor of 3.42 v.s. 1.46 in 2012).

See http://en.wikipedia.org/wiki/Journal_of_Machine_Learning_Res...


I don't understand, can you clarify how a journal/publisher can spend money to get prestige? Spend the money on what, that results in prestige?


eLife is one of the few journals that pays peer reviewers.

There's a huge divide between different types of peer review right now. There's the typical peer review you get in a prestigious "traditional" journal, which includes a review of the scientific method and also the "importance" of the work (is it a new/original work, does it move the field forward, are the results groundbreaking, etc etc). Then you have what's becoming the norm for OA megajournals, which has been pioneered by PLoS One, which focuses exclusively on the methodology (ie is it good science, regardless of "importance"). To put it bluntly, that's an easier and cheaper way to review a paper. I'm not at all arguing it's the wrong way, there's a solid case that pre-publication peer review should be methodology-focused and we should let post-publication review of some sort sort out the "importance" factor of a paper later.

So in these two types of peer review, the "traditional" version that assesses for importance will likely lead to a higher chance of a higher impact factor for your journal, since you're assessing articles based on their impact on the field (theoretically leading to more citations in the long term). Couple that type of peer review with open access (like eLife is doing) and you get a really good formula for producing a high impact journal after only a year or two. But getting that type of peer review done, particularly if you're in a hurry, as you are when launching a brand new high-cost journal, is hard and expensive. So eLife has chosen to fund all the aspects of the process, including paying the reviewers. So that's one way of throwing money at the problem of going from zero to high impact in a short timeframe.


I wonder if you could have peers sign others' research on a Github-like platform using PGP. You could then use a graph model to quantitatively determine reputation based on who reviews which article.


The biggest problem is how to get the people doing the hiring to care about your new magic metric. You could devise some amazing algorithm to calculate a researcher's impact, but if nobody on a tenure committee thinks it's a valid way to rank researchers then it's not going to go anywhere. This is where you hit the most inertia in the industry. The people making hiring decisions look at which journals you publish in and they care about the journal's impact factor and historical prestige. It's the fast/lazy way to judge a candidate. It's going to be hard for a technical solution to the problem to address that sociological problem.


This is actually the idea that started Google, and its something that Google Scholar does.


> I wonder if you could have peers sign others' research on a Github-like platform using PGP.

When the majority of influential scientists don't know and don't care what GitHub or PGP are? I don't think so.


There are no "review fees" (we review papers for free) and most editors (who select reviewers and select the papers) are not paid. So, basically, the cost is the cost of a website.

At any rate, to 'disrupt' conventional journals, you need to convince the big names to be the editors/reviewers (which is often hard!), in order to give some credibility to your new journal.


Is this what you mean? http://www.plos.org/

The current system rewards people who publish in the high-impact journals, which means they get the "coolest" papers, which means they remain high impact. The ones at the top now are not generally open access.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: