Monday 20 March 2023

A suggestion for eLife

According to a piece today in Nature, there’s uproar at eLife, where a new publishing model has been introduced by Editor-in-Chief, Michael Eisen. The idea is that authors first submit their paper as a preprint, and then a decision is made by editors as to whether it is sent out for review – at a cost of $2000 to the author.  Papers that are reviewed are published, with the reviews, and an editorial comment, regardless of any criticisms in the review.  Authors have an opportunity to update the article to take into account reviewer comments if they wish, but once reviewed, cannot be rejected.

 

Of course, this does not really remove a “quality control” filter by the journal – it just moves it to the stage where a decision is made on whether or not to send the paper out for review.

The guidance given to editors in making that judgement is “can you generate high-quality and broadly useful public reviews of this paper?”  Concerns have been expressed over whether this would disadvantage less well-known authors, if editors preferred to play it safe and only send papers for review if the authors had a strong track record.  But the main concern is that there will be a drop in quality of papers in eLife, which will lose its reputation as a high-prestige outlet.

 

I have a simple suggestion for how to counteract such a concern, and that is that the journal should adopt a different criterion for deciding which papers to review – this should be done solely on the basis of the introduction and methods, without any knowledge of the results. Editors could also be kept unaware of the identity of authors.

 

If eLife wants to achieve a distinctive reputation for quality, it could do so by only taking forward to review those articles that have identified an interesting question and tackled it with robust methodology. It’s well-known that editors and reviewers tend to be strongly swayed by novel and unexpected results, and will disregard methodological weaknesses if the findings look exciting. If authors had to submit a results-blind version of the manuscript in the first instance, then I predict that the initial triage by editors would look rather different.  The question for the editor would no longer be one about the kind of review the paper would generate, but would focus rather on whether this was a well-conducted study that made the editor curious to know what the results would look like.  The papers that subsequently appeared in eLife would look different to those in its high-profile competitors, such as Nature and Science, but in a good way.  Those ultra-exciting but ultimately implausible papers would get filtered out, leaving behind only those that could survive being triaged solely on rationale and methods.

 

 

 

No comments:

Post a Comment