Monday, 27 October 2025

Problems with ELife's new article type: Replication studies

 

I was interested to receive an email from eLife last week, telling me that "As part of our commitment to open science, scientific rigour and transparency we now accept submissions of Replication Studies". Sounds good, I thought, but on reading further I became increasingly dismayed. I think the way this is set up dooms it to failure.

Back in 2023, when eLife created a storm by altering its publishing model, I argued that they would not achieve their aims unless they changed the basis for selecting articles for peer review.  Although eLife has abandoned the traditional binary outcome where articles are accepted or rejected, there is still a decision delegated to editors, which is whether the article is selected for peer review. My suggestion was that they should adopt results-blind selection. This avoids the bias that favours articles with positive results. Publication bias leaves well-conducted studies with null results to languish unpublished. This matters because a cumulative science should include negative as well as positive findings. Nissen et al (2016) showed how publication bias leads to what they termed the "canonization of false facts". It is a universal phenomenon and I doubt that eLife editors are immune to it.

One method of avoiding publication bias is the Registered Reports format, whereby the introduction, methods and analysis plan are peer reviewed before data are collected, with the article accepted in principle by a journal, provided the researchers did the study as planned, or gave adequate reasons for deviating from protocol. In my experience, biomedical researchers are highly resistant to Registered Reports; they tell me the approach is incompatible with how they work, which involves development of ideas and methods in the course of doing a study 🙄. However, that argument does not hold for replication studies, where the idea is to reproduce the methods of an existing published study. Indeed, eLife took a pioneering stand in hosting the Reproducibility Project on Cancer Biology (Errington et al., 2021a). So it is disappointing that their current proposal is for a much more timid approach.

First of all, it sounds as if the plan is for individuals to complete a replication study before submitting  it to eLife. This means that we'll be up against publication bias all over again - the likelihood of an editor selecting a study for peer review will depend not on the strength and suitability of methods but on whether the replication was "successful".

Second the eLife instructions state: "The authors should work closely with the authors of the original study and summarize their interactions with the original authors as part of a cover letter". It seems entirely appropriate to liaise with original authors to ensure that materials and methods are suitable for replication. However, we know from the Cancer Biology Replication project that many authors were unresponsive or even obstructive when asked to give advice on a replication study (Errington et al, 2021b). Many attempts at replication failed because it was impossible to work out exactly what had been done in the original study or because authors would not share materials. In effect, then, the current eLife approach to replication studies gives original authors the ability to veto any replication attempt.

I can't think who on earth would take the risk of doing a replication study under these conditions. Many scientists already regard replication as an inferior type of research activity, for which replicators get little credit - and potential abuse. Furthermore, it is difficult to get funds for replication studies, because they are deemed insufficiently novel. Yet, previous large-scale replication studies have found that direct replications often fail to find original effects, or replicate the effect but with a smaller effect size. So you make yourself unpopular by attempting a replication, and then when you come to try to submit it to eLife you are told it won't be peer reviewed because you obtained a null effect.

I think that we won't get replication studies in biosciences unless they are explicitly incentivised - and judged on their methodological quality rather than their results. Meanwhile, because of the field's obsession with novelty, research progress stalls as people keep trying to build on results without knowing whether they provide a solid foundation.

My prediction: give it a year, and see how many Replication Studies have been accepted for peer review in eLife. I'll be surprised if it's more than zero.

References

Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021a). Investigating the replicability of preclinical cancer biology. eLife, 10, e71601. https://doi.org/10.7554/eLife.71601

Errington, T. M., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021). Reproducibility in Cancer Biology: Challenges for assessing replicability in preclinical cancer biology | eLife. eLife. https://doi.org/10.7554/eLife.67995

Nissen, S. B., Magidson, T., Gross, K., & Bergstrom, C. T. (2016). Publication bias and the canonization of false facts. eLife, 5, e21451. https://doi.org/10.7554/eLife.21451

No comments:

Post a Comment