Tuesday, 24 September 2024

Using PubPeer to screen editors

 

2023 was the year when academic publishers started to take seriously the threat that paper mills posed to their business. Their research integrity experts have penned various articles about the scale of the problem and the need to come up with solutions (e.g., here and here).  Interested parties have joined forces in an initiative called United2Act. And yet, to outsiders, it looks as though some effective actions are being overlooked. It's hard to tell whether this is the result of timidity, poor understanding, or deliberate footdragging from those who have a strong financial conflict of interest.

As I have emphasised before, the gatekeepers to journals are editors. Therefore it is crucial that they are people of the utmost integrity and competence. The growth of mega-journals with hundreds of editors has diluted scrutiny of who gets to be an editor. This has been made worse by the bloating of journals with hundreds of special issues, each handled by "guest editors". We know that paper millers will try to bribe existing editors, and to place their own operatives as editors or guest editors, use fake reviewers, and stuff articles with irrelevant citations. Stemming this tide of corruption would be one effective way to reduce the contamination of the research literature. Here are two measures I suggest that publishers should take if they seriously want to clean up their journals.

1. Three strikes and you are out. Any editor who has accepted three or more paper milled papers should be debarred from acting as an editor, and all papers that they have been responsible for accepting should be regarded as suspect. This means retrospectively cleaning-up the field by scrutinising the suspect papers and retracting any from authors associated with paper mills, or which are characterised by features suggestive of paper mills, such as tortured phrases, citation stacking, gobbledegook content, fake reviews from reviewers suggested by authors, invalid author email domains, or co-authors who are known to be part of a paper mill ring. All of these are things that any competent editor should be able to detect. I anticipate this would lead to a large number of retractions, particularly from journals with many Special Issues. As well as these simple indicators, we are told that publishers are working hard to develop AI-based checks. They should use these not only to screen new submissions, and to retract published papers, but also to identify editors who are allowing this to happen on their watch. It also goes without saying that nobody who has co-authored a paper-milled paper should act as an editor.

2. All candidates for roles as Editor or Guest Editor at a journal should be checked against the post-publication peer review website PubPeer, and rejected if this reveals evidence of papers that have had credible criticisms suggesting of data fabrication or falsification. This is a far from perfect indicator: only a tiny fraction of authors receive PubPeer comments, and these may comment on trivial or innocent aspects of a paper. But, as I shall demonstrate, using such a criterion can reveal cases of editorial misconduct.

I will illustrate how this might work in practice, using the example of the MDPI journal Electronics. This journal came to my attention because it has indicators that all is not well with its Special Issues programme. 

First, in common with nearly all MDPI journals, Electronics has regularly broken the rule that specifies that no more than 25% of articles should be authored by a Guest Editor. As mentioned in a previous post, this is a rule that has come and gone in the MDPI guidelines, but which is clearly stated as a requirement for inclusion in the Directory of Open Access Journals (DOAJ). 13% of Special Issues in Electronics completed in 2023-4 broke this rule**. DOAJ have withdrawn some MDPI journals from their directory for this reason, and I live in hope that they will continue to implement this policy rigorously - which would entail delisting from their Directory the majority of MDPI journals. Otherwise, there is nothing to stop publishers claiming to be adhering to rigorous standards while failing to implement them, making listing in DOAJ an irrelevance.  

Even more intriguing, for around 11% of the 517 Special issues of Electronics published in 2023-4, the Guest Editor doesn't seem to have done any editing We can tell this because Special Issues are supposed to list who has acted as Academic Editor for each paper. MDPI journals vary in how rigorously they implement that rule - some journals have no record of who was the Academic Editor. But most do, and in most Special Issues, as you might expect, the Guest Editor is the Academic Editor, except for any papers where there is conflict of interest (e.g. if authors are Guest Editors or are from the same institution as the Guest Editor). Where the Guest Editor cannot act as Academic Editor, the MDPI guidelines state that this role will be taken by a member of the Editorial Board. But, guess what? Sometimes that doesn't happen. As someone with a suspicious frame of mind, and a jaundiced view of how paper mills operate, this is a potential red flag for me.

Accordingly, I decided to check PubPeer comments for individuals in three editorial roles at Electronics for the years 2023-4:

  • Those listed as being in a formal editorial role on the journal website. 
  • Those acting a Guest Editors 
  • Those acting as Academic Editors, despite not being in the other two categories.

For Editors, a PubPeer search by name revealed 213/931 that had one or more comments. That sounds alarming, but cannot be taken at face value, because there are many innocent reasons for this result. The main one is namesakes: this is particularly common with Chinese names, which tend to be less distinctive than Western names. It is therefore important to match PubPeer comments on affiliations as well as names. Using this approach, it was depressingly easy to find instances of Editors who appeared associated with paper mills. I will mention just three, to illustrate the kind of evidence that PubPeer provides, but remember, there are many others deserving of scrutiny. 

  • As well as being a section board member of Electronics, Danda B Rawat (Department of Electrical Engineering and Computer Science, Howard University, Washington, DC 20059, USA) is Editor-in-Chief of Journal of Cybersecurity and Privacy, and a section board member of two further MDPI journals: Future Internet and Sensors. A PubPeer search reveals him to be co-author of one paper with tortured phrases, and another where equations make no sense. He is listed as Editor of three MDPI Special Issues: Multimodal Technologies and Interaction: Human Computer Communications and Internet of Things Sensors: Frontiers in Mobile Multimedia Communications Journal of Cybersecurity and Privacy: Applied Cryptography.
  • Aniello Castiglione  (Department of Management & Innovation Systems, University of Salerno, Italy) is Section Board Member of three journals: Electronics, Future Internet, and Journal of Cybersecurity and Privacy, and an Editorial Board member of Sustainability. PubPeer reveals he has co-authored one paper that was recently retracted because of compromised editorial processing, and that his papers are heavily cited in several other articles that appear to be used as vehicles for citation stacking. 
  •  Natalia Kryvinska (Department of Information Systems, Faculty of Management, Comenius University in Bratislava, Slovakia) is a Section Board Member of Electronics. She has co-authored several articles with tortured phrases.

Turning to the 1326 Guest Editors of Special Issues, there were 500 with at least one PubPeer comment, but as before, note that in many cases name disambiguation is difficult, so this will overestimate the problem. Once again, while it may seem invidious to single out specific individuals, it seems important to show the kinds of issues that can be found among those who are put in this important gatekeeping role. 

Finally, let's look at the category of Academic Editors who aren't listed as journal Editors. It's unclear how they are selected and who approves their selection. Again, among those with PubPeer comments, there's a lot to choose from. I'll focus here on three who have been exceptionally busy doing editorial work on several special issues. 

  • Gwanggil Jeon (Incheon National University, Korea) has acted as Academic Editor for 18 Special Issues in Electronics. He is not on the Editorial Board of the journal, but he has been Guest Editor for two special issues in Remote Sensing, and one in SensorsPubPeer comments note recycled figures and irrelevant references in papers that he has co-authored, as well as a problematic Special Issue that he co-edited for Springer Nature, which led to several retractions.
  • Hamid Reza Karimi (Department of Mechanical Engineering, Politecnico di Milano, Milan, Italy) has acted as Academic Editor for 12 Special Issues in Electronics. He was previously Guest Editor for two Special Issues of Electronics, one of Sensors, one of Micromachines, and one of Machines.  In 2022, he was specifically called out by the IEEE for acting "in violation of the IEEE Principles of Ethical Publishing by artificially inflating the number of citations" for several articles. 
  • Finally, Juan M. Corchado (University of Salamanca, Spain) has acted as Academic Editor for 29 Special Issues. He was picked up by my search as he is not currently listed as being an Editor for Electronics, but that seems to be a relatively recent change: when searching for information, I found this interview from 2023. Thus his role as Academic Editor seems legitimate. Furthermore, as far as PubPeer is concerned, I found only one old comment, concerned with duplicate publication. However, he is notorious for boosting citations to his work by unorthodox means, as described in this article.* I guess we could regard his quiet disappearance from the Editorial Board as a sign that MDPI are genuinely concerned about editors who try to game the system. If so, we can only hope that they employ some experts who can do the kinds of cross-checking that I have described here at scale. If I can find nine dubious editors of one journal in a couple of hours searching, then surely the publisher, with all its financial resources, could uncover many more if they really tried.

Note that many of the editors featured here have quite substantial portfolios of publications. This makes me dubious about MDPI's latest strategy for improving integrity - to use an AI tool to select potential reviewers "from our internal databases with extensive publication records". That seems like an excellent way to keep paper millers in control of the system. 

Although the analysis presented here just scratches the surface of the problem, it would not have been possible without the help of sleuths who made it straightforward to extract the information I needed from the internet. My particular thanks to Pablo Gómez Barreiro, Huanzi and Sholto David.

I want to finish by thanking the sleuths who attempt to decontaminate the literature by posting comments to PubPeer. Without their efforts it would be much harder to keep track of paper millers. The problem is large and growing. Publishers are going to need to invest seriously in employing those with the expertise to tackle this issue. 

 *As I was finalising this piece, this damning update from El Pais appeared. It seems that many retractions of Corchado papers are imminent.  

 I can't keep up.... here's today's news. 


** P.S. 25th Sept 2024. DOAJ inform me that Electronics was removed from their directory in June of this year. 

*** P.P.S. 26th Sept 2024.  Guillaume Cabanac pointed me to this journal-level report on PubPeer, where he noted a high rate of Electronics papers picked up by the Problematic Paper Screener.

Saturday, 14 September 2024

Prodding the behemoth with a stick

 

Like many academics, I was interested to see an announcement on social media that a US legal firm had filed a federal antitrust lawsuit against six commercial publishers of academic journals: (1) Elsevier B.V.; (2) Wolters Kluwer N.V.; (3) John Wiley & Sons, Inc.; (4) Sage Publications, Inc.; (5) Taylor and Francis Group, Ltd.; and (6) Springer Nature AG & Co, on the grounds that "In violation of Section 1 of the Sherman Act, the Publisher Defendants conspired to unlawfully appropriate billions of dollars that would have otherwise funded scientific research".   

 

So far, so good.  I've been writing about the avaricious practices of academic publishers for over 12 years, and there's plenty of grounds for a challenge. 

 

However, when I saw the case being put forward, I was puzzled.  From my perspective, the arguments just don't stack up.  In particular, three points are emphasised in the summary (quoted verbatim here from the website): 

 

  • First, an agreement to fix the price of peer review services at zero that includes an agreement to coerce scholars into providing their labor for nothing by expressly linking their unpaid labor with their ability to get their manuscripts published in the defendants’ preeminent journals.

 

But it's not true that there is an express link between peer review and publishing papers in the pre-eminent journals.  In fact, many journal editors complain that some of the most prolific authors never do any peer review - gaining an advantage by not adopting the "good citizen" behaviour of a peer reviewer.  I think this point can be rapidly thrown out.

 

  • Second, the publisher defendants agreed not to compete with each other for manuscripts by requiring scholars to submit their manuscripts to only one journal at a time, which substantially reduces competition by removing incentives to review manuscripts promptly and publish meritorious research quickly. 

 

This implies that the rationale for not allowing multiple submissions is to reduce competition between publishers.  But if there were no limit on how many journals you could simultaneously submit to, then the number of submissions to each journal would grow massively, increasing the workload for editors and peer reviewers - and much of their time would be wasted. This seems like a rational requirement, not a sinister one.

 

  • Third, the publisher defendants agreed to prohibit scholars from freely sharing the scientific advancements described in submitted manuscripts while those manuscripts are under peer review, a process that often takes over a year. As the complaint notes, “From the moment scholars submit manuscripts for publication, the Publisher Defendants behave as though the scientific advancements set forth in the manuscripts are their property, to be shared only if the Publisher Defendant grants permission. Moreover, when the Publisher Defendants select manuscripts for publication, the Publisher Defendants will often require scholars to sign away all intellectual property rights, in exchange for nothing. The manuscripts then become the actual property of the Publisher Defendants, and the Publisher Defendants charge the maximum the market will bear for access to that scientific knowledge.” 

Again, I would question the accuracy of this account.  For a start, in most science fields, peer review is a matter of weeks or months, not "over a year".  But also, most journals these days allow authors to post their articles as preprints, prior to, or at the point of submission. In fact, this is encouraged by many institutions, as it means that a Green Open Access version of the publication is available, even if the work is subsequently published in a pay-to-read version.  

 

In all, I am rather dismayed by this case, especially when there are very good grounds on which academic publishers can be challenged.  For instance:

 

1. Academic publishers claim to ensure quality control of what gets published, but some of them fail to do the necessary due diligence in selecting editors and reviewers, with the result that the academic literatureis flooded with weak and fraudulent material, making it difficult to distinguish valuable from pointless work, and creating an outlet for damaging material, such as pseudoscience.  This has become a growing problem with the advent of paper mills.

 

2. Many publishers are notoriously slow at responding to credible evidence of serious problems in published papers. It can take years to get studies retracted, even when they have important real world consequences.

 

3. Perhaps the only point in common with the case by Leiff Cabraser, Heimann and Bernstein concerns the issue of retention of intellectual property rights.  It is the case that publishers have traditionally required authors to sign away copyright of their works.  In the UK, at least, there has been a movement to fight this requirement, which has had some success, but undoubtedly more could be done. 

 

If I can find time I will add some references to support some of the points above - this is a hasty response to discussion taking place on social media, where many people seem to think it's great that someone is taking on the big publishers. I never thought I would find myself in a position of defending them, but I think if you are going to attack a behemoth, you need to do so with good weapons.  

 

 

Postscript

Comments on this post are welcome - there is moderation so they don't appear immediately.

 Nick Wise attempted unsuccessfully to add a comment (sorry, Blogger can be weird), providing this helpful reference on typical duration of peer review.  Very field-dependent and may be a biased sample, I suspect, but it gives us a rough idea.

PPS. 5th October 2024.

Before I wrote this blogpost, I contacted the legal firm involved, Leiff Cabraser, Heimann & Bernstein, via their website, to raise the same points.  Yesterday I received a reply from them, explaining that "Because you are located abroad, unfortunately you are not a member of this class suit".  This suggests they don't read correspondence sent to them. Not impressed.  

Wednesday, 4 September 2024

Now you see it, now you don't: the strange world of disappearing Special Issues at MDPI

 

There is growing awareness that Special Issues have become a menace in the world of academic publishing, because they provide a convenient way for large volumes of low quality work to be published in journals that profit from a healthy article processing charge. There has been a consequent backlash against Special Issues, with various attempts to rein them in. Here I'll describe the backstory and show how such attempts are being subverted. 

Basically, when it became normative for journals to publish open access papers in exchange for an article processing charge, many publishers saw an opportunity to grow their business by expanding the number of articles they published. There was one snag: to maintain quality standards, one requires academic editors to oversee the peer review process and decide what to publish. The solution was to recruit large numbers of temporary "guest editors", each of whom could invite authors to submit to a Special Issue in their area of expertise; this cleverly solved two problems at once: it provided a way to increase the number of submissions to the journal, and it avoided overloading regular academic editors. In addition, if an eminent person could be persuaded to act as guest editor, this would encourage researchers to submit their work to a Special Issue.

Problems soon became apparent though. Dubious individuals, including those running paper mills, seized the opportunity to volunteer as guest editors, and then proceeded to fill Special Issues with papers that were at best low quality and at worse fraudulent. As described in this blogpost, the publisher Wiley, was badly hit by fallout from the Special Issues programme, with its Hindawi brand being 'sunsetted' in 2023 . In addition, the Swiss National Science Foundation declared they would not fund APCs for articles in Special Issues, on the grounds that the increase in the number of special issues was associated with shorter processing times and lower rejection rates, suggestive of rushed and superficial peer review. Other commentators noted the reputational risks of overreliance on Special Issues.

Some publishers that had adopted the same strategy for growth looked on nervously, but basically took the line that the golden goose should be tethered rather than killed, introducing various stringent conditions around how Special Issues operated. The publisher MDPI, one of those that had massive growth in Special Issues in recent years, issued detailed guidelines.

One of these concerned guest editors publishing in their own special issues. These guidelines have undergone subtle changes over time, as evidenced by these comparisons of different versions (accessed via Wayback Machine):
JUNE 2022: The special issue may publish contributions from the Guest Editor(s), but the number of such contributions should be limited to 20%, to ensure the diversity and inclusiveness of authorship representing the research area of the Special Issue.... Any article submitted by a Guest Editor will be handled by a member of the Editorial Board.

 21 JAN 2023: The special issue may publish contributions from the Guest Editor(s), but the number of such contributions should be limited, to ensure the diversity and inclusiveness of authorship representing the research area of the Special Issue. Any article submitted by a Guest Editor will be handled by a member of the Editorial Board.

2 JAN 2024: The special issue may publish contributions from the Guest Editor(s), but the number of such contributions should be limited to 25%, to ensure the diversity and inclusiveness of authorship representing the research area of the Special Issue. Any article submitted by a Guest Editor will be handled by a member of the Editorial Board.

3 MAY 2024: The special issue may publish contributions from the Guest Editor(s), but the number of such contributions should be limited, to ensure the diversity and inclusiveness of authorship representing the research area of the Special Issue. Any article submitted by a Guest Editor will be handled by a member of the Editorial Board.

The May 2024 version of guidelines is nonspecific but problematic, because it is out of alignment with criteria for accreditation by the Directory of Open Access Journals (DOAJ) , who state "Papers submitted to a special issue by the guest editor(s) must be handled under an independent review process and make up no more than 25% of the issue's total". Most of MDPI's journals are listed on DOAJ, which is a signal of trustworthiness.

So, how well is MDPI doing in terms of the DOAJ criteria? I was first prompted to ask this question when writing about an article in a Special Issue of Journal of Personalized Medicine that claimed to "reverse autism symptoms". You can read my critique of that article here; one question it raised was how on earth did it ever get published? I noted that the paper was handled by a guest editor, Richard E. Frye, who had coauthored 7 of the 14 articles in the Special Issue. I subsequently found that between 2021 and 2024 he had published 30 articles in Journal of Personalized Medicine, most in three special issues where he was guest editor. I'm pleased to say that DOAJ have now delisted the journal from their Directory. But this raises the question of how well MDPI is regulating their guest editors to prevent them going rogue and using a Special Issue as a repository for papers by themselves and their cronies.

To check up on this, I took a look at Special Issues published in 2023-2024 in 28 other MDPI journals*, focusing particularly on those with implications for public health. What I found was concerning at four levels. 

  1. Every single journal I looked at had Special Issues that broke the DOAJ rule of no more than 25% papers co-authored by guest editors (something DOAJ refer to as "endogeny").  Some of these can be found on PubPeer, flagged with the term "stuffedSI". 
  2. A minority of Special Issues conformed to the description of a "successful Special Issue" envisaged by the MDPI guidelines: "Normally, a successful Special Issue consists of 10 or more papers, in addition to an editorial (optional) written by the Guest Editor(s)." For the journals I looked at around 60% of Special Issues had fewer than 10 articles. 
  3. Quite often, the listed guest editors did not actually do any editing. One can check this by comparing the Action Editor listed for each article. Here's one example, where a different editor was needed for three of the nine papers to avoid conflict of interest, because they were co-authored by the guest editors;  but the guest editors are not listed as action editors for any of the other six papers in the special issue. 
  4. As I did this analysis, I became aware that some articles changed status. For instance, Richard E. Frye, mentioned above, had additional articles in the Journal of Personalized Medicine that were originally part of a Special Issue that are now listed as just belonging to a Section. see https://pubpeer.com/publications/BA21B22CA3FED62B6D3F679978F591#1.This change was not transparent, but was evident when earlier versions of the website were accessed using the Wayback Machine. Some of these are flagged with the term "stealth correction" on PubPeer.

This final observation was particularly worrying, because it indicated that the publisher could change the Special Issue status of articles post-publication. The concern is that lack of oversight of guest editors has created a mechanism whereby authors can propose a Special Issue, get taken on as a guest editor, and then have papers accepted there (either their own, or from friends, which could include papermillers), after which the Special Issue status is removed. In fact, given the growing nervousness around Special Issues, removal of Special Issue status could be an advantage.

When I have discussed with colleagues these and other issues around MDPI practices, I find that credible researchers tell me that there are some excellent journals published by MDPI. It seems unfortunate that, in seeking rapid growth via the mechanism of Special Issues, the publisher has risked its reputation by giving editorial responsibility to numerous guest editors without adequate oversight, and encouraged quantity over quality. Furthermore, the lack of transparency demonstrated by the publisher covertly removing Special Issue status from articles by guest editors does not appear consistent with their stated commitment to ethical policies. 

 *The code for this analysis and a summary chart for the 28 journals can be found on Github.