Sunday, 27 October 2024

I don't care about journal impact factors but I do care about visibility

There's been a fair bit of discussion about Clarivate's decision to pause inclusion of eLife publications on the Science Citation Index (e.g. on Research Professional).  What I find exasperating is that most of the discussion focuses on a single consequence - loss of eLife's impact factor.  For authors, there are graver consequences.   

I've reviewed for eLife but never published there; however, I have published a lot in Wellcome Open Research, which is another journal that aimed to disrupt the traditional publishing model, and has some similarities with eLife.  Wellcome Open Research has never been included in Science Citation Index, despite the fact that it uses peer review.  Wellcome Open Research has an unconventional model whereby submitted papers are immediately published, as a kind of pre-print prior to peer review, and then updated after peer review.  It is true that some papers don't get sufficient approval to proceed to the peer-reviewed stage; the distinction between those that do and do not pass peer review is clearly flagged on the article.  In addition to peer review, Wellcome Open Research maintains some quality control by limiting eligibility to researchers funded by Wellcome.  

 

When Wellcome Open Research started up, all Wellcome-funded researchers were encouraged to publish there.  As someone committed to Open Research, this seemed a great idea.  There were no publication charges, and everything was open: access to the publication, data, and peer review. Peer reviewers even get DOIs for their reviews, some of which are worth citing in their own right.  I was increasingly adopting open practices, and I think some of my best peer-reviewed work is published there. 

 

I was shocked when I discovered that the journal wasn't included in Web of Science. I remember preparing a progress report for Wellcome and using Web of Science to check I hadn't omitted any publications.  I was puzzled that I seemed to have published far less than I remembered. Then it became clear: everything in Wellcome Open Research was missing. 

 

I was on the Advisory Board for Wellcome Open Research at the time, and raised this with them. They were shocked that I was upset.  "We thought you of all people didn't care about impact factors", they said. This, of course, was true. But I did care a lot about my work being visible.  I was also aware that any WOS-based H-index would exclude all the papers listed below: not a big deal for me, but potentially harmful to junior authors.  

 

The reply I got was similar to the argument being made by eLife  - well, the articles are indexed in Google Scholar and PubMed.  That was really little consolation to me, given that I had relied heavily on Web of Science in my own literature searches, believing that it screened out dodgy journals. (This belief turns out to be false - there are many journals featured in WoS that are very low quality, which just rubs salt into the wound).  

 

I have some criticisms of eLife's publishing model, but I would like them to succeed. We urgently need alternatives to the traditional journal model operated by the big commercial publishers.  Their response to the open access movement has been to monetise it, with catastrophic consequences for science, as an unlimited supply of shoddy and fake work gets published - often in journals that are indexed in Web of Science.

 

I agree that we need an index of published academic work that has some quality control.  Whether alternatives like OpenAlex will do the job remains to be seen. 

 

Papers that aren't indexed on Web of Science

Bishop, D. V. M., & Bates, T. C. (2020). Heritability of language laterality assessed by functional transcranial Doppler ultrasound: A twin study. Wellcome Open Research, 4, 161. https://doi.org/10.12688/wellcomeopenres.15524.3


Bishop, D. V. M., Brookman-Byrne, A., Gratton, N., Gray, E., Holt, G., Morgan, L., Morris, S., Paine, E., Thornton, H., & Thompson, P. A. (2019). Language phenotypes in children with sex chromosome trisomies. Wellcome Open Research, 3, 143. https://doi.org/10.12688/wellcomeopenres.14904.2


Bishop, D. V. M., Grabitz, C. R., Harte, S. C., Watkins, K. E., Sasaki, M., Gutierrez-Sigut, E., MacSweeney, M., Woodhead, Z. V. J., & Payne, H. (2021). Cerebral lateralisation of first and second languages in bilinguals assessed using functional transcranial Doppler ultrasound. Wellcome Open Research, 1, 15. https://doi.org/10.12688/wellcomeopenres.9869.2


Frizelle, P., Thompson, P. A., Duta, M., & Bishop, D. V. M. (2019). The understanding of complex syntax in children with Down syndrome. Wellcome Open Research, 3, 140. https://doi.org/10.12688/wellcomeopenres.14861.2


Newbury, D. F., Simpson, N. H., Thompson, P. A., & Bishop, D. V. M. (2018). Stage 1 Registered Report: Variation in neurodevelopmental outcomes in children with sex chromosome trisomies: protocol for a test of the double hit hypothesis. Wellcome Open Research, 3, 10. https://doi.org/10.12688/wellcomeopenres.13828.2


Newbury, D. F., Simpson, N. H., Thompson, P. A., & Bishop, D. V. M. (2021). Stage 2 Registered Report: Variation in neurodevelopmental outcomes in children with sex chromosome trisomies: testing the double hit hypothesis. Wellcome Open Research, 3, 85. 

https://doi.org/10.12688/wellcomeopenres.14677.4


Pritchard, V. E., Malone, S. A., Burgoyne, K., Heron-Delaney, M., Bishop, D. V. M., & Hulme, C. (2019). Stage 2 Registered Report: There is no appreciable relationship between strength of hand preference and language ability in 6- to 7-year-old children. Wellcome Open Research, 4, 81. https://doi.org/10.12688/wellcomeopenres.15254.1


Thompson, P. A., Bishop, D. V. M., Eising, E., Fisher, S. E., & Newbury, D. F. (2020). Generalized Structured Component Analysis in candidate gene association studies: Applications and limitations. Wellcome Open Research, 4, 142. https://doi.org/10.12688/wellcomeopenres.15396.2


Wilson, A. C., & Bishop, D. V. M. (2019). ‘If you catch my drift...’: Ability to infer implied meaning is distinct from vocabulary and grammar skills. Wellcome Open Research, 4, 68. https://doi.org/10.12688/wellcomeopenres.15210.3


Wilson, A. C., King, J., & Bishop, D. V. M. (2019). Autism and social anxiety in children with sex chromosome trisomies: An observational study. Wellcome Open Research, 4, 32. https://doi.org/10.12688/wellcomeopenres.15095.2


Woodhead, Z. V. J., Rutherford, H. A., & Bishop, D. V. M. (2020). Measurement of language laterality using functional transcranial Doppler ultrasound: A comparison of different tasks. Wellcome Open Research, 3, 104. https://doi.org/10.12688/wellcomeopenres.14720.3

 

 

 

Monday, 21 October 2024

What is going on at the Journal of Psycholinguistic Research?

Last week this blog focussed on problems affecting Scientific Reports, a mega-journal published by Springer Nature. This week I look at a journal at the opposite end of the spectrum, the Journal of Psycholinguistic Research (JPR), a small, specialist journal which has published just 2187 papers since it was founded in 1971. This is fewer than Scientific Reports publishes in one year. It was brought to my attention by Anna Abalkina because it shows every sign of having been targeted by one or more Eastern European paper mills.

Now, this was really surprising to me. JPR was founded in 1971 by Robert Rieber, whose obituaries in the New York Times  and the American Psychologist confirm he had a distinguished career (though both misnamed JPR!). The Advisory and Editorial boards of the journal are peppered with names of famous linguists and psychologists, starting with Noam Chomsky. So there is a sense that if this can happen to JPR, no journal is safe.

Coincidentally, last week Anna and I submitted revisions for a commentary on paper mills coauthored with Pawel Matusz. (You can read the preprint here). Pawel is editor of the journal Mind, Brain & Education (MBE), which experienced an attack by the Tanu.pro paper mill involving papers published in 2022-3. In the commentary, we discussed characteristics of the paper mill, which are rather distinctive and quite different from what is seen in basic biomedical or physical sciences. A striking feature is that the IMRaD structure (Introduction, Methods, Results and Discussion) is used, but in a clueless fashion, with these headings being inserted in what is otherwise a rambling and discursive piece of text, that typically has little or no empirical content. Insofar as there are any methods described, they don't occur in the methods section, and they are too vague for the research to be replicable.

Reading these papers rapidly turns my brain to mush, but in the interest of public service I did wade through five of them and left comments on Pubpeer:  

Yeleussizkyzy, M., Zhiyenbayeva, N., Ushatikova, I. et al. E-Learning and Flipped Classroom in Inclusive Education: The Case of Students with the Psychopathology of Language and Cognition. J Psycholinguist Res 52, 2721–2742 (2023). https://doi.org/10.1007/s10936-023-10015-y  

Snezhko, Z., Yersultanova, G., Spichak, V. et al. Effects of Bilingualism on Students’ Linguistic Education: Specifics of Teaching Phonetics and Lexicology. J Psycholinguist Res 52, 2693–2720 (2023). https://doi.org/10.1007/s10936-023-10016-x

Nurakenova, A., Nagymzhanova, K. A Study of Psychological Features Related to Creative Thinking Styles of University Students. J Psycholinguist Res 53, 1 (2024). https://doi.org/10.1007/s10936-024-10042-3

Auganbayeva, M., Turguntayeva, G., Anafinova, M. et al.Linguacultural and Cognitive Peculiarities of Linguistic Universals. J Psycholinguist Res 53, 3 (2024). https://doi.org/10.1007/s10936-024-10050-3

Shalkarbek, A., Kalybayeva, K., Shaharman, G. et al. Cognitive Linguistic Analysis of Hyperbole-based Phraseological Expressions in Kazakh and English Languages. J Psycholinguist Res 53, 4 (2024). https://doi.org/10.1007/s10936-024-10052-1

My experience with the current batch of papers suggests that a relatively quick way of screening a submitted paper would be to look at the Methods section. This should contain an account of methods that would indicate what was done and how, at a level of detail sufficient for others to replicate the work. Obviously, this is not appropriate for theoretical papers, but for those purporting to report empirical work, it would work well, at least for the papers I looked at in JPR.   

All of these papers have authors from Kazakhstan, sometimes with co-authors from the Russian Federation. This led me to look at the geographic distribution of authors in the journal over time. The top countries represented by JPR authors in 2020 onwards are China (113), United States (68), Iran (52), Germany (28), Saudi Arabia (22) and Kazakhstan (19). However, these composite numbers mask striking trends. All the Kazakhstan authored papers are in 2023-2024. There's also a notable fall-off in papers authored by USA-based authors in the same time period, with only 11 cases in total. This is quite remarkable for a journal that had a striking USA dominance in authors up until around 2015, as shown in the attached figure (screenshot from Dimensions.ai).

 

Number of papers in JPR from five top countries: 2005-2024

Exported: October 20, 2024

Criteria: Source Title is Journal of Psycholinguistic Research.

© 2024 Digital Science and Research Solutions Inc. All rights reserved. 

Non-commercial redistribution / external re-use of this work is permitted subject to appropriate acknowledgement. 

This work is sourced from Dimensions® at www.dimensions.ai.

Whenever a paper mill infestation is discovered, it raises the question of how it happened. Surely the whole purpose of peer review is to prevent low quality or fraudulent material entering the literature? In other journals where this has happened it has been found that the peer review process was compromised, with fake peer reviewers being used. Even so, one would have hoped that an editor would scrutinize papers and realise something was amiss. As mentioned in the previous blogpost, it would be much easier to track down the ways in which fraudulent papers get into mainstream journals if the journal reported information about the editor who handled the paper, and published open peer review.

Whatever the explanation, it is saddening to see a fine journal brought so low. In 2021, at the 50th anniversary of the founding of the journal, the current editor, Rafael Art. Javier, wrote a tribute to his predecessor, Robert Rieber:
"His expectation, as stated in that first issue, was that manuscripts accepted 'must add to knowledge in some way, whether they are in the form of experimental reports, review papers, or theoretical papers...and studies with negative results,' provided that they are of sufficiently high quality to make an original contribution."

Let us hope that the scourge of paper mills can be banished from the journal to allow it to be restored to the status it once had, and for Robert Rieber's words to once more be applicable.

 

Wednesday, 16 October 2024

An open letter regarding Scientific Reports

16th October 2024 

to: Mr Chris Graf
Research Integrity Director, Springer Nature and Chair Elect of the World Conference on Research Integrity Foundation Governing Board.

 

Dear Mr Graf,

We are a group of sleuths and forensic meta-scientists who are concerned that Springer Nature is failing in its duty to protect the scientific literature from fraudulent and low quality work. We are aware that, as noted in the 2023 Annual Report, you are committed to maintaining research integrity. We agree with the statement: “To solve the world’s biggest challenges, we all need research that’s reliable, trustworthy and can be built on by scientists and innovators. As a leading global research publisher, we have a pivotal role to play.” It is encouraging to hear that the Springer Nature research integrity group doubled in size in 2023. Nevertheless, we have a growing sense that all is not well concerning the mega journal Scientific Reports.

Some of the work that has been published is so seriously flawed that it is not credible that it underwent any meaningful form of peer review. In other cases, when we have reported flawed papers to the editor or integrity team, the response has been inadequate. A striking example cropped up last week when a “corrected” version of an article was published in Scientific Reports. This article had been flagged up by Guillaume Cabanac as containing numerous “tortured phrases” that are indicative of fraudulent authors attempting to bypass plagiarism checks; the authors were allowed to “correct” the article by merely removing some (not all) of the tortured phrases. This led some of us to look more closely at the article. As is evident from comments on PubPeer, it turned out to be a kind of case study of all the red flags for fraud that we look for. As well as (still uncorrected) tortured phrases, it contained irrelevant content, irrelevant citations, meaningless gibberish, a nonsensical figure, and material recycled from other publications.

This is perhaps the most flagrant example, but we argue that it indicates problems with your editorial processes that are not going to be fixed by AI. The only ways an article like this can have been published are either through editorial negligence or outright malpractice. For it to be negligence would require a remarkable degree of professional incompetence from a handling editor. The possibility of malpractice, would mean there is a corrupt handling editor who bypasses the peer review process entirely or willingly appoints corrupt peer reviewers to approve the manuscript. We appreciate that some papers that we and others have reported have been retracted, but in other cases blatantly fraudulent papers can take years to be retracted or to receive any appropriate editorial action.

We have some specific suggestions for actions that Springer Nature could take to address these issues.

  1. Employ a task force of people with the necessary expertise to carry out an urgent audit of all editors of Scientific Reports. We have looked at the editors on your website, and it is clear that this is an enormous task, given that there are over 13,000 of them, and they are not listed with disambiguating information such as Orcid IDs. Even so, in a few hours, by cross-checking this list against PubPeer, it was possible to identify the 28 cases listed below, covering a range of disciplines, and all, in our view, with pretty clear-cut evidence of problems. Four are members of the Editorial Board. We stress, this is just the low-hanging fruit which was fairly easy to detect.
  2. The list of problematic articles appended below or tabulated on the Problematic Paper Screener might provide an alternative route to identify editors who should never have been given a gatekeeping role in academic publishing. As well as checking the papers we list below, we recommend that all other articles accepted by the same editors should be scrutinised.
  3. Detection of problematic articles and editors could be helped by requiring open peer review for all journals, and ensuring that the name and Orcid ID of the handling editor is included with the published meta-data for all articles.
We hope these suggestions will be helpful in ensuring that research published in Scientific Reports is reliable and trustworthy.

Yours sincerely

Dorothy Bishop
Guillaume Cabanac
François-Xavier Coudert
René Aquarius
Nick Wise
Lonni Besançon
Simon A.J. Kimber
Anna Abalkina
Rickard Carlsson
Samuel J Westwood
Patricia Murray
Nicholas J. L. Brown
Smut Clyde
Leonid Schneider
Ian Hussey
Tu Duong
Gustav Nilsonne
Jamie Cummins
Alexander Magazinov
Elisabeth Bik
Mu Yang
Corrado Viotti
Sholto David


 

Appendices

1. Some examples of editors with concerning PubPeer entries

Editorial board Ghulam Md Ashraf
Editorial board Eun Bo Shim
Editorial board Ajay Goel
Editorial board Rasoul Kowsar

AGEING Vittorio Calabrese
AGRICULTURE Sudip Mitra
ANALYTICAL CHEMISTRY Syed Ghulam Musharraf
CELL BIOLOGY Gabriella Dobrowolny
CHEMICAL ENGINEERING Enas Taha Sayed
CIVIL ENGINEERING Manoj Khandelwal
CLINICAL ONCOLOGY Marcello Maggiolini
COMPUTATIONAL SCIENCE Praveen Kumar Reddy Maddikunta
DRUG DISCOVERY Salvatore Cuzzocrea
ENDOCRINOLOGY Sihem Boudina
ENVIRONMENTAL ENGINEERING Rama Rao Karri
ENVIRONMENTAL SCIENCE Mayeen Uddin Khandaker
GASTROENTEROLOGY AND HEPATOLOGY Sharon DeMorrow
IMMUNOLOGY Marcin Wysoczynski
INFECTIOUS DISEASES Fatah Kashanchi
MATHEMATICAL PHYSICS Ilyas Khan
MICROBIOLOGY Massimiliano Galdiero
NETWORKS AND COMPLEX SYSTEMS Achyut Shankar
NEUROLOGY Yvan Torrente
RESPIRATORY MEDICINE Soni Savai Pullamsetti
STRUCTURAL AND MOLECULAR BIOLOGY Stefania Galdiero
https://pubpeer.com/publications/42901FD2901EC917E3EE54B8DBD749#4 (authors claim a correction is underway, but none published for 2 years)
https://pubpeer.com/publications/01FE09F1127DF0598985987677A101 (part of a list of many flagged papers from this author group. Corrected rather than retracted)
https://pubpeer.com/publications/69EDBAECD50F31B051ECECCD1DF346 (notified on 31-3-2023 about this paper, no action so far)
https://pubpeer.com/publications/F8A1AD2B165888A06C18B28C860E7B. EiC contacted Nov. 22 with authorship concerns, responded that he would investigate. No action taken so far.
https://pubpeer.com/publications/286F83F9553D29F82CD4281309A1E4. Has had EoC for authorship irregularities since July 22, no action taken since.
https://pubpeer.com/publications/5BEDDDA9CF92B9CDDD2AB1AA796271 (blatantly nonsensical paper reported to publisher in June 2024; no action as yet)
https://pubpeer.com/publications/37B87CAC48DE4BC98AD40E00330143 (various corrections since 2022, and in Feb 2023 readers were told “conclusions of this article are being considered by the Editors. A further editorial response will follow the resolution of these issues”. 19 Months later we are still waiting.)


3. Some examples of journal-level reports posted on PubPeer

Scientific Reports

other Springer Nature journals:

Chemosphere