The journal that allowed a bizarre article linking Covid-19 to 5G cell phone waves to “slip through the net” now blames rigged peer review for the fishy paper.
As we reported last month, the journal initially simply withdrew the article without explanation. But the publisher, Biolife, then provided us with a few less-than-satisfying excuses, such as:
A journal has allowed a geophysicist who cited his own work hundreds of times across 10 papers to retract the articles and republish them with a fraction of the self-citations.
From 2017 to 2019, Yangkang Chen published some of the papers in Geophysical Journal International, an Oxford University Press title, while he was a postdoc at Oak Ridge National Laboratory in the U.S., and some as a faculty member at Zhejiang University in China. In April, the journal subjected the works to expressions of concern.
In Retraction Watch world, it’s like finding long-buried and forgotten treasure.
A now-defunct journal retracted nearly four dozen papers in a single sweep, citing questions about the integrity of the peer review process for the articles.
The Open Automation and Control Systems Journal, formerly published by Bentham, released a list of 46 articles, which it published in 2015, by researchers from various institutions in China. Bentham dates the retractions to 2016. We learned about the case from a commenter to our recent post about a mysterious incident of plagiarism.
The research process is rarely straightforward. There are a myriad of ways in which it can go wrong, from the inception of a hypothesis that goes on to be disproved, to failed experiments and rejected manuscripts, hopefully ending in the “happily ever after” of adding to the scholarly record through publication and worldwide dissemination… before starting all over again. Being able to build on the corpus of existing knowledge is essential for future discoveries and innovation: As Newton wrote back in 1675 “If I have seen further, it is by standing upon the shoulders of giants.”
Sadly, we know that even once published, many scientific results are not easily reproducible, and some are amended or retracted. Fraud and misconduct might be the attention-grabbing explanations for the lack of reproducibility in research, but more often than not, it is honest mistakes or making decisions with inaccurate or incomplete information that lead to errata, corrigenda or retraction of articles. Many have argued we need to be more honest about this – and to see retraction as a good thing. Correction of the version of record should be embraced, rather than avoided, and the stigma surrounding retractions should be removed.
Following pushback from members of the taxonomy community, Clarivate Analytics, the company behind the Impact Factor, has reversed its decision to suppress two journals from receiving those scores this year.
As we reported in late June, Clarivate suppressed 33 journals from its Journal Citation Reports, which meant denying them an Impact Factor, for high levels of self-citation that boosted their scores and ranking. Many universities — controversially — rely on Impact Factor to judge the work of their researchers, so the move could have a dramatic effect on journals and the authors whose work appears in them.
A pediatric infectious disease specialist in California “recklessly” fabricated his data in a 2009 published study and four grant submissions, worth millions of dollars, to the National Institutes of Health, according to the U.S. Office of Research Integrity (ORI).
A study that compared drugs used to reverse the effects of relaxants for surgery has been retracted because the majority of the results were already published.
The work found that the drug sugammadex worked faster than pyridostigmine in children undergoing surgery, and doesn’t appear to have anything wrong with it. But a study with the same authors and same name (barring a single uncapitalized letter) had already been published in the journal Anesthesia and Pain Medicine on July 31, 2019.
A study on a wireless communication algorithm was retracted for being an exact duplicate of a paper submitted to a separate journal last year — but the authors were different and it’s unclear how they got hold of it.
At least two more journals are fighting decisions by Clarivate — the company behind the Impact Factor — to suppress them from the 2019 list of journals assigned a metric that many rightly or wrongly consider career-making.
In a letter to the editorial board of Body Image, an Elsevier journal that was one of 33 suppressed by Clarivate for excessive self-citation, editor in chief Tracy Tylka and nine journal colleagues write: