A journal has allowed a geophysicist who cited his own work hundreds of times across 10 papers to retract the articles and republish them with a fraction of the self-citations.
From 2017 to 2019, Yangkang Chen published some of the papers in Geophysical Journal International, an Oxford University Press title, while he was a postdoc at Oak Ridge National Laboratory in the U.S., and some as a faculty member at Zhejiang University in China. In April, the journal subjected the works to expressions of concern.
A surgery journal has retracted seven papers by a group in South Korea after an institutional investigation found evidence of “intentional, repetitive, and serious misconduct” in the work.
The articles, by a team at Ewha Womans University and Seoul National University College of Medicine, appeared in the Journal of Hand Surgery (European Volume) between 2016 and 2019.
According to the journal, in June of 2019, a reviewer raised questions about the sample size in a manuscript by the authors (most of whom remain the same across the papers) — triggering an inquiry by the editors that led to Ewha’s investigation.
That investigation found that the first author on all seven articles, Young Hak Roh, an orthopedic surgeon at Ewha, had committed sweeping violations:
A group of researchers in China have lost a 2018 paper after whistleblowers informed the journal that the authors had misreported their data.
The paper, “Long‐term outcomes of 530 esophageal squamous cell carcinoma patients with minimally invasive Ivor Lewis esophagectomy,” appeared in the Journal of Surgical Oncology, a Wiley publication. It has been cited five times, according to Clarivate Analytics’ Web of Science. The researchers were affiliated with Zhejiang University, in Hangzhou.
In Retraction Watch world, it’s like finding long-buried and forgotten treasure.
A now-defunct journal retracted nearly four dozen papers in a single sweep, citing questions about the integrity of the peer review process for the articles.
The Open Automation and Control Systems Journal, formerly published by Bentham, released a list of 46 articles, which it published in 2015, by researchers from various institutions in China. Bentham dates the retractions to 2016. We learned about the case from a commenter to our recent post about a mysterious incident of plagiarism.
The journal that recently ran a controversial essay on poverty and race has flagged it with an editor’s note letting readers know about an investigation into the work.
As we reported last week, Society, a Springer Nature title, published a paper by Lawrence Mead, of New York University, who argued that poor Blacks and Hispanics lack certain cultural traits that help European whites succeed in the face of economic adversity:
An infectious diseases researcher found by a federal U.S. watchdog to have “recklessly” faked data in grants worth millions left his job as the investigation was coming to a close, Retraction Watch has learned.
The research process is rarely straightforward. There are a myriad of ways in which it can go wrong, from the inception of a hypothesis that goes on to be disproved, to failed experiments and rejected manuscripts, hopefully ending in the “happily ever after” of adding to the scholarly record through publication and worldwide dissemination… before starting all over again. Being able to build on the corpus of existing knowledge is essential for future discoveries and innovation: As Newton wrote back in 1675 “If I have seen further, it is by standing upon the shoulders of giants.”
Sadly, we know that even once published, many scientific results are not easily reproducible, and some are amended or retracted. Fraud and misconduct might be the attention-grabbing explanations for the lack of reproducibility in research, but more often than not, it is honest mistakes or making decisions with inaccurate or incomplete information that lead to errata, corrigenda or retraction of articles. Many have argued we need to be more honest about this – and to see retraction as a good thing. Correction of the version of record should be embraced, rather than avoided, and the stigma surrounding retractions should be removed.
Following pushback from members of the taxonomy community, Clarivate Analytics, the company behind the Impact Factor, has reversed its decision to suppress two journals from receiving those scores this year.
As we reported in late June, Clarivate suppressed 33 journals from its Journal Citation Reports, which meant denying them an Impact Factor, for high levels of self-citation that boosted their scores and ranking. Many universities — controversially — rely on Impact Factor to judge the work of their researchers, so the move could have a dramatic effect on journals and the authors whose work appears in them.
The response to our request for comment from editor in chief Pio Conti reads a bit like a Mad Libs of excuses we hear from publishers when something goes wrong. Read carefully for: