Didier Raoult, whose claims that hydroxychloroquine can treat COVID-19 have been widely disputed, has had a 2018 paper corrected for what his team says was unintentional duplication of a figure.
A group of data sleuths is calling for the retraction of seven articles by an exercise physiologist in Brazil whose data they believe to be “highly unlikely” to have occurred experimentally.
In a preprint posted to the server SportRxiv, the group — led by Andrew Vigotsky, a biomedical engineer at Northwestern University — details their concerns about the work of Matheus Barbalho, a PhD student at the Centro de Ciências Biológicas e da Saúde, part of the Universidade da Amazônia, in Belém. Barbalho’s mentor is Paulo Gentil.
In addition to the preprint, titled “Improbable data patterns in the work of Barbalho et al,” Greg Nuckols, one of the coauthors, has posted a lengthy “explainer” about the analysis.
A study that compared drugs used to reverse the effects of relaxants for surgery has been retracted because the majority of the results were already published.
The work found that the drug sugammadex worked faster than pyridostigmine in children undergoing surgery, and doesn’t appear to have anything wrong with it. But a study with the same authors and same name (barring a single uncapitalized letter) had already been published in the journal Anesthesia and Pain Medicine on July 31, 2019.
An anesthesia journal has retracted a 2020 paper by a group from China, Turkey and the United States after a post-publication review discovered issues with the analysis.
According to the notice, in the European Journal of Anaesthesiology:
An Elsevier journal plans to issue a retraction notice this week about a widely criticized 2012 paper claiming to find links between skin color, aggression, and sexuality.
The paper was the subject of a highly critical Medium post in November 2019, and of a petition with more than 1,000 signatures sent to Elsevier earlier this month.
The four-page retraction notice, provided to Retraction Watch by Elsevier, begins with a description of the history, policies and procedures at the journal, then launches into a litany of issues with the paper:
A study whose title suggested an “effective” way to give birth during the coronavirus pandemic has been temporarily retracted because the publisher says the word “effective” was included in the title by accident.
The method (pictured above) involved an enclosed, transparent chamber walling off the mother’s upper half from the rest of the world. It wasn’t very well received, according to an Essential Baby article that cited Twitter users referring to the “delivery table shield” as a “labor cage” and “greenhouse.”
A Japanese anesthesiologist who just notched his sixth retraction apologized for his misconduct and said his institution is now investigating his entire body of work.
Hironobu Ueshima, of Showa University in Tokyo, who has roughly 170 publications, told Retraction Watch by email:
Earlier this month, we reported on the retraction of two papers by a Japanese anesthesiologist for unreliable data. At the time, we noted that the case of Hironobu Ueshima bore watching, given his publication total runs to about 170.
The two retractions earlier this month came after an earlier one from the journal Medicine in April. Now, another anesthesia journal has retracted three more papers by Ueshima, citing misconduct, for a total of six.
The articles appeared in Regional Anesthesia & Pain Medicine in 2016 and 2019. According to the notice:
Often, when confronted with allegations of errors in papers they have published, journal editors encourage researchers to submit letters to the editor. Based on what we hear from such letter writers, however, the journals don’t make publication an easy process. Here’s one such story from a group at Indiana University: Luis M. Mestre, Stephanie L. Dickinson, Lilian Golzarri-Arroyo, and David B. Allison.
In late 2018, in the course of reviewing papers on obesity, one of us (DA) noticed a November 2018 article in the BMC journal Biomedical Engineering Online titled “Randomized controlled trial testing weight loss and abdominal obesity outcomes of moxibustion.” The objective of the study was to determine the effect of moxibustion therapy on weight loss, waist circumference, and waist-to-hip ratio in young Asian females living in Taiwan.
Some of the tabulated data in the paper seemed odd, so DA sent it to members of our research team asking for their input. The research team agreed, finding some irregularities in the data that seemed inconsistent with a randomized experimental design. After that, the task of carefully and thoroughly checking the published summary statistics and text in the paper was delegated to another of us (LM) and all of his work rechecked by professional statisticians and the research team.
The apparent inconsistencies and anomalies identified in the paper (i.e., large baseline differences, variance heterogeneity, and lack of details in the explanation of the study design) led to concerns about the extent to which the study report represented an accurate description of a properly conducted randomized controlled trial (RCT) and, therefore, whether the conclusions were reliable. Given the importance of reliable conclusions in the scientific literature on obesity treatment, as well as simply the integrity of the scientific literature overall, we decided to write a letter to the editor of the journal seeking either clarification or correction.
Yesterday, dozens of scientists petitioned the Proceedings of the National Academy of Sciences to “retract a paper on the effectiveness of masks, saying the study has ‘egregious errors’ and contains numerous ‘verifiably false’ statements,” as The New York Times reported. One of those scientists was James Heathers, whose name will likely be familiar to Retraction Watch readers because of his work as a scientific sleuth. We asked him to share his thoughts on why he signed the letter.
The above took a few weeks. To many people, it must have looked like silence. But the entire time, out of the public eye, a furious and detailed global discussion between scientists, statisticians, and epidemiologists about the accuracy of this paper was boiling. Frankly, almost no one believed this paper. The data were too regular. The access that was reported to hospital databases was too unusual. The amount of work done didn’t fit the parameters reported.
It was … “off.” It was obvious. Something was amiss.