An Elsevier journal plans to issue a retraction notice this week about a widely criticized 2012 paper claiming to find links between skin color, aggression, and sexuality.
The paper was the subject of a highly critical Medium post in November 2019, and of a petition with more than 1,000 signatures sent to Elsevier earlier this month.
The four-page retraction notice, provided to Retraction Watch by Elsevier, begins with a description of the history, policies and procedures at the journal, then launches into a litany of issues with the paper:
A study whose title suggested an “effective” way to give birth during the coronavirus pandemic has been temporarily retracted because the publisher says the word “effective” was included in the title by accident.
The method (pictured above) involved an enclosed, transparent chamber walling off the mother’s upper half from the rest of the world. It wasn’t very well received, according to an Essential Baby article that cited Twitter users referring to the “delivery table shield” as a “labor cage” and “greenhouse.”
A Japanese anesthesiologist who just notched his sixth retraction apologized for his misconduct and said his institution is now investigating his entire body of work.
Hironobu Ueshima, of Showa University in Tokyo, who has roughly 170 publications, told Retraction Watch by email:
Earlier this month, we reported on the retraction of two papers by a Japanese anesthesiologist for unreliable data. At the time, we noted that the case of Hironobu Ueshima bore watching, given his publication total runs to about 170.
The two retractions earlier this month came after an earlier one from the journal Medicine in April. Now, another anesthesia journal has retracted three more papers by Ueshima, citing misconduct, for a total of six.
The articles appeared in Regional Anesthesia & Pain Medicine in 2016 and 2019. According to the notice:
Often, when confronted with allegations of errors in papers they have published, journal editors encourage researchers to submit letters to the editor. Based on what we hear from such letter writers, however, the journals don’t make publication an easy process. Here’s one such story from a group at Indiana University: Luis M. Mestre, Stephanie L. Dickinson, Lilian Golzarri-Arroyo, and David B. Allison.
In late 2018, in the course of reviewing papers on obesity, one of us (DA) noticed a November 2018 article in the BMC journal Biomedical Engineering Online titled “Randomized controlled trial testing weight loss and abdominal obesity outcomes of moxibustion.” The objective of the study was to determine the effect of moxibustion therapy on weight loss, waist circumference, and waist-to-hip ratio in young Asian females living in Taiwan.
Some of the tabulated data in the paper seemed odd, so DA sent it to members of our research team asking for their input. The research team agreed, finding some irregularities in the data that seemed inconsistent with a randomized experimental design. After that, the task of carefully and thoroughly checking the published summary statistics and text in the paper was delegated to another of us (LM) and all of his work rechecked by professional statisticians and the research team.
The apparent inconsistencies and anomalies identified in the paper (i.e., large baseline differences, variance heterogeneity, and lack of details in the explanation of the study design) led to concerns about the extent to which the study report represented an accurate description of a properly conducted randomized controlled trial (RCT) and, therefore, whether the conclusions were reliable. Given the importance of reliable conclusions in the scientific literature on obesity treatment, as well as simply the integrity of the scientific literature overall, we decided to write a letter to the editor of the journal seeking either clarification or correction.
Yesterday, dozens of scientists petitioned the Proceedings of the National Academy of Sciences to “retract a paper on the effectiveness of masks, saying the study has ‘egregious errors’ and contains numerous ‘verifiably false’ statements,” as The New York Times reported. One of those scientists was James Heathers, whose name will likely be familiar to Retraction Watch readers because of his work as a scientific sleuth. We asked him to share his thoughts on why he signed the letter.
The above took a few weeks. To many people, it must have looked like silence. But the entire time, out of the public eye, a furious and detailed global discussion between scientists, statisticians, and epidemiologists about the accuracy of this paper was boiling. Frankly, almost no one believed this paper. The data were too regular. The access that was reported to hospital databases was too unusual. The amount of work done didn’t fit the parameters reported.
It was … “off.” It was obvious. Something was amiss.
A review of scores of studies on antidepressants has been retracted because it used an incorrect analysis.
The original paper, published in JAMA Psychiatry on February 19, 2020, looked at individual differences in patients taking antidepressants and concluded that there were significant differences beyond the placebo effect or the data’s statistical noise. The paper earned some attention, including a story on MedPage Today.
However, the analysis didn’t hold up to scrutiny. The retraction notice reads:
A Springer Nature journal has issued an editor’s note — which seems an awful lot like an Expression of Concern — for a widely circulated but quickly contested paper about how the novel coronavirus might infect white blood cells, akin to HIV.
However, readers could be forgiven for missing that fact. Indeed, the journal itself appears to be struggling to deal with the article — which one of the corresponding authors told us he asked to withdraw weeks ago.
A journal has retracted two case reports by a prolific Japanese anesthesiologist who appears to be embroiled in a misconduct investigation.
The two case studies, in JA Clinical Reports, were written by Hironobu Ueshima and Hiroshi Otake, of Showa University Hospital in Tokyo. Ueshima has roughly 170 publications to his name, according to Google Scholar, so we’ll be closely watching for developments in this case.
A series of back and forth publications about a 12-year-old study of nursing education ended with some unusual editorial decisions.
Darrell Spurlock, a professor of nursing at Widener University and director of the university’s Leadership Center for Nursing Education Research, co-authored a study of the Health Education Systems, Inc. (HESI) nursing test in 2008. He and his colleague found that the test was a poor predictor of failure on the National Counsel Licensure Exam (NCLEX-RN).
More than a decade later, a critique of the paper, by Dreher et al., appeared out of the blue, published last year inNursing Forum, a Wiley journal. Spurlock takes issue with the way his research was portrayed in the critique, which paints a more positive picture of the HESI test.