Archive for the ‘methodological problems’ Category
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data? Read the rest of this entry »
Three psychiatric studies of children contained a myriad of problems that may have put participants at greater risk than was disclosed by consent forms, according to a 2014 letter sent to hundreds of the participants and their families.
Through a public records request, we’ve obtained a copy of the letter — which lists a host of problems in the studies, ranging from enrolling ineligible patients, not informing families of the risks associated with the studies, and skipping tests intended to minimize the risks associated with lithium.
In 2013, Mani Pavuluri told the University of Illinois at Chicago that one of her study participants had been hospitalized — an event which prompted the university to halt three of her studies, launch a misconduct probe, and send letters to approximately 350 families of children participating in the research, notifying them of what happened.
The letter concludes:
For starters, the first author — Maria Riccardi of the National Research Council of Italy-Institute for Agricultural and Forest Systems in the Mediterranean (CNR-ISAFOM) in Ercolano, Naples, Italy — apparently submitted the paper without consulting the study’s four other listed co-authors. What’s more, according to the retraction notice in Scientia Horticulturae, the paper’s description of the experiment “does not reflect the real conditions under which the data was collected,” rendering the findings invalid.
Two papers evaluating glucose meters — used by diabetics to monitor blood sugar levels — suggested that a couple of the devices don’t work as well as they should. Perhaps unsurprisingly, the companies that sell those meters objected to how the studies were conducted. By all accounts, the companies appear to be justified in their complaints.
In both cases, researchers used blood drawn from veins to test the meters. But manufacturers of the WaveSense JAZZ and GlucoRx glucose meters said their devices are designed to work with fresh blood from a finger-prick. Both papers have now been retracted.
The retraction notice for “Technical and clinical accuracy of five blood glucose meters: clinical impact assessment using error grid analysis and insulin sliding scales,” published in 2015 in the Journal of Clinical Pathology, hints at the issue:
Researchers have retracted and replaced a 2014 paper in JAMA Internal Medicine after realizing a number of errors had affected the findings.
The authors note the mistakes do not have a significant impact on the overall proportion of heart patients who participated in cardiac rehab. However, a number of findings were affected, such as the difference in participation in cardiac rehab defined by race, and how the overall participation has changed throughout the years.
Therefore, JAMA Internal Medicine has published a lengthy notice of retraction and replacement, which explains the errors made in the original paper, and updated the first paper with a new version of the study.
A few months ago, an author alerted us to two retractions — including one in PNAS — after realizing his team had been using plants affected by inadvertent genotyping errors for an entire year. He initially told us these were the only two papers affected, but more recently reached out to say he had to pull a follow-up article, as well.
Recently, Steven C. Huber contacted us about the newest retraction, noting he was submitting a notice to the editor of Plant Signaling and Behavior:
In November 2015, we reported on a retraction for Mani Pavuluri in the Journal of Psychiatry & Neuroscience following a probe at the University of Illinois at Chicago, her institution, which concluded that there was a “preponderance of evidence” that Pavuluri had committed misconduct.
After an “unanticipated event” took place during a study, three studies by Pavuluri were halted and a letter was sent out to 350 research subjects, informing them of errors in the work. At the time, the Illinois spokesperson noted that Pavuluri — who, according to her LinkedIn page, is a Distinguished Fellow of the American Academy of Child and Adolescent Psychiatry — was also asked to retract two 2013 studies in the Journal of Affective Disorders. Those papers have now been retracted, noting that Pavuluri “intentionally and knowingly” misrepresented children’s medication history.
Oh, well — “love hormone” doesn’t reduce psychiatric symptoms, say researchers in request to retract
It turns out, snorting the so-called “love hormone” may not help reduce psychiatric symptoms such as depression and anxiety.
At least, that’s the conclusion the authors of a 2015 meta-analysis, which initially found intranasal doses of oxytocin could reduce psychiatric symptoms, have now reached. After a pair of graduate students pointed out flaws in the paper, the authors realized they’d made some significant errors, and oxytocin shows no more benefit than placebo.
Sarah Darby, last author of the now-retracted paper from the University of Oxford, UK, told Retraction Watch that the mistake was made by a doctoral student. When the error was realized, Darby said, she contacted the Journal of Clinical Oncology (JCO), explained the issue, and asked whether they would prefer a retraction or a correction. JCO wanted a retraction, and she complied.
The journal allowed the authors to publish a correspondence article outlining their new results.
Interestingly, two authors of the newly retracted papers — Yu-Tao Xiang from the University of Macau in China and Gabor Ungvari from the University of Western Australia — also recently co-authored another paper on an entirely different topic that has received a lengthy correction. That paper — on the use of organs from executed prisoners in China — raised controversy for allegedly reporting a “sanitized” account of the practice. The correction notice, in the Journal of Medical Ethics, was accompanied by a critics’ rebuttal to the paper.
According to Xiang, the newly retracted papers in The Journal of ECT — which examined the efficacy of ECT in treating schizophrenia — were pulled due to “genuine errors” resulting from differences in language. All the authors agree with the retraction, Xiang noted.
Xiang told us: Read the rest of this entry »