Archive for the ‘methodological problems’ Category
Sometime in the middle of 2015, Jennifer Byrne, professor of molecular oncology at the University of Sydney, began her journey from cancer researcher to a scientific literature sleuth, seeking out potentially problematic papers.
The first step was when she noticed several papers that contained a mistake in a DNA construct which, she believed, meant the papers were not testing the gene in question, associated with multiple cancer types. She started a writing campaign to the journal editors and researchers, with mixed success. But less than two years later, two of the five papers she flagged have already been retracted.
When asked why she spent time away from bench research to examine this issue, Byrne told us: Read the rest of this entry »
Gearóid Ó Faoleán, ethics and integrity manager at Frontiers, which publishes Frontiers in Plant Science, told us:
In accordance with our complaints protocol, the Field Chief Editor led the investigation that resulted in the decision to retract the paper.
One of the retracted papers in the Journal of Neurosurgery (JNS) had multiple problems that were “too extensive to revise,” according to the lengthy retraction notice, relating to issues with authorship, data analyses, and patient enrollment. The notice is signed by first author Hua Liu of the Nanjing Medical University in China.
Liu is also the first author of another recently retracted paper in Frontiers in Neuroscience, pulled for incorrectly categorizing patients.
An ecologist in Australia realized a database he was using to spot trends in extinction patterns was problematic, affecting two papers. One journal issued an expression of concern, which has since turned into a retraction. So far, the other journal has left the paper untouched.
The now-retracted paper concluded that medium-sized species on islands tend to go extinct more often than large or small mammalian species. But a little over a year ago, Biology Letters flagged the paper with an expression of concern (EOC), noting “concerns regarding the validity of some of the data and methods used in the analysis.”
Zhao Kai, the study’s first author from the Qilu Hospital of Shandong University and Zibo Central Hospital (both in China), took full responsibility for the error.
After the Journal of Medical Entomology (JME) published the study — about the identification of genes that enable an insect to detect odors — an outside researcher wrote a letter to the journal highlighting flaws in the paper. The journal then asked the authors to respond, and enlisted two additional peer reviewers to look into the study, the outside comment, and the authors’ response. They concluded the paper should be retracted.
William Reisen — the journal’s editor-in-chief from the University of California, Davis — said the journal believes the errors were unintentional and there was no fraud on the authors’ part. He added: Read the rest of this entry »
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data? Read the rest of this entry »
Three psychiatric studies of children contained a myriad of problems that may have put participants at greater risk than was disclosed by consent forms, according to a 2014 letter sent to hundreds of the participants and their families.
Through a public records request, we’ve obtained a copy of the letter — which lists a host of problems in the studies, ranging from enrolling ineligible patients, not informing families of the risks associated with the studies, and skipping tests intended to minimize the risks associated with lithium.
In 2013, Mani Pavuluri told the University of Illinois at Chicago that one of her study participants had been hospitalized — an event which prompted the university to halt three of her studies, launch a misconduct probe, and send letters to approximately 350 families of children participating in the research, notifying them of what happened.
The letter concludes:
For starters, the first author — Maria Riccardi of the National Research Council of Italy-Institute for Agricultural and Forest Systems in the Mediterranean (CNR-ISAFOM) in Ercolano, Naples, Italy — apparently submitted the paper without consulting the study’s four other listed co-authors. What’s more, according to the retraction notice in Scientia Horticulturae, the paper’s description of the experiment “does not reflect the real conditions under which the data was collected,” rendering the findings invalid.
Two papers evaluating glucose meters — used by diabetics to monitor blood sugar levels — suggested that a couple of the devices don’t work as well as they should. Perhaps unsurprisingly, the companies that sell those meters objected to how the studies were conducted. By all accounts, the companies appear to be justified in their complaints.
In both cases, researchers used blood drawn from veins to test the meters. But manufacturers of the WaveSense JAZZ and GlucoRx glucose meters said their devices are designed to work with fresh blood from a finger-prick. Both papers have now been retracted.
The retraction notice for “Technical and clinical accuracy of five blood glucose meters: clinical impact assessment using error grid analysis and insulin sliding scales,” published in 2015 in the Journal of Clinical Pathology, hints at the issue: