Archive for the ‘doing the right thing’ Category
Several years ago, a group of four chemists believed they had stumbled upon evidence that contradicted a fairly well-established model in fluid dynamics.
Between 2013 and 2015, the researchers published a series of four papers detailing their results — two in ACS Macro Letters and two in Macromolecules. Timothy P. Lodge, the journals’ editor and a distinguished professor at the University of Minnesota in Minneapolis, explained that the results were “somewhat controversial,” because they appeared to contradict the generally accepted model for how some polymer fluids move.
Indeed, the papers sparked debate between the authors and other experts who questioned the new data, arguing it didn’t upend the previous model.
Then, in 2015, the authors realized their critics might be correct. Read the rest of this entry »
Are there two types of retractions? One that results from a form of misconduct, such as plagiarism or manipulating figures, and another that results from “honest errors,” or genuine mistakes the authors have owned up to? More and more research is suggesting that the community views each type very differently, and don’t shun researchers who make mistakes and try to correct the record. In yet another piece of evidence, Daniele Fanelli and his colleagues recently published the results of their interviews with 14 scientists who retracted papers for honest errors between 2010-2015. Although much of what scientists said affirmed what Fanelli – based at METRICS (the Meta-Research Innovation Center) at Stanford University – has long argued about retractions due to honest error, some of their answers surprised him.
Retraction Watch: We’ve seen the community reward scientists who retract papers for honest error, including a 2013 paper that showed no citation penalty for researchers who self-retract. Yet the interviewees said they were surprised to realize there weren’t any negative consequences to their self-retractions (some even got kudos for doing it). Why do you think people don’t realize how the community will view honest error?
Only days after his paper was published online, a neuroscientist has posted a comment on PubMed alerting readers to several duplication errors.
Despite the issues, which the researcher says were introduced into the final manuscript after peer review, he reassured readers that they do not influence the final conclusions in the paper.
On February 9, ten days after the article came online, corresponding author Garret Stuber at the University of North Carolina at Chapel Hill wrote a detailed comment on PubMed Commons, explaining that the “research community” had brought four figure-related errors to his attention. After investigating the concerns, Stuber discovered that the problems emerged after the peer-review process, “while revising the manuscript to comply with Nature Neuroscience’s final formatting guidelines.” In his note, he outlined the specific duplication issues that arose, which he says he plans to detail to the journal in a formal corrigendum letter.
Although it’s the right thing to do, it’s never easy to admit error — particularly when you’re an extremely high-profile scientist whose work is being dissected publicly. So while it’s not a retraction, we thought this was worth noting: A Nobel Prize-winning researcher has admitted on a blog that he relied on weak studies in a chapter of his bestselling book.
The blog — by Ulrich Schimmack, Moritz Heene, and Kamini Kesavan — critiqued the citations included in a book by Daniel Kahneman, a psychologist whose research has illuminated our understanding of how humans form judgments and make decisions and earned him half of the 2002 Nobel Prize in Economics.
Four of the newly corrected papers have a common last and corresponding author: Luke O’Neill of Trinity College Dublin in the Republic of Ireland. O’Neill is also a co-author of the remaining paper that was fixed. O’Neill told us the mistakes were a “bit sloppy,” noting that he takes responsibility for the errors in the four papers on which he is last author.
O’Neill forwarded Retraction Watch a comment he received from Kaoru Sakabe — data integrity manager at the American Society for Biochemistry and Molecular Biology (which publishes The Journal of Biological Chemistry (JBC)) — that reads: Read the rest of this entry »
2013 probably felt like it was going to be a great year for stem cell biologist Douglas Melton at Harvard. He had published a buzz-worthy paper in Cell about a new way to potentially boost insulin in diabetics, attracting significant media attention, and eventually gathering nearly 200 citations.
But 2016 is closing out on a less positive tone for Melton — today, he and his colleagues are retracting the paper, after multiple labs (including his own) couldn’t reproduce the findings.
Although the lab has itself already published two articles casting doubt on the original findings, Melton told Retraction Watch he chose to retract the paper to ensure there was no confusion about the original paper’s validity: Read the rest of this entry »
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data? Read the rest of this entry »
Do pro-nuclear energy countries act more slowly to curb the effects of climate change? That’s what a paper published in July in the journal Climate Policy claimed. But the hotly debated study was retracted last week after the authors came to understand that it included serious errors.
A few months ago, an author alerted us to two retractions — including one in PNAS — after realizing his team had been using plants affected by inadvertent genotyping errors for an entire year. He initially told us these were the only two papers affected, but more recently reached out to say he had to pull a follow-up article, as well.
Recently, Steven C. Huber contacted us about the newest retraction, noting he was submitting a notice to the editor of Plant Signaling and Behavior:
Oh, well — “love hormone” doesn’t reduce psychiatric symptoms, say researchers in request to retract
It turns out, snorting the so-called “love hormone” may not help reduce psychiatric symptoms such as depression and anxiety.
At least, that’s the conclusion the authors of a 2015 meta-analysis, which initially found intranasal doses of oxytocin could reduce psychiatric symptoms, have now reached. After a pair of graduate students pointed out flaws in the paper, the authors realized they’d made some significant errors, and oxytocin shows no more benefit than placebo.