The authors of a 2015 study have retracted it after discovering that several Western blots in their paper “do not represent the experiments that were reported.”
They couldn’t check some of the original blots, because — according to the retraction notice in the American Journal of Physiology – Renal Physiology — they could not be located. The ones that could be found, however, are “inconsistent with what is presented in the figures.”
Daniel Bolnick is photographed at HHMI’s Janelia Farms campus on Wednesday, Oct. 9, 2013 in Ashburn, Va. (Kevin Wolf/AP Images for HHMI)
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data?Continue reading “A sinking feeling in my gut:” Diary of a retraction
If you think something is amiss with your data, running an experiment again to figure out what’s going on is a good move. But it’s not always possible.
A team of researchers in Seoul recently found themselves in a bind when they needed to check their work, but were out of a key substance: breast milk.
A physics journal has retracted a 2016 study after learning that the author published it without the knowledge or permission of the funder, which had a confidentiality agreement in place for the work.
According to the retraction notice in Applied Physics Letters, the paper also lifted content from other researchers without due credit. Given the “legal issue” associated with the breach of confidentiality, the journal has decided to remove the paper entirely.
When it comes to detecting image manipulation, the more tools you have at your disposal, the better. In a recent issue of Science and Engineering Ethics, Lars Koppers at TU Dortmund University in Germany and his colleagues present a new way to scan images. Specifically, they created an open-source software that compares pixels within or between images, looking for similarities, which can signify portions of an image has been duplicated or deleted. Koppers spoke with us about the program described in “Towards a Systematic Screening Tool for Quality Assurance and Semiautomatic Fraud Detection for Images in the Life Sciences,” and how it can be used by others to sleuth out fraud.
Retraction Watch: Can you briefly describe how your screening system works?
On December 31st 2014, a pioneer in the study of inflammatory bowel disease passed away. An obituary published in the Journal of Digestive Diseases shortly thereafter is typical enough: It describes his achievements, importance to his patients, and battle with pancreatic cancer.
This is the first time we’ve seen an obituary pulled from a journal. Unfortunately, this was not a case of a premature obituary (which happens more often than you’d think)– the researcher did actually die, but it appears the journal published the obituary in the wrong place.
The move comes after a group of researchers alleged the paper contains missing data, and the authors followed a problematic methodology. In September, however, the co-authors’ institution, Uppsala University in Sweden, concluded there wasn’t enough evidence to launch a misconduct investigation.
After the article was published in 2015, the Center for Science in the Public Interest (CSPI) organized a letter signed by more than 100 researchers, urging the publication to retract the article. Today, the journal said it found “no grounds” to do so.
However, in a press release accompanying the announcement of the correction, the BMJ notes that some aspects of the CSPI’s criticisms were merited.
The University of Tokyo is investigating a 2011 stem cell paper in Cell Cycle, recently retracted over irregularities in four figures.
The university has confirmed there is an investigation, but would not specify which paper it concerned; the corresponding author on the paper, however, confirmed to us that it is the focus of the investigation.