The authors of a 2015 study have retracted it after discovering that several Western blots in their paper “do not represent the experiments that were reported.”
They couldn’t check some of the original blots, because — according to the retraction notice in the American Journal of Physiology – Renal Physiology — they could not be located. The ones that could be found, however, are “inconsistent with what is presented in the figures.”
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data?Continue reading “A sinking feeling in my gut:” Diary of a retraction
After the article was published in 2015, the Center for Science in the Public Interest (CSPI) organized a letter signed by more than 100 researchers, urging the publication to retract the article. Today, the journal said it found “no grounds” to do so.
However, in a press release accompanying the announcement of the correction, the BMJ notes that some aspects of the CSPI’s criticisms were merited.
PubPeer has suffered a setback in an ongoing lawsuit filed by a scientist who alleges the site’s anonymous commenters cost him a job.
This week, judges in the Court of Appeals in Michigan denied the request of the American Civil Liberties Union — which is representing PubPeer — to include an investigative report as part of evidence in the case. The report, by Wayne State University, found the plaintiff — Fazlul Sarkar — had committed widespread misconduct, and should retract scores of papers.
That’s the sound of learning that a third scientist you worked with committed misconduct.
In the last two years, we reported on two retractions for neuroscientist Stanley Rapoport, the result of misconduct by two different first authors. We’ve since discovered more retractions resulting from those cases — and a new retraction stemming from the actions of yet another co-author.
Although the latest retraction notice doesn’t reveal the reason for retraction, both the journal editor and Rapoport — based at the National Institute on Aging (NIA), part of the National Institutes of Health (NIH) — confirmed to us that it is the result of misconduct by the last author, Jagadeesh Rao. According to Rapoport, a “number of retractions [for] Rao are still in the works.”
We asked Rapoport for his reaction to multiple cases of misconduct by his colleagues, including the two first authors we’ve already reported on, Fei Gao and Mireille Basselin:
Too many of us have sat through too many bad presentations. And no one wants to give one, either.
Someone who’s seen his fair share of bad talks is David Sholl, of the Georgia Institute of Technology. In this helpful video, “The Secrets of Memorably Bad Presentations,” he presents some tongue-in-cheek advice on how to torture your audience.
(If it’s not immediately obvious — Sholl wants you to do the exact opposite of what he’s suggesting, below.)
Last year, a cancer researcher wrote to the Journal of Biological Chemistry, asking to correct one of his papers. The journal responded by requesting the raw data used to prepare his figures. Then, in a follow-up request, it asked for raw data behind the figures in 20 additional published articles.
And when all was said and done six months later, Jin Cheng ended up with far more than just a single correction: Last month, the journal issued withdrawals for 19 of his papers — including the paper he originally asked to correct — along with one correction.
We’ve pieced together some clues about what happened after reviewing correspondence between representatives of JBC and Moffitt Cancer Center, where Cheng conducted his research. A spokesperson for Moffitt confirmed that the retractions did not initiate from an institutional investigation — but that the institution is now conducting one.
That’s not the way retractions typically happen: Often, journals don’t have the resources to conduct investigations themselves, so institutions mostly take the lead in double-checking papers and, if necessary, contacting the journal to initiate a retraction. Here, it seems the opposite took place.
A study linking vaccines to autism and other neurological problems has been removed by a Frontiers journal after receiving heavy criticism since it was accepted last week.
The abstract — published online in Frontiers in Public Health after being accepted November 21 — reported findings from anonymous online questionnaires completed by 415 mothers of home-schooled children 6-12 years old. Nearly 40 percent of children had not been vaccinated, and those that had were three times more likely to be diagnosed with neurodevelopmental disorders such as autism, the study found.