A few months ago, Dirk Werling discovered he had made a horrible mistake: He had inadvertently plagiarized in his recent review.
On January 20, Werling said he came across a 2016 paperwhile working on a grant and realized he had published some of the text in his 2018 review in Research in Veterinary Science. Werling — based at Royal Veterinary College at the University of London — told Retraction Watch:
A surgery journal retracted a 2014 paper last month after discovering that the study has “no scientific validity.”
Mario Schietroma and his coauthors, based at the University of L’Aquila in Italy, reported that giving patients high concentrations of oxygen during and after colorectal surgery significantly reduced their risk of infections. Although the authors reported significant p-values, the retraction notice states that, “upon recalculation, no p-values were close to significant.” The University of L’Aquila told Retraction Watch it is investigating, but did not provide details.Continue reading Paper used to support WHO guidelines on preventing infections “has no scientific validity”
After being “blindsided” a few months ago when she was told one of her 2005 papers was going to be retracted, a researcher scrambled to get information about why. And when she didn’t like the answers, she took to PubPeer.
Eight days ago, Shalon (Babbitt) Ledbetter, the first author of the 2005 paper published in Cell, posted a comment on the site announcing the paper was going to be retracted after the last author’s institution, Saint Louis University (SLU), determined that some figures had been manipulated by the last author, Dorota Skowrya. A letter dated September 2, 2015 sent by SLU to Cell describes the results of the investigation — namely, that the manipulations were “cosmetic,” and had no effect on the data or the conclusions. More than two years later, Ledbetter learned the journal was planning to retract the paper, and an initial draft of the notice wouldn’t identify who was responsible; she has since been pulled into a confusing web of blame-shifting and conflicting information that has been, in her words, “heartbreaking.”
Originally published June 17, 2016, the paper was retracted Jan. 15. Led by corresponding author Xavier Altafaj, of the University of Barcelona (UB) and Bellvitge Biomedical Research Institute (IDIBELL), researchers described using an amino acid, D-serine, to treat a child with a rare genetic disorder that affects neurons.
According to the notice, the researchers did use D-serine in lab work used as proof-of-concept; however, when it came time to try it in the patient, as a result of a “communication error:”
When Saidur Rahman learned last month that his 2010 review paper about nanoparticles in refrigeration systems had been retracted, he was concerned—no one at the journal had told him it was going to be pulled.
Rahman, a professor of engineering at Sunway University in Selangor, Malaysia, had recently corrected his 2010 review in Renewable and Sustainable Energy Reviews—specifically, in January, the journal published a two-page correction rewriting the parts of the paper that were “appear close to some materials we had included in some of our other review research.” But Rahman was not anticipating a retraction. Continue reading Oops: Elsevier journal retracts the wrong paper
When Nicholas Peppas, chair of engineering at the University of Texas at Austin, discovered one of his papers had been plagiarized, he decided to “go public!”
One of the most common reasons for retractions is image manipulation. When searching for evidence of it, researchers often rely on what their eyes tell them. But what if screening tools could help? Last week, researchers described a new automated tool to screen images for duplication (reported by Nature News); with help from publishing giant Elsevier, another group at Harvard Medical School is developing a different approach. We spoke with creators Mary Walsh, Chief Scientific Investigator in the Office for Professional Standards and Integrity, and Daniel Wainstock, Associate Director of Research Integrity, about how the tool works, and why — unlike the other recently described automated tool — they want to make theirs freely available.
Retraction Watch: What prompted you to develop this tool?
When Alexander Harms arrived at the University of Copenhagen in August 2016, as a postdoc planning to study a type of antibiotic resistance in bacteria, he carried with him a warning from another lab who had recruited him:
People said, “If you go there, you have to deal with these weird articles that nobody believes.”
The papers in question had been published in the Proceedings of the National Academy of Sciences in 2011 and Cell in 2013. Led by Kenn Gerdes, Harms’s new lab director, the work laid out a complex chain of events that mapped out how an E. coli bacterium can go into a dormant state, called persistence, that allows it to survive while the rest of its colony is wiped out.
Despite some experts’ skepticism, each paper had been cited hundreds of times. And Harms told us:
I personally did believe in the published work. There had been papers from others that kind of attacked [the Gerdes lab’s theory], but that was not high-quality work.