The FASEB Journal has retracted a 2012 paper by a group from the University of Alabama, Birmingham (UAB), looking at the role of a tumor-suppressing micro-RNA in pulmonary fibrosis. The retraction suggests the provenance of the data are in question, and we learned details of what went wrong.
A paper that shares a first author with a paper retracted in December has been corrected.
Late last year, we reported on a retraction in Antioxidants & Redox Signaling (ARDS) by Indika Edirisinghe, who was at the University of Rochester when the original paper was published, and colleagues. On January 17, the Journal of Agricultural and Food Chemistry published a correction to “Effect of Black Currant Anthocyanins on the Activation of Endothelial Nitric Oxide Synthase (eNOS) in Vitro in Human Endothelial Cells,” on which Edirisinghe is also first author.
The U.S. Office of Research Integrity has sanctioned Bryan William Doreian, a former postdoc in dermatology at Case Western Reserve University in Cleveland, for falsifying data in his dissertation and a 2009 paper in Molecular Biology of the Cell (for which it provided a cover image, at right).
ORI says Doreian’s bad NIH-funded data also appeared in a manuscript submitted to, but never published in, Nature Medicine.
A team of Swiss microbiologists has retracted their 2012 paper in PLoS One on the genetics of the TB mycobacterium after learning that the fusion protein they thought they’d used in their study was in fact a different molecule.
Last year, an audit by the U.S. Government Accountability Office found “a potential for unnecessary duplication” among the billions of dollars in research grants funded by national agencies. Some researchers, it seemed, could be winning more than one grant to do the same research.
Prompted by that report, Virginia Tech’s Skip Garner and his colleagues used eTBLAST, which Garner invented, to review more than 630,000 grant applications submitted to the NIH, NSF, Department of Defense, Department of Energy, and Susan G. Komen for the Cure, “the largest charitable funder of breast cancer research in the United States.” The approach was not unlike those by publishers to identify potential article duplications.
In a Comment published today inNature, they report that they found 1,300 applications above a “similarity score” cutoff of 0.8 for federal agencies, and 0.65 for Komen documents — “with 1 indicating identical text in two same-length documents, and more than 1 representing identical text in one piece that is longer than the other.”
A group of surgeons in Cairo, Egypt have retracted their 2012 paper in the International Journal of Women’s Health for plagiarism, although that’s not quite what they’re calling it.
A new study in Clinical Chemistry paints an alarming picture of how often scientists deposit data that they’re supposed to — but perhaps not surprisingly, papers whose authors did submit such data scored higher on a quality scale than those whose authors didn’t deposit their data.
Ken Witwer, a pathobiologist at Hopkins, was concerned that a lot of studies involving microarray-based microRNA (miRNA) weren’t complying with Minimum Information About a Microarray Experiment (MIAME) standards supposedly required by journals. So he looked at 127 such papers published between July 2011 and April 2012 in journals including PLOS ONE, the Journal of Biological Chemistry, Blood, and Clinical Chemistry, assigning each one a quality score and checking whether the authors had followed guidelines.