The U.S. Office of Research Integrity has sanctioned Bryan William Doreian, a former postdoc in dermatology at Case Western Reserve University in Cleveland, for falsifying data in his dissertation and a 2009 paper in Molecular Biology of the Cell (for which it provided a cover image, at right).
ORI says Doreian’s bad NIH-funded data also appeared in a manuscript submitted to, but never published in, Nature Medicine.
A team of Swiss microbiologists has retracted their 2012 paper in PLoS One on the genetics of the TB mycobacterium after learning that the fusion protein they thought they’d used in their study was in fact a different molecule.
Last year, an audit by the U.S. Government Accountability Office found “a potential for unnecessary duplication” among the billions of dollars in research grants funded by national agencies. Some researchers, it seemed, could be winning more than one grant to do the same research.
Prompted by that report, Virginia Tech’s Skip Garner and his colleagues used eTBLAST, which Garner invented, to review more than 630,000 grant applications submitted to the NIH, NSF, Department of Defense, Department of Energy, and Susan G. Komen for the Cure, “the largest charitable funder of breast cancer research in the United States.” The approach was not unlike those by publishers to identify potential article duplications.
In a Comment published today inNature, they report that they found 1,300 applications above a “similarity score” cutoff of 0.8 for federal agencies, and 0.65 for Komen documents — “with 1 indicating identical text in two same-length documents, and more than 1 representing identical text in one piece that is longer than the other.”
A group of surgeons in Cairo, Egypt have retracted their 2012 paper in the International Journal of Women’s Health for plagiarism, although that’s not quite what they’re calling it.
A new study in Clinical Chemistry paints an alarming picture of how often scientists deposit data that they’re supposed to — but perhaps not surprisingly, papers whose authors did submit such data scored higher on a quality scale than those whose authors didn’t deposit their data.
Ken Witwer, a pathobiologist at Hopkins, was concerned that a lot of studies involving microarray-based microRNA (miRNA) weren’t complying with Minimum Information About a Microarray Experiment (MIAME) standards supposedly required by journals. So he looked at 127 such papers published between July 2011 and April 2012 in journals including PLOS ONE, the Journal of Biological Chemistry, Blood, and Clinical Chemistry, assigning each one a quality score and checking whether the authors had followed guidelines.
We get accused of grabbing at cheap puns around here, but the headline above is meant to be taken straight up.
Three journals in the food sciences are retracting a trio of papers published last year on bacterial contamination in pork products because the articles used the same data sets — a classic (Platonic?) case of “salami slicing.”
The Journal of Food Protection, which published one of the articles, “Performance of three culture media commonly used for detecting Listeria monocytogenes,” has the following retraction notice:
A University of Wisconsin neuroscience researcher falsified “Western blot images as well as quantitative and statistical data” in two NIH-supported papers and three unfunded grant applications, the U.S. Office of Research Integrity (ORI) has found.