A researcher who admitted in 2012 to “intentional and systematic manipulation” of data and had two papers retracted has been banned from funding by the German Research Foundation (DFG).
This one seems like an honest mistake: a paper on dietary supplements during pregnancy has been retracted based on an error in data recording.
In the BMC Pregnancy & Childbirth paper, “Folic acid supplementation, dietary folate intake during pregnancy and risk for spontaneous preterm delivery: a prospective observational cohort study,” women for whom the researchers had no data on folic acid supplementation were classified as taking no supplements. Despite the error, the authors claim the overall conclusion remains the same: taking folic acid supplements didn’t protect women from preterm deliveries.
After earning an erratum shortly after publication in 2009, a paper in Applied Physics Letters has now been retracted for the “regrettable mistake” of duplicating an earlier paper by the researchers.
After typing up 96 citations, researchers from the National Institute for Digestive Diseases, I.R.C.C.S. “S. de Bellis,” in Bari, Italy, apparently ran out of steam for the last five, earning themselves a retraction for plagiarism in a literature review of the effects of probiotics on intestinal cancer.
A scientific publishing gadfly who was banned earlier this year from an Elsevier journal for “personal attacks and threats” has had a paper rejected by a Springer journal after he called for the editor’s resignation because of alleged incompetence.
As detailed in a comment left at Retraction Watch, Jaime A. Teixeira da Silva submitted a manuscript titled “One Conjunction, a World of Ethical Difference: How Elsevier, the ICMJE and Neurology Define Authorship” to Science and Engineering Ethics on November 11, 2012. As of last week, despite a number of messages sent to editors of the journal, he had not had a decision on the manuscript.
In the meantime, the plasma scientists withdrew their paper from consideration and submitted it to IEEE Transactions on Plasma Science, where it was published in February 2013. Unfortunately, in the four year delay between the conference and the Institute of Physics publication, the withdrawal request got lost.
From Larry Summers to James Watson, certain scientists have a long and questionable tradition of using “data” to make claims about intelligence and aptitude.
So it’s no surprise that, when well-known computer scientist Richard Bornat claimed his PhD student had created a test to separate people who would succeed at programming versus those who didn’t, people happily embraced it. After all, it’s much easier to say there’s a large population that will just never get it, instead of re-examining your teaching methods.
The paper, called “The camel has two humps,” suggested instead of a bell curve, programming success rates look more like a two-humped ungulate: the kids who get it, and the kids who never will.