Have you seen our “unhelpful retraction notices” category, a motley collection of vague, misleading, and even information-free entries? We’d like to make it obsolete, and we need our readers’ help.
Here’s what we mean: Next month, Ivan will be traveling to Rio to take part in the World Conference on Research Integrity. One of his presentations is a set of proposed guidelines for retraction notices and their dissemination that we hope will inform publishing practices and severely limit the number of entries in our “unhelpful retraction notices” category. In September, for example, we announced that our guidelines would be linked from PRE-val, which “verifies for the end user that content has gone through the peer review process and provides information that is vital to assessing the quality of that process.”
Here’s a draft of our proposed guidelines, which include many of the items recommended by the Committee on Publication Ethics and the International Committee of Medical Journal Editors: Read the rest of this entry »
Last week, we announced a new partnership with PRE (Peer Review Evaluation) “to improve access to information about retraction policies.” The first step, we and PRE said, was that Retraction Watch would create guidelines for retraction notices, to which PRE’s flagship product, PRE-val, would link.
Well, it turns out that great minds think alike, or along similar lines, anyway. Today we learned that next week, the Committee on Publication Ethics (COPE) will be discussing a standard retraction form proposed by friend of Retraction Watch Hervé Maisonneuve, who has published several papers on retractions. According to a writeup: Read the rest of this entry »
In addition, some of the work was apparently covered by a copyright agreement.
Both papers were co-authored by the same three people. The idea theft came to light after one of the co-authors received a complaint from her former supervisor, prompting her to contact the publisher to resolve the issue.
Two blog posts are shining additional light on a recent retraction that included some unanswered questions — namely, the identity of the researcher who admitted to manipulating the results.
To recap: Psychological Science recently announced it was retracting a paper about the relationship between the words you use and your mood after a graduate student tampered with the results. But the sole author — William Hart, an assistant professor at the University of Alabama — was not responsible.
The post raised some questions — for instance, who was the graduate student, and if his or her work was so influential to a paper, why was he/she not listed as an author? Hart declined to identify the student, but two new blogs — including one by one of Hart’s collaborators at the University of Alabama — are providing more details.
When an ecologist realized he’d made a fatal error in a 2009 paper, he did the right thing: He immediately contacted the journal (Evolutionary Ecology Research) to ask for a retraction. But he didn’t stop there: He wrote a detailed blog post outlining how he learned — in October 2016, after a colleague couldn’t recreate his data — he had misused a statistical tool (using R programing), which ended up negating his findings entirely. We spoke to Daniel Bolnick at the University of Texas at Austin (and an early career scientist at the Howard Hughes Medical Institute) about what went wrong with his paper “Diet similarity declines with morphological distance between conspecific individuals,” and why he chose to be so forthright about it.
Retraction Watch: You raise a good point in your explanation of what went wrong with the statistical analysis: Eyeballing the data, they didn’t look significant. But when you plugged in the numbers (it turns out, incorrectly), they were significant – albeit weakly. So you reported the result. Did this teach you the importance of trusting your gut, and the so-called “eye-test” when looking at data? Read the rest of this entry »
Earlier this year, a nutrition journal retracted an article about the potential dangers of eating food containing genetically modified organisms (GMOs), noting the paper contained a duplicated image.
At the time, news outlets in Italy were reporting accusations that the last author, Federico Infascelli, an animal nutrition researcher at the University of Naples, had falsified some of his research.
Food and Nutrition Sciences has now updated its initial notice, saying the paper was pulled for data fabrication. In addition, Infascelli is no longer listed on its editorial board – he is included on an archived link to the editorial board from March 2016, but not on the current list of members.
The Open Automation and Control Systems Journal has published five items this calendar year — and all of those are retraction notices.
That’s what we’re sure about. Now to what we’re not clear on in this story, which is one of a growing number of cases we’ve seen in which so-called “predatory” publishers are starting to retract papers, perhaps because they hope the practice suggests they are rigorous. Four of the papers have been pulled for “compromised” peer review, some of which are due to the actions of an “external agent,” according to the journal. A co-author of one of these manuscripts, however, claims the paper has been pulled for using material from another researcher’s paper without acknowledgement but the journal has retracted it for issues with peer review.
The remaining paper has been pulled for plagiarizing from another published paper.
A journal has retracted a paper about 3D imaging after concluding the authors used equations from another researcher without attribution — and has conveniently included a detailed editorial explaining exactly what happened.
It’s rare for us to see a journal be so transparent in explaining what went wrong with one of its papers, so we’re thanking Stuart Granshaw, from Denbighshire in Wales, UK, the editor of The Photogrammetric Record, for “doing the right thing.”
We always like to get a historical perspective on how scientists have tried to correct the record, such as this attempt in 1756 to retract a published opinion about some of the work of Benjamin Franklin. Although that 18th century note used the word “retract,” it wasn’t a retraction like what we see today, in which an entire piece of writing is pulled from the record. These modern-day retractions are a relatively recent phenomenon, which only took off within the last few decades, according to science historian Alex Csiszar at Harvard University. He spoke to us about the history of retractions – and why an organization like Retraction Watch couldn’t have existed 100 years ago.
Retraction Watch: First of all, let’s start with something you found that appears to break our previous record for the earliest retraction – a “retractation” by William Molyneux of some assertions about the properties of a stone, published in 1684. Could this be the earliest English-language retraction? Read the rest of this entry »
We have a new record for the longest time from publication to retraction: 80 years. It’s for a case report about a 24-year-old man who died after coughing up more than four cups of what apparently looked — and smelled — like pee.
According to the case report titled “Een geval van uroptoë” published in 1923, an autopsy revealed that the man had a kidney that was strangely located in his chest cavity. A case of pneumonia caused the kidney to leak urine into the space around his lungs, leading to the perplexing cough.
If that sounds too crazy to be true, you’re right: This man never existed. The case was retracted in 2003. (Yes, we are a little late to this one — it recently popped up in one of our Google alerts.)
A write-up by the editors of the Nederlands Tijdschrift voor Geneeskunde — that translates to “Dutch Journal of Medicine” — explains that the strange case was a fake (on the fifth page of this PDF, in English):