Archive for the ‘uk retractions’ Category
A pharmacology journal has retracted a 2011 paper after concluding images in three figures had been manipulated.
According to the British Journal of Pharmacology, four of the five authors claim they played no role in the manipulation. There is no comment from the remaining author, first author Ian Morecroft, a research associate at the University of Glasgow.
Here’s more from the notice, which says an investigation at the University of Glasgow is ongoing:
A researcher who resigned from the University of Dundee in Scotland after it concluded he was guilty of misconduct has issued his first retraction.
According to an internal email to staff forwarded to us last year, the university concluded that Robert Ryan had misrepresented clinical data and images in 12 different publications. The first retraction, published by Molecular Microbiology, cites image duplications in multiple figures.
Here’s the full notice:
From time to time, academics will devise a “sting” operation, designed to expose journals’ weaknesses. We’ve seen scientists submit a duplicated paper, a deeply flawed weight loss paper designed to generate splashy headlines (it worked), and an entirely fake paper – where even the author calls it a “pile of dung.” So it wasn’t a huge surprise when Katarzyna Pisanski at the University of Sussex and her colleagues found that so-called “predatory” journals – which are allegedly willing to publish subpar papers as long as the authors pay fees – often accepted a fake editor to join their team. In a new Nature Comment, Pisanski and her team (Piotr Sorokowski, Emek Kulczycki and Agnieszka Sorokowska) describe creating a profile of a fake scientist named Anna O. Szust (Oszust means “a fraud” in Polish). Despite the fact that Szust never published a single scholarly article and had no experience as a reviewer or editor, approximately one-third of predatory journals accepted Szust’s application as an editor. We spoke with Pisanski about the project.
Retraction Watch: What made you conceive of this project, and what did you hope to accomplish?
When zoologists at the University of Oxford published findings in Science last year suggesting ducklings can learn to identify shapes and colors without training (unlike other animals), the news media was entranced.
However, critics of the study have published a pair of papers questioning the findings, saying the data likely stem from chance alone. Still, the critics told us they don’t believe the findings should be retracted.
If a duckling is shown an image, can it pick out another from a set that has the same shape or color? Antone Martinho III and Alex Kacelnik say yes. In one experiment, 32 out of 47 ducklings preferred pairs of shapes they were originally shown. In the second experiment, 45 out of 66 ducklings preferred the original color. The findings caught the attention of many media outlets, including the New York Times, The Atlantic, and BuzzFeed.
Martinho told us:
Most researchers by now recognize there’s a reproducibility crisis facing science. But what to do about it? Today in Nature, Jeffrey S. Mogil at McGill University and Malcolm R. Macleod at the University of Edinburgh propose a new approach: Restructure the reporting of preclinical research to include an extra “confirmatory study” performed by an independent lab, which verifies the findings before they are published. We spoke with them about how this could work.
Retraction Watch: You’re proposing to restructure animal studies of new therapies or ways to prevent disease. Can you explain what this new type of study should look like, and how researchers will execute it?
Four of the newly corrected papers have a common last and corresponding author: Luke O’Neill of Trinity College Dublin in the Republic of Ireland. O’Neill is also a co-author of the remaining paper that was fixed. O’Neill told us the mistakes were a “bit sloppy,” noting that he takes responsibility for the errors in the four papers on which he is last author.
O’Neill forwarded Retraction Watch a comment he received from Kaoru Sakabe — data integrity manager at the American Society for Biochemistry and Molecular Biology (which publishes The Journal of Biological Chemistry (JBC)) — that reads: Read the rest of this entry »
As if peer reviewers weren’t overburdened enough, imagine if journals asked them to also independently replicate the experiments they were reviewing? True, replication is a big problem — and always has been. At the November 2016 SpotOn conference in London, UK historian Noah Moxham of the University of St Andrews in Scotland mentioned that, in the past, some peer reviewers did replicate experiments. We asked him to expand on the phenomenon here.
Retraction Watch: During what periods in history did peer reviewers repeat experiments? And how common was the practice? Read the rest of this entry »
Science has retracted a high-profile immunology paper after a probe concluded the corresponding author had committed misconduct.
The paper — which initially caught media attention for suggesting a protein could help boost the immune system’s ability to fight off tumors — has been under a cloud of suspicion since last year, when the journal tagged it with an expression of concern, citing a university investigation.
That investigation — at Imperial College London — has concluded that the paper contained problematic figures that were the result of research misconduct. All were prepared by last and corresponding author Philip Ashton-Rickardt, who took full responsibility. Even though the paper was published in 2015, some original blots and accompanying details have disappeared.
Do pro-nuclear energy countries act more slowly to curb the effects of climate change? That’s what a paper published in July in the journal Climate Policy claimed. But the hotly debated study was retracted last week after the authors came to understand that it included serious errors.
Two papers evaluating glucose meters — used by diabetics to monitor blood sugar levels — suggested that a couple of the devices don’t work as well as they should. Perhaps unsurprisingly, the companies that sell those meters objected to how the studies were conducted. By all accounts, the companies appear to be justified in their complaints.
In both cases, researchers used blood drawn from veins to test the meters. But manufacturers of the WaveSense JAZZ and GlucoRx glucose meters said their devices are designed to work with fresh blood from a finger-prick. Both papers have now been retracted.
The retraction notice for “Technical and clinical accuracy of five blood glucose meters: clinical impact assessment using error grid analysis and insulin sliding scales,” published in 2015 in the Journal of Clinical Pathology, hints at the issue: