Retraction Watch

Tracking retractions as a window into the scientific process

Archive for the ‘not reproducible’ Category

Researchers disagree over how to explain doubts over physics findings

without comments

After an international group of physicists agreed that the findings of their 2015 paper were in doubt, they simply couldn’t agree on how to explain what went wrong. Apparently tired of waiting, the journal retracted the paper anyway.

The resulting notice doesn’t say much, for obvious reasons. Apparently, some additional information came to light which caused the researchers to question the results and model. Although the five authors thought a retraction was the right call, they could not agree on the language in the notice.

Here’s the retraction notice for “Atomistic simulation of damage accumulation and amorphization in Ge,” published online February 2015 in the Journal of Applied Physics (JAP) and retracted two years later in January 2017: Read the rest of this entry »

Authors pull virus replication paper after they cannot replicate results

without comments

Researchers in China have retracted a 2016 paper exploring the replication behaviors of a retrovirus, after discovering that the key results could not be reproduced — possibly because their cell cultures had been contaminated.

The authors also cite a disagreement with a colleague, who they say contributed to the work but does not want to be listed as an author.

Here’s the retraction notice for “Nuclear import of prototype foamy virus transactivator Bel1 is mediated by KPNA1, KPNA6 and KPNA7,” published in the International Journal of Molecular Medicine: Read the rest of this entry »

Why traditional statistics are often “counterproductive to research the human sciences”

with 9 comments

Andrew Gelman

Doing research is hard. Getting statistically significant results is hard. Making sure the results you obtain reflect reality is even harder. In this week’s Science, Eric Loken at the University of Connecticut and Andrew Gelman at Columbia University debunk some common myths about the use of statistics in research — and argue that, in many cases, the use of traditional statistics does more harm than good in human sciences research. 

Retraction Watch: Your article focuses on the “noise” that’s present in research studies. What is “noise” and how is it created during an experiment?

Read the rest of this entry »

Written by Alison McCook

February 9th, 2017 at 2:00 pm

At last, cancer reproducibility project releases some results — and they’re mixed

without comments

Nearly five years ago, researchers suggested that the vast majority of preclinical cancer research wouldn’t hold up to follow-up experiments, delaying much needed treatments for patients. In a series of articles publishing tomorrow morning, eLife has released the results of the first five attempts to replicate experiments in cancer biology — and the results are decidedly mixed.

As our co-founders Adam Marcus and Ivan Oransky write in STAT, the overall take-home message was that two studies generated findings similar to the original, one did not replicate the original, and two others were inconclusive.

They quote Brian Nosek, a psychologist at the University of Virginia, in Charlottesville, who runs the Center for Open Science, who has been leading the replication effort in his own field:

Read the rest of this entry »

Written by Alison McCook

January 18th, 2017 at 1:53 pm

Lack of reproducibility triggers retractions of Nature Materials articles

without comments

The authors of a highly cited 2015 paper in Nature Materials have retracted it, after being unable to reproduce some of the key findings.

The move prompted the journal to also retract an associated News & Views article.

Here’s the retraction notice for “Fast and long-range triplet exciton diffusion in metal–organic frameworks for photon upconversion at ultralow excitation power:” Read the rest of this entry »

Error-laden database kills paper on extinction patterns

without comments

An ecologist in Australia realized a database he was using to spot trends in extinction patterns was problematic, affecting two papers. One journal issued an expression of concern, which has since turned into a retraction. So far, the other journal has left the paper untouched.

The now-retracted paper concluded that medium-sized species on islands tend to go extinct more often than large or small mammalian species. But a little over a year ago, Biology Letters flagged the paper with an expression of concern (EOC), noting “concerns regarding the validity of some of the data and methods used in the analysis.”

Now, last author Marcel Cardillo at Australian National University has come to a new conclusion about extinction patterns. A retraction notice that has replaced the EOC explains:

Read the rest of this entry »

Yes, “power pose” study is flawed, but shouldn’t be retracted, says one author

with 7 comments

psychological-scienceAfter the first author of a debated study about the benefits of positioning your body in an assertive ways — the so-called “power pose” — posted her concerns about the research, she has told us she does not believe the paper should be retracted.

As reported by New York magazine, late last night, the first author of a 2010 paper in Psychological Science posted a statement saying she no longer believes the effects of the “power pose” are real.

We contacted Dana Carney, now based at the University of California, Berkeley, to ask if she thought the next step would be to retract the paper. She told us: Read the rest of this entry »

Written by Alison McCook

September 26th, 2016 at 1:36 pm

Author asks to retract nearly 20-year old paper over figure questions, lack of data

with 23 comments

journal-of-biological-chemistryThe last author of a 1999 paper has asked the journal to retract it less than one month after a user raised questions about images on PubPeer.

Yesterday, last author Jim Woodgett posted a note on the site saying the author who generated the figures in question could not find the original data, and since he agreed the images appeared “suspicious,” he had contacted the journal to retract the paper.

Here’s the note from Woodgett, based at Lunenfeld-Tanenbaum
Research Institute at Mount Sinai Hospital in Toronto: Read the rest of this entry »

Authors retract 2016 cancer study when data don’t align with figures

with 2 comments

cell-death-and-differentiationResearchers have retracted a 2016 cancer study, citing discrepancies between the data and images presented in the paper. 

Although the retraction notice itself contains relatively little information, we’ve obtained a letter from the last author — Jun-Li Luo of The Scripps Research Institute in Jupiter, Florida — to the editor-in-chief of Cell Death and Differentiation that says a bit more. 

According to the letter, after receiving the anonymous email, Luo conducted an investigation, contacting co-authors who contributed each of the figures in question. Although Luo writes that he has no reason to suspect fraud, the researchers were not able to provide some of the original data.

PubPeer commenters have questioned figures 1, 3, 4, 5, 6 and 7 in the study, “IKKα-mediated biogenesis of miR-196a through interaction with Drosha regulates the sensitivity of cancer cells to radiotherapy.”

In the letter, Luo tells Gerry Melino, co-editor-in-chief of the journal from the University of Leicester, UK, that figures 3D and 3E were provided by the study’s first author, Xing Fang, adding: Read the rest of this entry »

How can we improve preclinical research? Advice from a diabetes researcher

without comments

Daniel Drucker

Daniel Drucker

By all accounts, science is facing a crisis: Too many preclinical studies aren’t reproducible, leading to wasted time and effort by researchers around the world. Today in Cell Metabolism, Daniel Drucker at the Lunenfeld-Tanenbaum Research Institute of Mount Sinai Hospital in Toronto details numerous ways to make this early research more robust. His most important advice: more transparent reporting of all results (not just the positive findings), along with quantifying, reporting, tracking, and rewarding reproducibility, for both scientists and journals and universities/research institutes.

Retraction Watch: Which of your recommendations will researchers most object to, and why? Read the rest of this entry »

Written by Alison McCook

September 13th, 2016 at 12:05 pm