“The data have spoken:” Controversial NgAgo gene editing study retracted

The author of a 2016 paper describing a potentially invaluable lab tool has retracted it, following heavy criticism from outside groups that could not reproduce the findings.

The paper had already been tagged with an Expression of Concern by the journal, Nature Biotechnology, which included data from multiple groups casting doubt on the original findings. Although the authors, led by Chunyu Han at Hebei University of Science and Technology in China, produced data to support their original findings, the journal has concluded — following “feedback from expert reviewers” — that the additional data “are insufficient to counter the substantial body of evidence that contradicts their initial findings,” according to an editorial released today:

Continue reading “The data have spoken:” Controversial NgAgo gene editing study retracted

“We do not want to create false hope”: Authors retract Cell paper they can’t replicate

A few years ago, researchers in Sweden had something to celebrate: They thought they had discovered a chink in the armor of the most common type of malignant brain cancer.

In a 2014 Cell paper, the team — led by Patrik Ernfors at the Karolinska Institutet — reported that they had identified a small molecule that could target and kill glioblastoma cells — the cancer that U.S. Senator John McCain was just diagnosed with — and prolong survival in mice with the disease. 

Satish Srinivas Kitambi, the paper’s first author, who is also based at the Karolinska Institutet, said the results got the team “really excited:” Continue reading “We do not want to create false hope”: Authors retract Cell paper they can’t replicate

Can we do math unconsciously? Replicators of a prominent 2012 study have some doubts

In 2012, news media were abuzz with a new finding from PNAS: Authors based in Israel had found evidence that our brains can unconsciously process more than we thought — including basic math and reading.  In other words, the authors claimed people could read and do math without even knowing what they were doing.

With such a major development in the field of consciousness research, other groups quickly got to work trying to replicate the findings. Those efforts have taken some twists and turns — including a recent retraction of a replication paper that was, itself, not reproducible (which is not something we see every day). But overall, five years after the initial, remarkable result, the replication efforts are calling it into question.

According to Pieter Moors at KU Leuven, a researcher in this field:

Continue reading Can we do math unconsciously? Replicators of a prominent 2012 study have some doubts

Dear journals: Clean up your act. Regards, Concerned Biostatistician

Romain-Daniel Gosselin

Recently, a biostatistician sent an open letter to editors of 10 major science journals, urging them to pay more attention to common statistical problems with papers. Specifically, Romain-Daniel Gosselin, Founder and CEO of Biotelligences, which trains researchers in biostatistics, counted how many of 10 recent papers in each of the 10 journals contained two common problems: omitting the sample size used in experiments, as well as the tests used as part of the statistical analyses. (Short answer: Too many.) Below, we have reproduced his letter.

Dear Editors and Colleagues,

I write this letter as a biologist and instructor of biostatistics, concerned about the disregard for statistical reporting that is threatening scientific reproducibility. I hereby urge you to spearhead the strict application of existing guidelines on statistical reporting. Continue reading Dear journals: Clean up your act. Regards, Concerned Biostatistician

Need to find a replication partner, or collaborator? There’s an online platform for that

Christopher Chartier
Randy McCarthy

Do researchers need a new “Craigslist?” We were recently alerted to a new online platform called StudySwap by one of its creators, who said it was partially inspired by one of our posts. The platform creates an “online marketplace” that previous researchers have called for, connecting scientists with willing partners – such as a team looking for someone to replicate its results, and vice versa. As co-creators Christopher Chartier at Ashland University and Randy McCarthy at Northern Illinois University tell us, having a place where researchers can find each other more efficiently “is in everyone’s best interest.”

Retraction Watch: What inspired you to create StudySwap?

Continue reading Need to find a replication partner, or collaborator? There’s an online platform for that

“Failure is an essential part of science:” A Q&A with the author of a new book on reproducibility

Reproducibility is everywhere recently, from the pages of scientific journals to the halls of the National Academy of Sciences, and today it lands in bookstores across the U.S. Longtime NPR correspondent Richard Harris has written Rigor Mortis (Basic Books), which is published today. (Full disclosure: I blurbed the book, writing that “Harris deftly weaves gripping tales of sleuthing with possible paths out of what some call a crisis.”) Harris answered some questions about the book, and the larger issues, for us. 

Retraction Watch (RW): Rigor Mortis begins with the story of the 2012 Nature paper by C. Glenn Begley and Lee Ellis that is now famous for sounding the alarm about reproducibility in basic cancer research. But as you document, this is not a problem that began in 2012. When did scientists first start realizing there was a problem? Continue reading “Failure is an essential part of science:” A Q&A with the author of a new book on reproducibility

What leads to bias in the scientific literature? New study tries to answer

By now, most of our readers are aware that some fields of science have a reproducibility problem. Part of the problem, some argue, is the publishing community’s bias toward dramatic findings — namely, studies that show something has an effect on something else are more likely to be published than studies that don’t.

Many have argued that scientists publish such data because that’s what is rewarded — by journals and, indirectly, by funders and employers, who judge a scientist based on his or her publication record. But a new meta-analysis in PNAS is saying it’s a bit more complicated than that.

In a paper released today, researchers led by Daniele Fanelli and John Ioannidis — both at Stanford University — suggest that the so-called “pressure-to-publish” does not appear to bias studies toward larger so-called “effect sizes.” Instead, the researchers argue that other factors were a bigger source of bias than the pressure-to-publish, namely the use of small sample sizes (which could contain a skewed sample that shows stronger effects), and relegating studies with smaller effects to the “gray literature,” such as conference proceedings, PhD theses, and other less publicized formats.

However, Ferric Fang of the University of Washington — who did not participate in the study — approached the findings with some caution:

Continue reading What leads to bias in the scientific literature? New study tries to answer

Physics paper’s results off by factor of 100

Researchers from China have retracted a physics paper after realizing an error led them to report results that were nearly 100 times too large.

What’s more, the authors omitted key findings that would enable others to reproduce their experiments.

According to the notice, the authors used a value to calculate a feature of electrons—called mobility—that “was approximately 100 times too small,” which led to results that were “100 times too large.” The notice also details several gaps in the presentation of experimental results, which preclude others from duplicating the experiments.

Here’s the retraction notice for “Bulk- and layer-heterojunction phototransistors based on poly[2-methoxy-5-(2′-ethylhexyloxy-p-phenylenevinylene)] and PbS quantum dot hybrids:” Continue reading Physics paper’s results off by factor of 100

A new reproducibility fix? Get your work checked before it’s published

Jeffrey Mogil
Malcolm Macleod

Most researchers by now recognize there’s a reproducibility crisis facing science. But what to do about it? Today in Nature, Jeffrey S. Mogil at McGill University and Malcolm R. Macleod at the University of Edinburgh propose a new approach: Restructure the reporting of preclinical research to include an extra “confirmatory study” performed by an independent lab, which verifies the findings before they are published. We spoke with them about how this could work.

Retraction Watch: You’re proposing to restructure animal studies of new therapies or ways to prevent disease. Can you explain what this new type of study should look like, and how researchers will execute it?

Continue reading A new reproducibility fix? Get your work checked before it’s published

Researchers disagree over how to explain doubts over physics findings

After an international group of physicists agreed that the findings of their 2015 paper were in doubt, they simply couldn’t agree on how to explain what went wrong. Apparently tired of waiting, the journal retracted the paper anyway.

The resulting notice doesn’t say much, for obvious reasons. Apparently, some additional information came to light which caused the researchers to question the results and model. Although the five authors thought a retraction was the right call, they could not agree on the language in the notice.

Here’s the retraction notice for “Atomistic simulation of damage accumulation and amorphization in Ge,” published online February 2015 in the Journal of Applied Physics (JAP) and retracted two years later in January 2017: Continue reading Researchers disagree over how to explain doubts over physics findings