What if we tried to replicate papers before they’re published?

Martin Schweinsberg
Martin Schweinsberg
Eric Uhlmann
Eric Uhlmann

We all know replicability is a problem – consistently, many papers in various fields fail to replicate when put to the test. But instead of testing findings after they’ve gone through the rigorous and laborious process of publication, why not verify them beforehand, so that only replicable findings make their way into the literature? That is the principle behind a recent initiative called The Pipeline Project (covered in The Atlantic today), in which 25 labs checked 10 unpublished studies from the lab of one researcher in social psychology. We spoke with that researcher, Eric Uhlmann (also last author on the paper), and first author Martin Schweinsberg, both based at INSEAD.

Retraction Watch: What made you decide to embark upon this project? Continue reading What if we tried to replicate papers before they’re published?

Chemist fighting to keep PhD asks University of Texas to pay $95k in legal fees

University of Texas

After the University of Texas postponed a hearing to determine whether it should revoke a chemist’s PhD, her lawyer has filed a motion to stop the proceedings, and requested the school pay her $95,099 in lawyer fees and expenses.

This is the second time UT has threatened to revoke Suvi Orr‘s PhD, following a 2012 retraction for a paper that made up part of her dissertation, which the school alleged contained falsified data. UT revoked her degree in 2014, only to reinstate it after she sued. The school is now trying to revoke it again, but the scheduled hearing on March 4 was postponed. Last week, her lawyer filed a motion for final summary judgment requesting that UT stop the proceedings and repay $95,099 in lawyer fees and expenses. The new motion makes a few requests:

Sperm paper impaired by “corporate company” analysis

2.coverWithout a certain protein, mouse sperm have motility disorders. That’s the conclusion of a paper that has itself been stopped — by errors in the data analysis, carried out by a third-party company.

The retraction note pins the analysis, which led to faulty data, on a “corporate company.” Aside from the companies that sell the kits used for substrates, assays, and detection, there’s only one company mentioned in the paper:

Generation of the mouse model was performed by the Cyagen Company (Guangzhou, China)

However, a representative of Cyagen says it does not offer the type of analysis described by the retraction note.

Here’s the full retraction note for the 2015 paper in Biology of Reproduction (which is paywalled — tsk, tsk):
Continue reading Sperm paper impaired by “corporate company” analysis

Authors retract striking circadian clock finding after failing to replicate

Screen Shot 2016-03-02 at 9.29.28 AMThe authors of a paper showing a “striking and unanticipated” relationship between light and temperature in regulating circadian rhythms are retracting it when the results couldn’t be replicated.

After being contacted by another group who couldn’t reproduce the data, the authors failed to, as well. They “have absolutely no explanation for the discrepancies with the original results,” according to the note in PLOS Biology.

It’s an unfortunate turn of events, but Continue reading Authors retract striking circadian clock finding after failing to replicate

Let’s not mischaracterize replication studies: authors

Brian Nosek
Brian Nosek

Scientists have been abuzz over a report in last week’s Science questioning the results of a recent landmark effort to replicate 100 published studies in top psychology journals. The critique of this effort – which suggested the authors couldn’t replicate most of the research because they didn’t adhere closely enough to the original studies – was debated in many outlets, including Nature, The New York Times, and Wired. Below, two of the authors of the original reproducibility project — Brian Nosek and Elizabeth Gilbert – use the example of one replicated study to show why it is important to describe accurately the nature of a study in order to assess whether the differences from the original should be considered consequential. In fact, they argue, that one of the purposes of replication is to help assess whether differences presumed to be irrelevant are actually irrelevant, all of which brings us closer to the truth. Continue reading Let’s not mischaracterize replication studies: authors

We’re using a common statistical test all wrong. Statisticians want to fix that.

ASA-newlogoAfter reading too many papers that either are not reproducible or contain statistical errors (or both), the American Statistical Association (ASA) has been roused to action. Today the group released six principles for the use and interpretation of p values. P-values are used to search for differences between groups or treatments, to evaluate relationships between variables of interest, and for many other purposes.  But the ASA says they are widely misused. Here are the six principles from the ASA statement:  Continue reading We’re using a common statistical test all wrong. Statisticians want to fix that.

More than half of top-tier economics papers are replicable, study finds

scienceApproximately six out of 10 economics studies published in the field’s most reputable journals American Economic Review and the Quarterly Journal of Economics are replicable, according to a study published today in Science.

The authors repeated the results of 18 papers published between 2011 and 2014 and found 11 approximately 61% lived up to their claims. But the study found the replicated effect to be on average only 66% of that reported in the earlier studies, which suggests that authors of the original papers may have exaggerated the trends they reported.

Colin Camerer, a behavioral economist at the California Institute of Technology in Pasadena, who co-authored the study, “Evaluating replicability of laboratory experiments in economics,” told us: Continue reading More than half of top-tier economics papers are replicable, study finds

High-profile critic slams Nature letters about dinosaur growth following corrections

cover_nature (1)Authors of a pair of letters in Nature that concluded dinosaurs reached their full size surprisingly quickly are standing by their conclusions, despite challenges from a high-profile critic.

In the letters, researchers led by first author Gregory M. Erickson, a paleobiologist at The Florida State University, concluded that massive dinos grew fast — for example, a 5.5 ton T-Rex could reach skeletal maturity in just two decades. However, when Nathan Myhrvold tried to reanalyze the data, he couldn’t replicate the results. The authors have issued corrections to address the small mistakes unearthed by Myhrvold’s analysis, but argue he couldn’t replicate their results because they hadn’t fully explained their methodology.

After Myhrvold attempted to replicate the findings of maximum size and growth rate for several papers, he found issues in many, including the two Nature letters, according to a press release on Myhrvold’s website: Continue reading High-profile critic slams Nature letters about dinosaur growth following corrections

STAP stem cell researcher Obokata loses another paper

Nature protocols

The first author of two high-profile Nature retractions about a technique to easily create stem cells has lost another paper in Nature Protocols.

Haruko Obokata, once “a lab director’s dream,” according to The New Yorker, also had her PhD revoked from Waseda University last fall.

After learning of concerns that two figures are “very similar” and “some of the error bars look unevenly positioned,” the rest of the authors were unable to locate the raw data, according to the note. The journal could not reach Obokata for comment before publishing the retraction.

Reproducible subcutaneous transplantation of cell sheets into recipient mice” has been cited 21 times, according to Thomson Reuters Web of Science. It was published in June 2011, soon after Obokata earned her PhD. 

Here’s the note:

Continue reading STAP stem cell researcher Obokata loses another paper

Why publishing negative findings is hard

Jean-Luc Margot
Jean-Luc Margot

When a researcher encountered two papers that suggested moonlight has biological effects — on both plants and humans — he took a second look at the data, and came to different conclusions. That was the easy part — getting the word out about his negative findings, however, was much more difficult.

When Jean-Luc Margot, a professor in the departments of Earth, Planetary & Space Sciences and Physics & Astronomy at the University of California, Los Angeles, tried to submit his reanalysis to the journals that published the original papers, both rejected it; after multiple attempts, his work ended up in different publications.

Disagreements are common but crucial in science; like they say, friction makes fire. Journals are inherently disinterested in negative findings — but should it take more than a year, in one instance, to publish an alternative interpretation to somewhat speculative findings that, at first glance, seem difficult to believe? Especially when they contain such obvious methodological issues such as presenting only a handful of data points linking biological activity to the full moon, or ignore significant confounders?

Margot did not expect to have such a difficult experience with the journals — including Biology Letters, which published the study suggesting that a plant relied on the full moon to survive: Continue reading Why publishing negative findings is hard