Three years ago, the American Statistical Association (ASA) expressed hope that the world would move to a “post-p-value era.” The statement in which they made that recommendation has been cited more than 1,700 times, and apparently, the organization has decided that era’s time has come. (At least one journal had already banned p values by 2016.) In an editorial in a special issue of The American Statistician out today, “Statistical Inference in the 21st Century: A World Beyond P<0.05,” the executive director of the ASA, Ron Wasserstein, along with two co-authors, recommends that when it comes to the term “statistically significant,” “don’t say it and don’t use it.” (More than 800 researchers signed onto a piece published in Nature yesterday calling for the same thing.) We asked Wasserstein’s co-author, Nicole Lazar of the University of Georgia, to answer a few questions about the move. Here are her responses, prepared in collaboration with Wasserstein and the editorial’s third co-author, Allen Schirm.
What Caught Our Attention: A tree of life paper has been axed — and based on the information in the retraction notice, we’re wondering how it ever passed peer review.
Specifically, the notice states a review of the paper found “concerns regarding the study design, methodology, and interpretation of the data.” Overall, the research “contradict(s) a large body of existing literature and do(es) not provide a sufficient level of evidence to support the claims made in the paper.” Um, so what did it get right?
Originally published June 17, 2016, the paper was retracted Jan. 15. Led by corresponding author Xavier Altafaj, of the University of Barcelona (UB) and Bellvitge Biomedical Research Institute (IDIBELL), researchers described using an amino acid, D-serine, to treat a child with a rare genetic disorder that affects neurons.
According to the notice, the researchers did use D-serine in lab work used as proof-of-concept; however, when it came time to try it in the patient, as a result of a “communication error:”
When Alexander Harms arrived at the University of Copenhagen in August 2016, as a postdoc planning to study a type of antibiotic resistance in bacteria, he carried with him a warning from another lab who had recruited him:
People said, “If you go there, you have to deal with these weird articles that nobody believes.”
The papers in question had been published in the Proceedings of the National Academy of Sciences in 2011 and Cell in 2013. Led by Kenn Gerdes, Harms’s new lab director, the work laid out a complex chain of events that mapped out how an E. coli bacterium can go into a dormant state, called persistence, that allows it to survive while the rest of its colony is wiped out.
Despite some experts’ skepticism, each paper had been cited hundreds of times. And Harms told us:
I personally did believe in the published work. There had been papers from others that kind of attacked [the Gerdes lab’s theory], but that was not high-quality work.
What Caught Our Attention: Soon after the paper appeared, the journal was alerted to the fact its findings were at odds with others in the field. When the editor approached the authors, everything fell apart: The authors couldn’t repeat the experiments, and “were also unsure of the molecular probes that were used in the study.” While it isn’t unusual to have doubts about data — since since research is a process of experimentation — it is odd not to know how your experiment was conducted. The paper was retracted less than two months after it was published. The manuscript was accepted two months after it was submitted in early May, theoretically giving reviewers enough time to catch these issues (along with the authors’ failure to cite relevant papers).
What Caught Our Attention: A big peer review (and perhaps academic mentorship) fail. These researchers used the wrong anticoagulant for their blood samples, leading them to believe that certain blood components were fighting microbes. The authors counted the number of colonies to show how well or poorly Tuberculin mycobacteria were growing in cultures — but blood samples need anticoagulants to prevent clots before analysis, and they used an anticoagulant that actually prevented the microbes from colonizing. The authors (and reviewers) should have known this from Continue reading Caught Our Notice: Dear peer reviewer, please read the methods section. Sincerely, everyone