Gastroenterology has retracted a 2012 article on GI cancers associated with AIDS after the authors, from the National Cancer Institute, acknowledged that a “programming” error led them to overestimate the incidence of the tumors.
Eleven scientists are asking a journal to consider retracting an asbestos paper with industry ties for including “seriously misleading information,” “several wrong statements,” and thrice citing a journal that doesn’t appear to exist.
Editors of the journal, Epidemiology Biostatistics and Public Health, however, say they will not retract the article, based on the advice of two external reviewers.
An unusual article that considered the concept of change from a systems perspective — including change in medicine, economics, and decision-making, for instance — has, well, changed from “published” to “retracted.”
After commenters on PubPeer called the 2014 paper “gibberish” and even suggested it might be computer-generated, Frontiers in Computational Neuroscience retracted it, noting it “does not meet the standards of editorial and scientific soundness” for the journal, according to the retraction notice.The paper’s editor and author maintain there was nothing wrong with the science in the paper.
A father and son are fighting over whether a laser therapy they describe as co-authors of a 2015 paper could be harmful to patients, prompting the journal to retract the article.
The small study suggested that the therapy could safely treat patients with glaucoma. But Tomislav Ivandic — the father — alleges that errors in how the study was reported could lead to harmful doses of laser light for patients receiving the therapy. His son and co-author, Boris Ivandic, maintains that the article is accurate.
After reading too many papers that either are not reproducible or contain statistical errors (or both), the American Statistical Association (ASA) has been roused to action. Today the group released six principles for the use and interpretation of p values. P-values are used to search for differences between groups or treatments, to evaluate relationships between variables of interest, and for many other purposes. But the ASA says they are widely misused. Here are the six principles from the ASA statement:Continue reading We’re using a common statistical test all wrong. Statisticians want to fix that.
Authors are retracting a 2014 paper about how liquid-crystalline materials self-organize in low temperature conditions after realizing they had measured the temperatures incorrectly.
Ben Goldacre has been a busy man. In the last six weeks, the author and medical doctor’s Compare Project has evaluated 67 clinical trials published in the top five medical journals, looking for any “switched outcomes,” meaning the authors didn’t report something they said they would, or included additional outcomes in the published paper, with no explanation for the change. The vast majority – 58 – included such discrepancies. Goldacre talked to us about how journals – New England Journal of Medicine (NEJM), JAMA, The Lancet, BMJ, and Annals of Internal Medicine — have responded to this feedback.
In the corrected report, the agency estimates the health risks of the laminate flooring — by irritating the ear, nose and throat — to be three-fold higher than what they suggested in the original report, published February 10.
When a researcher encountered two papers that suggested moonlight has biological effects — on both plants and humans — he took a second look at the data, and came to different conclusions. That was the easy part — getting the word out about his negative findings, however, was much more difficult.
When Jean-Luc Margot, a professor in the departments of Earth, Planetary & Space Sciences and Physics & Astronomy at the University of California, Los Angeles, tried to submit his reanalysis to the journals that published the original papers, both rejected it; after multiple attempts, his work ended up in different publications.
Disagreements are common but crucial in science; like they say, friction makes fire. Journals are inherently disinterested in negative findings — but should it take more than a year, in one instance, to publish an alternative interpretation to somewhat speculative findings that, at first glance, seem difficult to believe? Especially when they contain such obvious methodological issues such as presenting only a handful of data points linking biological activity to the full moon, or ignore significant confounders?
Margot did not expect to have such a difficult experience with the journals — including Biology Letters, which published the study suggesting that a plant relied on the full moon to survive: Continue reading Why publishing negative findings is hard