Archive for the ‘pnas’ Category
Recently, a biostatistician sent an open letter to editors of 10 major science journals, urging them to pay more attention to common statistical problems with papers. Specifically, Romain-Daniel Gosselin, Founder and CEO of Biotelligences, which trains researchers in biostatistics, counted how many of 10 recent papers in each of the 10 journals contained two common problems: omitting the sample size used in experiments, as well as the tests used as part of the statistical analyses. (Short answer: Too many.) Below, we have reproduced his letter.
Dear Editors and Colleagues,
I write this letter as a biologist and instructor of biostatistics, concerned about the disregard for statistical reporting that is threatening scientific reproducibility. I hereby urge you to spearhead the strict application of existing guidelines on statistical reporting. Read the rest of this entry »
After being accused of falsifying three figures in a submitted manuscript, Mauvais-Jarvis sued his accusers and officials at his former employer — Northwestern University — for defamation and conspiracy in 2011.
In 2014, a judge dismissed the suit. We wish we could tell you more details about it—such as what the university’s misconduct investigation found, or how the lawsuit was concluded—but they remain shrouded in mystery. What we know is based on court records from the lawsuit, which we recently obtained through an unrelated public records request. Even without all the details, it’s a long, sordid tale, involving a lot of finger-pointing and allegations of misconduct.
In 2008, a former research technician in the lab of Mauvais-Jarvis, then an associate professor of medicine at Northwestern University, raised concerns of fabrication in two figures in a paper on the regulation of insulin synthesis that had been submitted the Journal of Biological Chemistry. An inquiry committee at the university unanimously concluded that research misconduct charges against Mauvais-Jarvis were not credible.
But then a third figure in the manuscript was found to be “inaccurate,” and the university initiated a second inquiry. That’s when Mauvais-Jarvis — whose papers have been cited more than 2,000 times, according to Clarivate Analytics’ Web of Science, formerly part of Thomson Reuters — initiated a lawsuit. Read the rest of this entry »
This weekend, Carlo Croce had some reprieve from the misconduct accusations that have followed him for years (recently described in a lengthy article in the New York Times) and that have prompted his university to re-open an investigation. On Sunday, he received a prestigious award from the American Association for Cancer Research, honoring his work.
But the moment may have been short-lived. Today, Croce received two expressions of concern (EOCs) from PNAS for two well-cited papers published over a decade ago, on which Croce — chair of the Department of Cancer Biology and Genetics at The Ohio State University (OSU) — is last author. The two EOCs cite concerns over duplicated bands. What’s more, another journal recently decided to retract one of his papers, citing figures that didn’t represent the results of the experiments.
PNAS chose to issue EOCs, rather than retractions or corrections, because the authors didn’t agree that the bands were duplicated, according to executive editor Diane Sullenberger. She explained how the journal learned of the issues with the two papers: Read the rest of this entry »
By now, most of our readers are aware that some fields of science have a reproducibility problem. Part of the problem, some argue, is the publishing community’s bias toward dramatic findings — namely, studies that show something has an effect on something else are more likely to be published than studies that don’t.
Many have argued that scientists publish such data because that’s what is rewarded — by journals and, indirectly, by funders and employers, who judge a scientist based on his or her publication record. But a new meta-analysis in PNAS is saying it’s a bit more complicated than that.
In a paper released today, researchers led by Daniele Fanelli and John Ioannidis — both at Stanford University — suggest that the so-called “pressure-to-publish” does not appear to bias studies toward larger so-called “effect sizes.” Instead, the researchers argue that other factors were a bigger source of bias than the pressure-to-publish, namely the use of small sample sizes (which could contain a skewed sample that shows stronger effects), and relegating studies with smaller effects to the “gray literature,” such as conference proceedings, PhD theses, and other less publicized formats.
However, Ferric Fang of the University of Washington — who did not participate in the study — approached the findings with some caution:
The chair of a biology department who has faced years of misconduct accusations has taken another hit—a lengthy correction due to text “overlap” between one of his PNAS papers and six other articles.
According to the correction, a reader contacted the journal to notify the editors that text and sentences in multiple sections of the 2015 paper — on which Carlo Croce is last author — were lifted from other sources without quotation marks.
This is the second correction for Croce in PNAS regarding overlap issues in just the last few weeks—the first was published on March 7 (see here). In both instances, PNAS did not call the textual similarities plagiarism, but the notice details multiple instances of overlap.
Croce, the chair of the department of cancer biology and genetics at The Ohio State University (OSU), is no stranger to controversy.
Last week, a study brought into question years of research conducted using the neuroimaging technique functional magnetic resonance imaging (fMRI). The new paper, published in PNAS, particularly raised eyebrows for suggesting that the rates of false positives in studies using fMRI could be up to 70%, which may affect many of the approximately 40,000 studies in academic literature that have so far used the technique. We spoke to the Anders Eklund, from Linköping University in Sweden, who was the first author of the study. Read the rest of this entry »
Researchers are retracting two papers about molecular signalling in plants — including one from the Proceedings of the National Academy of Sciences (PNAS) — after discovering some inadvertent genotyping errors. But that was only after they used the problematic plants for an entire year without realizing they’d made a mistake.
In a pair of refreshingly transparent and detailed notices, the authors explain that the transgenic plants used in the papers included genotyping errors, which invalidated their findings. According to the notices, first author Man-Ho Oh generated the problematic transgenic plants, while corresponding author Steven C. Huber, based at University of Illinois, Urbana-Champaign (UIUC), took responsibility for omitting some critical oversight.
Huber told us that there were only two papers that used the transgenic plants in question, so no other retractions will be forthcoming.
A PNAS paper that caught the media’s attention for suggesting that adding silk could stabilize vaccines and antibiotics has been pulled after the authors realized there were significant errors in the data analysis.
According to the notice, the authors agreed to retract the 2012 paper; however, the corresponding author told us the authors did not think a retraction was required as, according to him, the conclusions remained valid.
The paper presented a solution to the long-standing problem that sensitive biological compounds such as vaccines and antibiotics begin to lose their effectiveness outside the recommended temperature range, and naturally biodegrade over time. The degradation process cannot be reversed, and may even speed up during transport or storage under less ideal temperatures.
The study, published today in the Proceedings of the National Academy of Sciences, used data from the psychology replication project, which found only 39 out of 100 experiments live up to their original claims. The authors conclude that more “contextually sensitive” papers — those whose background factors are more likely to affect their replicability — are slightly less likely to be reproduced successfully.
They summarize their results in the paper:
The former University of Tokyo endocrinologist recently earned another retraction, for a paper in Archives of Biochemistry and Biophysics that contained image manipulation. As we’ve noted before, Kato resigned from the university in 2012 as it investigated his work for misconduct; in 2013 a Japanese newspaper reported that the investigation had found 43 papers from his lab contained “likely altered or forged materials.”
In addition to the new retraction, we’ve dug up four others for Kato from the past few years, plus one correction. Two of the retraction notices mention an investigation at the University of Tokyo.
First, the retraction note for “Multiple co-activator complexes support ligand-induced transactivation function of VDR,” published in December: