After being accused of falsifying three figures in a submitted manuscript, Mauvais-Jarvis sued his accusers and officials at his former employer — Northwestern University — for defamation and conspiracy in 2011.
In 2014, a judge dismissed the suit. We wish we could tell you more details about it—such as what the university’s misconduct investigation found, or how the lawsuit was concluded—but they remain shrouded in mystery. What we know is based on court records from the lawsuit, which we recently obtained through an unrelated public records request. Even without all the details, it’s a long, sordid tale, involving a lot of finger-pointing and allegations of misconduct.
In 2008, a former research technician in the lab of Mauvais-Jarvis, then an associate professor of medicine at Northwestern University, raised concerns of fabrication in two figures in a paper on the regulation of insulin synthesis that had been submitted the Journal of Biological Chemistry. An inquiry committee at the university unanimously concluded that research misconduct charges against Mauvais-Jarvis were not credible.
Last July, Joseph Hilgard, a postdoctoral fellow at the Annenberg Public Policy Center at the University of Pennsylvania, saw an article in Gifted Child Quarterly that made him do a double take. Hilgard, who is studying the effects of violent media on aggressive behavior, said the results of the 2016 paper “caused me some alarm.”
The research—led by corresponding author Brad J. Bushman, a professor of communication and psychology at The Ohio State University (OSU)—showed that gifted and non-gifted children’s verbal skills dropped substantially after watching 12 minutes of a violent cartoon. The violent program had a greater impact on the gifted children, temporarily eliminating the pre-video verbal edge they displayed over their non-gifted peers.
To Hilgard, the results suggested that violent media can actually impair learning and performance. But the effect size was huge — so big, Hilgard thought it had to be a mistake. This, plus other questions, prompted Hilgard to contact the authors and the journal. Unfortunately, once he got a look at the data — collected by a co-author in Turkey who became unreachable after the recent coup attempt — the questions didn’t go away. So the journal decided to retract the paper.
Bushman’s body of work has continually supported the idea that violent media increases aggressive behavior, including a controversial 2012 study “Boom, Headshot!” that was retracted earlier this year.
In an unusual turn of events, a nutrition paper has come back to life a year after being pulled from its original publication.
After the paper was retracted from the journal Obesity, the authors revised it and republished it in another journal, Pediatric Obesity. Both journals are published by Wiley. The second version of the paper doesn’t mention the previous retraction. Indeed, the journal editor told us he didn’t know the paper had been retracted. Still, he stood by his decision to publish it.
The authors told us the paper was retracted after editors at Obesity raised concerns over the authors’ methodology. The authors revised the paper, adding some analysis and explanation of their methodological approach, and said the new version was accepted by peer reviewers before being published in Pediatric Obesity.
However, an outside expert who reviewed both papers for us said he thinks the authors didn’t change enough. According to Patrick McKnight, head of the Measurement, Research methodology, Evaluation, and Statistics group at George Mason University and a Statistical Advisory Board member of STATS.org:
When zoologists at the University of Oxford published findings in Science last year suggesting ducklings can learn to identify shapes and colors without training (unlike other animals), the news media was entranced.
However, critics of the study have published a pair of papers questioning the findings, saying the data likely stem from chance alone. Still, the critics told us they don’t believe the findings should be retracted.
If a duckling is shown an image, can it pick out another from a set that has the same shape or color? Antone Martinho III and Alex Kacelnik say yes. In one experiment, 32 out of 47 ducklings preferred pairs of shapes they were originally shown. In the second experiment, 45 out of 66 ducklings preferred the original color. The findings caught the attention of many media outlets, including the New York Times, The Atlantic, and BuzzFeed.
A department chair of a Swedish university asked to retract a 2010 study in Diabetes after none of the authors could explain image-related ambiguities.
The matter prompted particular attention because the paper’s first author, Pontus Almer Boström, had been found guilty of scientific misconduct by the University of Gothenburg in 2012, after Jan Borén noted some irregularities in data calculated by Boström. At that point, the research group combed the data to identify further issues arising from Boström’s work, and didn’t find any.
But last summer, when a user on PubPeer raised questions about some of the images in the 2010 paper, the matter was brought back into focus. According to Borén, they found “no evidence of scientific misconduct in this study.” But Boström had left the university in 2009 and could not be reached, the corresponding author had passed away, and the remaining co-authors hadn’t stayed in the field. So Borén decided it would be best to retract the study to avoid any “lingering questions.”
Although it’s the right thing to do, it’s never easy to admit error — particularly when you’re an extremely high-profile scientist whose work is being dissected publicly. So while it’s not a retraction, we thought this was worth noting: A Nobel Prize-winning researcher has admitted on a blog that he relied on weak studies in a chapter of his bestselling book.
It’s been a busy few months for Brian Wansink, a prominent food researcher at Cornell University. A blog post he wrote in November prompted a huge backlash from readers who accused him of using problematic research methods to produce questionable data, and a group of researchers suggested four of his papers contained 150 inconsistencies. The scientist has since announced he’s asked a non-author to reanalyze the data — a researcher in his own lab. Meanwhile, criticisms continue to mount. We spoke with Wansink about the backlash, and how he hopes to answer his critics’ questions.
Retraction Watch: Why not engage someone outside your lab to revalidate the analysis of the four papers under question?
To Brian Wansink of Cornell University, a blog post he wrote in November, 2016, was a meant as a lesson in productivity: A graduate student who was willing to embrace every research opportunity submitted five papers within six months of arriving to his lab, while a postdoc who declined two chances to analyze a data set left after one year with a small fraction of the grad student’s publications.
But two months and nearly 50 comments on the post later, Wansink — known for so much high-profile nutrition research he’s been dubbed the “Sherlock Holmes of food” — has announced he’s now reanalyzing the data in the papers, and will correct any issues that arise. In the meantime, he had to decline requests to share his raw data, citing its proprietary nature.
After a years-long dispute over a 2012 paper which suggested there might be some effects of first-person shooter video games on players, the journal has retracted the paper.
The stated reason in the notice: Some outside researchers spotted irregularities in the data, and contacted the corresponding author’s institution, Ohio State University, in 2015. Since the original data were missing, Communication Research is retracting the paper, with the corresponding author’s okay.
A journal has retracted a surgery study by researchers at Brown University after noticing it included data that was not intended for research purposes. (Incidentally, the data were collected by the publisher of the journal.)
Ingrid Philbert, managing editor of the Journal of Graduate Medical Education — which published the paper — told Retraction Watch that senior staff at the publisher alerted the journal that they suspected the authors had used data from a confidential source:
This is a fairly new set of case log data, and as the collector [of] the data, the [Accreditation Council for Graduate Medical Education (ACGME)] gets to determine the use and it has decreed that this data be used solely for accreditation decisions.
Philbert said the journal asked the authors where they got the data: