Authors to correct influential Imperial College COVID-19 report after learning it cited a withdrawn preprint

A March paper by researchers at Imperial College London that, in the words of the Washington Post, “helped upend U.S. and U.K. coronavirus strategies,” cited a preprint that had been withdrawn.

Retraction Watch became aware of the issue after being contacted by a PubPeer commenter who had noted the withdrawal earlier this month. Following questions from Retraction Watch this weekend, the authors said they plan to submit a correction.

In March, the New York Times wrote:

Continue reading Authors to correct influential Imperial College COVID-19 report after learning it cited a withdrawn preprint

Elsevier investigating hydroxychloroquine-COVID-19 paper

Elsevier has weighed in on the handling of a controversial paper about the utility of hydroxychloroquine to treat Covid-19 infection, defending the rigor of the peer review process for the article in the face of concerns that the authors included the top editor of the journal that published the work. 

On April 3, as we reported, the International Society of Antimicrobial Chemotherapy issued an expression of concern (without quite calling it that) about the paper, which had appeared in March in the International Journal of Antimicrobial Agents, which the ISAC publishes, along with Elsevier. According to the society, the article, by the controversial French scientist  Didier Raoult, of the University of Marseille, and colleagues:

Continue reading Elsevier investigating hydroxychloroquine-COVID-19 paper

Hydroxychloroquine-COVID-19 study did not meet publishing society’s “expected standard”

The paper that appears to have triggered the Trump administration’s obsession with hydroxychloroquine as a treatment for infection with the novel coronavirus has received a statement of concern from the society that publishes the journal in which the work appeared. 

The April 3, 2020, notice, from the International Journal of Antimicrobial Agents, states that the March 20 article, “Hydroxychloroquine and azithromycin as a treatment of Covid-19: results of an open-label non-randomized clinical trial” 

Continue reading Hydroxychloroquine-COVID-19 study did not meet publishing society’s “expected standard”

Former star cancer researcher who sued his university for discrimination notches eighth retraction

Jasti Rao, who once earned $700,000 a year at the University of Illinois College of Medicine at Peoria and was named the first “Peorian of the Year” before a misconduct investigation put an end to his time there, has now lost eight papers

Rao’s case is among the more colorful that we’ve covered. A highly-regarded cancer specialist, Rao was caught up in a morass of misdeeds, including not only plagiarism and manipulation of data but gambling and behavior tantamount to extortion of his employees. As we reported in 2018

Continue reading Former star cancer researcher who sued his university for discrimination notches eighth retraction

“I was shocked. I felt physically ill.” And still, she corrected the record.

Julia Strand

Two years ago, Julia Strand, an assistant professor of psychology at Carleton College, published a paper in Psychonomic Bulletin & Review about how people strain to listen in crowded spaces (think: when they’re doing the opposite of social distancing).

The article, titled “Talking points: A modulating circle reduces listening effort without improving speech recognition,” was a young scientist’s fantasy — splashy, fascinating findings in a well-known journal — and, according to Strand, it gave her fledgling career a jolt. 

The data were “gorgeous,” she said, initially replicable and well-received: 

Continue reading “I was shocked. I felt physically ill.” And still, she corrected the record.

A ‘Cat Tale’: A story of how flawed science formed the basis of policy

On the surface, it would seem like a good thing when science undergirds policy decisions. But what if that science is deeply flawed? Craig Pittman, an award-winning journalist at the Tampa Bay Times and author of 4 books, writes that his new book Cat Tale: The Wild, Weird Battle to Save the Florida Panther is “a tale of raw courage, of scientific skulduggery and political shenanigans, of big-money interests versus what’s right for everyone.” In this excerpt, Pittman explains what happened — and what didn’t — after a group of scientists known as the Science Review Team (SRT) found serious problems in research used to support regulatory policies involving panthers.

In 2003, the SRT released a report containing its verdict. As you might guess, it ripped apart Maehr’s work, piece by piece, and yes, they called him out by name. They didn’t label him a fraud, but they made it clear that Dr. Panther had done some pretty shady things.

Because they were scientists, they didn’t scream out their find­ings in impassioned prose. They were cool and calm—but there was no mistaking what they were saying.

Continue reading A ‘Cat Tale’: A story of how flawed science formed the basis of policy

‘Harming‌ ‌the‌ ‌scientific‌ ‌process‌:’ An attempt to correct the sports science literature, part 3

Matthew Tenan

Why is it so difficult to correct the scientific record in sports science? In the first installment in this series of guest posts, Matthew Tenan, a data scientist with a PhD in neuroscience, began the story of how he and some colleagues came to scrutinize a paper. In the second, he explained what happened next. In today’s final installment, he reflects on the editors’ response and what he thinks it means for his field.

In‌ ‌refusing‌ ‌to‌ ‌retract‌ ‌the‌ ‌Dankel‌ ‌and‌ ‌Loenneke‌ ‌manuscript‌ ‌we‌ ‌showed‌ ‌to‌ ‌be‌ ‌mathematically‌ ‌flawed,‌ ‌the‌ ‌editors‌ referred to “feedback‌ ‌from‌ ‌someone‌ ‌with‌ ‌greater‌ ‌expertise”‌ and ‌included‌ ‌the‌ ‌following:‌ ‌

Continue reading ‘Harming‌ ‌the‌ ‌scientific‌ ‌process‌:’ An attempt to correct the sports science literature, part 3

‘A flawed decision:’ What happened when sports scientists tried to correct the scientific record, part 2

Matthew Tenan

Why is it so difficult to correct the scientific record in sports science? In the first installment in this series of guest posts, Matthew Tenan, a data scientist with a PhD in neuroscience, began the story of how he and some colleagues came to scrutinize a paper. In this post, he explains what happened next.

The‌ ‌journal‌ ‌Sports‌ ‌Medicine‌ ‌is‌ ‌widely‌ ‌considered‌ ‌one‌ ‌of‌ ‌the‌ ‌top‌ ‌journals‌ ‌–‌ ‌if‌ ‌not‌ ‌the‌ ‌top‌ ‌journal‌ ‌–‌ ‌in‌ ‌the‌ ‌fields‌ ‌of‌ ‌sport‌ ‌science,‌ ‌exercise‌ ‌science‌ ‌and‌ ‌physical‌ ‌education.‌  ‌This‌ ‌journal‌ ‌is‌ ‌managed‌ ‌by‌ ‌two‌ ‌professional‌ ‌editors‌ ‌who‌ ‌do‌ ‌not‌ ‌hold‌ ‌PhDs‌ ‌in‌ ‌the‌ ‌journal’s‌ ‌subject‌ ‌area‌ ‌but‌ ‌are‌ ‌generally‌ ‌versed‌ ‌in‌ ‌the‌ ‌topic‌ ‌and‌ ‌have‌ ‌the‌ ‌goal‌ ‌of‌ ‌managing‌ ‌a‌ ‌successful‌ ‌journal‌ ‌for‌ ‌SpringerNature.‌ ‌

The‌ ‌manuscript‌ ‌by‌ ‌Dankel‌ ‌and‌ ‌Loenneke‌ ‌was‌ ‌reviewed‌ ‌by‌ ‌three‌ ‌reviewers.‌  ‌I‌ ‌know‌ ‌this‌ ‌because‌ ‌I‌ ‌was‌ ‌one‌ ‌of‌ ‌the‌ ‌reviewers‌ ‌and,‌ ‌as‌ ‌noted‌ ‌in‌ ‌the‌ ‌first‌ ‌post‌ ‌in‌ ‌this‌ ‌series,‌ ‌I‌ ‌strongly‌ ‌advised‌ ‌against‌ ‌its‌ ‌publication.‌ ‌Greg‌ ‌Atkinson,‌ ‌a‌ ‌practicing‌ ‌scientist‌ ‌in‌ ‌the‌ ‌area‌ ‌of‌ ‌health‌ sciences,‌ ‌has‌ ‌publicly‌ ‌stated‌, in a private Facebook group, that he‌ ‌was‌ ‌one‌ ‌of‌ ‌the‌ ‌reviewers‌ ‌who‌ ‌recommended‌ ‌the‌ ‌paper‌ be‌ ‌published.‌ ‌Both‌ ‌myself,‌ ‌Atkinson,‌ ‌and‌ ‌the‌ ‌senior‌ ‌author‌ ‌on‌ ‌the‌ ‌manuscript,‌ ‌Loenneke,‌ ‌sit‌ ‌on‌ the‌ ‌editorial‌ ‌board‌ ‌of‌ ‌the‌ ‌journal‌ ‌Sports‌ ‌Medicine.‌ ‌And‌ ‌while‌ ‌the‌ ‌paper‌ ‌published‌ ‌in‌ ‌the‌ ‌journal‌ ‌by‌ ‌Dankel‌ ‌and‌ ‌Loenneke‌ ‌proposes‌ ‌a‌ ‌novel‌ ‌statistical‌ ‌method,‌ ‌neither‌ ‌of‌ ‌the‌ ‌two‌ ‌authors‌ ‌on‌ ‌the‌ ‌manuscript,‌ ‌myself,‌ ‌nor‌ ‌Atkinson,‌ ‌have‌ ‌PhDs‌ ‌in‌ ‌statistics.‌ ‌The‌ ‌published‌ ‌paper‌ ‌does‌ ‌not‌ ‌cite‌ ‌a‌ ‌single‌ ‌statistics‌ ‌journal‌ ‌in‌ ‌the‌ ‌course‌ ‌of‌ ‌reporting‌ ‌their‌ ‌“novel‌ ‌method.”‌

‌What‌ ‌could‌ ‌go‌ ‌wrong,‌ ‌right?‌ ‌

Continue reading ‘A flawed decision:’ What happened when sports scientists tried to correct the scientific record, part 2

Why — even after reforms for an episode involving bad statistics — is it so difficult to correct the sports medicine literature? Part 1

Matthew Tenan

Two years ago, following heated debate, a sports science journal banned a statistical method from its pages, and a different journal — which had published a defense of that method earlier — decided to boost its statistical chops. But as Matthew Tenan, a data scientist with a PhD in neuroscience relates in this three-part series, that doesn’t seem to have made it any easier to correct the scientific record. Here’s part one.

In‌ ‌July‌ ‌2019,‌ ‌my‌ ‌colleague‌ ‌‌Andrew‌ ‌Vigotsky‌‌ ‌contacted‌ ‌me.‌ ‌He‌ ‌was‌ ‌curious,‌ ‌he‌ ‌said,‌ ‌whether‌ ‌a‌ paper‌ ‌published‌ ‌in‌ ‌Sports‌ ‌Medicine‌ ‌had‌ ‌undergone‌ ‌statistical‌ ‌review ‌ ‌because‌ ‌he‌ ‌was‌ concerned‌ ‌about‌ ‌some‌ ‌of‌ ‌its‌ ‌claims.‌ ‌The‌ ‌link‌ ‌he‌ ‌sent‌ ‌me‌ ‌was‌ ‌to‌ ‌“‌A‌ ‌Method‌ ‌to‌ ‌Stop‌ ‌Analyzing‌ Random‌ ‌Error‌ ‌and‌ ‌Start‌ ‌Analyzing‌ ‌Differential‌ ‌Responders‌ ‌to‌ ‌Exercise‌,”‌ ‌a‌ ‌paper‌ ‌published‌ ‌on‌ June‌ ‌28,‌ ‌2019‌ ‌by‌ ‌‌Scott‌ ‌Dankel‌‌ ‌and‌ ‌‌Jeremy‌ ‌Loenneke‌.‌

As‌ ‌it‌ ‌happened,‌ ‌I‌ ‌knew‌ ‌that‌ ‌paper,‌ ‌and‌ ‌I‌ ‌had‌ ‌also‌ ‌expressed‌ ‌concerns‌ ‌about‌ ‌it‌ ‌–‌ ‌when‌ ‌I reviewed‌ ‌it‌ ‌before‌ ‌publication‌ ‌as‌ ‌one‌ ‌of‌ ‌the‌ ‌members‌ ‌of‌ ‌the‌ ‌journal’s‌ ‌editorial‌ ‌board.‌ ‌Indeed,‌ ‌I was‌ ‌brought‌ ‌on‌ ‌to‌ ‌the‌ ‌editorial‌ ‌board‌ ‌of‌ ‌‌Sports‌ ‌Medicine‌‌ ‌because‌ ‌the‌ ‌journal‌ ‌had‌ ‌recently‌ received‌ ‌a‌ ‌lot‌ ‌of‌ ‌bad‌ ‌press‌ ‌for‌ ‌publishing‌ ‌a‌ ‌paper‌ ‌about‌ ‌another‌ ‌“novel‌ ‌statistical‌ ‌method”‌ ‌with‌ significant‌ ‌issues and I had been a vocal critic of the sports medicine and sport science‌ field developing their own statistical methods that are not used outside of the field and validated by the wider statistics community.‌ ‌

Continue reading Why — even after reforms for an episode involving bad statistics — is it so difficult to correct the sports medicine literature? Part 1

Journal founded by Hans Eysenck issues expressions of concern for his papers, despite calls by university to retract

Hans Eysenck

Bucking the advice of university investigators, a journal founded by Hans Eysenck has issued expressions of concern — not retractions — for three articles by the deceased psychologist whose work has been dogged by controversy since the 1980s. 

The move comes barely a week after other journals opted to retract 13 papers by Eysenck, who died in 1997. Those retractions were prompted by the findings of a 2019 investigation by King’s College London, where Eysenck worked until 1983. That inquiry concluded that: 

Continue reading Journal founded by Hans Eysenck issues expressions of concern for his papers, despite calls by university to retract