JAMA takes all calls for retraction seriously — even from PETA

JAMAA leading medical journal is taking a second look at a recent high-profile paper about elephants’ lower risk of cancer, after receiving a call for retraction from a somewhat unusual corner: the animal rights group PETA.

This isn’t the first time the activist group has called for a retraction — last year, it nudged a journal to pull a paper that had been flagged for fraud by the U.S. Office of Research Integrity. Their latest target: A 2015 paper in JAMA, which PETA claims contains inaccurate information.

What’s more, the organization argues, Ringling Bros and Barnum & Bailey Circus — which partly funded the research — is using the findings as “justification for the continued use of abusive training techniques with elephants.” Yesterday, PETA sent a letter to the journal asking it to either retract the paper or issue an expression of concern, claiming: Continue reading JAMA takes all calls for retraction seriously — even from PETA

Fraudster’s colleague faked data, too

ori-logoA week after announcing that a researcher formerly at the University of Chicago had faked the results of more than 70 experiments, the U.S. Office of Research Integrity announced yesterday that one of his colleagues also falsified data.

According to the ORI, Karen D’Souza  Continue reading Fraudster’s colleague faked data, too

What does “reproducibility” mean? New paper seeks to standardize the lexicon

Science Translational MedicineWhat is the difference between “reproducible” and “replicable”? And how does each relate to results that are “generalizable” and “robust”?

Researchers are using these terms interchangeably, creating confusion over what exactly is needed to confirm a scientific result, argues a new paper published today in Science Translational Medicine.

Here’s how the US National Science Foundation (NSF) defines “reproducibility,” according to the authors: Continue reading What does “reproducibility” mean? New paper seeks to standardize the lexicon

Neuro journal pulls article for data theft, prompts misconduct probe

annneuro

Neuroscientists have retracted a research letter less than two months after it appeared, admitting they appeared to pass off others’ data as their own.

Two of the researchers are listed as affiliated with the University of California, San Francisco (UCSF), and the incident has led to a misconduct investigation at the institution, a UCSF spokesperson told us.

The article, “DNAJC6 variants in Parkinson’s disease and amyotrophic lateral sclerosis,” appeared in April. It was quickly followed by this notice, dated in May: Continue reading Neuro journal pulls article for data theft, prompts misconduct probe

Economists go wild over overlooked citations in preprint on prenatal stress

The_American_Economic_Review_(cover)Citation omissions in an economics preprint have set off a wave of recrimination and speculation on a widely read economics discussion board.

Commenters accuse the authors of purposely omitting citations that would have undermined the paper’s claims to novelty and contributions to the field, leveling acrimony and personal attacks. Economists Petra Persson at Stanford and Maya Rossin-Slater at the University of California, Santa Barbara told us they hadn’t been familiar with the omitted papers at the time they first posted their preprint, but their work remains distinct from these previous studies. Nevertheless, the two quickly updated the preprint of their paper – accepted by the top-tier economics journal American Economic Review – to include additional citations. An editor at the journal said it’s not unusual for authors to request such changes before publication, and dismissed the accusations made on the discussion board, calling the site “not a legitimate source of information.”

The study, “Family Ruptures, Stress, and the Mental Health of the Next Generation,” used data from Swedish national databases to compare mental health outcomes of people born to women who lost a relative while pregnant and women who lost a relative in the first year after giving birth. Continue reading Economists go wild over overlooked citations in preprint on prenatal stress

How should journals update papers when new findings come out?

NEJM Logo

When authors get new data that revise a previous report, what should they do?

In the case of a 2015 lung cancer drug study in the New England Journal of Medicine (NEJM), the journal published a letter to the editor with the updated findings.

Shortly after the paper was published, a pharmaceutical company released new data showing the drug wasn’t quite as effective as it had seemed. Once the authors included the new data in their analysis, they adjusted their original response rate of 59%  — hailed as one of a few “encouraging results” in an NEJM editorial at the time of publication — to 45%, as they write in the letter. One of the authors told us they published the 2015 paper using less “mature” data because the drug’s benefits appeared so promising, raising questions about when to publish “exciting but still evolving data.”

It’s not a correction, as the original paper has not been changed; it doesn’t even contain a flag that it’s been updated. But among the online letters about the paper is one from the authors, “Update to Rociletinib Data with the RECIST Confirmed Response Rate,” which provides the new data and backstory:

Continue reading How should journals update papers when new findings come out?

Duplicated data gets corrected — not retracted — by psych journal

Psychological Science

A psychology journal is correcting a paper for reusing data. The editor told us the paper is a “piecemeal publication,” not a duplicate, and is distinct enough from the previous article that it is not “grounds for retraction.”

The authors tracked the health and mood of 65 patients over nine weeks. In one paper, they concluded that measures of physical well being and psychosocial well being positively predict one another; in the other (the now corrected paper), they concluded that health and mood (along with positive emotions) influence each other in a self-sustaining dynamic.

As a press release for the now-corrected paper put it: Continue reading Duplicated data gets corrected — not retracted — by psych journal

Heart researcher faked 70+ experiments, 100+ images

ori-logoA former researcher at the University of Michigan and the University of Chicago faked dozens of experiments and images over the course of six years, according to a new finding from the Office of Research Integrity (ORI).

Ricky Malhotra, who studied heart cells, admitted to committing misconduct at both institutions, the ORI said in its report of the case. The fakery involved three National Institutes of Health (NIH) grant applications, one NIH progress report, one paper, seven presentations, and one image file. Despite an investigation at the University of Michigan, where Malhotra was from 2005-2006, he continued this falsification at [University of Chicago], after the [University of Michigan] research misconduct investigation was completed,” according to the ORI. The agency found that he Continue reading Heart researcher faked 70+ experiments, 100+ images

Context matters when replicating experiments, argues study

PNASBackground factors such as culture, location, population, or time of day affect the success rates of replication experiments, a new study suggests.

The study, published today in the Proceedings of the National Academy of Sciences, used data from the psychology replication project, which found only 39 out of 100 experiments live up to their original claims. The authors conclude that more “contextually sensitive” papers — those whose background factors are more likely to affect their replicability — are slightly less likely to be reproduced successfully.

They summarize their results in the paper:

Continue reading Context matters when replicating experiments, argues study

Editors say they missed “fairly obvious clues” of third party tampering, publish fake peer reviews

BJCP Cover

The editors of a journal that recently retracted a paper after the peer-review process was “compromised” have published the fake reviews, along with additional details about the case.

In the editorial titled “Organised crime against the academic peer review system,” Adam Cohen and other editors at the British Journal of Clinical Pharmacology say they missed “several fairly obvious clues that should have set alarm bells ringing.” For instance, the glowing reviews from supposed high-profile researchers at Ivy League institutions were returned within a few days, were riddled with grammar problems, and the authors had no previous publications. 

The case is one of many we’ve recently seen in which papers are pulled due to actions of a third party

The paper was submitted on August 5, 2015. From the beginning, the timing was suspect, Cohen — the director for the Centre for Human Drug Research in The Netherlands — and his colleagues note: Continue reading Editors say they missed “fairly obvious clues” of third party tampering, publish fake peer reviews