Retraction Watch readers may be familiar with the name Piero Anversa. Until several years ago, Anversa, a scientist at Harvard Medical School and the Brigham and Women’s Hospital, was a powerful figure in cardiac stem cell research.
In 2015, a group of researchers based in Spain decided to write a review article on high blood pressure. But when they looked over eight articles co-authored by the same person, they noticed some undeniable similarities.
Over the last few years, Giuseppe Derosa, based at the University of Pavia in Italy, has racked up 10 retractions after journals determined he’d published the same material multiple times. But there’s much more to this story: The researchers in Spain (led by Luis Carlos Saiz of the Navarre Regional Health Service in Pamplona) kept digging into his publication record, and have since identified dozens of additional potential duplicates. Although the outside researchers alerted journals to the additional potentially problematic papers in 2015, most have not taken action; recently, two journals published by Taylor & Francis flagged 12 of Derosa’s articles, three of which they had been alerted about in 2015 by Saiz and colleagues.
Now, Saiz is telling his story — and why duplication of medical research matters:
What Caught Our Attention: We’ve written about the controversy surrounding a commonly used tool to measure whether patients are sticking to their drug regimen, known as the Morisky Medication Adherence Scale (MMAS-8). It can cost thousands of dollars — and using it without payment/permission earns researchers a call from a collector, who has used legal threats to compel multiple teams to withdraw their papers (a phenomenon we wrote about in Science). The creator of the tool argues it’s copyrighted, and demanding fees ensures researchers use it properly, which avoids putting patients at risk. We’ve found a notice (paywalled, tsk-tsk) that reveals another group of authors used the tool without permission and, according to the notice, “incorrectly.”
An anatomy journal has banned a researcher from submitting papers for three years after determining one of his recently published papers suffered from “serious ethical” issues.
According to Jae Seung Kang, associate editor at the journal Anatomy and Cell Biology (ACB), the paper’s sole author—Jae Chul Lee—falsified both his affiliation and approval for conducting animal experiments in the paper, published online in March.
When several recent submissions raised a red flag, a pediatrics journal decided to investigate. The journal, Pediatrics in Review, discovered “citation and attribution errors” in three case studies, which the journal has now retracted.
Luann Zanzola, the managing editor of the journal, explained that the editors caught the errors when they scanned the three papers—one published in 2014 and two in 2015—using the plagiarism detection software, iThenticate. Zanzola told us that the three case studies “were flagged for high iThenticate scores,” and when the authors could not adequately explain the amount of text overlap, the editors retracted the papers.
In June, Gene Emery, a journalist for Reuters Health, was assigned to write a story about an upcoming paper in the Journal of the American College of Cardiology, set to come off embargo and be released to the public in a few days. Pretty quickly, he noticed something seemed off.
Emery saw that the data presented in the tables of the paper — about awareness of the problem of heart disease among women and their doctors — didn’t seem to match the authors’ conclusions. For instance, on a scale of 1 to 5 rating preparedness to assess female patients’ risk (with 5 being the most prepared), 64% of doctors answered 4 or 5; but the paper said “only a minority” of doctors felt well-prepared (findings echoed in an accompanying press release). On Monday June 19, four days before the paper was set to publish, Emery told the corresponding author — C. Noel Bairey Merz, Medical Director of the Women’s Heart Center at Cedars-Sinai in Los Angeles — about the discrepancy; she told him to rely on the data in the table.
But the more Emery and his editors looked, the more problems they found with the paper. They alerted the journal hours before it was set to publish, hoping that was enough to halt the process. It wasn’t.