The last author on the article, published in the Journal of Bioethical Inquiry, told us an “innocent mistake” and difficulty navigating a website led the authors to incorrectly note that nine journals had not made their contents available through the World Health Organization’s Health InterNetwork Research Initiative database (HINARI), which gives bioethicists who live in low- and middle-income countries access to research articles either free of charge or at reduced cost. The authors argued that the mistake didn’t affect the paper’s conclusions, but the journal disagreed, and opted to pull the paper entirely.
After searching through the database, first author Subrata Chattopadhyay mistakenly determined that the journals had not made their contents available through HINARI, when in fact they were listed but on a different part of the website.
Last week, the International Committee of Medical Journal Editors proposed requiring authors to share deidentified patient data underlying the published results of clinical trials within six months of publication. The proposal has earned much support but also some concerns – for example that other scientists might poach the findings, acting as the New England Journal of Medicinedubbed “research parasites.”Elizabeth Wager, a member of the board of directors of our parent organization, disagrees with that concern, but raises another issue – namely, the unintended consequences of data sharing on other, more effective initiatives to make reporting more transparent.
The recent proposal from the ICMJE may appear, at first glance, a positive step towards better clinical trial reporting. However, I’m concerned that this new requirement might undermine other more effective initiatives to increase the efficiency of research, such as the publication of protocols and full study reports. Here’s why.
All actions have costs, risks, and benefits: Making partial data sharing a condition of publication is no exception. The costs are hard to quantify but undoubtedly not trivial. Putting clinical data into a usable format and making it meaningful to other researchers requires considerable time and effort by knowledgeable people. To this must be added the costs of establishing and maintaining suitable repositories and of checking compliance.
The 2013 paper — now retracted by the American Journal of Infection Control — suggested a particular kind of connector between the catheter and the patient could reduce some of the notoriously deadly bloodstream infections associated with the procedure, according to a press release that publicized the work. But last year, the journalissued an expression of concern for the paper, noting there were questions about the data. The retraction note reveals an investigation at Georgia Regents University — now known as Augusta University — started looking into undisclosed conflicts of interest in the paper, and ultimately concluded the science was flawed.
After a group of researchers noticed an error that affected the analysis of a survey of psychologists working with medical teams to help pediatric patients, they didn’t just issue a retraction — they published a commentary explaining what exactly went wrong.
The error was discovered by a research assistant who was assembling a scientific poster, and noticed the data didn’t align with what was reported in the journal. The error, the authors note, was:
an honest one, a mistake of not reverse coding a portion of the data that none of the authors caught over several months of editing and conference calls. Unfortunately, this error led to misrepresentation and misinterpretation of a subset of the data, impacting the results and discussion.
Needless to say, these authors — who use their “lessons learned” to help other researchers avoid similar missteps — earn a spot in our “doing the right thing” category. The retraction and commentary both appear in Clinical Practice in Pediatric Psychology.
Errors in the interpretation of some of the data — the result of “procedural flaws” — are to blame for the retraction of a paper on a way to help skin grow back after injury.
The paper explores a method involving nanofibers. According to the abstract:
In this study, tilapia skin collagen sponge and electrospun nanofibers were developed for wound dressing…the collagen nanofibers stimulated the skin regeneration rapidly and effectively in vivo.
The paper was published January 19, 2015 by ACSApplied Materials and Interfaces, then retracted eight months later, in August. It has not been cited, according to Thomson Scientific’s Web of Knowledge.
When our co-founders launched the site in 2010, they wondered whether there would be enough retractions to write about on a regular basis. Five+ years and three full-time staffers later, and we simply don’t have the time to cover everything that comes across our desk.
In 2012, we covered a group of duplication retractions in a single post, simply because duplications happen so frequently (sadly) and often don’t tell an interesting story. So in the interest of bookkeeping, we’re picking up the practice again.
Here are five unrelated retractions for your perusal: all addressing duplications, in which the same – or mostly the same – authors published the same – or mostly the same – information in two different – or sometimes the same – journals.
Researchers are correcting a widely covered study that suggested chronic use of pot might not put users at risk of problems later in life.
It turns out that initial, unexpected finding — covered by Newsweek, The Washington Post, Quartz, and (of course) The Stoner’s Cookbook (now known as HERB) —wasn’t quite right, and a reanalysis found users had a small uptick in risk for psychosis. The authors have issued a lengthy correction in Psychology of Addictive Behaviors that includes some supplemental analysis, too.
Not surprisingly, the study’s findings engendered some controversy, which prompted the authors to reanalyze their data, collected from 408 males with varying levels of marijuana use, who were followed from their teens into their 30s.
Now, an American Psychological Association press release that accompanied the initial findings in August contains an editors note explaining why those aren’t quite correct:
We recently obtained court documents showing that, in September, a judge dismissed a lawsuit filed by cancer researcher Fazlul Sarkar against the University of Mississippi after it rescinded a job offer after reviewing concerns raised about his research on PubPeer.
Sarkar’s connection to PubPeer will be familiar to many readers — he has also taken the site to court to force them to reveal the identity of the anonymous commenters who have questioned his findings. He has accused the commenters of defamation, arguing they cost him the job offer. Today, the American Civil Liberties Union filed a brief on behalf of PubPeer’s appeal of the court’s most recent ruling, that the site must disclose the identity of an anonymous commenter. At the same time, some heavy hitters in science – Bruce Alberts and Harold Varmus — and technology — Google and Twitter — filed briefs in support of the appeal.
The lawsuit against Ole Miss has brought to light the reasoning behind the school’s decision to rescind their offer to Sarkar — and the key role played by the concerns raised on PubPeer.
In a letter dated June 19, 2014 to Sarkar from Larry Walker, the director of the National Center for Natural Products Research at the University of Mississippi, Walker chides Sarkar for not revealing the extent of the ongoing questions over his research during the interview process:
PLOS One is retracting a paper for overlapping with a Wikipedia page. And for containing material lifted from other sources. And for “language errors.” And for insufficient evidence that authors found the pathogens floating around in hospital air that they claimed to find.
The instances of plagiarism are a “huge problem,” each “enough for retraction on its own,” Jonathan Eisen, a microbiologist at the University of California, Davis, told us. Eisen, who posted several comments to the paper after its publication in October, added that the paper was “simply not technically sound.”