Archive for the ‘data issues’ Category
A journal has retracted a surgery study by researchers at Brown University after noticing it included data that was not intended for research purposes. (Incidentally, the data were collected by the publisher of the journal.)
Ingrid Philbert, managing editor of the Journal of Graduate Medical Education — which published the paper — told Retraction Watch that senior staff at the publisher alerted the journal that they suspected the authors had used data from a confidential source:
This is a fairly new set of case log data, and as the collector [of] the data, the [Accreditation Council for Graduate Medical Education (ACGME)] gets to determine the use and it has decreed that this data be used solely for accreditation decisions.
Philbert said the journal asked the authors where they got the data:
Following a journal probe and questions on PubPeer about their work, authors in Spain have issued four corrections, citing missing raw data for experiments conducted more than 10 years ago.
All papers include the same last two authors, Mireia Duñach at the Autonomous University of Barcelona, and Antonio García de Herreros at the Institut Hospital del Mar d’Investigacions Mèdiques.
Three of the corrections were issued by the Journal of Biological Chemistry, from which the authors retracted three papers earlier this year after a journal investigation concluded they contain reused images, designed to represent different experiments.
Duñach told us the latest corrections are the result of her own initiation:
An ecologist in Australia realized a database he was using to spot trends in extinction patterns was problematic, affecting two papers. One journal issued an expression of concern, which has since turned into a retraction. So far, the other journal has left the paper untouched.
The now-retracted paper concluded that medium-sized species on islands tend to go extinct more often than large or small mammalian species. But a little over a year ago, Biology Letters flagged the paper with an expression of concern (EOC), noting “concerns regarding the validity of some of the data and methods used in the analysis.”
A journal posted an abstract online suggesting a link between vaccines and autism. After a firestorm of criticism, it removed the abstract, saying it was going to be re-reviewed. Now, the journal has decided to formally reject it.
As we reported last month, Frontiers in Public Health removed the abstract after it sparked criticism on social media. After doing so, the journal released a public statement claiming that the paper was “provisionally accepted but not published,” noting that the journal had reverted it to peer review to ensure it was re-reviewed.
Now, Gearóid Ó Faoleán, ethics and integrity manager at Frontiers (the journal’s publisher), told Retraction Watch that after consultation with an external expert, the journal has rejected the paper, adding: Read the rest of this entry »
If you think something is amiss with your data, running an experiment again to figure out what’s going on is a good move. But it’s not always possible.
A team of researchers in Seoul recently found themselves in a bind when they needed to check their work, but were out of a key substance: breast milk.
The shortage led them to the retract their 2016 paper on a micronutrient found in breast milk that helps protect infants’ retinas. “Association between lutein intake and lutein concentrations in human milk samples from lactating mothers in South Korea,” was published online last spring in the European Journal of Clinical Nutrition.
Here’s the retraction notice:
The move comes after a group of researchers alleged the paper contains missing data, and the authors followed a problematic methodology. In September, however, the co-authors’ institution, Uppsala University in Sweden, concluded there wasn’t enough evidence to launch a misconduct investigation.
The University of Tokyo is investigating a 2011 stem cell paper in Cell Cycle, recently retracted over irregularities in four figures.
The university has confirmed there is an investigation, but would not specify which paper it concerned; the corresponding author on the paper, however, confirmed to us that it is the focus of the investigation.
For starters, the first author — Maria Riccardi of the National Research Council of Italy-Institute for Agricultural and Forest Systems in the Mediterranean (CNR-ISAFOM) in Ercolano, Naples, Italy — apparently submitted the paper without consulting the study’s four other listed co-authors. What’s more, according to the retraction notice in Scientia Horticulturae, the paper’s description of the experiment “does not reflect the real conditions under which the data was collected,” rendering the findings invalid.
Two papers evaluating glucose meters — used by diabetics to monitor blood sugar levels — suggested that a couple of the devices don’t work as well as they should. Perhaps unsurprisingly, the companies that sell those meters objected to how the studies were conducted. By all accounts, the companies appear to be justified in their complaints.
In both cases, researchers used blood drawn from veins to test the meters. But manufacturers of the WaveSense JAZZ and GlucoRx glucose meters said their devices are designed to work with fresh blood from a finger-prick. Both papers have now been retracted.
The retraction notice for “Technical and clinical accuracy of five blood glucose meters: clinical impact assessment using error grid analysis and insulin sliding scales,” published in 2015 in the Journal of Clinical Pathology, hints at the issue:
A new analysis of more than 30 clinical trials co-authored by a bone researcher based in Japan is casting doubt on the legitimacy of the findings.
Yoshihiro Sato, based at Mitate Hospital, has already retracted 12 papers, for reasons ranging from data problems, to including co-authors without their consent, to self-plagiarism. Most of these retracted papers are included in the analysis in the journal Neurology, which concluded that Sato’s 33 randomized clinical trials exhibited patterns that suggest systematic problems with the results.
Other researchers have used similar approaches to analyze a researcher’s body of work — notably, when John Carlisle applied statistical tools to uncover problems in the research of notorious fraudster Yoshitaka Fujii, and Uri Simonsohn, who sniffed out problems with the work of social psychologist Dirk Smeesters.