Retraction Watch

Tracking retractions as a window into the scientific process

“Think of the unthinkable:” JAMA retraction prompts author to urge others to share data

with 4 comments

A few months ago, a researcher told Evelien Oostdijk there might be a problem with a 2014 JAMA study she had co-authored.

The study had compared two methods of preventing infection in the intensive care unit (ICU). But a separate analysis had produced different results.

Oostdijk, from the University Medical Center Utrecht in The Netherlands, immediately got to work to try to figure out what was going on. And she soon discovered the problem: The coding for the two interventions had been reversed at one of the 16 ICUs. This switch had “a major impact on the study outcome,” last author Marc Bonten, also from the University Medical Center Utrecht, wrote in a blog post about the experience yesterday, because it occurred at “one of the largest participating ICUs.”

When Oostdijk and a researcher not involved in the study analyzed the data again, they discovered a notable difference between the revised and original findings: The new analysis revealed that one of the interventions had a small but significant survival benefit over the other.

Oostdijk and Bonten, who supervised the re-analysis, notified their colleagues of the revised study outcomes and contacted the journal requesting a retraction and replacement, which was published yesterday in JAMA.

According to the notice of retraction and replacement:

The corrections for these errors indicate that the previously reported absence of statistically significant differences in secondary outcomes has been changed and the article now concludes: “Unit-wide application of [selective digestive decontamination (SDD)] and [selective oropharyngeal decontamination (SOD)] was associated with low levels of antibiotic resistance. Compared with SOD, SDD was associated with lower mortality, reduced length of stay, lower rates of ICU-acquired bacteremia and candidemia, and lower prevalence of rectal carriage of antibiotic resistant gram-negative bacteria, but a more pronounced gradual increase in aminoglycoside-resistant gram-negative bacteria.”

Overall, the primary outcome of the original study appears to be unchanged by the error. Both the revised and original study found:

SDD was associated with lower rectal carriage of antibiotic-resistant gram-negative bacteria and ICU-acquired bacteremia but a more pronounced gradual increase in aminoglycoside-resistant gram-negative bacteria.

Effects of decontamination of the oropharynx and intestinal tract on antibiotic resistance in ICUs: a randomized clinical trial” has been cited 39 times, according to Clarivate Analytics’ Web of Science, formerly part of Thomson Reuters.

We’ve been seeing more journals opt to “retract and replace” in lieu of issuing a retraction or erratum. JAMA, in particular, is considering more “retract and replace” decisions for papers. Last year, Annette Flanagin, the executive managing editor for The JAMA Network, told us this option is appealing because it provides a mechanism:

… to address honest pervasive error (ie, unintentional  human or programmatic errors that result in the need to correct numerous data and text in the abstract, text, tables and figures, such as a coding error) without the current stigma that is associated with retraction.

Flanagin told us why the journal opted to retract and replace this latest paper:

The authors informed us of errors due to misclassification at 1 of the 16 intensive care units (ICUs) included in the trial that was discovered after publication during a subsequent individual patient-level meta-analysis. The misclassification and errors were inadvertent, but they affected important secondary outcomes, and the underlying science was still valid.

She added that she believed the revised paper would also have been accepted by the journal, if it had been the original submission.

Bonten concurred that he thought the revised version of the paper would have been accepted by JAMA, noting “the new outcome probably would have been more appealing” to the journal.

Bonten explains in his blog post how the error occurred:

Each month the ICU had delivered an excel file containing admission and discharge dates, with hours and minutes in the same cell. To harmonize these data with those from other ICUs, the hours and minutes needed to removed, requiring several copy and paste procedures. And that is where the human error occurred.

More specifically:

…that the coding of the intervention (0 or 1) was reversed for one of the largest participating ICUs…

The small survival benefit associated with SDD that came to light in the revised study may help clarify the controversy surrounding which intervention is more effective. Bonten told us:

The debate between SDD and SOD has been ongoing for years, at least in the Netherlands. This correction provides SDD with a survival advantage that will certainly be used in the debate. We will present more data for this soon in the individual patient data meta-analysis.

Bonten explains in his blog post:

The good news is that there are now 2 large cluster-randomized cross-over studies in which SDD does better than something else (better than nothing in one and better than SOD in the other study). That reduces the likelihood that the benefits were due to differences in patient characteristics, which cannot be fully excluded in such trials. The results of the Individual Patient Data meta-analysis will appear soon.

In the blog, Bonten notes three key take-aways from this experience:

  1. Think of the unthinkable when it comes to checking the quality of your data.
  2. If you think that all possible has been checked, check again.
  3. Always let others use your original data for new (or just the same) analyses.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

Written by Victoria Stern

April 19th, 2017 at 12:45 pm

Comments
  • Rob Siebers April 19, 2017 at 3:01 pm

    Double entering of data might have detected the problem earlier

  • cghoogstraten April 19, 2017 at 4:35 pm

    4. Don’t use Excel for data analysis. Use software specifically designed for scientific data handling.

    • Cheryl April 20, 2017 at 2:57 am

      can’t say this loud enough. Data are handled best with database managers.

  • TL April 20, 2017 at 4:37 am

    This paper and its correction should probably be checked by a professional statistician. They report the length of the ICU stay after both interventions as having a median of 6 days, but SOD has an interquartile range of 4-10 days and SDD has an IQR of 4-11 days. This they massage into an odds ratio of 1.056 (95% CI 1.014-1.100), which turns into “reduced length of stay for SDD” in the conclusions? Oy vey!

  • Post a comment

    Threaded commenting powered by interconnect/it code.