‘Immortal time bias’ fells JAMA journal asthma paper

via Lévesque et al, BMJ

One of the many fun things about reporting on retractions is that we get to expand our statistical knowledge. To wit, follow along as we explore the concept of immortal time bias.

A JAMA journal has retracted and replaced a paper by authors at the University of Massachusetts after another researcher identified a critical statistical error in their study. 

The paper, “Association of Antibiotic Treatment With Outcomes in Patients Hospitalized for an Asthma Exacerbation Treated With Systemic Corticosteroids,” was written by a group led by Mihaela Stefan, the associate director of the Institute for Healthcare Delivery and Population Science at UMass, and appeared in JAMA Internal Medicine in 2019. 

The study purported to find that:*

Antibiotic therapy may be associated with a longer hospital length of stay, higher hospital cost, and similar risk of treatment failure. These results highlight the need to reduce inappropriate antibiotic prescribing among patients hospitalized for asthma.

But after publication, Thomas Newman, a pediatrician and epidemiologist at the University of California, San Francisco, noticed something concerning in the analysis. Newman saw that the authors appeared to have fallen prey to “immortal time bias” — a flaw which, in essence, does not allow members of a study population enough time to experience a given outcome. 

Newman said he detected the problem quickly, because he has been teaching students about immortal time bias for several years:   

I remember that the first few times I read about it,  it didn’t really stick.  In fact, I noticed susceptibility to it in a paper of which I am a coauthor while it was still in press, although in that paper it didn’t actually make any difference. It’s one of those things that once you really get it, you see it easily, but until then it’s easy to miss.

Newman added that: 

I noticed the problem around December 2019, because I used the paper for a final exam problem for a Clinical Epidemiology course I co-teach, in which we cover Immortal Time Bias.  I first went to the journal, but it was already past the normal deadline for a letter to the editor, so it took awhile for them to figure out how to handle it. 

The solution was to retract the paper and to publish Newman’s discovery as a letter to the editor, which appeared this week. In it, Newman wrote:

In their Original Investigation published in the March 2019 issue of JAMA Internal Medicine, Stefan et al1 reported a 1-day higher median length of stay among patients admitted for acute asthma who were treated with antibiotics for at least 2 days starting in the first 2 days of their hospitalization, compared with propensity score–matched controls not so treated. The authors should clarify how the timing of antibiotic exposure and length of stay were determined because it seems like the study might be susceptible to immortal time bias,2 in which there is a period in the exposed group during which they are not at risk of the outcome (which in this case is discharge from the hospital).

Here’s the retraction notice, in which the authors acknowledge Newman’s point and write that fixing the bias problem changed their findings :

On behalf of our coauthors, we thank Dr Newman1 for raising concern that our study reported in the article, “Association of Antibiotic Treatment With Outcomes in Patients Hospitalized for an Asthma Exacerbation Treated With Systemic Corticosteroids,” published in the March 2019 issue of JAMA Internal Medicine,2 was susceptible to immortal time bias because of how we had determined the timing of antibiotic exposure and length of stay in our original analysis.

As originally reported,2 we had conducted a retrospective cohort study of data collected from 542 US acute care hospitals that participate in Premier Inpatient Database (an inpatient administrative database developed for measuring health care quality and use). We identified 19 811 patients hospitalized for an asthma exacerbation treated with corticosteroids and included an analysis of 8788 (44.4%) patients who received antibiotics during the first 2 days of hospitalization and were prescribed antibiotics for a minimum of 2 days. The primary outcome was hospital length of stay measured in days. Secondary outcomes included a composite measure of treatment failure (initiation of invasive or noninvasive mechanical ventilation, transfer to the intensive care unit after hospital day 2, in-hospital mortality, or readmission for asthma exacerbation within 30 days of discharge), hospital cost, and antibiotic-associated diarrhea. In our original propensity score–matched analysis that compared patients who were not started on antibiotic therapy during the 2 days or who were started on antibiotic therapy after day 2 with those who were started on antibiotic therapy during that time frame and treated for at least 2 days, we found that treated patients had a significantly longer hospital stay, similar rate of treatment failure, higher hospitalization cost, and nonsignificant increased risk of antibiotic-related diarrhea.

However, when we repeated the analysis to address concerns about immortal time bias with the treatment cohort defined as patients who started antibiotics during the first day of hospitalization compared with a group of patients who were not treated or who started treatment after day 1 of hospitalization and eliminated the requirement that antibiotics be prescribed for at least 2 days, our findings changed. The number of eligible patients included increased to 21 628; the number of hospitals decreased to 540; and the number of patients who received antibiotics decreased to 8927 (41.3%). The corrected findings of the propensity score–matched analysis now show that receipt of antibiotics on day 1 was associated with a marginally longer but not clinically meaningful length of hospital stay (unadjusted mean [SD], 2.81 [2.27] vs 2.57 [2.45] days; difference, 0.11; 95% CI, 0.03 to 0.19); length-of-stay ratio, 1.06; 95% CI, 1.04 to 1.09) and higher cost of hospitalization (median [IQR] cost, $4320 [$2754-$6716] vs $3861 [$2479-$6236]; mean [SD], $5662 [$5855] vs $5302 [$6959]; difference, $360; 95% CI, $155 to $566; odds ratio [OR], 1.10; 95% CI, 1.08 to 1.12). The risk for antibiotic-related diarrhea was higher in the antibiotic-treated patients (adjusted OR, 1.34; 95% CI, 1.05 to 2.17), but the association became nonsignificant in the propensity score–matched cohort (adjusted OR, 1.19; 95% CI, 0.90 to 1.57). In addition, in the corrected analyses, we found a lower risk of treatment failure in the antibiotic-treated patients (7.1% vs 8.2%; difference, −1.08%; 95% CI, −1.93% to −0.24%; OR, 0.86; 95% CI, 0.77 to 0.97).

Stefan told us: 

I knew very well about immortal time bias but somehow it slipped through my and my collaborators mind.

What we learned is what Dr. Newman said very well – you need to follow the target trial approach and be humble when interpreting the results of observational studies.

If you’ve stuck with us this far, we have one more footnote for this tale: In retracting and replacing the paper, the journal used the same DOI — digital object identifier — instead of creating a new one for the new version. The abstract page also still has the date of the original paper. That isn’t considered best practice, and makes the work of indexers — we suppose we’re one of them now — more difficult. New versions of papers should have their own DOI because so much reference management is based on DOIs, so it becomes impossible to tell, without scrolling and reading the fine print if it’s available, whether they’re in possession of the updated copy or the retracted copy. Authors could very easily end up citing a retracted paper without knowing it — a problem we and others have documented for decades.

That concludes our lesson.

*Update, 2300 UTC, 1/29/21: When originally posted, the quote from the conclusion of the paper that followed “purported to find” was from the new version, rather than the retracted version. It has been replaced.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.