About these ads

Retraction Watch

Tracking retractions as a window into the scientific process

Not for the faint of heart: Cardiologists retract syncope paper after realizing data columns weren’t aligned right

with 13 comments

Improperly aligned columns have cost researchers at the Mayo Clinic a paper in the Journal of the American College of Cardiology.

The paper originally concluded that fainting spells (syncope) give patients with high blood pressure in their lung arteries poor prognoses, an observation that turned out to be incorrect.

The problem? The group merged two electronic databases, but did not align columns properly, a problem found only after first author Rachel Le revisited the dataset looking to cull more data.

The retraction notice published on May 22 (the one on ScienceDirect is free to air, while the one on the JACC site is behind a paywall):

Syncope in Adults With Pulmonary Arterial Hypertension. J Am Coll Cardiol 59 (2011) 863–7.

Rachel J. Le, MD, Eric R. Fenstad, MD, Hilal Maradit-Kremers, MD, Robert B. McCully, MD, Robert P. Frantz, MD, Michael D. McGoon, MD, Garvan C. Kane, MD, P HD.

Department of Medicine, Mayo Clinic, Rochester, Minnesota; and the Department of Health Sciences Research, Mayo Clinic, Rochester, Minnesota.

Available online August 8, 2011.

Reason: This article has been retracted at the request of the authors, because of a data entry error, which is fundamental to the study findings. As background, this was a clinical study where a specific variable was tested in a large database. The process involved merging a variable (presence or absence of syncope) from one electronic source with an alternate electronic database of patients with pulmonary arterial hypertension and assessing associations and outcomes. In proceeding to design a follow-up study to this work, Dr. Le went back to the original source file to abstract new data. In doing this she identified a ‘cut-and paste’ error in which the column of syncope data was transferred incorrectly where syncope/no syncope variables were assigned to wrong subjects. This led to a critical error that then got carried forward and a fundamental misclassification of syncope in the final study group. This error fundamentally affects the results, which now do not fully support the conclusions.

The paper has been cited once, according to Thomson Scientific’s Web of Knowledge.

We’ve tried to contact Le and Kane for more information, and will update with anything we find out.

The Journal of the American College of Cardiology, like many journals, struggles with limited resources that allow errors to pass through, according to an April 17 editorial from Anthony N. DeMaria, the journal’s editor-in-chief. In “Scientific Misconduct, Retractions, and Errata”, DeMaria wrote:

If we can believe that over 10% of investigators are aware of scientific misconduct, either we as editors have been extraordinarily discerning of such transgressions during the review process, or we have occasionally been duped. This perhaps would not be surprising given the limited arsenal available to us to identify misconduct.

Kane’s group voluntarily came forward to admit their errors, and there was no misconduct reported.

The retraction also puts into question an abstract published in 2010 in Chest that had a similar conclusion to that of the JACC paper:

In a broad spectrum of clinical PAH patients, syncope is infrequent, associated with markers of right heart dysfunction and is strongly and independently predictive of poor outcome.

Cardiology Today featured the research on April 18, 2011 during its coverage of the The International Society for Heart & Lung Transplantation 31st annual meeting and scientific sessions. They quoted Le:

Presyncope/syncope is associated with markers of increased disease severity in newly diagnosed PAH patients. However, it was not predictive of unadjusted survival.

James Young, a Cardiology Today section editor, wrote a brief editorial to accompany the online article:

To me, the interesting aspect of this data was validation of something that has been repeatedly mentioned by astute clinicians of yesteryear: the relationship of presyncope and syncope to severity of PAH. It was an elegant analysis of an important registry and raises the question of a pathophysiologic link of syncope/presyncope to worsening PAH and not just something that is a consequence of PAH.

About these ads

Written by trevorlstokes

May 29, 2012 at 9:30 am

13 Responses

Subscribe to comments with RSS.

  1. Essentially the same issue as destroyed Potti. Clinicians and data are a very dangerous mix. The very people who most dislike statistical analysis do it and publish. Would they get a biostatistician to do surgery? Seems unlikely. A cardiologist doing data analysis? Exactly the same thing.

    Paul A. Thompson

    May 29, 2012 at 9:41 am

    • The retraction is now open access.

      Glenn Collins

      May 29, 2012 at 11:41 am

    • Not so. Potti was destroyed by a repeated series of efforts to deceive, which including inflating statistical significance. The retractions followed in-depth analysis by others. In this case it sounds like a genuine error of data entry with no intent to deceive, and the problem was self-reported by the original authors. Nothign really linking the 2 cases other than the perp’s were both MDs.

      vhedwig

      May 29, 2012 at 1:08 pm

      • From my reading of the retraction they did own up as soon as they spotted the mistake.
        A good thing.

        David Hardman

        May 29, 2012 at 2:45 pm

      • Did you read the Baggerly and Coombes paper?

        Paul A. Thompson

        June 6, 2012 at 12:20 pm

    • This is about as far from Potti as you can get… Potti – repeated fraud; this – one-off honest mistake?

      Neuroskeptic

      June 3, 2012 at 5:13 am

      • Did you actually read the Baggerly and Coombes paper? They did not start off trying to deceive people. They started off trying to address an issue. They furnished their data to B & C until issues arose.

        Paul A. Thompson

        June 6, 2012 at 12:18 pm

    • Most of those who disagree with me have obviously NOT read the Baggerly and Coombes paper. They should do so immediately to cease the writing of clearly incorrect statements.

      Annals of Applied Statistics
      DERIVING CHEMOSENSITIVITY FROM CELL LINES:
      FORENSIC BIOINFORMATICS AND REPRODUCIBLE
      RESEARCH IN HIGH-THROUGHPUT BIOLOGY
      By Keith A. Baggerly∗ and Kevin R. Coombes†
      U.T. M.D. Anderson Cancer Center

      I have a copy without final page numbers, but I quote the discussion.

      From B & C,

      “7. Discussion.
      7.1. On the nature of common errors. In all of the case studies examined
      above, forensic reconstruction identifies errors that are hidden by poor
      documentation. Unfortunately, these case studies are illustrative, not exhaustive;
      further problems similar to the ones detailed above are described
      in the supplementary reports. The case studies also share other commonalities.
      In particular, they illustrate that the most common problems are
      simple: e.g., confounding in the experimental design (all TET before all
      FEC), mixing up the gene labels (off-by-one errors), mixing up the group
      labels (sensitive/resistant); most of these mixups involve simple switches or
      offsets. These mistakes are easy to make, particularly if working with Excel
      or if working with 0/1 labels instead of names (as with binreg). We have
      encountered these and like problems before.”

      It is clear that Potti began with data handling errors. They then attempted to cover up.

      Paul A. Thompson

      June 6, 2012 at 12:25 pm

  2. Maradit-Kremers is a clinical epidemiologist. She doesn’t do surgery. My point is, this wasn’t a bunch of clinical cardiologists who decided to play with a big spreadsheet and a statistics program. This WAS a team with biostatistics experience. Just a mistake I suspect, but not a mistake due to lack of research qualifications.

    MEM

    May 29, 2012 at 5:27 pm

  3. “The Journal of the American College of Cardiology, like many journals, struggles with limited resources that allow errors to pass through, according to an April 17 editorial from Anthony N. DeMaria, the journal’s editor-in-chief. In “Scientific Misconduct, Retractions, and Errata”,…”

    There’s nothing in this editorial that suggests that JACC is struggling with “limited resources”, unless you understand “resources” to mean the “limited arsenal available to us to identify misconduct”. As far as tangible resources are concerned, my impression from working with scientific society-sponsored cardiology journals in Spain and elswhere is that the sponsoring societies are very well endowed in terms of funding.

    So perhaps what some of the better-funded medical journals do (or fail to do) about peer review and misconduct depends more on their policies and priorities rather than on potential limitations in available knowledge and skills. It’s hard for editors to deal with retractions, but not all journals can claim “limited resources” as a reason for inaction or procrastination.

    Karen Shashok

    May 30, 2012 at 3:46 am

    • “Limited resources” is a very useful excuse. It has general utility, and can garner some sympathy points too.
      “Blame the government” might trip off the tongue next. Perhaps it might better if journals with “limited resources” closed, or pooled their resources and merged with other journals, and stopped polluting the scientific literature. I know that cardiology has many aspects to it, but there are so many journals. Eventually the information has to be useful and fit into a medical doctor’s brain (contrary to many reports, they do have brains).

      Fernando Pessoa

      May 30, 2012 at 4:06 am

  4. Cut-and-paste errors and partial sorts are common mistakes in a spreadsheet, and are terribly difficult to spot afterwards. As a Biostatistician, I try to caution researchers about this, because I have discovered similar errors on several occasions. It would be quite possible for a subtle error of this sort to slip past me unnoticed, because I generally have no familiarity with or access to the original data that would allow me to spot discrepancies.

    Use a database if you can – REDCap is becoming a very good choice for this.
    Avoid all spreadsheet cut/paste operations if at all possible.
    When sorting a spreadsheet, verify that ALL columns are selected for sorting (Excel automatic expansion can fail if there are empty columns between data columns).
    Consult a statistician about how to set up a good database in the first place.

    Dan Eastwood

    May 30, 2012 at 3:38 pm

  5. Kudos to the authors for noticing this and owning up to it. It happened to me once, although luckily I detected it before publication, but it’s a terrible feeling as the realization dawns.

    Excel is especially dangerous because it allows you to select only some columns and then reorder or shift them, which can lead to misaligned rows; this is something that SPSS, for example, just doesn’t allow you to do (with good reason)!

    Neuroskeptic

    June 3, 2012 at 5:11 am


We welcome comments. Please read our comments policy at http://retractionwatch.wordpress.com/the-retraction-watch-faq/ and leave your comment below.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 34,262 other followers

%d bloggers like this: