IRB mishap costs MD Anderson team a paper on prostate cancer

bjuifeb14A group of researchers from MD Anderson Cancer Center in Houston has lost a 2013 paper in BJU International for running afoul of their institution’s ethics review board, and of military reviewers, as well.

The paper, “Many young men with prostate-specific antigen (PSA) screen-detected prostate cancers may be candidates for active surveillance,” looked at prostate cancer screening in men 55 and under — considered young for the older-man’s disease. According to the abstract:

Little is known as to the potential for over-treatment of young men diagnosed with prostate cancer. We show that for men aged ≤55 years with PSA screen-detected disease, 45% of the tumours are classified as very low risk and 85% of these have favourable pathology, yet most are actively treated. These findings raise the spectre of over-treatment for a group of men likely to be affected by treatment side-effects.


To identify a population of young men (aged <55 years at diagnosis) with very-low-risk prostate cancer (stage cT1c, with prostate-specific antigen [PSA] density of <0.15 ng/mL/g, Gleason score ≤6, and ≤2 positive biopsy cores with <50% tumour involvement) that may be candidates for active surveillance (AS).


We queried a Department of Defense tumour registry and hard-copy records for servicemen diagnosed with prostate cancer from 1987 to 2010. Statistical analyses were undertaken using Fisher’s exact and chi-square testing.


From 1987-1991 and 2007-2010, PSA screen-detected tumours diagnosed in men aged ≤55 years rose >30-fold. Data for a subset of men (174) with PSA screen-detected cancer were evaluable for disease risk assessment. Of the 174 men with screen-detected disease, 81 (47%) had very-low-risk disease. Of that group, 96% (78/81) selected treatment and, of 57 men undergoing radical prostatectomy (RP), the tumours of 49 (86%) carried favourable pathology (organ confined, <10% gland involvement, Gleason ≤6).


Nearly half of young men with PSA screen-detected prostate cancer are AS candidates but the overwhelming majority seek treatment. Considering that many tumours show favourable pathology at RP, there is a possibility that these patients may benefit from AS management.

The paper has been cited once, according to Thomson Scientific’s Web of Knowledge. Here’s the retraction notice:

Retraction: ‘Many young men with prostate-specific antigen (PSA) screen-detected prostate cancers may be candidates for active surveillance’ by Jeri Kim, James Ebertowski, Matthew Janiga, Jorge Arzola, Gayle Gillespie, Michael Fountain, Douglas Soderdahl, Edith Canby-Hagino, Sally Elsamanoudi, Jennifer Gurski, John W. Davis, Patricia A. Parker and Douglas D. Boyd.

The above article from BJU International, published in Volume 111 Issue 6, pages 934-940, has been retracted by agreement between the authors, the journal Editor-in-Chief, Professor Prokar Dasgupta, and John Wiley & Sons Ltd. The retraction has been agreed due to the exact protocol used not having the approval of The University of Texas MD Anderson Cancer Center’s Institutional Review Board.

Boyd, who said he’s still chagrined a year after the controversy began, told us that the issue involved a phenomenon called “protocol deviation.” His group had multiple trials with the U.S. military, including, in this case, the Air Force, and had received separate IRB approvals. But Boyd said he mistakenly combined data from two different studies into the BJUI paper:

My mistake was crossing the objectives of one with the objectives of another.

That irked Air Force officials, whose complaints, Boyd admits, he ignored — to his ultimate disadvantage:

I didn’t handle it very well. I was so pissed off with the whole project at the time that I wasn’t responding to the USAF IRB, or I was responding in an untimely fashion.

In the process of securing USAF/Dod IRB approval (prior to engaging in the research) I had complained to them (and their superiors including a politician) as to the slowness of the DoD approval process. I suspect that this further encouraged the USAF/DoD IRB folks to come down hard on me.

The Air Force IRB went to the MD Anderson IRB, and the two panels agreed that the article should be retracted. Boyd, however, did not:

Even our clinical office didn’t really think it was worth a retraction.

Still, Boyd lost that battle — as well as the ability to work with human subjects (he has declined to reapply for permission, he said).

I’m still not a happy camper, but time heals wounds.

8 thoughts on “IRB mishap costs MD Anderson team a paper on prostate cancer”

  1. I’m surprised nobody objected to binning data across two different IRBs. Why combine them in the first place? Was the comparison critical to driving the conclusions that generated the paper?


  2. I thought the IRB’s job was to protect research subjects. If data set 1 is collected ethically/safely according to IRB1, and data set 2 is collected ethically/safely according to IRB2, then how is it ethically problematic to combine the two data sets?

    1. If I understand it, they gained access to a _database_ of medical records but carried out a study that differed significantly from what they were authorized to do. From a strict medical ethics point of view this retraction is probably correct.

      An organization such as the DoD which has a large (gigantic) collection of medical records has a heavy obligation to protect these records from disclosure and/or misuse. I think it is right that access is only granted after the highest stringency of review and approval.

    2. Indeed an IRB’s job is to protect research subjects. That involves, in part, making sure that the subjects know exactly what their data or tissue is going to be used for. There is a common misconception among researchers that once they have data they can do what they want with it. The researchers here breached their obligations to their subjects, whether negligently or deliberately isn’t entirely clear. Either way, the IRB was correct to require a retraction.

  3. This is overkill. Doesn’t a retraction suggest that the science is in error, that the data are unreliable or untrustworthy? Here, a correction or “mea culpa” addendum would clarify what the ethics error was (which really is unclear from this), and be both a smack-down as well as provide an educational opportunity for the investigators, without impugning the scientific findings. As Art Caplan argued years ago, publishing but acknowledging the ethical problems behind the science is probably better — more respectful of the contribution of the subjects, more helpful to future scientists – than the attempt to wipe the record clear (which is never successful, regardless).

    Imagine the silly situation in which the investigators now apply for IRB approval to do the study, redo it, and then try to republish it… a waste all around.

    1. “Imagine the silly situation in which the investigators now apply for IRB approval to do the study, redo it, and then try to republish it… a waste all around”
      Or they might not get permission to do it! This reasoning would allow unethically performed studies, once performed, to be published as well. I therefore think that this is not a line that can be crossed: no approval is no publication is no overkill. Otherwise, the whole idea of ethical approval quickly becomes a farce.

    2. I think the issue is whether the errors were purely administrative or procedural, as opposed to something that violates the spirit of IRB approval. In this case it seems to be the second sort.

  4. Sounds like a scientist who did not want to play by the rules. I stand by the DoD in this instance. You can’t make up study objectives and aims as you go along. An IRB is very study title, study aims, and study objective specific. That is the whole point. Tantrums won’t buy you love in the game of protecting the rights of study subjects and the ethical integrity of doing research correctly and TRANSPARENTLY.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.