That’s what Nadia Elia, Liz Wager, and Martin Tramer reported here Sunday in an abstract at the Seventh International Congress on Peer Review and Biomedical Publication. Elia and Tramer are editors at the European Journal of Anaesthesiology, while Wager is former chair of the Committee on Publication Ethics (COPE).
As of January 2013, nine of the papers hadn’t been retracted, Tramer said, while only five — all in one journal — had completely followed COPE guidelines, with adequate retraction notices, made freely available, along with PDFs properly marked “Retracted.” From the abstract (see page 18):
A total of 79 (90%) retraction notices were published, 76 (86%) were freely accessible, but only 15 (17%) were complete. Seventy-three (83%) full-text articles were marked as retracted, of which 14 (15.9%) had an opaque watermark hiding parts of the original content, and 11 (12.5%) had all original content deleted.
The findings were slightly mystifying, given that these were papers the editors themselves had agreed to retract. When the researchers contacted the 18 editors to ask why they hadn’t followed COPE guidelines, one cited “personal health problems.” Three referred the team to their publishers, and one “challenged the principle that data in retracted articles should be preserved, as these data were false and therefore valueless.”
Perhaps most worrying were the two publishers who said they hadn’t retracted six of the papers because of legal threats from Boldt’s co-authors. Jigisha Patel, of BioMedCentral, reacted this way from the audience:
Whenever an editor suggests a retraction, a lawyer’s letter arrives. This may explain the findings.
We asked Anesthesia & Analgesia editor in chief Steven Shafer, who led the charge to retract Boldt’s papers, for comment:
I have heard that some of the German anesthesia journals did not retract papers because of fear of litigation. I don’t think that is valid, but I’m neither German nor an attorney. I think that the BJA, the Canadian Journal, and A&A retracted everything.
Shafer said he looked forward to seeing the details of Tramer’s team’s analysis, and that he thought his journal’s 23 Boldt retractions complied with COPE guidelines:
As I understand the report, in at least 18 cases (23 – 5 done correctly) we screwed up somehow. I’d like to know what I missed, particularly since I seem to be doing this a lot!
An unrelated paper came out earlier this month suggesting a method for revealing problematic data, using a retracted Boldt paper as a test subject:
To illustrate our four-step procedure, we have applied it to analyze the data reported by Boldt et al. in an article that has been recently retracted for fraud [10,11]. It should be pointed out that the Editor-in-Chief decided to retract the article after several letters were sent to the journal pointing out the surprisingly small variability in cytokines measurements reported in the paper. Using this article as an example helped us to illustrate that, even though each step of our approach might be well known by statistically trained reviewers or readers, strictly applying this four-step procedure could have provided some important warning signals when apprising Boldt’s manuscript.
Tramer’s presentation on failings at anesthesiology journals prompted Richard Smith, former editor of the BMJ, to ask, “Isn’t it time to end “hobby editors?” Not surprisingly, that earned an outcry from the some 500 editors, publishers, and researchers gathered for the conference, which we’re attending through tomorrow. Nope, said Tramer, who is one such hobby editor. “But help us!”