The Year of the Retraction: A look back at 2011

Don Quixote and Sancho Panza, Madrid. Photo by Zaqarbal via Wikimedia

If Retraction Watch was actually a business, as opposed — for the moment, anyway — to a labor of love for two guys with day jobs, 2011 would have been a very good year for business.

It was a year that will probably see close to 400 retractions, including a number of high-profile ones, once the dust settles. Those high numbers caught the attention of a lot of major media outlets, from Nature to NPR to the Wall Street Journal. Science publications, including LiveScience and The Scientist, have done their own end-of-year retraction lists.

It was also a good year for us at Retraction Watch. Many news outlets featured us in their coverage, either picking up stories we’d broken or asking us for comment on big-picture issues. Three national NPR programs — Science Friday, On the Media, and All Things Considered — had us on air. We launched a column in LabTimes, and Nature asked us to write a year-end commentary. We even earned a Wikipedia entry.

All of that has contributed to the fact that sometime today, we’ll surpass 1.5 million pageviews. We’ve tapped into a passionate and helpful community of readers, without whom much of Retraction Watch wouldn’t be possible. You send us great tips, add valuable commentary, keep us honest by correcting our errors, and encourage us at ever step.

So: Thank you.

Now, which journals had the most retractions?

By our unofficial count, the winner would appear to be the Journal of Biological Chemistry, with 15. Those tended to come in groups, with four from Silvia Bulfone-Paus’s lab, four from the late Maria Diverse-Pierlussi’s group, three from Jesse Roman’s lab, and two from Zhiguo Wang’s. One of the other two was from Harvard cancer researcher — and New Yorker staff writer — Jerome Groopman’s lab. There are four more on the way from Ashok Kumar’s University of Ottawa lab, although it looks as though those will actually appear in 2012.

We covered seven in Blood, eight in the Proceedings of the National Academy Sciences, and eight in the Journal of Immunology.

Science had five, including two very high-profile ones, longevity genes and chronic fatigue syndrome-XMRV, compared to just two last year.

Last year, Nature did some soul-searching when they got to November and had published four retractions. This year, they published just one.

Rounding out the “glamour journals,” Cell had just one, compared to four last year.

Personal best?

Of course, the person with the most retractions was Joachim Boldt, with 89. Naoki Mori was a distant second, with 32, although a few of those ran last year. Claudio Airoldi’s group retracted 11. And most of the journals listed above were touched by other big names in Retractionville, including Bulfone-Paus, who has now retracted 13, and Anil Potti, who has now retracted 7.

Jatinder Ahluwalia was only forced to retract one paper this year, after retracting one last year. But the revelations that followed cost him a job at the University of East London, and may cause Imperial to strip him of his PhD. His case is a good reminder of why it’s a good idea to poke at what lies beneath retractions.

Here were our top five posts by traffic:

  1. The retraction of an editorial about why semen is a good Valentine’s Day gift, by eminent surgeon Lazar Greenfield
  2. The retraction of a bizarre paper claiming that science and spirituality both came from space
  3. Our coverage of the Claudio Airoldi case
  4. Our now-infamous “none of your damn business” post, covered by Ben Goldacre
  5. Our scoop on two retractions by Zhiguo Wang, who later resigned from the Montreal Heart Institute

Looking forward to 2012

Our wish list is much the same as it was for 2011, particularly better explanations of why particular papers are being retracted, and better publicity for retractions. We’ll add one item to that list: Journals, please stop letting researchers make claims in retraction notices and corrections, unless you peer-review them. Why should we trust the word of researchers who’ve demonstrated they make errors, intentional or not?

And we’ll be keeping an eye on what may be an emerging trend: The mega-correction. We’ve seen errata notices that correct so many different errors, it’s hard to believe the paper shouldn’t have been retracted. It’s unclear what this means yet, but watch this space for coverage of more examples.

We also may be coming to your town. See our list of upcoming appearances, which will be regularly updated. Get in touch if you’d like to host us; we love engaging with readers in person. And don’t forget the Retraction Watch Store.

Happy New Year.

32 thoughts on “The Year of the Retraction: A look back at 2011”

  1. I enjoy reading your posts and trying to get a pulse on how big of a problem retractions are and how their ripples may reach some of the publications we use to inform our work. If possible, next year it would be great to have some kind of break down of the % of articles retracted that contained bad data/conclusions (fraudulent data for example). Plagiarism is obviously bad as is duplicating figures in multiple publications but they don’t necessarily mean the data/interpretations one may have gleaned from the paper is inaccurate, which to me is the #1 concern.

    1. Thanks!

      You might be interested in using our category drop-downs in the right-hand column of all our pages. There, you can find posts indexed by reason for retraction, among other criteria. For example, here are all of the posts about plagiarism:

      And here are all of the posts about faked data:

  2. When you are calculating your retraction numbers per journal, had you thought of normalising per number of papers published? Of course not all retractions published in a given year relate to papers published in that year, but I seem to remember that JBC publishes a great deal of papers, so perhaps proportionally not out of line for retractions? I don’t know, but it might be an interesting statistic.

  3. I would like to buy your t-shirts, but it needs to say something more than just “retraction watch”. Many physicians (at least in my part of the world) have no idea what retractions are all about.
    Some think “academic publishing” in medicine is all about being a good writer (literary).
    The study design, data and statistical analysis are just incidental and a bit of a nuisance.
    The number of times I have been patted on the back and asked to “write something for publication”. Hmm………
    Talking about improving the quality of research for the sake of patient care sometimes/usually draws puzzled stares-some just don’t see the connection

  4. “…for the moment, anyway”

    Would you care to explain the above comment? I think a lot of the credibility here derives from this NOT being a business. But now I am starting to wonder. Once you start turning this into a business (dunno, perhaps by associating with one of the scam “plagiarism detection” companies out there who sell software to spot plagiarism with one hand and software to help students not get detected with the other), there are conflict of interest issues that you can never really shake as there is always the lingering suspicion that what you say has a hidden motive (=$). But I don’t know you personally, perhaps that was the ultimate goal to begin with.
    In the meanwhile, Happy New Year! 🙂

    1. Thanks for the question. We take conflict of interest issues very seriously. We decline honoraria and travel expenses for speaking engagements at groups or companies that we might cover — for example, Nature, which asked us to write an end-of-year commentary. We turned off a “donate” button on the site when we realized it was creating conflicts of interest, as we note in our FAQ: The only revenue we have right now is what we are paid from a column in LabTimes, and pennies — literally — from syndication through a company called Newstex.

      We take these issues seriously because we believe that credibility with our readers and sources is paramount. And we are certainly happy with the success of Retraction Watch so far as a labor of love. But it takes considerable time and effort, and growing it in the ways we’d — and our readers, based many great suggestions — like will require more resources. One of the ways to develop those resources is to have revenue. We would only entertain such opportunities, however, if we could minimize conflict of interest, and be completely transparent about it. That’s where credibility comes from, in our minds; having revenue itself does not mean a lack of credibility.

      Sponsorship by a plagiarism detection software company is not one of the models we think is consistent with our conflict of interest standards, but it is hardly the only model for creating a business that would allow us to grow. Nor is sponsorship by for-profit companies the only way to go. There are foundations, for example, that are interested in accountability and transparency. Since we are too, we will always disclose those relationships.

  5. Thanks for another great year.

    Just an idea for next year, could we have some kind of a monthly or fortnightly list of all the retractions that didn’t warrant a post of their own? I don’t think there is such a thing at present and it would be helpful for people wanting to do stats on the % of retractions due to misconduct or whatever.

  6. Thanks so much for providing this valuable service. Of interest, a PubMed search for “Retracted Publication” or “Retraction of Publication” for 2011 yields only 35 papers. So if you and your colleagues weren’t on the job, we’d be missing out on ~1 in 12 of these papers!

    1. “Retraction notice” gives 63 hits. Still, the problem is exactly that is is not clear how to retrieve all the retractions, which may well be in the PubMed database. There should be a standard tag!

      1. I once asked PubMed help about this. They suggested “Retraction of Publication[Publication Type]” as the MeSH term to use.

  7. In those dark days where scientific fraud seems to be everywhere, even in the most prestigious and respected journals, your blog gives me hope. I am a scientist and I feel that fraud reaches now an endemic point in science (…unfortunately, the 400 retracted articles that you was able to trace in 2011, seem to me only the tip of a huge iceberg…) and that it is imperative to break the system of silence that seems to surround scientific publications for more transparency and honesty and for the good of science.

    You are doing an extremely important job. I wish long life to your blog.

    1. “even in the most prestigious and respected journals”. Most often than not, you get into these fad-worshipping, popularity contest, journals because of your cozy relationship with the editors/reviewers. Therefore, it would not be surprising if this was true. In my opinion, most solid work is published in more specialized, mid-level journals that are less interesting in the media splash factor and more immune to the sociology of science.

  8. Dear Lilly,

    “The system of silence” is difficult to penetrate, but perhaps some of the message is starting to seep through.

    I was surprised that many editors didn’t know the ICMJE guidelines for authorship

    or the COPE guidelines

    When I mentioned 1789 to French editors, they didn’t know about its significance.
    Many had not heard of the “Arab Spring” so expecting them to know about the “Science Spring” is too much.

    The Odessa file gets it about right.

    Very soon you come up against the “system of silence”.

  9. Your blog is a very interesting read, indeed.
    May I ask, why the photo? It is interesting that Cervantes himself suffered from plagiarism. He responded by creating the second half of his Quixote and a cornerstone of literature. He went far beyond the first part by including the real case of plagirism he suffered in the plot of the novel he was creating.
    Thanks in part to your task we should also learn from these retractions to go beyond and produce better science. Best hopes for 2012.

    1. We were thinking more of Don Quixote and Sancho Panza, on a quixotic mission. I’d forgotten that Quixote was himself a victim of plagiarism — adds a nice double meaning for us to say we’re representing authors who are plagiarized!

  10. Was reading through some of the links posted in the article – particularly the one in Nature – and wondered: is it possible to know how many papers don’t make publication or, indeed, to judge anything about the validity of a paper, or indeed the publication process itself, from looking at the number of retractions vs. the number of publications in isolation?
    It would be good to think that journals who do retract – either on their own volition or because of an author’s insistence – learn from the experience. Good also to hear from the Journal that they have put plans in place or tightened procedures to lessen the chance of it happening again – but the individual reasons for retraction often make me think that the tightest of procedures wouldn’t result in less retractions per se.
    Thinking of the recent retraction by Science of the Lombardi et al paper – I would like to know just how unique such a retraction is i.e. one where the editors took the decision as opposed to the authors’ themselves.
    I guess you could categorise each retraction based on rationale – assuming of course the rationale is given or even agreed upon by the authors for example. In the case of Lo et al. the statement of belief in the ‘integrity’ of their paper rather threw additional confusion into the mix.
    All in all a rather difficult thing to quantify I would imagine. Then again, I was under the (mis)apprehension that retractions only occurred because of fraudulent allegations and not necessarily for more legitimate reasons such as non-reproducibility or contamination. After all, not every published paper receives the degree of attention that Lombardi and Lo have and if indeed they had not – then presumably the papers would not have been retracted?

    1. Thanks for the comment. You may be interested in our category drop-down menu, in the right-hand column of every page.

      For example, here are all of our posts about papers retracted for image manipulation:

      And here are those about papers retracted because their findings couldn’t be reproduced:

      Perhaps we should consider adding categories that differentiate author retractions from editor/publisher retractions.

      It’s worth noting just how difficult it is to pin down precise numbers, because of the opacity of so many retraction notices:

  11. Thanks Ivan and Adam for this valuable service. Hopefully this will become a business someday. Scientists are human beings. Sometimes they are wrong. What could be more scientific than a journal that follows our “self correcting” mechanisms?

    1. In fact there is a journal called Research Policy.

      Maybe a couple of periodicals specializing in studying the phenomenon of scientific fraud would very much revolutionize science and even attract a good lot of general attention. I hope this would soon come up.

  12. If you have not already, and I’ve missed it: Would it be possible for your to post a list of retractions in 2011 by category of reason for the retraction (plagiarism, non-reproducibility, etc.)? Thanks. Ron Roizen

  13. Another question from me, if I may: Did you unearth any retracted meta-analyses in 2011, or ever for that matter? Thanks, Ron Roizen

  14. Ivan:

    You left an important category out, Prominent People taken out by Misbehavior. For 2011, this includes
    1) the Dean of the school of social sciences in the netherlands
    2) zu Gutenberg – German Defense Minister
    3) Gaddafi’s son

    There are plenty more of course, but I am just looking for really high-profile people.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.