How many retractions were there in 2012? And, some shattered records

retractionwatch3We’ve learned a lot about retractions in 2012, from the fact that most retractions are due to misconduct to the effects they can have on funding. We’ve seen eyebrow-raising reasons for retractions, from a hack of Elsevier’s peer review system to a researcher peer reviewing his own papers, to massive fraud in psychology to a math paper retracted because some of made “no sense mathematically.”

So as the year winds to a close, we wanted to take a look at retractions by the numbers.

1. How many retractions in 2012? Counting retractions for the year is a bit tricky, and not just because the year isn’t quite over yet. Some journals don’t show up in databases for months, meaning 2012 retractions will appear in 2013. We asked our friends at Thomson Scientific for their numbers, and they found 322 retractions so far, out of about 1.6 million items indexed. But they point out that 2011 saw about 2 million indexed items — with 391 retractions, probably not a complete count — which suggests there’s a lot left for 2012.

So what does that all mean? We’ve said that 2011 was a record year, with about 400, and it looks as though 2012 will have about the same number.

2. The unofficial retraction record holder. This one we can report with a bit more certainty. Yoshitaka Fujii, an anesthesiologist formerly of Toho University in Japan, fabricated results in at least 172 studies, according to investigations. Those retractions haven’t all made their way into the scientific record, but presuming even most of them do, he’ll take the record from Joachim Boldt. Boldt, Retraction Watch readers may recall, is also an anesthesiologist, and has some 88 retractions to his, um, credit. Adam discussed whether anesthesiology might have a problem in this post.

3. Longest time from publication to retraction. The record, set just this month, is 27 years, for a mundane reason:

misunderstanding of the respective publishing and copyright policies of the journals and the implications of publishing in a conference proceedings.

That was in the Journal of Steroid Biochemistry and Molecular Biology for a December 1985 paper, “Increasing the response rate to cytotoxic chemotherapy by endocrine means,” beating out 25 years for a 1980 paper in the Biochemical Journal.

We’ve broken a few records of our own, including having our first 300,000-pageview month in July (and a few since then). Our most-viewed post this year — although not of all time — was about Dipak Das, the resveratrol researcher who has now retracted 19 papers.

As always, however, numbers don’t tell the whole story. Thanks to some high-profile cases, scientific misconduct — and how to prevent it — has been an international topic of conversation. There have been stories in Le Monde, the New York Times, the Seoul Daily, and many others. Just this morning, there was a New Yorker blog post on the subject.

None of this would have been possible without the support, tips, constructive criticism, and encouragement of our tens of thousands of monthly readers. So: Thank you, and all best for the holidays and in 2013.

8 thoughts on “How many retractions were there in 2012? And, some shattered records”

  1. So for 2011 the retractions are about 400 from 2 million indexed items, or about 0.02% if my maths does not fail me.

    While not detracting from the seriousness of the 400 cases, or by any means suggesting that the 400 represents all cases (after all, its only those discovered or admitted), it does place the problem in some degree of context.

    1. You need to bear that there is enormous bar that needs to be surmounted in order to be pinged for fraud. A large percentage of the life sciences cases involved Western blots and photoshop. You have to be an imbecile to use photoshop in a publication, a) because its readily detectable, b) there is usually a very simple alternative that would allow you to put forward the same falsified data without using photoshop. In virtually every case had the scientists involved put the minimal effort to avoid image manipulation, however suspicious people may have been of their results, they would not have been retracted

      So the 400 figure refers to the number of lazy imbeciles, not really an indication of the level of misconduct. I don’t mean to imply that this 400 is the tip of the iceberg – we can really have no idea, just saying this number can’t be us to provide context since it is really a measure of imbecility not misconduct. Of course, it is great to think there are only 400 cases of imbecility out there.

      In this regard I think a special mention should be made of Shane Mayack and Amy Wagers for publishing a paper with an image taken from a methods website.

      1. The Shane Mayack case is indeed an interesting one: the ORI reported investigation findings showing that figures taken from others’ papers were copied, re-labeled and falsely reported in both a Nature and a Blood paper. Findings of misconduct were made against Dr Mayack. In her defense, Dr Mayack said (in a post to this site) that it was just a case of poor record keeping, and her PI, Dr Wagers, should share some of the blame both for not detecting the fabrications, and for retracting the papers too quickly. Having read the report on the Federal Register https://www.federalregister.gov/articles/2012/08/28/2012-21236/findings-of-research-misconduct#p-11, and after looking up the relevant publications, it seems to me that Dr Mayack is to blame, and Dr Wagers should be congratulated for taking such prompt action.

      2. Ehh… Harvard gave her tenure and in the same year booted one of their most prestigious profs in medical sciences out the university. Had Prof Wagers really done something wrong, she would have never been given a promotion. Harvard will do whatever is needed to protect their rep, but they don’t like keeping bad apples around the bunch.

      3. So John maybe you could direct us to the report of the Harvard investigative committee which in the interests of transparency they have released with certain sensitive passages blacked out.

        Thanking you in advance. It should be a fascinating read.

      4. “which in the interests of transparency”

        I believe that no major university has any sort of interest in transparency. The reputation of the university trumps all.

  2. Well, I looked up the federal register and I couldn’t see any mention of Professor Wagers aside from her name as a co-author and doubtless in the spirit of Christmas Michael Briggs has purposefully misrepresented what Shane Mayack said (what she said was not explicit but asked how was possible that all the blame in these cases fell on the most junior person).

    However, Professor Wagers should be congratulated for not going to the lengths that Professor Segal went to in order to the point the finger.

  3. Yes, that does seem like a very small number in relation, but we should also consider that a very small percentage of the millions of publications are actually being read. There are so many junk journals out there, publishing work that no one is reading or cares enough to try to reproduce, that a lot of bad science is getting indexed. Also, considering how little editors want to retract a paper, it takes a serious amount of effort by the scientific community to get them to act.

    There is still one particular Nature paper that I am waiting to see retracted. Everyone under the sun knows it is bad, yet no action from Nature. Another case of the post doc getting the blame and the PI going off to a dream job. Meanwhile, no one can repeat the data, including the PI.

    Sometimes you have to wonder why anyone is honest.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.