There are a number of fields that seem to punch above their weight on Retraction Watch: Anesthesiology, home to the world record holder (and runner-up), and psychology, home to Diederik Stapel and others. But the red-hot field of stem cell research is another that makes frequent appearances, last year’s STAP controversy being particularly prominent.
There’s an interesting (but unfortunately paywalled) recent paper in Science and Engineering Ethics, “The Acid Test for Biological Science: STAP Cells, Trust, and Replication,” by Cheryl Lancaster, a small part of which tries to answer that question.
Lancaster applies the same methods Fang, Steen, and Casadevall used to broadly measure the causes of retractions in all life science and biomedicine to the specific field of stem cell research:
Using a very basic version of the methods in Fang, Steen, and Casadevall, PubMed was searched for all articles containing the term “stem cell” between 1973 (when Fang, Steen, and Casadevall report the first retraction) and May 2012 (when Fang, Steen, and Casadevall carried out their search); this resulted in 102,089 hits. When an additional search for retractions was added, this resulted in 62 hits; 0.06 % of papers referring to stem cells were therefore retracted between 1973 and mid-2012.
Fang et al found that roughly 4 out of 10 biomedical retractions stemmed from fraud; Lancaster spotted the same in stem cell research, specifically.
In the academic press then, stem cell fraud does not appear to occur with more frequency than in general bioscience.
The 0.06% figure is of course quite small, but it’s larger than the overall retraction rate — about 0.02%. But Lancaster’s numbers don’t, in fact, tell us whether fraud occurs at the same rate in stem cell research as in bioscience overall, since the fraud percentage only looks at retracted papers, rather than overall papers. In other words, the data tell us how often retractions in stem cell research occur because of fraud, but they don’t tell us how frequent such fraud is.
Since the author cited Ferric Fang, we asked him for his take.
Firstly, he noted that many papers — especially in a growing field such as stem cell research — have been published since 2012, so limiting the analysis to before that date would affect the data. He was kind enough to run the numbers himself, including more recent papers.
I have performed a quick search in PubMed and find that the first retracted article with the term ‘stem cell*’ was published in 1989 (PMID 2474759). I therefore limited my analysis to articles published between 1989 and the present. I found 14,761,843 journal articles, with 209,038 containing the search term ‘stem cell*’. A total of 3,244 articles (0.022%) published since 1989 have been retracted, 110 of which contain the search term ‘stem cell*’. This represents 0.053% of all articles containing the search term ‘stem cell*’ that were published since 1989, which is a 2.4-fold excess over what would have been predicted if the stem cell field were representative of journal articles as a whole. In other words, an average rate of retraction would have predicted 46 retracted articles containing the search term ‘stem cell*’, but the actual observed number is 110. This supports the assertion that retractions are more frequent in the stem cell field than in some other branches of research, although the proportion of retracted articles is still very low (about 1 out of every 1900 articles).
In reviewing the reasons for retraction, I had some difficulty finding the retraction notices for a few of the articles, but I can assign at least 56 to fraud/suspected fraud (51%). (By ‘fraud’, I mean data falsification or fabrication). This is slightly in excess of the 43.4% attributed to fraud/suspected fraud in our 2012 study.
So while the data are nuanced, there is some evidence that yes, retractions are more common in stem cell research, and they may be more commonly due to fraud. Whether fraud itself is more common, however, is unclear.
Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post.
Does Psychology really have all that many retractions? Can you back up that statement with numbers?
Thanks for the question. If you’re referring to the statement that psychology “seem[s] to punch above their weight on Retraction Watch,” our archive, including 54 from Stapel, is what gives that impression: http://retractionwatch.com/category/psychology/ But that’s the point: We don’t say that psychology actually has more retractions than other fields, just that it might seem so given how many retractions we cover from the field.
Rolf Degen has an interesting post on retractions in psychology, based partly on a paper that we hope to cover shortly, that provides numbers: https://plus.google.com/101046916407340625977/posts/FdGxGCnykeC
This is a good advance by Dr. Lancaster, and she is applauded for completing this study. Indeed, anyone who follows PubPeer closely will see quite a few papers from this field appearing questioned [1]. We definately need more case studies like this to raise greater awareness. I believe that one of the common problems across all of these disciplines is a failed or at least highly permeable traditional peer review [2]. So much so that Springer, the publisher of the journal that published Dr. Lancaster’s study, JSEE, sent a “chain” email just today, requesting for feedback on the online submission systems and peer review, stating:
“Dear Author, Dear Researcher:
We know that you spend a lot of time with the online submission of your manuscripts and many of you also serve as reviewers for the research community. Springer is very much interested in making these tasks as easy as possible and in providing efficient tools to support you.
We would like to ask you to share your experiences with online manuscript submission and peer review systems and let us know how these tools could be improved. We would very much appreciate you taking a few minutes time to complete a brief online questionnaire.
As a thank you for your participation we invite you to participate in a prize drawing to win an iPad Mini 3 or one of 3 Springer books worth up to $200 each.
Please click here to start the survey.
If you have any questions about the survey, please contact Astrid Pfenning.
Best regards,
Harald Wirsching
Director Market Intelligence & Web Analytics
Springer”
It is unfortunate, however, that there always has to be an insignificant dangling carrot each time.
[1] https://pubpeer.com/topics/1/2B2B490DD36C55707411830470926D#fb22619
[2] Teixeira da Silva, J.A., Dobránszki, J. (2015) Problems with traditional science publishing and finding a wider niche for post-publication peer review. Accountability in Research: Policies and Quality Assurance 22(1): 22-40. http://www.tandfonline.com/doi/full/10.1080/08989621.2014.899909#.VJXPV0oBg DOI: 10.1080/08989621.2014.899909
I think that in general retractions represent the tip of the iceberg. Some very sharp British researchers examined the cardiac stem cell literature and came up with some staggering findings in a BMJ paper. Here’s my story on their paper: http://www.forbes.com/sites/larryhusten/2013/07/02/paper-raises-hundreds-of-questions-about-the-integrity-of-stem-cell-research-group/
Note that despite the huge number of errors in many of these papers, only a few of the most visible ones have been retracted.
Sorry– I linked to the wrong story in the comment above. Here is the correct link: http://www.forbes.com/sites/larryhusten/2014/04/28/stem-cell-therapy-to-fix-the-heart-a-house-of-cards-about-to-fall/
Larry, your stories are very pertinent and revealing. In your opinion, could you summarize here why you think so little erroneous literature has been retracted from this sector of science. In my field of study, plant science, I am facing fierce resistance or even total rejection of post-publication peer review, and so to try and understand the why of such an editorial firewall to correcting the literature might exist, we (i.e., the infinitismally tiny fraction of plant scientists who serve as activists) are forced, very unfortunately, to turn to other fields of study to understand the rationale, the reasons, and the difficulties. So, any clues that you could provide would be welcomed.
For example, I have noticed, in several management (psychology?) papers by Fred O. Walumbwa, discussed at PubPeer, that there appears to be a serious lack of accountability, especially in editor boards, but even so this depends on the editor board. Or the so-called ivory tower of untouchables.
https://pubpeer.com/search?q=+FRED+O.+WALUMBWA
https://pubpeer.com/search?q=walumbwa&sessionid=5F94A31194F20429ACBA
e.g. https://pubpeer.com/publications/1E79BA4AA94EB722491B14AE871B0F (notice clear criticisms of the editor board)
In contrast, notice how many papers of Walumbwa have been retracted from The Leadership Quaterly.
Larry, in your opinion, how valid would such a broad characterization be, i.e., the theory of the power of the untounchable ivory tower? Also, in your opinion, what role do you believe the publisher, who stands in the background, plays?
We can’t know precisely why some fields are more prone to misconduct. My own best guess is that a highly visible and highly hyped field attracts a lot of people who are in it for something more than science. Economic forces play a larger role.
Another key element is that the overwhelming hype lends a nearly religious aura to the field. Awkward questions are put aside in the desire to take part in progress and join the inevitable winning side. The dazzle of imminent success makes us blind to all the flaws.
Just a start. A very important topic.
Thank you, Larry, for taking the time to respond. Indeed, most of plant science, including horticultural science, has a rather unflashy lining to it, rather bland compared to cancer research, for example, and much less funded. That is why high-profile retractions really stand out: http://retractionwatch.com/category/plant-biology/
Thank goodness I am just a middle-class plant scientist, a “Joe the Plumber” with a more earthy vision of the elite in plant science who are clearly out of touch with the real revolution taking place in science. It is these individuals, not of all of them, but many of them, who live in their cozy cocoons, but who will sooner or later be “touched” by the issues. My experience in the past few years is that the peer pool is so restricted that at some point, everything and everyone becomes linked. Editor X in Publisher Y’s Journal Z knows scientist A who is Editor B in Publisher C’s journal D. When a retraction or scandal occurs to ABCD or to XYZ, then the whole house of cards implodes.
I am not sure there is a way to estimate all the funding flowing into a single field of life sciences and correlate it to cases of misconduct and retractions. Stem cells and cancer research are such excessively money-bloated fields. Probably everything else with clinical relevance or industrial interest, and less so ecology or animal behaviour. One might find out that, surprise surprise, where money is, crime thrives.
Way too many confounds to easily characterize one field as having more fraud. It could have to do with the types of evidence normally faked, for instance. In some fields it seems to be images that are “manipulated” or just duplicated. These images have to be included in manuscripts, and some folks have figured out how to detect this fakery. But if the faking involves a statistical analysis, even if the data are subject to inspection, you could pretty easily build up a dataset that says whatever you want, and that has just enough of a haze of random variation that it would be quite hard to detect. And if no one is allowed to see the data, or demand documentation linked to the supposed data collection, well you just have to be clumsy or lazy to get caught. It may also have to do with the size of research teams in different fields. Large research team size probably increases the chances of being caught. Some fields feature larger research teams. I’m sure that other factors will come to mind.