Harvard and the Brigham recommend 31 retractions for cardiac stem cell work

Piero Anversa

Retraction Watch readers may be familiar with the name Piero Anversa. Until several years ago, Anversa, a scientist at Harvard Medical School and the Brigham and Women’s Hospital, was a powerful figure in cardiac stem cell research.

“For ten years, he ran everything,” says Jeffery Molkentin, a researcher at Cincinnati Children’s whose lab was among the first to question the basis of Anversa’s results in a 2014 paper in Nature.

That quote is from our story today in STAT reporting that Harvard and the Brigham are recommending 31 papers from Anversa’s former lab be retracted. Read all about it here.

Like Retraction Watch? You can make a tax-deductible contribution to support our growth, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

15 thoughts on “Harvard and the Brigham recommend 31 retractions for cardiac stem cell work”

  1. Following the links to earlier stories on Anversa’s lab, the similarities to Theranos are remarkable: both organizations reportedly controlled by secrecy, intimidation, and siloing of information. The question is why? They must have known their stories would unravel eventually. Not to excuse her but Holmes was very very young when Theranos went off the rails. I can’t fathom this.

    1. It is unlikely that the Anversa lab was the only one where the combination of purely radial information flow (no sharing among grad students), strong expectation of particular results, and bullying of grad students to produce those results has resulted in a string of irreproducible publications. You could even say that artificial, non-lab-related results are an inevitable outcome of the way science is currently structured and funded.

      I just wonder how many PIs are aware that their lab’s publications are largely faked, and are battening down the hatches, worried that they could be the next cascade of retractions; and how many manage to delude themselves.

      1. One grad student out of many falsifies a data set and gets away with it – it can happen; 2 papers are based on falsified data – now it looks like the PI is just plain careless; 31 papers? That takes real effort.

        Apparently Anversa had a multi-million dollar deal with a company fall apart when the investigation became known. It seems he also failed to disclose COI. Failure to disclose COI is easier to detect than falsified data, and should be seen as a giant neon red flag.

  2. Science is still effectively an honor system due to the difficulty of thorough peer review on every single paper published. Shame on these fraudsters for abusing the public trust.

    1. I truly believe that academic departments/laboratories need to take a page from industry, especially given their government funding: conduct regular financial and quality audits.

      Audits of lab records would catch some (not all) of these lapses. For example, if original data from a publication is not available, corrective actions to ensure that the data is available could be undertaken. If the department/lab has no standard equipment calibration/maintenance schedule, they would need to implement a calibration and preventive maintenance programs. If the department/lab have no standard protocols for common assays (I’m not really talking about new ones under development), audits would ensure that the protocols are fully documented and trained on by everyone conducting the protocols.

      There’s really no excuse for the types of institutional laziness I saw when I was a grad student, and I’ll remain wary of academic research until academia adopts best practices.

      1. I agree with you in principle, formergradstudent, and I imagine a lot of staff-level researchers feel the same way. I’m pushing up against this at my own place however, and the institutional perspective seems to be a hesitance to implement things like GLP compliance and audits because it comes off as heavy-handed, stifling investigator autonomy and flexibility. The real problem enters when recruiting faculty — administrative burdens are an important factor for faculty choosing between appointments, and the appearance of being micro-manage-y and audit-happy tends to turn people off and may make the difference between that in-demand junior faculty taking the offer from you or someone else. So most places will try every other possible softer approach before simply laying down the law and holding people stringently to best practices, even though that’s clearly what’s necessary.

        You’re correct that it’s academia that needs to adopt best practices as a whole, because based on my experience, I don’t think we can count on individual institutions to be leaders in this area when there’s little benefit (and probably a lot of risk) in being “first.”

  3. I think one reason for there only being a few retractions to data is that there were not so many problematic images in the publications. It was quite difficult to see things which did not make sense on the page.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.