Retraction Watch

Tracking retractions as a window into the scientific process

Search Results

Scientific misconduct and sexual harassment: Similar problems with similar solutions?

with 17 comments

Michael Chwe

Michael Chwe

Today colleges and universities face a crisis of accountability in two domains: scientific misconduct and sexual harassment or assault.  Scientific misconduct and sexual harassment/assault are obviously different, but the way they are reported, handled, and play out have many similarities. Michael Chwe at the University of California in Los Angeles has been thinking about this for a while.  Just last summer his own department was rocked by the high-profile retraction of a Science paper about gay canvassers, and two graduate students in the UCLA history department sued the university for failing to investigate sexual harassment complaints.  Chwe suggests that, if scientific misconduct and sexual assault are similar, they might have similar solutions.

Scientific misconduct and sexual assault have more in common than you might think. Read the rest of this entry »

Written by Alison McCook

April 6th, 2016 at 2:00 pm

Who has the most retractions? Introducing the Retraction Watch leaderboard

with 25 comments

Ever since we broke the news about the issues with the now-retracted Science paper about changing people’s minds on gay marriage, we’ve been the subject of a lot of press coverage, which has in turn led a number of people to ask us: Who has the most retractions?

Well, we’ve tried to answer that in our new Retraction Watch leaderboard.

Here is the current list (click here for more detailed information about our methodology and additional notes): Read the rest of this entry »

Written by Alison McCook

June 16th, 2015 at 2:00 pm

The Retraction Watch Leaderboard

with 22 comments

Who has the most retractions? Here’s our unofficial list (see notes on methodology), which we’ll update as more information comes to light:

  1. Yoshitaka Fujii (total retractions: 183) See also: Final report of investigating committee, our reporting, additional coverage
  2. Joachim Boldt (96) See also: Editors-in-chief statement, our coverage
  3. Diederik Stapel (58) See also: our coverage
  4. Adrian Maxim (48) See also: our coverage
  5. Chen-Yuan (Peter) Chen (43) See also: SAGE, our coverage
  6. Hua Zhong (41) See also: journal notice
  7. Shigeaki Kato (39) See also: our coverage
  8. James Hunton (36) See also: our coverage
  9. Hyung-In Moon (35) See also: our coverage
  10. Naoki Mori (32) See also: our coverage
  11. Jan Hendrik Schön (31) See also: our coverage
  12. Tao Liu (29) See also: our coverage
  13. Cheng-Wu Chen (28) See also: our coverage
  14. Yoshihiro Sato (25) See also: our coverage
  15. Scott Reuben (24) See also: our coverage
  16. Jun Iwamoto (23) See also: our coverage
  17. Gilson Khang (22) See also: our coverage
  18. Noel Chia (21) See also: our coverage
  19. Friedhelm Herrmann (21) See also: our coverage
  20. Dipak Das (20) See also: our coverage
  21. Khalid Zaman (20) See also: our coverage
  22. Jin Cheng (19) See also: our coverage
  23. Stanley Rapoport (19) See also: our coverage
  24. Fazlul Sarkar (19) See also: our coverage
  25. Bharat Aggarwal (18) See also: our coverage
  26. John Darsee (17) See also: our coverage
  27. Wataru Matsuyama (17) See also: our coverage
  28. Erin Potts-Kant (17) See also: our coverage
  29. Robert Slutsky (17) See also: our coverage
  30. Ulrich Lichtenthaler (16) See also: our coverage

We note that all but one of the top 30 are men, which agrees with the general findings of a 2013 paper suggesting that men are more likely to commit fraud.


Many accounts of the John Darsee story cite 80-plus retractions, which would place him third on the list, but Web of Science only lists 17, three of which are categorized as corrections. That’s not the only discrepancy. For example, Fujii has 138 retractions listed in Web of Science, compared to 183 as recommended by a university committee, while Reuben has 25, compared to the 22 named in this paper. We know that not everything ends up in Web of Science — Chen, for example, isn’t there at all — so we’ve used our judgment based on covering these cases to arrive at the highest numbers we could verify.

Shigeaki Kato is likely to end up with 43 retractions, based on the results of a university investigation.

All of this is a good reminder why the database we’re building with the generous support of the MacArthur Foundation and Arnold Foundation will be useful.

Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post.

Written by Ivan Oransky

June 16th, 2015 at 11:09 am

Posted in

Is it time for a retraction penalty?

with 31 comments

labtimesThe title of this post is the headline of our most recent column in LabTimes, which begins:

As we write this in mid-August, Nature has already retracted seven papers in 2014. That’s not yet a record – for that, you’d have to go back to 2003’s ten retractions, in the midst of the Jan Hendrik Schön fiasco – but if you add up all of the citations to those seven papers, the figure is in excess of 500.

That’s an average of more than 70 citations per paper. What effect would removing those citations from calculations of Nature’s impact factor – currently 42 – have?

Science would lose 197 citations based on this year’s two retractions. And Cell would lose 315 citations to two now-retracted papers.

In other words, what if journals were penalised for retractions, putting their money where their mouth is when they talk about how good their peer review is? Clearly, if a paper is retracted, no matter what excuses journals make, peer review didn’t work as well as it could have.

We explore what this might mean for top journals. But there are some nuances here. We wouldn’t want to further discourage retractions of papers that deserved it. One solution: Read the rest of this entry »

Written by Ivan Oransky

September 18th, 2014 at 12:10 pm

Posted in RW announcements

Nature comes clean about retractions and why they’re on the rise

with 5 comments

courtesy Nature

This week’s Nature includes a refreshing and soul-searching editorial about retractions. Excerpt (we added links and corrected a misspelling and wrong country in the editorial after a reader noted the errors below):

This year, Nature has published four retractions, an unusually large number. In 2009 we published one. Throughout the past decade, we have averaged about two per year, compared with about one per year in the 1990s, excluding the pulse of retractions of papers co-authored by [Austrian German physicist Jan Hendrick Hendrik Schön].

Given that Nature publishes about 800 papers a year, the total is not particularly alarming, especially because only some of the retractions are due to proven misconduct. A few of the Nature research journals have also had to retract papers in recent years, but the combined data do no more than hint at a trend. A broader survey revealed even smaller proportions: in 2009, Times Higher Education commissioned a survey by Thomson Reuters that counted 95 retractions among 1.4 million papers published in 2008. But the same survey showed that, since 1990 — during which time the number of published papers doubled — the proportion of retractions increased tenfold (see

The editorial highlights Read the rest of this entry »

Written by Ivan Oransky

November 4th, 2010 at 10:28 am

Posted in nature retractions