Many publishers have been duped by fake peer reviews, which have brought down more than 600 papers to date. But some continue to get fooled.
Recently, SAGE retracted 10 papers published as part of two special collections in Advances in Mechanical Engineering after discovering the peer review process that had been managed by the guest editors “did not meet the journal’s usual rigorous standards.” After a new set of reviewers looked over the collections, they determined 10 papers included “technical errors,” and the content “did not meet the journal’s required standard of scientific validity.”
Yeah, we’re not exactly sure what happened here, either. SAGE gave us a little extra clarity — but not much.
In March 2017, Christopher Blanford received an email from an editor at the Journal of Crystal Growth. Blanford had been named as a suggested reviewer for a manuscript, and the editor, Arnab Bhattacharya, wanted to verify that the Gmail account the authors provided was legitimate.
It was not.
Blanford—a senior lecturer in biomaterials at the University of Manchester, UK—thought it was an “amusing coincidence” that he was chosen as a fake reviewer, given that he has written about malpractice in academic publishing. He confirmed the Gmail account was not his, and the other two suggested reviewers told Bhattacharya, a professor at the Tata Institute of Fundamental Research in Mumbai, India, the same thing.
The papers, published between 2015 and 2017, are from researchersbased at the Council of Scientific & Industrial Research (CSIR)–National Institute for Interdisciplinary Science and Technology (NIIST) in Thiruvananthapuram, India. S. Nishanth Kumar is the only author in common to all four paper and a corresponding on two of them; Dileep Kumar, a scientist at CSIR, is a corresponding author on three of the papers.
Fake peer reviews are a problem in academic publishing. A big problem. Many publishers are taking proactive steps to limit the effects, but massive purges of papers tainted by problematic reviews continue to occur; to date, more than 500 papers have been retracted for this reason. In an effort to help, Clarivate Analytics is unveiling a new tool as part of the release of ScholarOne Manuscripts, its peer review and submission software in December, 2017. We spoke to Chris Heid, Head of Product for ScholarOne, about the new pilot program to detect unusual submission and peer review activity that may warrant further investigation by the journal.
Retraction Watch: Fake peer reviews are a major problem in publishing, but many publishers are hyper-aware of it and even making changes to their processes, such as not allowing authors to recommend reviewers. Why do you think the industry needs a tool to help detect fake reviews?
SAGE recently retracted three 2015 papers from one of its journals after the publisher found the articles were accepted with faked peer reviews. The retraction notices call out the authors responsible for submitting the reviews.
This trio of retractions is the second batch of papers withdrawn by Technology in Cancer Research & Treatment over faked reviews in the past eight months. In 2016, the journal began investigating concerns from an anonymous tipster about faked reviewer reports and subsequently retracted three papers in December over “manipulation of the peer-review process” (1, 2, 3).
Starting July 19, anything published by Tumor Biology will not be indexed in Web of Science, part of Clarivate Analytics (formerly part of Thomson Reuters). Clarivate told us the decision was based on the fake reviews that took down more than 100 papers earlier this year. The problematic papers were released while the journal was published by Springer, not its current publisher, SAGE.