Can you spot a fake? New tool aims to help journals identify fake reviews

Chris Heid

Fake peer reviews are a problem in academic publishing. A big problem. Many publishers are taking proactive steps to limit the effects, but massive purges of papers tainted by problematic reviews continue to occur; to date, more than 500 papers have been retracted for this reason. In an effort to help, Clarivate Analytics is unveiling a new tool as part of the release of ScholarOne Manuscripts, its peer review and submission software in December, 2017. We spoke to Chris Heid, Head of Product for ScholarOne, about the new pilot program to detect unusual submission and peer review activity that may warrant further investigation by the journal.

Retraction Watch: Fake peer reviews are a major problem in publishing, but many publishers are hyper-aware of it and even making changes to their processes, such as not allowing authors to recommend reviewers. Why do you think the industry needs a tool to help detect fake reviews?

Chris Heid: Although the evidence is clear that allowing authors to suggest reviewers increases the chances of peer review fraud, there are still significant numbers of journals that use this as one of many methods to find qualified reviewers. We estimate that about half of the journals using ScholarOne Manuscripts continue to allow authors to add recommended reviewers during submission despite the risk.

The reason that journals don’t completely lock down these suggestions from authors, or limit profiles to verified institutional address, is that journals continue to struggle to find peer reviewers. According to our analysis of five years of peer review trends on ScholarOne journals, the average number of invitations sent to reviewers for research articles has almost doubled in the last five years.

Instead of trying to eliminate all risk and make the process even slower for peer review, journal publishers take a calculated risk and rely on human intervention to mitigate it. This adds both time to the overall process, and costs for the publisher to staff extra background checking. This means peer review is slower and costs publishers more for every article.

This tool’s goal is to improve a journal’s reputation by simplifying the management of a process, which relies on hundreds or even thousands of individual stakeholders. Even though the vast majority of peer reviews are legitimate, the reputational risks are very real for publishers. Why continue to work based solely on trust and human efforts when technology can automate this for us?

Clarivate Analytics is leading the charge on multiple fronts to provide the tools and information needed to combat fraud and improve the peer review process from end to end.

For example, by the end of the year, journals can use Publons Reviewer Locator/Connect (final name undecided) — the most comprehensive and precise reviewer search tool — to help identify the right reviewers, assess their competency, history and availability, contact them and invite them to review.

Recognition through Publons helps motivate reviewers to do a thoughtful and efficient job. The fraud prevention tool follows the submission of the review report to flag potential fraud.

RW: Can you say briefly how the tool works? What it looks for, etc? Anyone can spot a reviewer that’s not using an institutional email address, so what other qualities help signify a review is fake?

CH: The presence of a non-institutional email or absence of a Publons reviewer profile with verified review history are not fool proof for identifying peer review fraud. The fraud prevention tool evaluates 30+ factors based on web traffic, profile information, submission stats and other server data, compiled by our proprietary algorithm, to find fake profiles, impersonators and other unusual activity. This happens multiple times throughout the submission and review process.

By themselves, these factors may not trigger an alert, but combined with other actions, they can increase the risk level of a submission. From there, it is up to the journal editor and/or publisher to determine the next steps. In the long run, this tool will help to reduce the amount of retractions by highlighting issues during the submission process, instead of after publication.

RW: How can journals and publishers get access to the tool? Will there be a fee?

CH: Because the integrity of published research is at risk due to peer review fraud, Clarivate is offering this as a core, free feature in the next ScholarOne release (December 2017). Journals may request the tool to be activated in the interface at any time. The tool can also be configured to the report access levels by role for each individual journal.

RW: Have you tested the tool’s effectiveness? Do you have any data on its rate of success, as well as false negatives or positives?

CH: The tool relies on alerts based on the right combination of factors and leaves the decision to the journal editor or publisher. This is similar to alerts a bank may issue about potential fraud. For example, if you receive an alert about unusual charges on your account, it could be legitimate if you’re on vacation or it could indicate actual credit card theft.

Clarivate actively worked on this capability for the past year, continuing to balance and refine the approach with feedback from publishers who are managing this risk every day. Refinements were made based on feedback including tool sensitivity and user interface.

Early testers indicated that a number of alerts resulted in direct action, including the rejection of a paper that was already accepted but unpublished, and a re-review of another paper by an editor and external reviewer. Once the feature is live in December, we expect additional refinement through feedback tools.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

6 thoughts on “Can you spot a fake? New tool aims to help journals identify fake reviews”

  1. good to hear that. it would really be beneficial if journals start to properly evaluate potential (and suggested) reviewers, since i have met many cases where there were strange things appearing in the published papers (even in high profile journals) which clearly point on fatally failed review process, high probably some kind of tit-for-tat game.

  2. Why not simply pay peer reviewers to do the job? That way there will be a contractual relationship between the publisher and peer reviewer . . . and the right to sue in case of breach of contract.

    The quality of reviews will also likely improve. As the saying goes, there is no such thing as a free lunch. You can count on a freebie review to produce a cursory assessment. A thorough assessment of a manuscript comes with a large opportunity cost for a reviewer with expertise and reputation.

    Why volunteer time and effort when proliferating journals publish so much crap? Folks who have reputation and expertise have better things to do than lending support to a faltering enterprise. Traditional appeals to professional responsibility seem less persuasive in face of mounting evidence of low publication standards and shoddy work.

  3. This reminds me of Springer’s coming up with software to detect gibberish papers written by software.  Can’t expect any humans to take time to read this stuff.

    As a subject matter AE, I may struggle to find appropriate reviewers, but I don’t know any editors who would just accept unknown suggested reviewers. So ScholarOne is coming up with a complicated 30 point algorithm to automatically flag suggested reviewers without online histories. So then, sketchy authors have to go back to the time honoured approach of suggesting allies. 

    ScholarOne is clunky and quite complicated as is. ScholarOne’s reviewer database structure invites multiple entries for the same person, whenever their email changes. Being able to handle two emails, an institutional and a permanent personal one, would help. Seems like allowing journals an option to require ORCIDs for reviewers would be a simpler solution. Sure, ORCID IDs can be faked, but it would point to an empty profile. Not so easy to populate a fake profile with fake papers. So ScholarOne – I’m more than a little skeptical of your strategy. Seems like an effort to set up a system where managing editors can run the whole show, without having to actively engage those tedious, volunteer subject matter editors who sometimes take a long time and make questionable calls. There’s a business model there. There was a chap in Denver who used to tally up journals with this business model….

  4. 1. For an editor, It is not easy to find competent reviewers. It is impossible to know all competent reviewers in all fields.
    2. Most journals publish papers cover a wide range of disciplines. Because the editor (or subject editor) may know only his own limited field, It is impossible for the editor to pick competent reviewers for each manuscript.
    3. It takes a few weeks to do a good review . Competent reviewers are mostly overloaded with teaching, research, writing, family life, etc., because competent reviewers take every job seriously.
    4. There are not many competent reviewers. Some scientists accept the invitation of review just want to add credit to their resume.
    5. There are too many journals (and even more predatory journals, although the predatory journals do not need reviewers at all).
    6. As an author, I prefer not to suggest reviewers for my manuscripts. But most journals forces authors to do that.
    7. There are too many scientists. Not all of them are good scientists, not all of them can be good reviewers or editors.
    8. According to CBE Style Manual (5th edition), “The editor is also obliged to readers to publish only valid and important papers.” But I know it is a mission impossible.
    9. The problems of fake reviewers and incompetent reviewers are both extremely important. I hope the new tool can solve at least part of the problem of “fake reviewers”.

  5. The biggest problem with reviewers is the lack of them. We need to think creatively about how we can maximize recognition for reviewers and I don’t think remuneration is not the answer. Things are not helped by the proliferation of new and poor journals which are causing confusion when persons are seeking a journal to publish a paper. These journals are attractively presented and offer everything except an impact factor and good citation. In pure frustration at the number of journals inviting me to publish and/or join Editorial Boards I reviewed some of these journals and I have to admit they were very attractively packaged and I found it very difficult to locate any information about where they are cited and I contacted them to clarify issues around citation and impact factor status, it was no surprise that I did not receive a reply. I think the established journals need to up their efforts in combating the flood of poor journals set as business opportunities

  6. Currently, there are so many manuscripts submitted to journals, many of them useless and unreal research results. “Publish or perish” and the major role that publications profile play in the academic promotions encourage people to put together some stuff and submit to a journal. In the past, the researchers were considering the results obtained from calibration and setting up of an instrument or examination of a technique as trash, but now many researchers fabricate several papers from those unreliable results. Another issue is, huge number of journals that makes the peer review and editorial tasks difficult and unreliable. The scholarly publishing has converted from a nearly pure and honest activity to a profitable business. As long as this academic activity has not been regulated, more greedy business people come into publishing sector. With fewer submissions with higher quality and original articles, it will be not so hard to find competent peer reviewer them. So, there must be clear regulations for publication for both publishers and authors. I seriously suspect the quality of a reviewer with hundreds review in a short period. Such reviewers are given award and endorsed regardless of the quality of their reviews.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.