The problem of fake peer reviews never seems to end — although the research community has known about it since 2014, publishers are still discovering new cases. In April, one journal alone retracted 107 papers after discovering the review process had been compromised. By tracking individual reviewers’ contributions, Publons — recently purchased by Clarivate Analytics — thinks that it can help curtail the problem of faked reviews. Co-founder and CEO Andrew Preston spoke to us about how it might work — and how the site has responded to recent criticism about accessibility to review data.
Retraction Watch: What is Publons doing to help combat the problem of faked reviews?
Andrew Preston: We see it as our job to build tools to help editors to more efficiently find, screen, contact, and motivate potential reviewers. This is a large problem but by working across the publisher ecosystem (we now have partnerships with 8 of the top 10 publishers) we should be able to make the system more efficient for everyone involved.
Fake reviews occur when the author or a third party subverts the peer review process. In many of these cases the editor believes they are communicating with a reviewer when in fact the email address they’re using is controlled by someone else. This problem usually occurs when the editor uses an email address suggested by the author, but can also happen when searching for email addresses online.
On Publons, reviewers are required to verify both their email address and their reviews. What we have learned from journal editors is that by connecting those two things together, editors can have confidence that the person on the other end of the message is the same person whose verified review record they are viewing on Publons.
RW: We recently covered a sweep of retractions from one journal (107!) published by Springer, which has long been aware of the problem of faked reviews. This suggested there may be a new mechanism by which people are subverting the peer review system. There’s also discussions about the potential role third party companies – which can submit papers on authors’ behalf – may play in manipulating the review process. How can Publons address these problems?
AP: If we want to stop faked reviews we need to provide editors with quick and effective tools that help them find motivated, trustworthy reviewers. I’ve covered the tools we offer but it is worth noting that faked reviews are just one example of a number of issues that are slowing down and disrupting the academic publication process. It’s increasingly difficult for editors to find qualified and motivated reviewers for each of the many manuscripts they receive.
One solution here is to expand the pool of available reviewers. This requires courses to train new reviewers, and to make that training available to researchers who would not usually be asked to review. This may be because of their geographic location or because the editor can’t establish their qualifications elsewhere. We tackle this head on in our Publons Academy. This is a free, practical peer review training course designed to teach early career researchers the core competencies of peer review. They work directly with their supervisor to practice writing real reviews, and upon graduation, we close the loop by connecting them with editors in their field.
RW: How might Publons’s purchase by Clarivate Analytics help further reduce the problem of fake reviews?
AP: A clear prerequisite for us in the deal with Clarivate Analytics was that it allowed us to remain who we were and to address the problems we’ve been working on at a larger scale. To give you a specific example, Clarivate Analytics is home to Web of Science, the world’s preeminent citation database. By incorporating citation and author data from Web of Science into the tools we offer to editors, we will be able to provide best-in-class conflict of interest reports and suggest a wider pool of potential reviewers.
More generally, one of the key challenges in building solutions to the problems facing peer review is that while everyone agrees they are critical, it’s very hard to bring everyone together to solve them. Clarivate Analytics is a completely neutral player in the research ecosystem — they’re not a publisher, funder, or research institution — but they have extensive relationships with all of the key stakeholders.
We believe that the scale of Clarivate Analytics will help us to coordinate publishers, funders, and institutions to first of all raise awareness of the issues and then build market-leading solutions. A joint effort will improve the situation for everyone.
RW: It seems some Publons users are taking issue with how their review data are being used — and how they can’t access it. Do you have a response to that?
AP: We take data privacy and handling really seriously. This is particularly important when dealing with peer review: a process subject to a range of policies where anonymity and privacy are paramount. So we make sure to treat data both in accordance with data privacy best-practice and in compliance with individual journal review policies. This two-pronged approach has helped put our data protection credentials at the core of the Publons platform.
In the one case that ended up on Twitter, scientist Laurent Gatto asked for a download of his raw peer review data. This is not a request we receive often so we don’t have an automated process for it; when we couldn’t instantly provide the data, he took to Twitter, asking to delete his account. I’ve since instructed the team to find a way to do this, and Laurent now has his data. We’re looking at ways to accommodate these requests more efficiently in the future, but it’s more challenging than you would think. It’s not like we have a folder on our desktop for every reviewer with everything packaged and ready to go. The data are spread across many (>10) database models and we have attached various forms of private verification data from editors and publishers that we simply cannot share.
I do want to be clear that the response we’ve received from this deal has been overwhelmingly positive. Aside from a few misconceptions circulating on social media, almost every researcher, publisher, institution, and funder we’ve spoken with was at first surprised — no-one expected Clarivate Analytics to take the lead in peer review — and then incredibly excited at the potential.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.