Up to one in seven submissions to hundreds of Wiley journals flagged by new paper mill tool

Wiley, whose Hindawi subsidiary has attracted thousands of paper mill papers that later needed to be retracted, has seen widespread paper mill activity among hundreds of its journals, it announced yesterday.

More than 270 of its titles rejected anywhere from 600 to 1,000 papers per month before peer review once they implemented a pilot of what the publisher calls its Papermill Detection service. That service flagged 10-13% of all of the 10,000 manuscripts submitted to those journals per month, Wiley told Retraction Watch.

Wiley said the service includes “six distinct tools,” including looking for similarities with known paper mill papers, searching for “tortured phrases” and other problematic passages, flagging “irregular publishing patterns by paper authors,” verifying researcher identity, detecting hallmarks of generative AI, and analyzing the relevance of a given manuscript to the journal.

Wiley will now “advance this new service into the next phase of testing in partnership with Sage and IEEE,” a spokesperson said.

“This service is complementary to the STM Integrity Hub, which has been established to provide a shared infrastructure both for screening and information sharing across publishers,” the spokesperson told Retraction Watch. The service does not make use of another product, the Papermill Alarm from Clear Skies, which is incorporated into the Integrity Hub, the spokesperson added.

Asked what Wiley would tell authors of rejected papers, or whether they would alert any other publishers, the spokesperson said:

Wiley’s Papermill Detection service is meant to supplement human integrity checks with AI-powered tools. This means that papers will not automatically be rejected if they are flagged in the system – rather, they will be flagged to an editor for closer consideration before proceeding in the publishing workflow.

Research integrity is an industry-wide challenge, and we are committed to transparency and sharing what we learn about papermills with our peers and the wider industry. We will continue to do so as we learn more through the continued testing and piloting of this service.  

We also asked if Wiley has considered steps to reduce the incentives authors have to use paper mills, rather than just working to detect them:

Yes, this is a problem we must address across the entire scholarly communications ecosystem. Wiley agrees with the findings of the 2022 joint report between COPE and STM which calls for direct engagement with funders, universities and hospitals to create new incentives. The United2Act initiative, which Wiley endorses and contributes to, has been organized to bring those stakeholders together.  One of their five working groups is focused directly on this important dialog between the stakeholders in the global academic reward systems.

Wiley will stop using the Hindawi brand, it said late last year, after they paused publication of  lucrative special issues because they were overrun by paper mills. That move cost the company, which publishes about 1,600 journals, millions of dollars. CEO Brian Napack stepped down in October 2023 amid the bad news.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, subscribe to our free daily digest or paid weekly updatefollow us on Twitter, like us on Facebook, or add us to your RSS reader. If you find a retraction that’s not in The Retraction Watch Database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

16 thoughts on “Up to one in seven submissions to hundreds of Wiley journals flagged by new paper mill tool”

  1. CEO Brian Napack stepped down in October 2023 amid the bad news.

    After the acquisition of Arrakis Press by Harkonnen Enterprises and the subsequent drive to recoup the costs by maximising spice production, CEO Glossu Rabban stepped down in December 10193 AG amid the bad news.

  2. While I applaud the attempt to filter bad papers out earlier in the process, we are seeing papers where the generative AI said “I can’t do that” and the paper was published. We are seeing papers where “ant colony” is substituted with “creepy-crawly kingdom” and the paper was published. We are seeing papers with images of mice with four testicles, and the paper was published….

    What this says to me is, these journals have no meaningful peer review. I’d fix that rather than applying filters upstream. Otherwise you’ll end up with higher-grade sewage, but it will still be sewage.

    1. The fact that you reached a single conclusion when throughput is always the result of plural causation is a comment on your appreciation for science, or lack thereof, Mary.

      1. The fact that you are reporting to ad hominems instead of tackling their argument (denoting a real lack of peer review within many or all of these journals) directly would lead readers to assume you don’t have any real arguments against it so are resorting to rhetorical tactics instead, Maud’i.

  3. Well the existence of some examples does not mean that the whole peer review is necessarily terrible. I mean this evidence you are mentioning is not sufficient and we need better data.

    But I agree that the peer review process at Hindawi is terrible. Most papers never get a chance of being peer reviewed because no one agrees to review for Hindawi journals. So most papers get withdrawn after monhs of waiting for someone to review them.

    If they are reviewed, it is usually by inexperienced junior researchers and students, who just write a couple of lines as their so-called review. So most of the time, reviews are very superficial and worthless, if any. Sometimes, they are even incorrect technically.

    In the last months, Hindawi has taken some measures to improve it; now the personnel check the extent of reviews and if they are too short, they ask the editor to invite more reviewers.

    This is better than before, but not much better. Now at best, we will have 2 superficial reviews, instead of only 1. At the cost of a lot more waiting.

    1. I was giving illustrative examples for flavor, not trying to make a statistical case. But I think that when 1/7 of articles can be caught by automated pre-screening, and it’s necessary to pre-screen because *otherwise these could get published*, it’s pretty clear that peer review is failing. The problems caught by the Problematic Paper Screener are not subtle: any human reading the paper should catch them. It’s hard not to suspect that no one reads those papers.
      I think peer reviewers probably have to be paid. I spend a minimum of 2-3 hours on a review and it’s time away from my research. It’s hard to justify.

      1. I agree on monetary payment or other forms of compensation for reviewers’ time and quality work.

        Though payment should be per 500 words of review. For example, every 500 words, X dollars. Otherwise, if there is a fixed review fee regardless of the extent of the review, anybody would rush to write a couple of lines as his so-called review, just to be able to collect the whole fee.

        Also quotations and excerpts from the manuscrpt should be discarded before counting the words. Otherwise, many people will fill their review notes with lots of excerpts from the manuscript.

        Besides, universities already take into account the peer-review records within one’s CV. So even without monetary gain, for [i]most[/i] people who teach at a university, peer reviewing is still translatable to academic promotion and increased monetary income as well as more success and prestige among peers.

  4. Hindawi is effectively fighting back against papermills and misconduct, at the expense of time and speed says:

    Recently, Hindawi staff at its ETHICS department evaluate each submitted manuscript with utmost OCD, at the expense of months and months of time, BEFORE handing the manuscript to an editor. They run a very in-depth analysis, using a very long checklist, examining even the most trivial things:

    1. The title should not be similar to previous studies. Otherwise, they ask the authors to change it.

    2. They check to see if the paper or any version of it has been previously submitted to another Hindawi journal and rejected.

    If the article is previously rejected by another Hindawi journal, the extent of changes should be considerable. Otherwise, reject.

    3. If the article is previously rejected by another Hindawi journal, the authors must revise it properly and also send a Rebuttal Letter along with the submission, addressing the reviewers’ concerns.

    4. The number of authors. If there are too many authors (8 or more authors), they will suspect some papermill activity, and will ask the authors to explain explicitly why so many authors? In many cases, despite the authors’ explanation, they deem the paper is not ethical enough, and reject it. I like this very much. All journals should be sensitive to the number of authors.

    5. They read carefully the author contribution and ethical declarations reported in the manuscript, and in the case of smallest missings, they ask the authors to fix them.

    6. They ask the authors to confirm that the order and list of authors is final and no author will be added or removed in the middle of submission.

    7. They will check ALL the references, ONE by ONE. Meticulously. Really good (but slow).

    8. If there are more than 3 references cited at once somewhere in the text, they ask the authors to avoid BULK CITATION and reduce the number of references, or explain each reference separately.

    9. They check the relevance of EACH reference to the sentence that cites it. Once they asked me “what is the relevance of reference X to the sentence citing it?” and I had to explain it to them! Very good.

    10. They check text similarity using iThenticate. Though this is not new. They always did.

    11. They check EACH of the authors very carefully. They search the internet for EACH of the authors one by one and find their email addresses on their previous articles published before. They try their best to verify each author, without first trusting the authors.

    12. They ask each author to reply from his email that he/she is really an author. Unless all authors don’t verify this, Hindawi will not move to the next step.

    13. Sometimes, they don’t believe the authors. For example, a couple of times, our first author was a medical resident with a personal email address. They DIDN’T believe us. They asked us to do one of the following: Either 1. The resident had to provide an academic email address. OR 2. The department head must send a signed letter on an official university sheet, verifying that this person is indeed a resident at her department. And the department head must do this from her Academic email address. OR 3. The same be done by the dean of the university, again on an official university sheet and using the academic email address of the dean. ALL OF THIS JUST TO VERIFY ONE AUTHOR. Isn’t that something?

    14. They meticulously check the integrity of the editor and reviewers, as well as any conflicts of interest.

    They do lots of other things on their checklist too, but I guess you got the point.

    These SECURITY measures are beyond awesome. But they have made the process very very very slow.

    ps. I have NO affiliation whatsoever to Hindawi or any of its journals. I am just an author / reviewer at Hindawi who is in awe with recent progress of Hindawi in catching up.

      1. What is the relevance of Poe’s law?

        I know Hindawi is supposed to be no more; but as an active author and reviewer at journals published by both Hindawi and Wiley, I know Hindawi still exists completely independently from Wiley (yet).

        When you want to submit a paper to a Hindawi journal, everything still differs from Wiley. From the website of the journal (which is still called Hindawi) to the editorial manager website for manuscript submission, initial screening and editorial assessments and peer review system, ethics assessment, all and all are 100% separate and independent from Wiley. It takes time for the merger to complete.

        See? Hindawi’s there: https://www.hindawi.com/
        Want to submit a paper? You must submit through Hindawi: https://www.hindawi.com/journals/mi/
        Nothing there is Wiley, YET.

        But then again, what was the relevance of Poe’s law? Be clear please.

        1. The distinction between attempts at satire and what people believe (or, at least, say) can be subtle. My assessment of your comment was that it was probably intended as satire, but apparently that was not the case.

          1. Thanks for clarifying. Yes, I was telling my real opinion of Hindawi trying to catch up and become a responsible publisher. But it has really become very very slow at screening. It takes at least 1 to 4 months (usually around 2 months) and so many revisions just for initial screening!

            Then it takes months just to find an interested editor who agrees to edit –most of them refuse to edit. Then many months more to find peer reviewers who agree to review AND who also send in their reviews.

            Most of the time, no one shows up, and the paper is force-withdrawn by Hindawi after say, 7 or 8 months of fruitless waiting. Hence, all those small acceptance rates.

  5. The most pertinent question is where the Science Citation Index (now Web of Science) journal quality control disappeared. Some 20 – 25 years ago or earlier journals could be accepted in that system after a long process of analyses. And could easily disappear from that index. The fact that publications moved from paper to the Internet made it possible to lose quality for many of these journals. Science Citation Index should keep high quality and novelty for, say 90% to 95% of articles per issue. By eroded standards, the citation metrics are losing their value too. Bringing back SCI quality could limit number of serious journals to, say 10% of the actual volume and clearly improve standards.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.