How a widely used ranking system ended up with three fake journals in its top 10 philosophy list

Tomasz Żuradzki

Recently our philosophy faculty at Jagiellonian University in Kraków, like many institutions around the world, introduced a ranking of journals based on Elsevier’s Scopus database to evaluate the research output of its employees for awards and promotions. This database is also used by our institution in the hiring process. 

The database provides three main measures: CiteScore, SJR, and SNIP. CiteScore counts the citations received in four-year periods (e.g. 2020-2023) by texts published in this span and divides this figure by the number of papers published in the same interval. SJR and SNIP – which our institution uses to rank journals – are more complicated, with their full algorithms not publicly available.

We checked the Scopus philosophy list and discovered three journals published by Addleton Academic Publishers – which we had never heard of – are in the top 10 of the 2023 CiteScore ranking: Linguistic and Philosophical Investigations (3rd on the list of 806 philosophy journals indexed by Scopus in 2023), Review of Contemporary Philosophy (5/806), and Analysis and Metaphysics (6/806). All three also are in the top 100 of the 2023 SJR ranking. 

Addleton publishes two other journals indexed by Scopus: Knowledge Cultures (67/806 in CiteScore Philosophy ranking and 8/1,106 in Literature and Literary Theory ranking) and Contemporary Readings in Law and Social Justice, which is indexed as a social science journal (33/1025 in Law).  The publisher initially had nine journals in Scopus, but four of them have now been removed because of “Publication Concerns,” according to the most recent Scopus update on May 5. Early in May the company announced on their website that Linguistic and Philosophical Investigations, Review of Contemporary Philosophy, Analysis and Metaphysics, and Contemporary Readings in Law and Social Justice will have a new publisher: Auricle Global Society of Education and Research, based in India.

How was it possible to get into the Scopus top 10 in philosophy? The trick is simple: The Addleton journals extensively cross-cite each other. For example, of 541 citations to Linguistic and Philosophical Investigations used to calculate the 2023 CiteScore, 208 come from journals published by Addleton. Additional citations come mostly from Frontiers and MDPI journals.

These journals are filled with automatically generated papers, all using the same template, extensively using buzzwords such as “blockchain,” “metaverse,” “deep learning,” “immersive visualization,” “neuro-engineering technologies,” and “internet of things.” Most papers claim to examine the recently published literature on these topics by “a quantitative literature review of the main databases.” They also claim to analyze initially (always!) between 170 and 180 articles that satisfied the undisclosed “eligibility criteria.” 

Leszek Wroński

The papers claim that after quality checks with tools called AMSTAR, AXIS, MMAT, ROBIS, etc., the authors decided to focus (always!) on between 30 and 35 articles. Then there are the phrases. “ROBIS assessed the risk of bias in systematic reviews,” “AXIS evaluated the quality of cross-sectional studies,” and “The quality of academic articles was determined and risk of bias was measured by MMAT” appear in Google Scholar 270 times as of early May, solely in journals published by Addleton. 

Although our quick search showed that some authors have real affiliations, mostly in Romania, Slovakia, and the Czech Republic, a substantial share of authors and their affiliations seem to be fake – for example authors at “The Cognitive Labor Institute in New York,”The Sustainable Industrial Networks Research Unit at CLI in Springfield, IL, USA,” and “The Center for Sensing and Computing Technologies, Bradford at ISBDA” in England, all with “” email addresses. 

That domain – also used by some of the editors in chief – belongs to the American Association for Economic Research, which on its website shares an address – 30-18 50th Street, Woodside, New York, NY, 11377 – with Addleton and seems to be a random house in the New York City borough of Queens. 

Authors also use fake grant numbers allegedly from fake institutions – “Grant GE-1420897 from the Internet of Things Sensing Infrastructures Research Unit, Newport, Wales;” “Grant GE-1764317 from the Cyber-Physical Process Monitoring Systems Laboratory, Norwich, England;” “Grant GE-1823847 from the Internet of Things Sensing Networks Research Unit, Plymouth, England.” The same editorial board serves for three journals, with 10 members who are dead. 

When we informally alerted some colleagues involved in introducing these rankings at our institution, we met with indifference. The presence of these fake journals on the relevant lists is apparently perceived to have negligible consequences. However, the lists as used in the employee evaluation process – for example, to nominate researchers for yearly awards – have firm percentile cutoffs. And the fact that three fake journals are among the leaders in the Scopus rankings has the practical consequence that three honest journals which should have received the top score from the perspective of our local evaluation have been pushed to the lower tier.

We have contacted three “authors” with “” email addresses, asking them to send their papers, but they have not replied. We also contacted various members of the journals’ editorial boards. One – Liz Jackson of Hong Kong – said she was invited to be on the board but has since not been asked to do anything. She said she would ask to have her name removed. One of us (LW) also contacted some apparently real authors from the University of Craiova to discuss the content of their papers. None has replied.

Rankings based on Scopus frequently serve universities and funding bodies as indicators of the quality of research, including in philosophy. They play a crucial role in decisions regarding academic awards, hiring, and promotion, and thus may influence the publication strategies of researchers. A recent philosophy article that provides a meta-ranking of philosophy journals claimed “CiteScore is the one [measure] that correlates most strongly with all four meta-rankings. Tentatively this might be a reason to prefer CiteScore as a source of information on a journal if it has no other rankings.” Our findings show that research institutions should refrain from the automatic use of such rankings. 

Tomasz Żuradzki and Leszek Wroński are professors at the Institute of Philosophy & Interdisciplinary Centre for Ethics at Jagiellonian University in Kraków.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, subscribe to our free daily digest or paid weekly updatefollow us on Twitter, like us on Facebook, or add us to your RSS reader. If you find a retraction that’s not in The Retraction Watch Database, you can let us know here. For comments or feedback, email us at [email protected].

16 thoughts on “How a widely used ranking system ended up with three fake journals in its top 10 philosophy list”

  1. If a journal uses a virtual address means they are fake? how about not having an address at all? calling a journal fake needs more evidence than that.

    1. May I suggest you read again the sections that describe the automatically generated papers, the fake affiliations / grant numbers and the pseudo editorial board ?

    2. You mean evidence like “The same editorial board serves for three journals, with 10 members who are dead”? I can believe a journal being slow to update information about its editorial board–but three different journals, each with _the same_ ten dead people on their editorial boards?

      The problem with the address isn’t just the unfamiliar email domain. It’s journal “authors” affiliated with three different alleged institutions, institutions that google and duckduckgo have never heard of, with those addresses.

      Yes, Woodside, Queens really exists, and

  2. “Recently our philosophy faculty at Jagiellonian University in Kraków, like many institutions around the world, introduced a ranking of journals based on Elsevier’s Scopus database to evaluate the research output of its employees for awards and promotions. This database is also used by our institution in the hiring process. ”

    As long as this continues, so will publication fraud and papermilling. It is inevitable.

    1. Yes. The human passion for ranking things inevitably collides with the human passion for gaming every system. ANY unconsidered, automatic application of numeric rankings to complex human productions > absurdity.

  3. See also “Predatory journals in Scopus”:
    The paper is already a bit outdated. But this story is just another iteration on the same topic (of fake, predatory, or whatever you call the problem, questionable journals creeping in under the low bar of entry criteria in the Scopus database).

  4. Great work! In addition, there seems to be a cottage industry in India of acquiring Scopus-indexed journals that are currently defunct, languishing, or in this case outright fraudulent (a Pump and Dump business model?). I suspect that these acquisitions are marketed to a population of Indian faculty / graduate students who are being required to publish in Scopus-indexed journals for graduation / promotion – similar to the China requirements that have incentivized the paper-mill business model polluting the scholarly literature repositories.

    It appears Scopus is trying to get ahead of this with an aggressive attempt to stem the damage as its latest update (1st quarter 2024) has de-indexed a number of these journals.

    Once these journals are de-indexed, most do not reflect this “de-indexing” on their websites (you can imagine my surprise!) ie Journal of Survey in Fisheries Sciences,
    or the hijacked journal Journal of Chemical Health Risks
    and continue to promote their dumpster fires to the unsuspecting/lazy.

    It appears that once these journals are no longer indexed in Scopus, they are sold(?) / publisher name changes / pawned off to the next Indian predatory publisher to carry the torch. It is a dizzying feat trying to track/maintain who the actual / current publishers are.

    Many of these journals follow a pattern as described by Anna Abalkina on RW where the number of publications increase significantly from prior years under the original publisher. A quick glance at the issue content on PSIref will show anyone with just over 2 semi-functional neurons that most articles have nothing to do with the Aim and Scope of the journal.

    For 100s of examples see
    Journal of Survey in Fisheries Sciences on PSIref @
    You don’t have to be an ichthyologist but if you’re still not convinced, here’s a 1000 more examples more in
    Educational Administration : Theory and Practice

    The publishers responsible for a number of these have the DOI prefixes 10.53555/, reported as “Green Publication” and 10.52783/ reported as “Science Research Society” in Crossref, respectively.

    Not sure how we address this nonsense as a community (not going to bombard Pubpeer and waste people’s time) but we’re open to ideas as long the solution doesn’t become another data silo. We are starting with a visible flag to the journal pages in our indexes on PSIref in hopes to bring attention to the unsuspecting. We like to think of it as “turning the lights on” and bringing a bit of transparency (and public scrutiny) to some dark corners.
    Here are the publishers mentioned with their current stable periodicals (under the “Periodicals” tab) that may or may not deserve some extra attention.

    Science Research Society

    Auricle Global Society of Education and Research

    Green Publication

    Addleton Academic Publishers

    Ninety Nine Publication (specifically, International Journal of Psychosocial Rehabilitation appears to be a sale/transfer from Hampstead Psychological Associates, to Green Publication, then de-indexed by Scopus and “acquired” by Ninety Nine Publication)

  5. The advertised list of contributors for Linguistic and Philosophical Investigations is pretty impressive. ( But when you look at the tables of contents for recent issues, they seem to match what you describe. In fact there seems to be a switchover from a fairly normal (and probably not highly cited) journal, to what you describe, in 2020. (

  6. I’m a bit confused — what is the incentive to create a fake journal like this? Are they hoping that real philosophers will be duped into sending them articles for consideration? It seems like a lot of trouble to go to, just to have a few philosophers submit articles that, let’s face it, will probably not be read by very many people. It’s just strange to me.

    1. Journals with a high rank attract authors. So, once one of these journals has a high rank, the publishers will add a charge for publication and earn $$.

      1. I don’t buy this – these publishers are so obviously fake, anyone who is serious about their work surely would not deal with them? I think the question remains, who is benefiting from this elaborate hoax?

        1. Some authors presumably don’t care that journals are obviously fake (as paper mills otherwise would find no customers), as long as they can claim to the paper pushers who evaluate them that they published in a high rank journal.

        2. I think that’s a good question and one of the reasons may be to steal intellectual property.
          How, you ask? The fake journals will trick some people who are doing original work, and barely starting to develop ideas, into sending in their manuscripts. Why would these people be tricked? Because they might be working at institutions that do not have a strong research culture and do not have the natural safeguards of norms that institutions with a strong research culture have.
          Why would it be “theft” if the goals was to publish the research anyway? Because, I said, there might be valuable original ideas just beginning to be developed. They could be such that in a different context (an institution that had a research culture and resources and connections) would be directed towards venues where they could be fully developed and used to create novel technology.
          Anyway, just an idea. Not inconsistent with some of the other motivations for fake journals that have been mentioned. The specializations of the fake journals/topics of the fake articles could lend support to the hypothesis.

  7. Thanks a lot for the insight, but why did you start producing a journal ranking anyway?
    Authors should never be evaluated by the “quality” of the journal they have published in. This is no good scientific practice, and one of the recommendations of DORA. Academic institutions should stop doing this.

  8. So, the idea is to automatically generate a lot of papers that are citing each other, so they look as if they are valuable research, then publish them in your own scientific journals, so they look like they are valuable. Last step is to charge for publishing a paper for those who want to score a lot of points, even if their research isn’t very valuable. Did I get it right? Disgusting practice.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.