The 14 universities with publication metrics researchers say are too good to be true

Saveetha Institute of Medical and Technical Sciences

More than a dozen universities have used “questionable authorship practices” to inflate their publication metrics, authors of a new study say. One university even saw an increase in published articles of nearly 1,500% in the last four years. 

The study, published January 5 in Quantitative Science Studies, “intends to serve as a starting point for broader discussions on balancing the pressures of global competition with maintaining ethical standards in research productivity and authorship practice,” study authors Lokman Meho and Elie Akl, researchers at the American University of Beirut in Lebanon, told Retraction Watch

Universities manipulating publication metrics have made headlines recently. Highly cited researchers started cutting ties with schools in Saudi Arabia after an investigation revealed that institutions were offering cash in exchange for affiliation — all to boost rankings. In 2023, we covered a case in which a prominent researcher was offered money by a university senior administrator to add his name to publications, outing the scam after not getting paid. And our 2023 investigation with Science uncovered a self-citation scheme at Saveetha Dental College — affiliated with Saveetha Institute of Medical and Technical Sciences, a name you will see again below — used to boost rankings. 

“There has been speculation about universities gaming the metrics system for a while,” sleuth Dorothy Bishop said in an email, “but I’m not aware of any previous attempt to study this formally using bibliometric indicators.” Bishop said “motivations of the universities who take part in this are hard to understand, but it’s clearly done to improve performance on international rankings.”

Meho and Akl used data from Elsevier’s SciVal, Scopus, and Clarivate’s Web of Science to identify 80 universities that experienced growth in research output, as measured by number of published journal articles, of over 100% from 2019 to 2023 — far outpacing the global average of 20%. 

Of these, 14 institutions also had declines in rates of first authorship of more than 15 percentage points over four years.This rate, over five times the average decrease of 3%, could be a sign of questionable authorship practices like sold or honorary authorship, the researchers said. “Such a dramatic decline often indicates a fundamental shift in how research contributions are distributed within an institution.”

Those 14 universities are: 

UniversityArticles published, 2019 Articles published, 2023% change
Future University in Egypt, New Cairo1271,368977
Chandigarh University, Punjab, India 3622,281530
GLA University, Bharthia, India2591,521487
Lovely Professional University, Phagwara, India8472,219162
Saveetha Institute of Medical and Technical Sciences, Chennai, India1,9843,959100
University of Petroleum and Energy Studies, Uttarakhand, India3071,557407
Al-Mustaqbal University College, Hilla, Iraq911,4171,457
Lebanese American University, Beirut3162,600723
Al Imam Mohammad Ibn Saud Islamic University, Riyadh, Saudi Arabia*3701,591330
King Khalid University, Abha, Saudi Arabia1,3295,145287
King Saud University, Riyadh, Saudi Arabia4,49311,906165
Prince Sattam Bin Abdulaziz University, Al-Kharj, Saudi Arabia7504,388485
Princess Nourah Bint Abdulrahman University, Riyadh, Saudi Arabia4864,465819
Taif University, Ta’if, Saudi Arabia5162,381361
*also referred to as “Imam Mohammad Ibn Saud Islamic University.”
Adapted with data from Meho & Akl (2025)

An earlier preprint of the study results used different thresholds, resulting in the addition of Saveetha University and the removal of three others from the list. None of the 14 universities responded to our request for comment. 

Together, the universities saw a 234% rise in total publications over four years and a 23% drop in rates of first authorship. Eight of the 14 schools ranked at the top of the list of most significant declines in first authorship rates out of the 1,000 most-published universities. By 2023, 11 ranked among the 15 universities with lowest first authorship rates. 

Researchers also looked at hyperprolific authorship, defining it as publishing 40 or more articles annually. Combined, the 14 universities had an increase in hyperprolific authors from 23 in 2019 to 177 in 2023, an increase of 670% and a growth rate 10 times the average. However, this rate wasn’t consistent over all schools: King Saud University, for example, went from four hyperprolific authors in 2019 to 63 in 2023, a 1,500% increase. 

Researchers defined many of the hyperprolific authors among the 14 universities as “noncore,” meaning they published articles with universities they are not directly affiliated with. An increase in authors with multiple affiliations can mean a rise in ethical collaborations. But a sharp increase — as seen in some of the listed universities — can indicate “strategic efforts to amplify research output,” the researchers wrote. 

From 2019 to 2023, the percentage of articles with authors with multiple affiliations remained stable at 18%. The control group, consisting of four universities known for conventional authorship practices — California Institute of Technology, Massachusetts Institute of Technology, Princeton University, and the University of California, Berkeley — had a much lower proportion of papers with authors with multiple affiliations, at 6%. 

Inflating academic output “directly biases the outcomes of ranking systems, compromising their reliability and usefulness,” Meho and Akl write in the study. They conclude with a call to action for universities, funding agencies, and policymakers, among other bodies, to create and enforce more stringent guidelines and review processes around authorship practices. 

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on X or Bluesky, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

Processing…
Success! You're on the list.

29 thoughts on “The 14 universities with publication metrics researchers say are too good to be true”

  1. The major actors are e.g., “Kanwarpartap Singh Singh Gill” with 160 papers in 2024 despite no prior research background, or other similar individuals, e.g., Vinay Kukreja with 270 articles in 2024, or Imran Ashraf with over 100 papers.

    1. There is huge push among some private universities in India to publish as many research papers and patents as possible to improve their rankings.

      I am a student of Chandigarh University, which is one among the 14 universities mentioned in this paper. I am currently doing masters there.

      We are asked to publish at least 2 research papers in 2 years to get our degree. We also have to file at least 1 patent idea. We have a whole department that files patents on our behalf and we give them ideas.

      All this is done for NIRF rankings. For these rankings there is good weightage given to the volume of research papers and patents. Hence to pump up the rankings our university do these tactics.

      Research papers are generally low quality with few exceptions. AI tools like chatgpt, claude are used by many. The AI content is then given to AI to human convertors to avoid detection by plagiarism detectors.

      Most of the research papers published are in the field of AI or Machine Learning. Just take a data set and apply a model that have never been applied on that dataset. That is it. This way we can actually create a new research paper every single day. Sky is the limit. There are even students who sell these papers. We have to just give them 1300 rupees ($15) and the topic we want. They provide the research paper usually within a week.

      There are universities near my university. Like Chitkara University, Lovely Professional University etc. They all seem to be doing this. For example the name you mentioned “Kanwarpartap Singh Singh Gill” belongs to that university. “Vinay Kukreja” is even a professor there and director of research. You can see all these are publishing these papers in AI and Machine Learning.

      These papers are kinda fraud. No doubt there are few exceptional students but most are just publishing fraud papers just to clear their degrees. Because if at least 2 papers are not published we do not get our degrees.

  2. This analysis should exclude conference posters with a doi. Otherwise it is a flawed estimate.
    Such analysis should focus solely on full manuscripts being published yearly. Would be interesting to see if that’s what they did. If they didn’t, then maybe the authors should consider reediting their data.

    1. What do you expect? Free journal takes eternity to publish. I submitted a manuscript to a journal in 2013, I got reviewer’s report in 2024! That was a time I can even remember anything about the manuscript nor even trace where I stored or kept the manuscript!

  3. Whatever written here is 100% true. Especially, Saveetha University follows worst practices. No respect to researchers. I don’t recommend to anyone to work in Saveetha

  4. So an increase in overall publications must mean something is off, especially if there was a history and it doesn’t fit the historical parameters of older institutions

  5. open access journals is not the source, it’s the pressure to produce research by the universities, and pressure to increase their rankings. Deliberately looking the other way, they are aware of what’s going on. People getting promoted, winning accolades, increasing rankings. this is all a sham and the house of card will come tumbling down. Academia is no longer immune to corrupt practices

  6. Paid journal publications has increased compromising quality, multiple authors involved. It is all highly unethical.
    Difficult to publish a standard research article now a days.

    1. Well, it could be sort of an AI involved in these papers … And if you notice most of these universities are in India, Saudi, and Iraq…wonder why .. and who is outing this practices… Could it be legacy Western University??

  7. The authors of this articles must also find the reasons and factors behind this rise in %ages. The pressure from universities for rankings, delayed responses from subscribe journals, more attraction and pushing towards open access, Chatgpt, buyers from rich countries, less monitoring on lab development and fund handling from the research funding. Moreover, they must also mention the profits of publishing groups earning billion$. They earn from both authors, and readers as well. These groups reject papers and push towards open access and paid publishing with thousand of $ fee. Instead the authors must write next article to expose this big mafia.

  8. I think that those authors and universities already have a huge publications before these years, but the lack of funding and long time for reply from open access push the authors to publish in local journal that is not indexed by scopus or web of science . but when the universities decide to fund them to obtain a good research reputations all the publications published in a paid and Q1 and Q2 journal. this mean that this report is has no accurate analysis and not fair.

  9. It would be interesting to crosscheck this against retraction data – are these institutions also seeing high volumes of retractions? I also disagree with the comment to exclude conference proceedings – these are valid publications just like journal articles. They are a popular vehicle for papermills due to low APCs, light peer review and less Publisher oversight.

  10. The metrics for this could be applied to any group or universities that have in sudden influx of new researchers (Black, Females etc.)
    Most of those listed Universities have been started with in the last 75 years (some of them with better funding then US schools).
    Those universities have just started to use the ‘western metrics’ of valuation of research and are responding to that metric

  11. Lots of universities are hiring people to write papers for them to boost their metrics. They take PG dissertations and Ph.D thesis cook up further data, add on data from electronic media, imaginary data, plagiarise, favoured authorship, misuse of office and official posts etc. boost their numbers, H index and i10 index. Purely unethical practices

  12. But isn’t the analysis flawed? Don’t get me wrong, I agree with the contents, but they are presenting their own screening methods as research results. For example, from the beginning they pick institutes with a declining % of first authors, and then they say “The study group exhibited a pronounced decline in first authorship publications, which may indicate an increased reliance on external contributions”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.