A committee of scholars in Finland has decided to downgrade 271 journals from Frontiers and MDPI in their quality rating system, in a move that may discourage researchers from submitting manuscripts to the outlets.
Both publishers criticized the move, first reported in Times Higher Education, as lacking transparency and seeming to target fully open-access publishers.
Finland’s Publication Forum (JUFO) “is a rating and classification system to support the quality assessment of research output,” which factors into government funding for universities, according to its website. “The objective is to encourage Finnish scholars and researchers to publish their research outcomes in high-level domestic and foreign forums.”
JUFO rates journals on a scale from 1 to 3, with higher ratings corresponding to more points for funding. Publications in titles rated “level 0” count the same as popular articles or scientific articles that haven’t been peer-reviewed.
Earlier this year, as we reported, JUFO downgraded 60 journals, most of them from MDPI, from level 1 to level 0 after soliciting feedback from researchers about their experiences with the quality of publications in scientific journals.
JUFO later decided it would downgrade MDPI and Frontiers journals to level 0 in 2025, calling the publications “grey area journals” which “make use of the APC (Article Processing Charge) operating model and aim to increase the number of publications with the minimum time spend for editorial work and quality assessment.” The announcement continued:
One of the most important changes in scientific publishing in Finland is the sharp increase in the number of articles published especially in MDPI and Frontiers open access journals operating with APC fees. The scientific community’s key concern is, whether the costs of open access publishing increases unreasonably, and whether the increase happens at the expense of a thorough quality assessment.
On December 16, JUFO released a list of 271 journals that will be downgraded to level 0 next year, 193 from MDPI and 78 from Frontiers. Based on suggestions from discipline-specific panels of experts, JUFO kept 16 MDPI and 22 Frontiers journals at level 1, the announcement stated.
“We are deeply concerned by JUFO’s recent decision and find it challenging to understand the objectivity of the applied criteria,” Giulia Stefenelli, scientific communications lead for MDPI, said in a statement to Retraction Watch. “The simultaneous downgrade of 271 journals suggests a generalized evaluation process rather than a fair assessment of each journal’s merit.”
In October, the sudden death of a young MDPI employee raised questions about the workplace culture at the company.
In a statement to Retraction Watch, Shirley Dent, head of public relations for Frontiers, said all of the publisher’s downgraded journals meet JUFO’s criteria for level 1 ranking. “Targeting” publishers with author-funded open-access models “is both unfair and arbitrary – the assessment ranking of journals should simply be based on the quality and value of services provided,” Dent said.
Dent also said JUFO’s statement on their decision “does not provide any evidence” but points to “hearsay, anecdote and discredited lists.” Frontiers did not receive any “substantial feedback” about issues to address, “which is the core criterion of any evaluation process.”
“The decision can only be interpreted as an attack on a publishing model, rather than as an assessment of journal quality,” she said, in comments echoing those by Frontiers’ Fred Fenter in the LSE Impact Blog.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
Great and sensible news. Hopefully more countries follow
Many scientists may find it challenging to choose the right journal for publishing their research. To address this, more countries adopted models like Finland’s Publication Forum model. This model rankes journals based on quality and impact. Such systems helped researchers and institutions identify suitable journals for their work. Following this approach could guide scientists to publish in reputable journals and improve their research visibility. It might also help institutions align their publishing strategies with global standards.
“To address this, more countries adopted models like Finland’s Publication Forum model. This model rankes journals based on quality and impact.”
To be sure, there are also many problems with the model. The major issue is that Finnish universities’ budgets are partially based on the model; i.e., they get more money when people publish in the high-ranked journals. Anecdotal evidence suggests that even individual scientists are evaluated based on the rankings, although this practice is strictly discouraged by the ranking institution. Someone already commented on the previous post linked that it is basically just another Goodhart’s law in practice.
The second big problem is that it tends to favor heavily the established big players (i.e., Elsevier, Springer, etc.). Thus, in reality, it correlates with the impact factors and whatnot. If they would use it to promote alternative, preferably society-based OA journals, it might have a positive impact in the long run — provided that other, bigger countries would join the effort.
That said, I think it is a good thing that they now downgrade these pay-to-play publishers heavily.
Publishing in level 0 is a good thing – scientists should communicate in trade and popular science magazines.
Frontiers and MDPI need their own negative rating to make it clear that they’re not on the same level as say, writing for New Scientist or Undark.
It’s indeed a bit strange that popular science magazines are dumped together with poor quality primary content journals by their systems. While obviously not (at least usually) reflecting any original research, the good ones are hard to get into, and in other systems an article there counts as an important contribution to outreach.
My personal experience with MDPI as an author and reviewer doesn’t match the criticisms I have heard. The peer reviews I receive are just as good as society journals (we have had very rigorous reviews) and as a reviewer, my reject recommendation is almost always honored. I honestly don’t get it.
I have the same experience. Although I experienced my Rejection being ignored one time at MDPI Cancers – the manuscript was from a moderate-sized, well-known biotech company. I suspect that had more to do with it than anything. Comparably, a popular Springer-Nature journal (Molecular Cancer, IF>20) has ignored my decisions to Reject manuscripts for obvious basic and fatal scientific flaws in methodology and data and published anyway. Overall, I think that the quality of MDPI is not the greatest, but it’s no worse than Scientific Reports or similar journals in most subject areas. The anonymous blacklist seems suspicious considering MDPI was just starting to rise as a publisher. Perhaps, it is also an acquisition target. I recall, many scandals surrounding BMC journals before Springer-Nature acquired it.
This is very true. I have had the same experience when reviewing for Molecular Cancer. Some of the new Elsevier journals are basically run by the Chinese universities and they proceed to publish many of their own papers, even when I clearly indicate serious flaws in my report and recommend rejection.
The criteria used by the Finland Publication Forum are totally opaque. I am, however, also mindful of the potential impact of the large numbers of submissions at MDPI and Frontiers journals on the whole ecosystem, especially since turnaround times at traditional journals are longer than ever these days. I have been feeling this myself. We may be pushing the few willing reviewers to the edge with more and more ‘trivial’ submissions and publications.
Not surprised to hear that my experience with Molecular Cancer has been shared by others. I know that Chinese academics consider it to be a high impact “watering journal” along with Signal Transduction and Targeted Therapy and others. At this point I prefer to pay attention to research in older journals with strong reputations, but even the quality of those seems to be under attack. I’m getting to the point where I’m basically unwilling to review.
In any discussion I’ve seen, MDPI is always only defended by those who have published in their journals. I’ve reviewed for several MDPI journals but never published with them. And based on my experience, which is not biased by my interest in defending the quality my own papers, I fully agree with the JUFO publication forum.
Perhaps it is also related to research field. Indeed, the worst examples I’ve experienced as a reviewer over the years are from Elsevier (reported a paper mill and ignored/papers quickly published) and Springer-Nature (Reject decisions ignored repeatedly and papers published without revision), but I have only reviewed for MDPI about 7-8 times. I have published in MDPI twice but not recently. At this point I would not personally publish in MDPI, but that is just because of this reputation issue, not because I’ve experienced anything untoward. Overall, I’d prefer quality to be assessed on a journal level, rather than a publisher level.
For those countries adapting the Finland Publication Forum model, this is considered to be a great news and a big move. It would also bring new challenges.
This is a great move! MDPI is a low quality publisher. Interestingly, this journal will only send you paper for the first round of review, then accept it right away, unlike with reputable publishers such as wiley and elsevier where submitted papers undergo several rounds of review ensuring quality reviews. I believe this believe that these publisher do not screen papers and just accept all submissions. Upon submission, authors are required to format their papers in the journal template, suggesting that all papers are accepted with no rejection. After all, this is a business minded publisher who does not have high regards to science.
I had a similar experience. I got in a manuscript with badly flawed science. I (charitiably) recommended withdrawl with right of resubmission. Two weeks later the manuscript was published in its original form. I will not review for MDPI journals as a result.
Due to the current issues with MDPI journals, mentioned also in many comments, dapting the Finland Publication Model seams essential. It is true that these journals may have impact factors or might be in the top quarters, but quality check by an independent model ensures adhering to ethics and integrity. This is why this model is getting more popular and accepted in many countries including Hungary.
My experience with MDPI Sensors and Energies was very positive. The feedback on my articles was constructive, and the review process was comparable in quality to that of IEEE Access. While I recognize that some MDPI journals may accept unrelated or substandard publications, this varies significantly from journal to journal.
Your mistake is to equal IEEE Access to high quality. You are comparing trash to trashier. MDPI Sensors has absolutely no selection criteria, their special issues are unremarkable and nothing but a tool to prey on young assistant type professors.
Sounds like you have a grudge from a previous dealing with MDPI? From my experience, MDPI are professional and will happily reject something that doesn’t meet standards. The fact they’re open-sourced, and hence, aren’t controlled by established narratives seems to annoy some people.
Anyone can easily see any mdpi journal publishes 100s of irrelevant papers in 1 issue.
Both MDPI & Frontiers journals are really quite bad. So, this is good news.
Kudos to Finland.
It seems that the problem with the MDPI and Frontiers journals is not that every paper is poorly reviewed or of suspect quality, it is that whatever quality control is in place has allowed a huge number of poor or downright fraudulent papers into their journals.
Individuals saying that they had a good experience doesn’t really say much against the overall analysis that went into the ratings downgrade (a pretty crude tool in its own right).
This means that researchers are left with an unmanageable number of papers to read and track in their fields of endeavour, and too much time is lost reading trash and working out the quality and informativeness of an unmanageably large number of papers.This makes the journals nearly useless for anything other than parking manuscripts and making points on a CV. Hence the low rating.
Agree about the unmanageable number of papers. If we’re being honest – Elsevier and Springer-Nature also have a vast selection of low quality journals with limited QC and questionable peer review. We recently saw Elsevier’s Biomedicine & Pharmacotherapy, a notorious dumping ground that reached a decently high impact factor, placed “on hold” by WOS. Many other journals are avoiding that fate even though the patterns are clear when perusing PubPeer. Indeed, if things are assessed at a publisher level – Elsevier certainly does not reach level 1 either.
Great move. Some MDPI journals publish 9 special issues per day! Of course, most of those issues are filled with mediocre articles that only serve to inflate the authors’ CVs.
Publish papers in any quality Journals which are approved by Clarivate (WoS) irrespective of publishers. However, authors must also inquire about the reputation of Journal. On time review is important too. If any journal dont follow set guidelines (policies), it must be delisted by Clativate (WoS) so that authors dont submit manuscripts.
Has this study of journal quality made by Publication Forum (JUFO) been published in a peer-review Journal, and the raw data publicly available (such as OpenAIRE)? It is quite important to know if the decision was made using scientific method.
As a person that tried publishing in different journals, I must say that when you receive an answer like “we reject your papers because we have many more” you come to a conclusion that the journal is not worth it. This is the case of Elsevier, IEEE, etc. At least MDPI rejected me because of the reviewers opinions. I managed to publish in one of those after I made an yearly subscription and in 2 years time i saw may paper. Restructure every journal, make them evaluate the articles in no more that 3 months, some scientific degrees depend on it.