A panel of scholars in Finland has downgraded 60 journals in their quality rating system, following months of review and feedback from researchers.
The Finnish Publication Forum (JUFO) classifies and rates journals and other scholarly publications to “support the quality assessment of academic research,” according to its website. JUFO considers the level of transparency, the number of experts on a publication’s editorial board, and the standard of peer review to make its assessment, which academics can use to determine the credibility of a given title or its publisher.
JUFO’s classification ranges from 3, for “supreme-level” publications, to 1, which still counts as legitimate publication. Level 0 means the journal is excluded from the ranking, which may dissuade researchers from publishing with them, James Heathers, a scientific sleuth said. Finland’s university funding model relies on JUFO as a publication quality metric.
In February, JUFO posted an open call asking for negative experiences to help re-evaluate the ranking system.
Heathers and Veli-Matti Karhulahti, a senior researcher at the Faculty of Humanities and Social Sciences at the University of Jyväskylä, gathered roughly 200 different publication-related experiences from BlueSky and X users and submitted them to JUFO.
The 60 journals will be downgraded from level 1 to level 0 at the start of 2025. Of these, 21 are from MDPI, three from Wiley, and three from Frontiers.
In 2023, JUFO downgraded MDPI’s Sustainability to level 0 over doubts about the journal’s “procedures to ensure scientific quality work reliably,” as we reported. At JUFO’s June meeting, the group discussed the possibility of downgrading all MDPI journals to level 0 but did not take any action on the idea. MDPI did not respond to our request for comment.
Tom Ciavarella, head of public affairs for Frontiers North America, said:
Finland and Frontiers established a national Open Access agreement in 2022 and enjoy a strong collaborative relationship. We have been made aware of the recent decision by JUFO to list three Frontiers journals — Frontiers in Plant Science, Frontiers in Marine Science, and Frontiers in Sociology — at level 0 starting January 1, 2025. Frontiers is seeking information from JUFO about its decision. JUFO did not communicate any concerns to Frontiers about these journals prior to their change in status. These are journals that have strong support in the respective research communities, including in Finland.
“Wiley has been transparent about our commitment to cleaning the scholarly record, investing in research integrity, and strengthening publishing processes. These journals continue to serve their communities,” a spokesperson for Wiley said.
Heathers’ X post which gained the most traction specifically asked for experiences with MDPI, Hindawi and Frontiers, Karhulahti said, which isn’t entirely representative of all publisher experiences.
Speaking about the submissions they received, Heathers said:
Many of these journals were making unreasonable demands of their reviewers, or rejecting their opinion entirely, providing slipshod review processes, are run by dismissive, disinterested or just completely bent editors, and so on.
Heathers and Karhulahti also submitted reports of good practice by journals to JUFO, he said.
“We recognise about half the journals from the submissions we made, so it’s likely that we provided a lot of the raw fuel that underpinned those downgrades,” Heathers told Retraction Watch. A spokesperson for JUFO declined to comment on individuals who gave feedback.
Although the campaign has ended, the scientific community can give JUFO feedback on the quality of publications at any time, a spokesperson for JUFO said.
Karhulahti urged more countries to stop relying on publisher-generated metrics of quality such as indexing in Scopus and instead use researcher or community-produced indexes, like JUFO.
“All in all, a very successful experiment,” Heathers said, “and one we would happily repeat.”
Update, 1930 UTC, 7/1/24: MDPI are “deeply disappointed to see 20 MDPI journals moved from Level 1 to Level 0, especially without proper explanation or reasoning,” said Rui Duarte, the public relations manager for MDPI. Despite this, Duarte said, the company wants to assure Finnish authors that they can continue to publish with MDPI with confidence, as they have 274 journals included in Level 1.
“While we are perplexed by JUFO’s decisions, we understand that the evaluation of journals is part of the process. We will make efforts to establish open communication with JUFO to better understand their decisions, educate them about MDPI and our editorial process, and clarify any misunderstandings or concerns they may have,” he said.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, subscribe to our free daily digest or paid weekly update, follow us on Twitter, like us on Facebook, or add us to your RSS reader. If you find a retraction that’s not in The Retraction Watch Database, you can let us know here. For comments or feedback, email us at [email protected].
I am glad such reevaluation can be done. For the MDPI the huge am ok unt for me money they charge for publication is not acceptable. In terms of the quality of review they do make some efforts to provide good feedback on the papers but the biggest problem is the fees the charge which is just unreasonable.
With mdpi things are pretty obvious. You are generally asked to perform revision in 5 days and if you have semi good h index they will accept it with any trash put there instead of real answers.
Having a system in place that allows for “manual” downgrading of journals is certainly a good thing.
However, when you count the number of journals that JUFO needs to evaluate, you admit that gathering feedback from 180 individual experiences and then downgrading 60 journals based on these has issues. If we assume that ALL feedback has been negative and ALL negative feedback has resulted in a downgrade, there are 3 cases on average for each journal. In the parlour of evidence-based medicine, some of these cases might be anecdotes.
Don’t get me wrong: I do not say that these decisions are wrong, and neither is it wrong to gather feedback from researchers. In fact, my own experience with one of the downgraded journals (International Journal of Molecular Sciences) is pretty much indicative of predatory publishing. However, the data underlying these decisions is not the best, and it would be surprising if every single one of these decisions would hold up to scrutiny. It is difficult to imagine how such feedback could not be biased. It’s self-reporting, and researchers might have a large incentive to report negative experiences.
The second big issue is transparency. The fact that I have to speculate in the paragraph above about the numbers that have led to the downgrades is worrisome:
– How much negative feedback does result in a downgrade?
– What exactly counts as negative feedback?
– Is any negative feedback treated equally?
– Do we trust the feedback, or is evidence required?
– Is the feedback anonymous or not?
– Does JUFO communicate with the journal/publisher in order to allow them to respond to accusations of misconduct?
I cannot find satisfactory answers to these questions. As long as these decisions are made behind closed doors and without oversight, we just exchange old problems with new ones. Goodhard’s law applies not only to the “Impact Factor” but also to JUFO: “When a measure becomes a target, it ceases to be a good measure.”
Cases of editorial misconduct have also become public from JUFO 1-3 classified journals, and anecdotes are even more abundant. But the threshold of evidence ought to be higher than just anecdotes. JUFO has some protection from the fact that it’s relevant only for less than 0.1% of the world’s population (there are about 5.5 Mio. people living in Finland). This protection by size is, at the same time, JUFO’s weakness: Does it make sense to spend that many resources for a classification system that is used only by 5 Mio people? The world needs more than 1000 classification systems to offer protection by size.
At the moment, most publishers won’t bother to specifically game JUFO, but that could change with the rise of AI rather soon. But there are still dangers that do not require AI: What if a handful of Finnish researchers team up to downgrade a journal they don’t like? I wouldn’t be surprised to learn that this has already happened. The lack of transparency does not instil confidence.
Disclosure: The author has applied multiple times to join the JUFO expert groups to gain insight into JUFO’s inner workings but has never been selected. That begs the next question: How does JUFO ensure a balanced composition of its expert panels, and how are its members selected?
I can’t agree more with your comment! I really like how you approached and analysed the issue here.
To be honest, my experience reviewing and publishing with MDPI is FAR better than my experience with Frontiers. I have had some pretty bad experience with Elsevier journals as well, especially those run by Chinese universities. Standards are almost non-existent.
In my opinion, all Frontiers journals should be downgraded – but again, that’s just my opinion and the process of making these decisions that can have wider implications should be made in a fair and transparent manner. The same applies to the decisions made by organisations like Clarivate to suppress IF (IJERPH is not more or less problematic than IJMS in my opinion!)
This type of rant about “transparency”, “robust criteria” and all similar stuff is nothing new. It is exactly how years earlier Grudniewicz and friends managed to talk down all organized attempts to combat predatory publishing. And it is why predatory journals still quite regularly appear in RW posts.
Let’s see if the community buys into it once again.
Good points made by Michael Jeltsch, I can add a bit more on the questionable quality of this forum.
First rephrasing my old professor “you only get answers on the questions you ask” in other words if you ask for bad experiences with the ‘usual’ suspects MDPI, Hindawi, Frontiers etc. you will get them.
More importantly it is a big mystery why they ‘downgrade’ certain titles just now and not many many years ago. Since for a number of titles it is already well-known that they suffer from ‘issues’, see for example:
Ecs transactions published by IOP Publishing is discontinued in in Scopus in 2022
Journal of advanced oxidation technologies used to be published by De Gruyter but is as far as I can see not active anymore since 2017/18!
Arabian journal of geosciences published by Springer Nature is discontinued in Scopus since 2021
Asia life sciences is discontinued in Scopus since 2020
Life science journal: acta zhengzhou university overseas edition is discontinued in Scopus since 2014!
Journal of advanced zoology is discontinued in Scopus since 2022
Neuroquantology is discontinued in Scopus since 2022 is already a journal with issues for a long time, see for example https://www.researchgate.net/post/What_is_the_current_status_of_the_journal_NeuroQuantology_indexed_in_Scopus
American journal of neurodegenerative disease is discontinued in Scopus since 2017 published by E-Century Publishing orporation (a publisher that manage to get all their journal titles discontinued in Scopus https://www.scimagojr.com/journalsearch.php?q=E-Century%20Publishing%20Corporation&tip=pub )
Oncoscience is discontinued in Scopus already since 2017!
International medical journal is discontinued in Scopus and an identified example of a journal with hijacked versions https://retractionwatch.com/the-retraction-watch-hijacked-journal-checker/
Library philosophy and practice discontinued in Scopus since 2021
Migration letters is discontinued in Scopus in 2022. See also https://www.researchgate.net/post/What_is_the_real_homepage_URL_of_migration_letters_journal
Revista de Educacion de las Ciencias I can’t find any evidence that this magazine still exists
Journal of namibian studies: history politics culture identified as predatory https://www.universityworldnews.com/post.php?story=20231213193936560 and discontinued in Scopus
So, to put it mildly at least this shows how hard it is to keep your ‘indexing’ up-to-date. Journals and publisher unfortunately sometimes turn into predatory ones (or got hijacked).
Sometimes things all come together. In this weekend’s overview here on Retraction Watch https://retractionwatch.com/2024/06/29/weekend-reads-huge-cash-bonuses-for-publishing-in-nature-fines-for-buying-authorship-have-retraction-notices-improved/#more-129528
There is this astonishing story https://www.statnews.com/2024/06/20/richard-lynn-racist-research-articles-journals-retractions/ about a highly controversial person that refers prominently to the ‘journal’ “Mankind Quarterly”. This is an obvious example of a journal with ‘issues’ which I personally would easily qualify as pseudo-scientific.
The last journal title in the here mentioned list of 60 journals downgraded is… “Mankind Quartely”. I seriously wonder how it is possible that this ‘journal’ is only downgraded now and not decades ago.
The suggested solution by the author that whitelists should be based on opinions of researchers that publish in those journal is also not fail-safe. Authors have a vested interest in rating journals in which they publish (and those journals’ reviewing processes) highly, especially if their promotions or research incentives are based on the ranking or stars of the journals where they publish.
Dear colleagues, let’s be honest: _any_ quantitative metric of “issues” or “high-quality” of a journal or a researcher can be fooled, in most cases with an astonishing ease. Yet, we need them, as well as discussions of their flaws. And we very much need organizations and groups, who invest their efforts in cleaning the field from, at least, the most-evident problems. The criticism above, probably deserved by JUFO, mostly required the group to be faster in their cleaning work (plus a generic call for transparency). But I would like to thank them for coming up: better late than never.