Elsevier’s Scopus database has paused indexing content from Sustainability, an MDPI journal, while it reevaluates whether to include the title, Retraction Watch has learned.
Please see an update on this post.
Other MDPI titles were reevaluated in 2023, and its mathematics journal Axioms is no longer included in Scopus’ nearly 30,000 titles. Clarivate also delisted two MDPI journals, including the International Journal of Environmental Research and Public Health, from its Web of Science index earlier this year, meaning those journals will no longer receive impact factors.
Universities and funders use Scopus to create “whitelists” of journals in which authors are encouraged to publish, so removal from the index can influence submissions.
In 2022, Norway removed Sustainability from its list of journals that researchers get credit for publishing in, and Finland followed suit at the beginning of 2023. In the announcement of its decision, the Finnish Publication Forum wrote:
Sustainability also publishes high-quality articles, but the wide scope, large publication volume and fast publication processes have undermined confidence that the journal’s procedures to ensure scientific quality work reliably down the line. The large variability in quality is partly the result of thousands of special issues that are common also in other MDPI journals.
The number of articles from Sustainability indexed in Scopus has increased nearly every year since 2009, its first year of coverage, when 78 articles were indexed. In 2022, the journal published over 17,000 articles. Scopus indexed about 13,500 in 2023, before the pause.
Staff for Sustainability learned on October 30 that Scopus’ Content Selection and Advisory Board (CSAB) were reevaluating the journal, according to Elaine Li, the managing editor.
Li confirmed the journal’s indexing is paused due to the reevaluation. If the process concludes positively, the content put on hold will be indexed within four weeks, she said.
According to Scopus’ title reevaluation policy, the index identifies “outlier and underperforming journals” for scrutiny based on citation metrics and benchmarks compared to other titles in the same field, when “legitimate” concerns are raised about the journal or publisher, or if Scopus’ algorithm flags outlier behavior. The CSAB can also decide the journal should be evaluated again.
Li told us:
No specific concerns were raised when the editorial office was asked to provide information required for the re-evaluation, therefore we are assuming that the reason for this re-evaluation is the continuous curation based on CSAB feedback.
“Several other journals of MDPI’s portfolio have already undergone re-evaluation” in 2023, Li said, “and the majority was evaluated positively and is continuing to be indexed in Scopus.”
We asked Elsevier for more information, and a spokesperson responded:
We consider information about journal reviews as confidential. Once decisions are made relevant outcomes will be communicated through the regular channels.
Update, 1/2/24, 2030 UTC: MDPI CEO Stefan Tochev told us after this story was published:
As at January 2024, MDPI has 269 of journals indexed in Scopus. Notably, 54 of these journals were added in 2023, underscoring our dedication to maintaining rigorous academic standards across various disciplines. While we acknowledge Scopus’ reevaluation process for specific journals, it’s crucial to understand the scope and depth of our presence in this indexing database.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, subscribe to our free daily digest or paid weekly update, follow us on Twitter, like us on Facebook, or add us to your RSS reader. If you find a retraction that’s not in The Retraction Watch Database, you can let us know here. For comments or feedback, email us at [email protected].
Here is to another one biting the dust! \o/
Now this was long due. At some point, these predatory journals had to be chained.
The best way to check predatory or fake or substandard research journals is not to give any credence to publishers of such journals for publication of proceedings / seminar or workshop papers by universities or research organisations. Once their journal is given publicity on any platform of researchers/ academics it becomes a bait for early career researchers and students . Any publishers multiply their publications / journals without proper quality scrutiny as lure for speedy and free publication is too big to ignore.
That is right, especially IJSRT journal. The way their publicity has been appearing on so many credible platforms are very bad. Example Gmail, YouTube etc. I’ve been a victim of such predatory journal and got to know when no reviewers comment came. Such journals must be stopped ASAP.
That journal is pure crap.
Wait
In my field, traditional journals have become pretty much useless—with slow, opaque, and unpredictable review processes as well as what appear to be cliquey editorial/review practices which serve to feed into their own antiquated schools of thought/methods, etc. seemingly without accountability, etc. When a paper is rejected by reviewers who are stuck with methods and theories from decades ago, nobody (including the reviewers themselves) benefits. As a result, journals from publishers like MDPI, Frontiers, PLOS, etc. have become indispensable for younger (and more productive) researchers.
Perhaps as a result, more and more interesting/cutting edge papers have been coming out in journals by these publishers and fewer and fewer from the traditional publishers. I genuinely believe the days of determining the quality of the manuscripts by the outlets may be behind us at this point—although, based on the comments I see here, some people do unfortunately still seem to believe journals/publishers determine the quality of the papers.
God, I couldn’t disagree more with your comment. Not one respectable researcher in my field goes to PLOS, MDPI, Frontiers. Not one.
Myself, having reviewed for MDPI several times, cannot vouch for their selectivity. It is close to zero. Traditional journals, run by societies, still are the norm and the bread and butter of good research. To claim Frontiers publish good research is moronic — they have an algorithm for selecting reviewers, editors themselves do nothing and control nothing. If you think this is the future, good riddance.
I think you are partly missing Joao’s point here. He is not arguing that MDPI journals are selective, but rather that they are more ‘inclusive’. A number of my colleagues have also reflected this view because traditional subscription-based journals have very low acceptance rates (and seems to become a bit of a ‘big boys club’ where you have to know someone or already be well-established in your field to penetrate) and even after a paper is accepted, it can take MONTHS before it is finally published/available online.
I am not defending Sustainability because in my opinion and experience with reviewing for the journal, they really have no bar and accept almost every single submission. They did reject a number of the papers which I STRONGLY recommended rejection but the standards are definitely low. But I do take the point that there are still some (however scarce) good papers here, and we should really judge a paper by its own merit and not the journal it is published in.
I concur: an article should be judged by the quality of its content and not simply on the journal it is eventually published in.
Correct, I totally agree. Moreover. The best referees are the readers and the recorded metrics.
Inclusivitiy in science is a justification for poor quality. Science is neither inclusive nor democratic. If one publishes crap, it should be erased.
“When a paper is rejected by reviewers who are stuck with methods and theories from decades ago”
which are…?
As Kuhn refers to “normal science”, there exists strong resistance to accept new or different ideas, specially if it is generated out of their “realm”
Is Elsevier doing anything about the predatory practices of some of it’s own editors, who publish in their own journals, and have more than 30 publications per year?
I am agreed with your statement as I have same observation before.
Nope, and honestly given how some of their journals look like and how much garbage they take in (Energy and Energy Reports are particular outliers in the papers I’ve seen), they might as well be in the MDPI tier of predatory publishing.
A few disparate thoughts on this, in no particular order.
1. I generally agree with Joao that many traditional journals suffer from slow, unclear and often cliquey reviews. For all the criticisms of MDPI, I can also say my experience publishing there has usually been accompanied by useful, good quality reviews. Certainly comparable to reviews I’ve recieved in similarly ranked traditional journals. And with MDPI, you have a reasonably good idea what to expect (acceptance rate, speed, reach, etc.)
2. It is not clear to me why Scopus is reviewing Sustainability. But if it is because of deviations from its aims and scope, I can understand that. Honestly, Sustainability has increasingly struck me as off- scope (much more so than IJERPH) and this is even something I communicated in a number of peer reviews I did for the journal.
3. On some level though, we also need to recognize that there is a massive conflict of interest here. In the last three months, I’ve recieved 2-3 invitations for special issues or APC waivers in Elsevier’s newer sustainability related journals. Discrediting and even delisting Sustainability is in Elsevier’s direct commercial interest.
I totally agree with this point. It is a fierce competition for APCs between publisfers. And the authors are stuck between the milling stones. Unfortunately.
78 articles in year 1 and 17,000 articles in year 13: that’s a steep slope, 1300 articles x number of years since launch.
That’s not sustainable!
Sorry, Chris, I did not seem to get the point you are making here.
Are you implying that a Journal should place a ceiling on the number of quality articles it publishes each year?
Sorry, just a play on words and that mathematically, unconstrained exponential growth functions or even linear growth functions with a steep slope eventually blow up
In my experience ,many paid journals that are indexed in Scopus are also predatory . Thei don’t consider qualities of the paper or significance of the paper . They only deal with money . Better journals should be indépendant and don’t many depend on income from the researchers .
The journal “Sustainability” has become a widely used way of easily publishing articles and, consequently, earning points in national ministerial systems for parametric assessment and citations, practically without a peer review process. The speed of publication is the best indicator of the impossibility of a thorough quality verification of the text. As a invited reviewer myself, I experienced a situation where, after writing a review and trying to send it to the editorial team, it turned out that the article had already been published, even though I was supposed to be one of the reviewers. My review was negative. I was apologized to by the editorial team, which claimed it was an accidental situation, but I knew from others that it was not. In the Publons database, there were cases that could be verified, where one author had prepared 140 reviews for the journal, including 70 in a single year, while also publishing 4 papers during that same year. Especially in CEE countries, “Sustainability” has become a way to build scientific portfolios, unfortunately, without quality verification. Removing this journal would prevent further deterioration in the quality of publication portfolios, and consequently, scientific research.
Top authors of Sustainability in 2023 (10+ items), per Dimensions:
Jianxu Liu – Shandong University of Finance and Economics, China – 15
M Janice Javier Gumasing – Mapúa University, Philippines – 14
Lóránt Dénes Dávid – Magyar Agrár- és Élettudományi Egyetem, Hungary – 14
Baseem Khan – Hawassa University, Ethiopia – 13
Decai Tang – Nanjing University of Information Science and Technology, China – 13
Thowayeb Hassan Hassan – Helwan University, Egypt – 13
Aaron Kinyu Hoshide – University of Maine, United States – 13
Ahmed Hassan Abdou – King Faisal University, Saudi Arabia – 11
Segundo Jonathan Rojas-Flores – Universidad Autónoma del Perú, Peru – 11
Ardvin Kester Sy Ong – Mapúa University, Philippines – 11
Mohammad Ali Abdelkareem – University of Sharjah, United Arab Emirates – 10
Yonis Gulzar – King Faisal University, Saudi Arabia – 10
Hegazy Rezk – Prince Sattam Bin Abdulaziz University, Saudi Arabia – 10
At least two names are familiar to me, and not for good reasons.
Perhaps, some RW reader can make a counterpoint and tell the world about breakthrough discoveries of any of these researchers?
Indeed that screening of the articles of the top three authors in the Sustainability journal indicates the redundancy. Repeating the same concept with no new data or breakthrough discoveries. It is sad. Generally submitting numerous articles in one journal has ethical issue.
There is a connection between papermill and top publishers in Sustainability. They cannot publish in other journals so easily.
Productivity can not be considered as a drawback for a researcher. Every single contributions of mine in Sustainability has its own novelty, holding an specific breakthrough discovery. Otherwise it could not get published. Due to the speed we prefer this journal to communicate our results. I am happy to do science communication.
The evaluation is a good process to improve the performance of the journal, but the reasons I think are not reasonable. Sustainability is a good journal, concerning the speed of reviewing and publishing, this is good to avoid losing the manuscripts their novelty and scientific value.
In my field there are around 10 ‘traditional’ journals, worldwide. At my university, every faculty member is, via performance plan, meant to publish at least 2 papers a year (on average, across different levels of seniority etc), and only papers in Scopus Q2 or higher count. That makes for 5 eligible journals world wide. Let’s say each journal publishes 40 papers a year, so there are 200 annual paper ‘vacancies’ per year, worldwide. Assuming that hundreds (or probably thousands) of other universities will also pressure their employees to publish in a similar fashion, how is this supposed to work if aiming for traditional journals only????
No wonder mdpi, frontiers etc turned up to satify demand (and turn the publishing business into a big money spinner along the way, I presume).
There are many things wrong with mdpi etc, no doubt, but the present system is inherently broken, and that’s not mdpi’s fault, they are just taking advantage of it. It’s also good to see that these ‘non-traditional’ publishers have better turnaround times. For may traditional journals these are now ridiculous. What is the point of giving a reviewer a three months deadline, if he only does the review when the first reminder arrives (e.g. 3 months minus one week)? In non-trad journals one can get a publication out within 2 months.
Not saying that this is necessarily desirable, but as long as universities require faculty to engage in industrial style publication schemes something will have to give.
Is the initial idea simply to cancel a journal? What about considering the implementation of some guidelines first? This appears to be primarily a financial matter. Elsevier’s best interest seems to involve the disappearance of publishers like MDPI and others, enabling them to regain control over researchers. I am not saying MDPI should be more regulated, but unless you are already well-established, getting published in certain journals can be extremely challenging. Furthermore, there are concerns about the opaque acceptance process and the high open access fees, ranging from 3000 to 10000€. Additionally, waiting six months for your article to be published, only to find similar research in other journals, can be frustrating.
Two examples of “high ethics standards” of Sustainability.
https://www.pubpeer.com/publications/5AE7DBDFC357D19088DDCE2986B148. It is a fake roughly based on the first author’s MSc thesis from 2016. The reported study period moved to 2019, and all the numbers magically scaled up 4 times. Having learnt of this article only after publication, the middle author (who was the supervisor of the original thesis) requested her removal from the authorship list, which was denied by the journal.
https://www.pubpeer.com/publications/6FBCD64F0F700B41524AB102F97CC9. This is a citation delivery vehicle, where one author (the one in common with the previous study) also reports an affiliation which he never had. The university, the name of which was abused, requested the incorrect affiliation to be removed. This was also denied by the journal.
P.S. The relevant correspondence is in my possession.
Isn’t this cherry picking?
Suppose. But let’s do more cherry-picking.
How about these 7 pieces identified by Anna Abalkina as coming from “International Publisher LLC,” aka “123mi” papermill?
https://doi.org/10.3390/su12103968
https://doi.org/10.3390/su13020640
https://doi.org/10.3390/su12166571
https://doi.org/10.3390/su12114412
https://doi.org/10.3390/su13179670
https://doi.org/10.3390/su132212667
https://doi.org/10.3390/su12166420
Reported in mid-2021, no action taken.
Source: Abalkina, A. (2023), Publication and collaboration anomalies in academic papers originating from a paper mill: Evidence from a Russia-based paper mill. Learned Publishing, 36: 689-702. https://doi.org/10.1002/leap.1574
I see inconsistency. Here you presented some Pubpeer comments that any journal might generally receive. Yet the top Sustainability contributing authors received no Pubpeer criticism.
Strange but true. Random Pubpeer comments might appear in any journal.
Even top journals receive Pubpeer comments everyday. Comments wont say anything about the quality of a journal. Comments are the expressions of the opinions and respectful but they can not be used for judgement of a journal which were built with the contributions of hundreds and hundreds of researchers over decades of hard work.
“É fundamental uma análise mais aprofundada destas questões, pois, ao meu ver, elas não são tão simples quanto parecem. Historicamente, até meados do ano 2000, enfrentávamos um mercado editorial científico altamente fechado e oligopolizado. Esta dinâmica começou a mudar com o surgimento das revistas científicas online ou eletrônicas, democratizando de forma significativa o acesso à publicação. Esse avanço tecnológico foi particularmente benéfico para pesquisadores e acadêmicos que, embora necessitassem publicar seus trabalhos, não dispunham de recursos suficientes ou não visavam revistas com milhões de acessos diários.
É importante destacar, no entanto, que o conceito de ‘gratuidade’ em revistas científicas é muitas vezes um equívoco. Mesmo as publicações que se declaram gratuitas geralmente são sustentadas por verbas governamentais ou outras fontes de financiamento. Isso levanta questões sobre a independência e imparcialidade dessas publicações.
Nesse contexto, acredito que as melhores revistas científicas são aquelas que conseguem equilibrar qualidade e acessibilidade. Elas não apenas realizam uma avaliação rigorosa e justa dos artigos submetidos, mas também estabelecem um modelo de financiamento que é transparente e justo para os autores. O ideal seria um modelo que não sobrecarregue financeiramente os pesquisadores, especialmente aqueles de instituições com menos recursos, ao mesmo tempo em que mantém a integridade e a alta qualidade editorial.
Além disso, é essencial considerar o impacto dessas revistas no avanço do conhecimento e na disseminação de informações científicas. Uma revista que favorece a inclusão e a diversidade de vozes no campo científico contribui significativamente para o desenvolvimento de uma comunidade acadêmica mais robusta e variada. Portanto, ao avaliarmos as revistas científicas, devemos levar em conta não apenas seu modelo de negócios, mas também seu papel no fomento de uma pesquisa científica mais inclusiva e diversificada.”
“It is essential to analyze these issues more thoroughly, as in my view, they are not as simple as they seem. Historically, until around the year 2000, we faced a highly closed and oligopolized scientific publishing market. This dynamic began to change with the emergence of online or electronic scientific journals, significantly democratizing access to publication. This technological advancement was particularly beneficial for researchers and academics who, although they needed to publish their work, did not have sufficient resources or did not aim for journals with millions of daily accesses.
However, it is important to note that the concept of ‘free’ in scientific journals is often a misconception. Even those publications that declare themselves free are usually supported by government funds or other sources of financing. This raises questions about the independence and impartiality of these publications.
In this context, I believe that the best scientific journals are those that manage to balance quality and accessibility. They not only conduct a rigorous and fair assessment of submitted articles but also establish a financing model that is transparent and fair to authors. The ideal would be a model that does not financially overburden researchers, especially those from less resourced institutions, while maintaining editorial integrity and high quality.
Moreover, it is essential to consider the impact of these journals on the advancement of knowledge and the dissemination of scientific information. A journal that favors inclusion and diversity of voices in the scientific field contributes significantly to the development of a more robust and varied academic community. Therefore, when evaluating scientific journals, we must consider not just their business model but also their role in promoting more inclusive and diverse scientific research.”
As a researcher, I feel sometime that the hierarchy of Quartile Ranking, Citation, H-Index,and impact factors have chilling effects on the researchers once publication in Journals indexed in Scopus and Web of Science is almost made mandatory in most of the leading universities for getting promotion in academics. I have observed friends on social networking platforms proudly positing “HOT OFF PRESS”. In private communiques, they cut a sorry picture when asked to explain the basic concepts of the published work.
Secondly such journals ask for skyrocketing publication costs which make the situation unfavorable for some of the genuine researchers to get their manuscripts published within limited timeframe. Third important factors in recent past I have noticed is the desire of the Reviewers to see their published works citated in the reviewed works.
Such phenomena cast a serious doubts on the authenticity of the scopes, Special issues and their authors.
What about another mdpi journal – journal of risk and financial management. Open acess APC charges should not be paid by universities
Sustainability was removed from Australia’s ABDC list. It used to be ranked as an “A” journal. The problem with Sustainability is they publish everything from science to business to sociology. That seems disingenuous for a scholarly journal.
Scholars must stop taking shortcuts and produce honest and useful work.
The leadership of Prof. Marc Rosen, the Editor-in-Chief of the journal ensures the high quality and future success of the journal.
See this special issue from this journal where 7 out of 11 articles are for one of the editors (Rezk)!!!!
https://www.mdpi.com/journal/sustainability/special_issues/renewable_energy_and_sustainable_energy
Another special issue scam!
Every single contributions of mine in Sustainability has its own novelty, holding an specific breakthrough discovery. Otherwise it could not get published. Due to the speed we prefer this journal to communicate our results. I am happy to do science communication. Productivity can not be considered as a drawback for a researcher.
So you say 😉