Clarivate, the company behind the Impact Factor, a closely watched — and controversial — metric, is calling out more than 20 journals for unusual citation patterns.
The 21 journals — 10 of which were suppressed, meaning they will not receive an Impact Factor in 2020, and 11 of which received an expression of concern — are fewer than half of the nearly 50 that the company suppressed or subjected to an expression of concern last year from its Journal Citation Report (JCR). The suppressions, the company notes, represent .05% of the journals listed — a total that increased dramatically this year from about 12,000 to about 20,000.
Clarivate suppressed 10 journals for excessive self-citation which inflates the Impact Factor, or for “citation-stacking,” sometimes referred to as taking part in “citation cartels” or “citation rings:”
- Archivos Latinoamericanos de Nutrition (stacking)
- Journal of Intelligent & Fuzzy Systems (stacking)
- Materials Express (stacking)
- Hellenic Journal of Cardiology (self-citation)
- International Journal of Engine Research (self-citation)
- Journal of Enhanced Heat Transfer (self-citation)
- Journal of Family and Economic Issues (self-citation)
- Mechanics-Based Design of Structures and Machines (self-citation)
- Journal of Biomolecular Structure and Dynamics (self-citation)
- Liquid Crystals (self-citation)
As we’ve noted before:
Given many universities’ reliance on journal rankings to judge researchers’ work as part of tenure and promotion decisions, Clarivate’s suppression of a journal — meaning denying it an Impact Factor — can have far-reaching effects. Impact Factors are based on average citations to articles in a journal over a particular period of time. Many, including us, have argued that Impact Factor is not the best way to judge research — for reasons including relative ease of gaming such metrics.
At least one university agrees.
In addition, 11 journals earned expressions of concern because they had “one or more published items with an atypically high-value contribution to the JIF numerator and a pattern of journal citations disproportionately concentrated into the JIF numerator.”
Alison Mitchell, chief journals officer at Springer Nature, which publishes one of the suppressed journals and four of those that received an expression of concern, told Retraction Watch:
In partnership with our editorial community we are committed to publishing the highest quality research. At this point we are looking into the questions raised about the mentioned journals in more detail, so cannot comment further, but will ensure that all questions are addressed appropriately going forwards.
Other journals on the lists include those published by Elsevier, Sage, Taylor & Francis, Wiley, and Wolters Kluwer.
Last year, several journals appealed Clarivate’s decisions. Some were successful, while others were not. For more on how the company approaches suppressions, see this guest post from Nandita Quaderi, the editor in chief of Clarivate’s Web of Science.
This year’s JCR is the first to include the Journal Citation Indicator, which according to Clarivate,
represents the average category-normalized citation impact for papers published in the prior three-year period, providing a single journal-level metric that can be easily interpreted and compared across disciplines. The Journal Citation Indicator will be calculated for all journals in the Web of Science Core Collection – including those that do not have a Journal Impact Factor (JIF)™.
Update, 2100 UTC, 6/30/21: Alan Daugherty, the editor of Arteriosclerosis, Thrombosis and Vascular Biology (ATVB), a title that received an expression of concern for the second year in a row, told us:
As the publisher for Arteriosclerosis, Thrombosis and Vascular Biology (ATVB), the American Heart Association has continued to work with Clarivate during the past year to gain additional insight into their data regarding citations. Started in 1981 as Arteriosclerosis, ATVB is a specialized journal that gradually expanded to encompass the ever-growing fields of thrombosis and vascular biology. Given the detailed work in these specific yet related research areas, it is unsurprising that ATVB and this article have a higher than usual rate of self-citation. It’s important to note the highly technical basic science research required for this paper, while the library of peer-reviewed research is extremely limited. We appreciate Clarivate’s efforts to monitor self-citation. We are confident in our continually evolving peer-review process that includes careful evaluation of citations.
Like Retraction Watch? You can make a one-time tax-deductible contribution or a monthly tax-deductible donation to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
So you know where I can find a list of the 11 journals with an expression concern? I could only find the other 10 that did not receive a JIF due to self-citations.
There is a link in this paragraph:
There are also researcher who excessively cite their own work, even if it is only very marginally related to their new work, or perhaps not that much more important than research from others in their field. When , say, 15-20 out of 50-60 references are to one’s own work, should that not raise concerns too? Or am I missing a discussion that has been going one outside my (admittedly limited) view?
>> When , say, 15-20 out of 50-60 references are to one’s own work, should that not raise concerns too?
No, that should not raise concerns. At least, that is what Springer Nature say, with COPE’s approval.
https://pubpeer.com/publications/7657C82A850180C1A0F179E411107F
Let’s be frank. Every breach of proper research conduct, every instance of plagiarism, every fake Western blot, every image manipulation, the paper mills, the fake authors, etc. we read about in the pages of this blog are due solely to the absolutely ridiculous fixation on citations, impact factors, number of papers published. The useless attempt to measure research quality by quantity is a joke. What makes matters worse is that the parameters measured are easily manipulated.