Scopus is broken – just look at its literature category

Aleksandar Stević

As Retraction Watch recently reported, three of the top 10 philosophy journals in the highly influential Scopus database turned out to be fakes: Not only did these dubious journals manage to infiltrate the list, but they also rose to its top by trading citations. This news is embarrassing in itself, but it is hardly shocking. Our rankings-obsessed academic culture has proven time and again that it is prone to data manipulation. Rankings for both publications and institutions are routinely hacked by scholars, editors, and administrators who are ready to tweak or even falsify numbers as needed. 

The problems with the Scopus journal rankings, however, run much deeper. The issue is not that inflated citation numbers have occasionally propelled impostor journals to the top of the list. Rather, at least in my own field of literary studies, the ranking makes no sense whatsoever: the list is full of journals that have no business being there at all because they belong to entirely different areas of scholarly enquiry, and even when the ranking gets the field right, it systematically places marginal publications close to the top. In what follows, I briefly break down the major ways the Scopus Literature and Literary Theory Ranking is not just skewed but downright nonsensical.

Scopus ranks journals based on SJR (SCImago Journal Rank), which it defines as a “measure of journal’s impact, influence or prestige.” The ranking includes a little over 1,000 titles, but for the purposes of this analysis I have focused mostly on the top 100. My main finding: the majority of journals in the upper part of the list quite simply do not belong to the fields of literature and literary theory. 

The top 10 are particularly telling. Within that group is exactly one (!) journal – an annual publication dedicated to the work of the Spanish Golden Age dramatist Lope de Vega – that  specializes in literature. Another two are wide-ranging humanities journals in which literary scholars do publish, although these also are highly idiosyncratic. The first is South Atlantic Quarterly, a long-standing journal dedicated to “urgent political, cultural, and social questions,” whose editorial board consists primarily of literary theorists, and the second is Journal of Cultural Analytics, a relatively new open-access publication “dedicated to the computational study of culture.”  

The rest of the top 10 belong to translation studies (two), criminology (one), and writing studies (two); one is an interdisciplinary journal with a primary focus on sociology and gender studies, and one (Poetics) has been founded as a literary journal in the 1970s and continues to address issues related to literature (among other topics), but it does so from a distinctly sociological perspective. The editors are sociologists, who also make up the bulk of the contributors. 

In short, Scopus’ list of top 10 literature journals includes one journal specializing in literature, two general humanities journals with some footing in literary studies, one journal that used to specialize in literature but no longer does, and six journals with absolutely no relation to literary studies.

RANKJOURNAL TITLE ACTUAL FIELD
1Translation SpacesTranslation Studies
2Criminology and Public PolicyCriminology
3Anuario Lope de VegaLiterature
4Journal of Writing ResearchWriting studies
5Men and MasculinitiesSociology
6Perspectives: Studies in Translation Theory and PracticeTranslation Studies
7PoeticsSociology of Culture
8Writing CommunicationWriting Studies
9South Atlantic QuarterlyInterdisciplinary Humanities
10Journal of Cultural AnalyticsDigital Humanities

                    Table 1. Top 10 journals in the Scopus Literature and Literary Theory Ranking

The situation becomes only slightly better when we analyze the top 100. In that cohort the number of journals belonging to the fields of literature and literary theory rises to somewhere between 35 and 45, depending on how one counts. However, even with that improvement, well over half of the titles on the list  do not belong there. They belong to a range of different fields, including primarily language and linguistics, but also education, library science, anthropology, history, theology, and so on. 

What accounts for this failure of most basic classification? One might assume that the creators of the list have simply lumped literature and some adjacent fields together: Although it is deeply frustrating to scholars who share little in terms of publication venues, research methodologies, and departmental affiliations, the use of joint categories that encompass fields like literature, linguistics, and writing is not unprecedented. 

But Scopus seems to be doing something different. The database has a separate category for Language and Linguistics, and although  some journals may publish in both fields  (none of the remotely good ones do), many of those included in the Literature category are in fact pure linguistics journals, full stop. Besides, even if overlapping classifications can account for some of these intruders, one cannot justify the presence of journals like Performance Measurement and Metrics or Men and Masculinities on the list.  No methodological choice, however dubious, can explain this ranking. One has to assume that those who created it simply couldn’t be bothered to properly attend to the task at hand.

To make things worse, even when we remove the noise created by the inclusion of journals from other fields and focus only on actual literature journals, the ranking’s ability to identify quality publications does not seem to improve. A look at the top 100 journals reveals that publications with no discernable international footprint systematically outrank highly selective world-leading venues. To use just one startling example, Malaysian pay-to-play publication 3L: Language, Linguistics, Literature (40th overall) far outranks almost every major literature journal published in the US and Europe, including such cutting-edge publications as PMLA (59th), English Literary History (110th), and Diacritics (415th).

Many other interesting quirks would require further scrutiny, such as the curious overrepresentation of journals published in Slovenian, a language with barely 2 million speakers, as well as the equally curious overrepresentation of journals focusing on the Spanish Golden Age, an important period in early modern literary history, but surely not one that could plausibly attract more high-quality scholarship than the entire fields of German and French literary studies, Comparative Literature, and Medieval Studies, all of which are completely absent from the top 100. And yet, here we are. 

The Scopus ranking under the rubric of “Literature and Literary Theory” is many things but a meaningful ranking of journals in the field of literary studies, it is not. It contains scores of journals that have absolutely nothing to do with that particular field, and it routinely awards very high rankings to publications with no international relevance whatsoever. On what criteria, it is impossible to say. Unsurprisingly, most of the publications that are ranked highly in Scopus are not good enough to be included in Clarivate’s much more selective Arts & Humanities Citation Index. Some have made it into the less selective Emerging Sources Citation Index, but most are not picked up by Web of Science at all. 

I wish I could offer some more constructive criticism, but such criticism would imply that the Scopus ranking, such as it is, has some redeeming features, which it does not.  How do you improve a list that ranks Criminology and Public Policy as one of the top literature journals? That would be like taking seriously a list that treats Shakespeare Quarterly as a top journal in the field of chemical engineering. It would probably be easier to just start from scratch.

Of course, the biggest problem with Scopus is that, despite its profound unseriousness (at least in my field), it is generally taken seriously and produces real-world consequences.  A tenure case at a major US research university is unlikely to hinge on a journal’s standing in Scopus, but in many parts of the world, especially in developing countries, administrators routinely rely on Scopus rankings as a proxy measure of research quality. And yet, in its existing form the ranking is not only an utterly inadequate tool for achieving that goal but may well be counterproductive given the sheer number of marginal journals on which it confers the veneer of respectability. 

I don’t know what the situation in the sciences is, but in the humanities, Scopus must get its act together. Until then, its rankings will not be a serious indicator of journal quality. 

Aleksandar Stević is assistant professor of English at Lingnan University in Hong Kong.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

Processing…
Success! You're on the list.

2 thoughts on “Scopus is broken – just look at its literature category”

  1. This is a narrow view of literature as a study of literary works (not texts, which might include cultural artefacts and phenomena). If one does, however, see cultural studies as a part of literature then, say, criminology and public policy are a part of it.

  2. It is not good even for sciences.
    But why even try to categorize journals based on a discipline?
    Personally, despite all the problems, I think Google Scholar still captures the wider “impact” the best. In your field, for instance, it might record references in theses, popular magazines, professional literature outlets, museum pieces, and whatnot. In many cases such references convey more “impact” than references in scholarly journals. Though, gaming and other bad things are still present also with Google Scholar.
    And, of course, it is not good that any single company controls the citation business. Therefore, I am quite pleased with the progress that Semantic Scholar has recently made.
    When I thought about this mine field recently, I came to a conclusion that the best available option would be if something like Crossref would record the citations. They already have the meta-data, after all. It would be also good for science in the long-run if there would be a non-profit in charge.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.