Weekend reads: Naughty journals; whistleblowers’ frustration; new misconduct definition?

booksThe week at Retraction Watch featured revelations of fraud in more than $100 million in government research, and swift findings in a much-discussed case. Here’s what was happening elsewhere:

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

4 thoughts on “Weekend reads: Naughty journals; whistleblowers’ frustration; new misconduct definition?”

  1. Regarding, “Which 18 naughty journals were delisted from Thomson Reuters’ Journal Citation Reports for excessive self-citation and citation stacking? :

    It might be worthwhile for RW to dig a bit deeper than TR’s announcements into why journals lose their journal impact factor. The delisting of Springer’s Earth and Environmental Sciences (formerly Environmental Geology) caught my eye as my colleagues have published in it, and it has been around a long time as one of the many mid-tier, niche journals that publish respectable but not highly selective works. It’s last JIR was 1.7 according to the journal.

    I dug into it in Scopus, and there are indeed some anomalies, but it’s not obvious that anomalous = naughty. In the early 2000s as Environmental Geology they published about 300 articles a year, but it took off from 297 articles in 2010 (first year as Earth Environ Sci) to 1137 in 2015. Total citations shot up by a factor of 10 during that period. Self-citation rates were indeed high: From 2011 through 2015 self-citation rates were 11, 29, 29, 40, and 35% respectively per year. For comparison, rates at two other journals in the field looked to be under 10% (JAWRA and Hydrology and Earth System Sciences). I glanced through the most highly cited article published in 2013, on landslide susceptibility in Korea, with 53 citations in Scopus. Indeed many of the citing articles were self-citations to the same journal, but they all seemed to be relevant to the topic.

    Because TR calculates the 2-year JIR with an arithmetic mean rather than a median, a single highly cited article can proportionally move a low JIR value quite a bit. Maybe this is a gotcha and there is indeed a heavy-handed company editor behind the scenes twisting authors’ arms to boost citations, but maybe it’s just that the journal is influential within its niche and authors read and cite others in their niche. Looking back at the most recently published 25 articles, only two were from authors in Western Europe or North America. Chinese authors were strongly represented. Certainly editors can misbehave, but this suggests that if TR is delisting solely for statistical anomalies without digging deeper, if a community of authors are selectively reading and publishing in a niche journal there is a risk of discrimination against this community.

    1. I cannot support the delisting of legitimate journals on the altar of the Holy Church of Impact Factor. Who decides how many journal self-citations is too many? 70%? 60%? 20%? Are the publishers who maintain these citation indices (and who also control the journals themselves) accountable to anyone for their arbitrary decisions?

  2. I have some concerns about the methodology employed by Dadkhah regarding the use of Google to find journals. While I agree that this is likely a poor technique for locating quality publishers, I question why he chose the 19 search phrases that he did. The overwhelming majority of his search phrases included some combination of fast/easy publication/review. Is there any evidence showing that authors actually use these phrases when searching Google? IMO, if you’re searching for a journal based on these criteria, then you’re all but asking to find a venue with potentially unsavory publishing ethics. 


    1. Agree with Anonymous above. In addition, who is “we”/”our”, considering there is only one author?

      “We used Beall’s list and Jalalian’s hijacked journals list to detect questionable journals. In our opinion, the best ways to select a suitable journal for publishing your research are
      to use journal finders from publishers or do a search in scientific sites, such as Web of Science or Scopus.”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.