Would you consider a donation to support Weekend Reads, and our daily work?
The week at Retraction Watch featured:
- Wiley journal retracts two papers it said were fine following criticism years ago
- Journal hijackers still infiltrate Scopus despite its efforts
- Nature retracts highly cited 2002 paper that claimed adult stem cells could become any type of cell
- Elsevier reopens investigation into controversial hydroxychloroquine-COVID paper
- ‘Exhausting’: Author finds another’s name on an Elsevier book chapter she wrote
- Journal investigating follow-up study that didn’t mention patients had died
- Superconductor researcher loses fifth paper
Our list of retracted or withdrawn COVID-19 papers is up past 400. There are more than 49,000 retractions in The Retraction Watch Database — which is now part of Crossref. The Retraction Watch Hijacked Journal Checker now contains more than 250 titles. And have you seen our leaderboard of authors with the most retractions lately — or our list of top 10 most highly cited retracted papers? What about The Retraction Watch Mass Resignations List — or our list of nearly 100 papers with evidence they were written by ChatGPT?
Here’s what was happening elsewhere (some of these items may be paywalled, metered access, or require free registration to read):
- “Vietnamese scientist in the top ‘world’s most influential’ explains why he publishes an article every 3 days.”
- “Science Is Full of Errors. Bounty Hunters Are Here to Find Them.”
- “The case for criminalizing scientific misconduct.”
- “A Global Network for Early Career Research Integrity Practitioners.”
- “The race to rank universities mainly serves enrollment goals.” And: “University ranking tricks: Ranking and promotion tricks.”
- “Google’s DeepMind leads European scoreboard in AI citations.” And for such work, “Peer review seems to have ‘almost completely fallen out of fashion.'”
- “Open access must not come at the expense of the review.”
- “Antiabortion Lawsuits Leaned on Discredited, Disputed Research.” Our previous coverage.
- “Evaluating Open Access Advantages for Citations and Altmetrics (2011-21): A Dynamic and Evolving Relationship.”
- “Positive feedback for distributed peer review.”
- “Is This Journal Legit? Open Access and Predatory Publishers.”
- “Materials scientist explains why he started commenting on PubPeer.”
- “Not all science papers (or “articles”) are the same”: “Science article genres” from The Grumpy Geophysicist.
- “Tobacco industry conflicts of interest cannot go undeclared in scientific publishing – Authors’ reply.”
- “Faked results lead to retraction of high-profile cancer neuroscience study.”
- “In political science research ethics is women’s work.”
- “Judge open science by its outcomes, not its outputs.”
- “How shameful should retraction be?”
- “Can We Trust Social Science Research? Issues of bias, credibility, politics, reliability and reproducibility.”
- “Navigating the challenges of imposter participants in online qualitative research.” And: “Protocol for Increasing Data Integrity in Online Research (PRIOR).”
- “Swiss philosopher Iso Kern sues former Chinese student Ni Liangkang for alleged plagiarism.”
- “Retracted studies and new treatments reveal the confusing state of Alzheimer’s research.”
- Medical whistleblowers “have been humiliated, scorned and ostracized. No wonder so few of us have the courage to speak up.”
- “The Realities of Increasing Open Access Charges.”
- “Retract Now: Negating Flawed Research Must Be Quicker.”
- “The Sydney student who uncovered a ‘shocking’ problem with global cancer research.”
- “[T]he case for an ‘open forum’ model of peer review in philosophy.”
- “Dealing with subpar academic research” in Pakistan.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, subscribe to our free daily digest or paid weekly update, follow us on Twitter, like us on Facebook, or add us to your RSS reader. If you find a retraction that’s not in The Retraction Watch Database, you can let us know here. For comments or feedback, email us at [email protected].
The h-index calculation needs to be changed. It should be reduced by some amount for every paper that is corrected and by a much larger amount for every retraction. I’ll start: subtract 5 points for each correction, 20 points for every retraction. The h-index is allowed to be negative.
Interesting … But should retractions for misconduct be penalized the same number of points as retractions for inadvertent or other types of errors? Also, corrections range from minor and inconsequential to major mega-corrections or those that are issued to cover for suspected misconduct (e.g., copying others’ work without citation, then adding a ‘correction’ by mentioning the ‘inadvertent’ lapse). Anyway, having a meaningful scoring system could get complicated.
I considered that and agree that trying to sort out the various flavors of corrections is unworkable. So, perhaps a -1 for each correction, regardless of type, might be better. Such a correction to an author’s h-index is not going to make or break anyone, but might be enough to cause authors, especially corresponding authors, to be more diligent in their papers. The -1s will begin to pile up for those with problematic track records.
I stand by the -20 for retractions, regardless of reason.
Something similar should also be considered for journal Impact Factors.
An example calculation for Ming-Hui Zou (Georgia State U) with 23 retractions, and an uncorrected H-index of 85:
https://scholar.google.com/citations?user=KX2tciMAAAAJ&hl=en
85 – 20(23) = 85 – 460 = – 375 (corrected-H index)
That’s quite a high bar as a starting benchmark. Kudos to GSU.
So if the journal inadvertently publishes your paper twice and then retracts the duplicate, or someone slaps your name on a paper that you had nothing to do with and that is then retracted for the fake authorship, you should take a hit to your h-index? Neat.
Care to propose a fix or alternative? Or, are you happy with the status quo?
The h-factor should be just ignored, not changed.
And in world news, the head of India’s National Testing Agency has been dismissed. https://www.bbc.com/news/articles/c1eejd3vx0no
Is it just me, or did the linked story “Vietnamese scientist in the top ‘world’s most influential’ explains why he publishes an article every 3 days.” not explain, in any meaningful way, “why he publishes an article every 3 days”? Seemed like a fluff piece.
Also, this is a good time to remind all readers that the tuoitre.vn story promulgates questionable information, as is no such thing as “”top most influential scientists in the world” rankings of Professor John PA Ioannidis and colleagues at Stanford University.” There is just a scientometrics dataset published by Ioannidis on a data-sharing repository for use in research. Of course, we cannot let facts stand in the way of academic gamesmanship (in the sense of van den Berghe).