Would you consider a donation to support Weekend Reads, and our daily work?
The week at Retraction Watch featured:
- Yale professor’s book ‘systematically misrepresents’ sources, review claims
- Nature flags doubts over Google AI study, pulls commentary
- Anthropology groups cancel conference panel on why biological sex is “necessary” for research
- After resigning en masse, math journal editors launch new publication
We also added The Retraction Watch Mass Resignations List.
Our list of retracted or withdrawn COVID-19 papers is up to well over 350. There are more than 43,000 retractions in The Retraction Watch Database — which is now part of Crossref. The Retraction Watch Hijacked Journal Checker now contains over 200 titles. And have you seen our leaderboard of authors with the most retractions lately — or our list of top 10 most highly cited retracted papers?
Here’s what was happening elsewhere (some of these items may be paywalled, metered access, or require free registration to read):
- “If you take the sleuths out of the equation, it’s very difficult to see how most of these retractions would have happened.”
- “Co-Authors Seek to Retract Paper Claiming Superconductor Breakthrough.”
- “The Banality of Bad-Faith Science: Not every piece of published research needs to be heartfelt.”
- “Why Crossref’s acquisition of the Retraction Watch database is a big step forward.”
- As Cyriac Abby Philips’ X account is suspended by court order, read about the time Elsevier retracted a paper he’d written after legal threats from Herbalife.
- “China set to outlaw use of chatbots to write dissertations.”
- “When authors play the predatory journals’ own game.”
- “Science will suffer if we fail to preserve academic integrity.”
- “Three ways to make peer review fairer, better and easier.”
- “Can generative AI add anything to academic peer review?”
- “While data citation is increasingly well-established, software citation is rapidly maturing.”
- A discussion of publication misconduct.
- “Where’s the Proof? NSF OIG Provides Insights on Crafting That All-Important Investigation Report.”
- “The Dos and Don’ts of Peer Reviewing.”
- “Maternal health points to need for oversight on scientific research.”
- “The INSPECT-SR project will develop a tool to assess the trustworthiness of RCTs in systematic reviews of healthcare related interventions.”
- “There are several pitfalls in the publication process that researchers can fall victim to, and these can occur knowingly or unknowingly.” The view from a journal.
- “Peer Review and Scientific Publication at a Crossroads: Call for Research for the 10th International Congress on Peer Review and Scientific Publication.”
- “[T]he number of retractions in Spanish research grows.”
- “‘We’re All Using It’: Publishing Decisions Are Increasingly Aided by AI. That’s Not Always Obvious.”
- “Key findings include that 72% of participants agreed there was a reproducibility crisis in biomedicine, with 27% of participants indicating the crisis was ‘significant’.”
- “Should research results be published during Ph.D. studies?”
- “An Overdue Due Process for Research Misconduct.” And: “Harvard Should Protect Whistleblowers.”
- “Journals That Ban Replications–Are They Serious Scholarly Outlets At All?”
- “Replication games: how to make reproducibility research more systematic.”
- “This retraction is in acknowledgement of the fact that the publication was incomplete, inaccurate and misleading due to misapprehension of the facts as established during the parliamentary hearings on the transaction.”
- “Ex-Tory MP threatens to sue University after being named in slavery research.”
- Missed our webinar with Crossref about an exciting development for The Retraction Watch Database? Watch here.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
“Key findings include that 72% of participants agreed there was a reproducibility crisis in biomedicine, with 27% of participants indicating the crisis was ‘significant’.”
So that means 45% think there’s an insignificant “crisis”? A minor “crisis”? A hiccup?
Table 2 has your answer.
Thanks. A “slight crisis.” Whether or not there’s an actual “crisis” is a separate issue, but it seems the authors and participants were actually talking about a “problem.”
I know it may just sound like a nitpick, but like the old saying goes: “When everything’s a crisis, nothing’s a crisis.” We should all strive to have important words retain their impactful meanings.