Paper rejected for AI, fake references published elsewhere with hardly anything changed

One journal’s trash is another’s treasure – until a former peer reviewer stumbles across it and sounds an alarm.

In April, communications professor Jacqueline Ewart got a Google Scholar notification about a paper published in the World of Media she had reviewed, and recommended rejecting, for another journal several months earlier.

At the time, she recommended against publishing the article, “Monitoring the development of community radio: A comprehensive bibliometric analysis,” in the Journal of Radio and Audio Media, or JRAM, because she had concerns the article was written by AI. She also noticed several references, including one she supposedly wrote, were fake. 

Continue reading Paper rejected for AI, fake references published elsewhere with hardly anything changed

10 years after the downfall of a same-sex marriage canvassing study, tenure, some better practices — and an engagement

A study on how conversations can change minds grabbed headlines when it was published — and again five months later when retracted.

“Gay Advocates Can Shift Same-Sex Marriage Views,” read the New York Times headline. “Doorstep visits change attitudes on gay marriage,” declared the Los Angeles Times. “Cure Homophobia With This One Weird Trick!” Slate spouted.

Driving those headlines was a December 2014 study in Science, by Michael J. LaCour, then a Ph.D. student at the University of California, Los Angeles, and Donald Green, a professor at Columbia University.

Researchers praised the “buzzy new study,” as Slate called it at the time, for its robust effects and impressive results. The key finding: A brief conversation with a gay door-to-door canvasser could change the mind of someone opposed to same-sex marriage. 

By the time the study was published, David Broockman, then a graduate student at the University of California, Berkeley, had already seen LaCour’s results and was keen to pursue his own version of it. He and fellow graduate student Joshua Kalla had collaborated before and wanted to look more closely at the impact canvassing could have on elections. But as the pair deconstructed LaCour’s study to figure out how to replicate it, they hit several curious stumbling blocks. And when they got a hold of LaCour’s dataset, or replication package, they quickly realized the results weren’t adding up. 

Continue reading 10 years after the downfall of a same-sex marriage canvassing study, tenure, some better practices — and an engagement

Fourth retraction for Italian scientist comes 11 years after sleuths flagged paper

PLOS One has retracted a 2011 paper first flagged for image issues 11 years ago. The retraction marks the fourth for the paper’s lead author, Gabriella Marfè of the University of Campania “Luigi Vanvitelli,” in Caserta, Italy. 

Involvement of FOXO Transcription Factors, TRAIL-FasL/Fas, and Sirtuin Proteins Family in Canine Coronavirus Type II-Induced Apoptosis,” has been cited 41 times, according to Clarivate’s Web of Science. 

Elisabeth Bik flagged the article on PubPeer in 2014 for apparent image manipulation and duplication in six figures. In a 2019 email to PLOS staff, pseudonymous sleuth Claire Francis drew attention to Bik’s findings. The journal retracted the paper on May 6 of this year.

Continue reading Fourth retraction for Italian scientist comes 11 years after sleuths flagged paper

‘Anyone can do this’: Sleuths publish a toolkit for post-publication review

For years, sleuths – whose names our readers are likely familiar with – have been diligently flagging issues with the scientific literature. More than a dozen of these specialists have teamed up to create a set of guides to teach others their trade.

The Collection of Open Science Integrity Guides (COSIG) aims to make “post-publication peer review” more accessible, according to the preprint made available online today. The 25 guides so far range from general – “PubPeer commenting best practices” – to field-specific – like spotting issues with X-ray diffraction patterns.

Although 15 sleuths are named as contributors on the project, those we talked to emphasized the project should be largely credited to Reese Richardson, the author of the preprint.

Continue reading ‘Anyone can do this’: Sleuths publish a toolkit for post-publication review

Guest post: NIH-funded replication studies are not the answer to the reproducibility crisis in pre-clinical research

Barbara Smaller/The New Yorker Collection/The Cartoon Bank

President Trump recently issued an executive order calling for improvement in the reproducibility of scientific research and asking federal agencies to propose how they will make that happen. I imagine that the National Institutes of Health’s response will include replication studies, in which NIH would fund attempts to repeat published experiments from the ground up, to see if they generate consistent results.

Both Robert F. Kennedy Jr., the Secretary of Health and Human Services, and NIH director Jay Bhattacharya have already proposed such studies with the objective of determining which NIH-funded research findings are reliable. The goals are presumably to boost public trust in science, improve health-policy decision making, and prevent wasting additional funds on research that relies on unreliable findings. 

As a former biomedical researcher, editor, and publisher, and a current consultant about image data integrity, I would argue that conducting systematic replication studies of pre-clinical research is neither an effective nor an efficient strategy to achieve the objective of identifying reliable research. Such studies would be an impractical use of NIH funds, especially in the face of extensive proposed budget cuts.

Continue reading Guest post: NIH-funded replication studies are not the answer to the reproducibility crisis in pre-clinical research

Can a better ID system for authors, reviewers and editors reduce fraud? STM thinks so

Unverifiable researchers are a harbinger of paper mill activity. While journals have clues to identifying fake personas — lack of professional affiliation, no profile on ORCID or strings of random numbers in email addresses, to name a few — there isn’t a standard template for doing so. 

The International Association of Scientific, Technical, & Medical Publishers (STM) has taken a stab at developing a framework for journals and institutions to validate researcher identity, with its Research Identity Verification Framework, released in March. The proposal suggests identifying “good” and “bad” actors based on what validated information they can provide, using passport validation when all else fails, and creating a common language in publishing circles to address authorship. 

But how this will be implemented and standardized remains to be seen. We spoke with Hylke Koers, the chief information officer for STM and one of the architects of the proposal. The questions and answers have been edited for brevity and clarity.

Continue reading Can a better ID system for authors, reviewers and editors reduce fraud? STM thinks so

Dozens of Elsevier papers retracted over fake companies and suspicious authorship changes

One of several retraction notices noting “the existence and nature” of a company couldn’t be confirmed.

Since March of last year, Elsevier has pulled around 60 papers connected to companies in the Caucasus region that don’t seem to exist. The retraction notices attribute the decision to suspicious changes in authorship and the authors being unable to verify the existence of their employers. Online sleuths have also flagged potentially manipulated citations among the articles. 

Each of the retracted papers appears to follow an identical pattern, based on the details given in the retraction notices. First, a solo author submits a paper and claims to be affiliated with a company that doesn’t appear in any business registries. During the revision process, the author adds several other authors to the paper — including new first and corresponding authors, despite no clear contribution to the original work. This behavior is typical of paper mills and authorship-for-sale schemes. 

When asked by the editors, the original authors are unable to explain why they added the additional authors, nor validate the “nature” or “existence” of the companies they were claiming an affiliation with, according to the retraction notices. 

Continue reading Dozens of Elsevier papers retracted over fake companies and suspicious authorship changes

Web of Science delists bioengineering journal in wake of paper mill cleanup

Bioengineered has lost its spot in Clarivate’s Web of Science index, as of its April update. The journal has been working to overcome a flood of paper mill activity, but sleuths have questioned why hundreds of papers with potentially manipulated images have still not been retracted.

A spokesperson for Taylor & Francis, which publishes the journal, said it has taken action against the paper mill; the journal has retracted 86 papers since January 2022. They are “disappointed” at the delisting decision, the spokesperson said. The journal now faces up to a two-year embargo before it can rejoin the citation index. 

Bioengineered publishes bioengineering and biotechnology research. In 2021, journal editors launched an investigation when submissions spiked and several authors of submitted and accepted articles asked for authorship changes – both hallmarks of paper mill activity. 

Continue reading Web of Science delists bioengineering journal in wake of paper mill cleanup

Biochemist with previous image duplication retractions loses another paper 

Dario Alessi

A researcher who retracted two papers last year following a years-long investigation has lost another, this one two decades old.

The same journal also corrected two papers for image duplication within days of the retraction.

The moves followed comments about image similarities on PubPeer. The retraction marks the third for biochemist Dario Alessi, a professor at the University of Dundee in Scotland. Two of his papers were retracted in 2024, a process that took six years and included a four-year investigation by the university. 

Continue reading Biochemist with previous image duplication retractions loses another paper 

Chinese funding agency sanctions 26 researchers in latest misconduct report

The organization responsible for allocating basic research funding in China has issued misconduct findings against 26 researchers for violations ranging from breach of confidentiality to image manipulation, plagiarism, and buying and selling authorship. 

The National Natural Science Foundation of China, or NSFC, released the results of 15 misconduct investigations on April 11. Several of the investigations involved teams of researchers and many included specific published papers, 53 in total. China has been taking steps to crack down on academic fraud, calling last year for a review of all retracted articles in English- and Chinese-language journals. 

Penalties for the researchers ranged from bans on applying for funding or serving as a reviewer, to having research funding revoked — which includes having to return funds already dispersed. In most cases, the restrictions on applying for funding were for three to seven years. 

Continue reading Chinese funding agency sanctions 26 researchers in latest misconduct report