A reviewer stole a manuscript and published it himself. But you wouldn’t know it from this retraction notice.

The Taylor & Francis logo

Fish off someone else’s peer review!

So writes (in somewhat different words) Mina Mehregan, a mechanical engineer at Ferdowsi University of Mashhad in Iran. Mehregan and a colleague recently discovered that they’d been victimized by a group of unscrupulous reviewers who used the pretext of a long turnaround time to publish a hijacked version of their manuscript in another journal.

In a guest editorial for the Journal of Korean Medical Science — which wasn’t involved in the heist — Mehregan began by noting the toll that protracted peer review can take on authors: Continue reading A reviewer stole a manuscript and published it himself. But you wouldn’t know it from this retraction notice.

The case of the reviewer who said cite me or I won’t recommend acceptance of your work

Some peer reviews evidently are tempted to ask authors to cite their work, perhaps as a way to boost their own influence. But a recent episode at the journal Bioinformatics suggests, the risk can outweigh the reward.

We’ll let the editors — Jonathan Wren, Alfonso Valencia and Janet Kelso — tell the tale, which they did in “Reviewer-coerced citation: Case report, update on journal policy, and suggestions for future prevention:” Continue reading The case of the reviewer who said cite me or I won’t recommend acceptance of your work

Deputy director of U.S. gov’t watchdog leaves to run another gov’t office

The second-in-command at the U.S. Office of Research Integrity (ORI), which oversees investigations into scientific misconduct, will be leaving the agency.

Scott Moore has been at ORI since 2016. He had previously been at the National Science Foundation’s Office of Inspector General, where he was an investigative scientist for 13 years. He was appointed by former director Kathy Partin, who after a tumultuous two years left the ORI in November 2017, and is now the intramural research integrity officer at NIH.

Moore was named acting deputy director of the Office of the Assistant Secretary of Health’s (OASH) Office of Grants Management in July, serving in both that role and as ORI deputy director since then. According to a memo from Assistant Secretary for Health Brett P. Giroir that was circulated at that time: Continue reading Deputy director of U.S. gov’t watchdog leaves to run another gov’t office

Total recall: Brazilian journal issues “total retraction” of plagiarized paper

We’ve seen partial retractions, and retract-and-replacements, but here’s a first (cue timpanis): The Total Retraction.

A Brazilian journal has pulled a 2018 paper on food security for plagiarism — at least, that’s what really happened; the stated reasons are a bit sauced up.

According to the notice: Continue reading Total recall: Brazilian journal issues “total retraction” of plagiarized paper

Can a “nudge” stop researchers from using the wrong cell lines?

Anita Bandrowski, a neuroscientist at the University of California, San Diego, works on tools to improve the transparency and reproducibility of scientific methods. (Her work on Research Resource Identifiers, or RRIDs, has been previously featured on Retraction Watch.) This week, Bandrowski and colleagues  — including Amanda Capes-Davis, who chairs the International Cell Line Authentication Committee — published a paper in eLife that seeks to determine whether these tools are actually influencing the behavior of scientists, in this case by reducing the number of potentially erroneous cell lines used in published studies.

Such issues may affect thousands of papers. Among more than 300,000 cell line names in more than 150,000 articles, Bandrowski and her colleagues “estimate that 8.6% of these cell lines were on the list of problematic cell lines, whereas only 3.3% of the cell lines in the 634 papers that included RRIDs were on the problematic list,” suggesting “that the use of RRIDs is associated with a lower reported use of problematic cell lines.” 

Retraction Watch spoke with Bandrowski about the role of these tools in the larger movement to improve transparency and reproducibility in science, and whether meta-scientific text-mining approaches will gain traction in the research community.

Retraction Watch (RW): Your study presents RRID as a behavioral “nudge,” beyond its primary goal of standardizing method reporting. What other nudges can you envision to prevent misuse of cell lines in scientific research? Continue reading Can a “nudge” stop researchers from using the wrong cell lines?

“Evidence of fabricated data” leads to retraction of paper on software engineering

A group of software engineers from academia and industry has lost a 2017 paper on web-based applications over concerns that the data were fabricated.

The article, “Facilitating debugging of web applications through recording reduction,” appeared online in May 2017 in Empirical Software Engineering, a Springer publication.

According to the retraction notice, which was released in December: Continue reading “Evidence of fabricated data” leads to retraction of paper on software engineering

Showdown over a study of abortion policy leads to a retraction, and leaves no one happy

Elard Koch

A paper in Contraception that purported to show serious flaws in an earlier study of abortion laws and maternal health has been retracted, after the authors of the original study found what were apparently significant flaws in the study doing the debunking.

That’s the short version of this story. The longer version involves years of back-and-forth, accusations of conflict of interest and poor research practice, and lawyers for at least two parties. Be warned: We have an unusual amount of information to quote from here that’s worth following.

As the editor of Contraception, Carolyn Westhoff, put it:

I got to make everybody angry.

Continue reading Showdown over a study of abortion policy leads to a retraction, and leaves no one happy

Group issues model retraction over antibody error

The authors of a 2013 paper on antibody production in patients with rheumatoid arthritis have retracted the work in what looks to us like a case study in how to handle operator error.

The paper, “Monoclonal IgG antibodies generated from joint-derived B cells of RA patients have a strong bias toward citrullinated autoantigen recognition,” was published in the Journal of Experimental Medicine by a group from the Karolinska Institutet (KI) in Sweden and elsewhere, and has been cited 128 times, according to Clarivate Analytics’ Web of Science. The last author, Vivianne Malmström, is a specialist in cellular immunology at the KI.

Here’s an excerpt from the lengthy notice: Continue reading Group issues model retraction over antibody error

Even potential participants of a research integrity conference commit plagiarism, organizers learn

One would hope that researchers submitting abstracts for a meeting on research integrity would be less likely to commit research misconduct. But if the experience of the 6th World Conference on Research Integrity is any indication, that may not be the case. Here, the co-organizers of the conference — Lex Bouter, Daniel Barr, and Mai Har Sham — explain.

Recently the 430 abstracts submitted for the 6th World Conference on Research Integrity (WCRI) were peer reviewed. After an alarming report of apparent plagiarism from one of the 30 reviewers, text similarity checking was conducted on all the abstracts received using Turnitin. This identified 12 suspected cases of plagiarism and 18 suspected cases of self-plagiarism. Abstracts with a Turnitin Similarity Index above 30% (ranging from 37% to 94%) were further assessed and labelled as potential self-plagiarism if overlapping texts had at least one author in common. Continue reading Even potential participants of a research integrity conference commit plagiarism, organizers learn

Oft-quoted paper on spread of fake news turns out to be…fake news

*See update at end of post

The authors of an much-ballyhooed 2017 paper about the spread of fake news on social media have retracted their article after finding that they’d botched their analysis.

The paper, “Limited individual attention and online virality of low-quality information,” presented an argument for why bogus facts seem to gain so much traction on sites such as Facebook. According to the researchers — — from Shanghai Institute of Technology, Indiana University and Yahoo — the key was in the sheer volume of bad information, which swamps the brain’s ability to discern the real from the merely plausible or even the downright ridiculous, competing with limited attention spans and time.

As they reported: Continue reading Oft-quoted paper on spread of fake news turns out to be…fake news