An article estimating how many people might have died during the first wave of the COVID-19 pandemic due to the off-label use of hydroxychloroquine in hospitals was retracted in August after advocates for the drug launched a campaign criticizing the study.
In a statement to Retraction Watch, the journal stood by its decision to retract the article due to “some clear fatal flaws” identified in letters to the editor, which it said it declined to publish due to their tone it deemed “not suitable for publication in a scientific journal.”
Retraction Watch readers may have noticed an uptick of items in the RW Daily and Weekend Reads about scientific integrity issues in Vietnam over the past year. Many of those items had their genesis, and were circulated, on a Facebook group that now has close to 100,000 members — and was recently removed temporarily by Facebook. We asked Van Tu Duong, a researcher at Purdue University in West Lafayette, Indiana, USA, who founded the group, to tell us more about the history of the effort. This email interview has been lightly edited for flow and clarity.
Tell us about this history of the group. Why did you find it, and when?
Last year, a researcher at a U.S. university received an email offering what the subject line described as a “great opportunity to publish an article.”
The author of the email, Mahdi Shariati, an adjunct professor of civil engineering at Ton Duc Thang University, in Vietnam, said he had read one of the researcher’s papers and was impressed by its quality. “It would be an honor for me to collaborate with you and jointly present your remarkable work,” Shariati added.
The Declaration of Helsinki on ethical principles for research involving human participants now includes a statement on scientific integrity and research misconduct.
Adopted in 1964 by the World Medical Association, the Declaration of Helsinki was conceived in response to the atrocities committed during World War 2 in the name of medical research on human subjects. The initial document – which has been updated many times over the last 60 years – included five key principles, including the primacy of informed consent, the need for a rigorous calculation of risks and benefits for a given study, and a consideration of the scientific value of a given study – that is, the experiment should be valuable to science and to the subjects involved.
In the recent process of revising the declaration, the World Medical Association added the following two sentences to the “general principles” section of the document:
Citing eLife’s unusual practice of publishing articles without accepting or rejecting them, Clarivate says it is re-evaluating the inclusion of the open-access biology journal in Web of Science, its influential database of abstracts and citations.
In contrast to the other journals recently placed on hold from indexing, including Elsevier’s Science of the Total Environment, Clarivate has cited a specific policy as the reason for re-evaluating eLife: “Coverage of journals/platforms in which publication is decoupled from validation by peer review.”
A Clarivate spokesperson described the policy as applying to “journals that do not make an editorial decision to accept or reject based on peer reviewers’ comments.”
A cancer researcher at the University of Cambridge in the UK has retracted a paper from Cell after commenters on PubPeer questioned aspects of 10 images in the article.
Steve Jackson
Though an institutional investigation found the figures were “not reliable,” another of the authors objected to the retraction as “an overreaction.”
Steve Jackson, the University of Cambridge biology professor and lab leader, previously retracted two papers – including one in Nature and one in Scienceposted on the same day – after a Cambridge investigation found a co-author, Abderrahmane Kaidi, had falsified data.
The sudden death of a 27-year-old woman in the Romania offices of MDPI, a major open-access publisher with a worldwide presence, has grabbed national headlines and raised questions about the conditions under which the firm’s employees work.
Local newsreports said the woman had initially fainted in MDPI’s Bucharest office on Friday, October 4, but that her superiors refused to call an ambulance or let her go home after she revived. She later collapsed again and died from a heart attack after efforts to resuscitate her failed, according to the reports.
But in an interview with Retraction Watch, a colleague of the deceased woman, identified as Maria Alexandra Anghel, contested the media’s account of events.
“At a time when scientists and scientific research are already being criticised by persons who identify science with technology and who deplore some of the consequences of technology, dishonesty among scientists causes unease among scientists themselves and regretful or gleeful misgivings among publicists who are critical of science.“
Daryl Chubin wrote that in 1985 — a time when institutions we now take for granted, like the U.S. Office of Research Integrity, did not yet exist. We asked him to reflect on what has happened in the intervening four decades.
The phrase “misconduct in research” today is a quaint reminder of how much science has been captured by for-profit, politicized, international interests. As a four-decades-removed social researcher of misconduct, I marvel at how an investigation industry has emerged to monitor, analyze, report and decry the mischief around us. This “watcher community” represents an industry in an era of science most of us never envisioned.
In the days before the Office of Research Integrity, many accused researchers and their academic institutions were grasping for an accountability structure that was fair to all parties — adhering to due process – and swift in its resolutions. Good luck! Today, the headlines in Retraction Watch reflect a publishing industry seemingly under siege—awash in retractions, plagiarism, AI mischief, undeclared conflicts of interest, whistleblowing, and a subset of ills that are dizzying and disconcerting to degrees never seen before.
Retraction Watch monitors an industry ever more self-conscious about misdeeds in research, from analysis to interpretation to reporting. By setting the threshold low, it focuses on misdeeds that may be rare in a particular field, but substantial when aggregated across fields. Yes, there is a risk of overgeneralizing from statistical anomalies, but readers care about violations that sully “their” field.