Before we present this week’s Weekend Reads, a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance.
The week at Retraction Watch featured a reminder that sometimes science just needs more bullshit; a call to make misconduct investigation reports public; and a puzzle about why retractions took so long. Here’s what was happening elsewhere:
- “The incoming dean of a leading Canadian pharmacy school has ‘voluntarily withdrawn’ from the new position after a book review he wrote was retracted from The Lancet in May.”
- “A commission of inquiry of the University of Tübingen has found scientific misconduct by the renowned brain researcher Niels Birbaumer and his colleague Ujwal Chaudhary.”
- “A less frequently used strategy is to submit a research manuscript to medium-impact journals. If the journal accepts the manuscript with only minor suggestions for improvement, authors then withdraw the paper and aim for a higher–impact factor journal.”
- “Journalists must realize the harm that can be caused when they fail to detect spin and promote it to their readers.” Our Ivan Oransky is one of the authors of a new study of how spin in coverage of medical studies affects perceptions.
- “About 6 months ago, a reporter from the New York Times drew our attention to the possibility that the disclosure of conflicts of interest of authors of a research article that we published about 12 months prior may not have been complete.” And now the journal, Clinical Journal of American Society of Nephrology, has made four corrections.
- “Given the nature of authors’ claims, which would constitute a serious criticism of a methodology that has been tried and tested over many years, we strongly urge your editorial board to reflect on the validity of this research and whether it is in fact a publishable item.” Clarivate pushes back on a study that suggested a flaw in its methodology for counting citations.
- “Credit data generators for data reuse: To promote effective sharing, we must create an enduring link between the people who generate data and its future uses, urge Heather H. Pierce and colleagues.”
- “Respondents said that the most important element that would enable the better reproducibility of published research would be that authors describe methods and analyses in detail.”
- “If all funding agencies were to mandate posting of preprints by grantees—an approach we term Plan U (for “universal”)—free access to the world’s scientific output for everyone would be achieved with minimal effort.”
- “[W]hilst a shift to gold (pay to publish) open access would deliver wider access to research, the lack of price sensitivity amongst academics presents a risk that they will be locked into a new escalating pay to publish system that could potentially be more costly to researchers than the previous subscription model.”
- “That’s when disaster struck – not because of a problem with the research, nor an unlucky break, but because of a reckless act that triggered a grim change in the course of my life.”
- “We find continued increase in the representation of women as authors in academic medicine but demonstrate that disparity still exists, especially in the last author position.”
- “A priest accused of decades of plagiarism will no longer be attending the annual conference of a left-wing American priests’ association.”
- “After outcry, USDA will no longer require scientists to label research ‘preliminary.” We had written about the policy earlier.
- “Research integrity is much more than misconduct,” writes Nature. “All researchers should strive to improve the quality, relevance and reliability of their work.”
- IEEE has lifted a ban on peer reviewers from Huawei, after clarifications from the US government about sanctions.
- This paper was so badly done, the journal didn’t just withdraw it —they withdrowned it.
- The “Next step of Plan S will require publishers to release acceptance rates and review times.” (Rachael Pells, Times Higher Education)
- India’s Council of Scientific & Industrial Research has called for an investigation into more than 100 papers questioned on PubPeer.
- A preprint with impact: “For instance, the overlap between the two Cabell lists reportedly was due to an internal system error by Cabell’s Scholarly Analytics and was rectified after we published the preprint.”
- When it comes to misconduct, what makes a story a story? Our Ivan Oransky’s slides from WCRI2019 in Hong Kong.
- The former rector of the University of Amsterdam is facing charges of plagiarism.
- The British Journal of Anaesthesia’s publishes two conclusions for a single paper, “to broaden replicability efforts beyond just methods and results.”
- “Clinical researchers can now share initial versions of their manuscripts through a free preprint server.”
- A Stanford researcher who studied chronic fatigue syndrome has been fired for sexual harassment and misconduct.
- A unique correction: “The badges for Open Data and Open Materials were initially not awarded to this article because the authors had not applied for them. The Editor in Chief subsequently noted that the article was eligible for both badges, and since the data and materials had been uploaded prior to the article’s acceptance, he asked the authors if they wished to apply.”
Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
RE: https://www.thehindu.com/sci-tech/science/csir-orders-probe-into-journals/article27473309.ece
“On June 3, the Council of Scientific & Industrial Research (CSIR) began an investigation into the large-scale manipulation and/or duplication of images within the same paper or in different papers by scientists at the Lucknow-based Indian Institute of Toxicology Research (CSIR-IITR).
At last count, 130 papers published in peer-reviewed journals by scientists from the institution have problems with the images. A chief scientist at the institute Dr. Yogeshwer Shukla alone has published 40 such papers.”
Another CISR institute, this time in Kolkata: https://www.thehindu.com/sci-tech/science/csir-indian-institute-of-chemical-biology-scientist-has-28-papers-with-manipulated-duplicated-images/article27697386.ece
When it comes to the firing of Montoya, I just do not understand why it is the university and not the legal system that should judge facts concerning sexual harassment. Moreover, we have no evidence to judge whether the university is acting correctly. The legal system is supposed to be more independent of other contingent concerns. That story definitely makes me feel uneasy.
Regarding the Birbaumer study, it’s a bit disturbing for how long PLOS Biology has delayed publication of evidence questioning the validity of the statistical analysis. This commentary https://www.sueddeutsche.de/wissen/niels-birbaumer-locked-in-syndrom-als-1.4478914 (in German) makes it seem many fellow scientists had serious questions about this work, but due to prominence and influence of Birbaumer were reluctant to come forward.
Today the German Science Foundation (DFG) decided to sanction Prof. Birbaumer and Dr. Chaudhury. They will be ineligible to apply for DFG funding, and Prof. Birbaumer must return funds received from the DFG. They are furthermore asked to retract two joint publications.
https://www.forschung-und-lehre.de/forschung/dfg-beschliesst-massnahmen-gegen-hirnforscher-birbaumer-2141/
(in German)
“A less frequently used strategy is to submit a research manuscript to medium-impact journals. ”
Less frequently than submitting for high-impact journals for peer review rather than acceptance, an occurrence they describe as entirely hypothetical (based on the ludicrous suggestion that higher impact journals produce higher quality peer review). So, I’m skeptical that anyone has ever submitted to a mid-tier journal, then withdrawn after a favourable review to submit to a higher tier journal under the naive impression they’d also get a favourable review there.
This editorial made little sense. Even if there are authors that withdraw manuscripts after receiving comments on their work (which I agree has to be exceedingly rare), what’s the downside if ultimately a better study ends up being published? The real problem is that peer review remains a largely altruistic exercise, with few tangible benefits for the referee.
«The real problem is that peer review remains a largely altruistic exercise, with few tangible benefits for the referee. »
And that should definitely change. What mechanism could enforce that peer review gets professionalized?
Brian, COPE cites one such similar case, https://publicationethics.org/case/unethical-withdrawal-after-acceptance-maximize-impact-factor, and I have heard of at least one or two other cases.