Before we present this week’s Weekend Reads, a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance.
The week at Retraction Watch featured a neither-correction-nor-retraction that made no one happy, a debate over an obesity intervention that ended without a resolution, and the retraction of a study that led to hyped claims about the dangers of tuna. Here’s what was happening elsewhere:
- A new scam: Fake acceptance letters. One society has seen seven over the past five years. (Angela Cochran, The Scholarly Kitchen)
- “What happens when concerns about the reproducibility crisis in science get picked up by political activists?” Michael Schulson wonders in Undark as a new report on reproducibility is released.
- Predatory journals are committing cybercrimes and should be reported to relevant authorities, says a new paper. (International Journal of Nursing Practice)
- The editor of the American Journal of Political Science has resigned, effective immediately, after his use of the journal to defend himself against allegations of sexual harassment led to an uproar. And a number of the journal’s editorial board members have urged that the leadership of the society that publishes the journal “appoint immediately a team of empowered subfield co-editors to work with Jacoby on the transfer of editorial control.”
- An NIH research project on opioids won’t be partially funded by the pharmaceutical industry after all. (Lev Facher, STAT)
- “A new study estimates that it will take 16 years for women and men to publish papers in equal numbers. For physics, it will take 258.” (Ed Yong, The Atlantic)
- “No matter where librarians choose to start, they represent excellent partners in the research enterprise — that can help overcome the reproducibility crisis.” (Letisha Wyatt, JOVE)
- “What those and many other stories failed to note, however, was that three of the scientists behind the study in question had financial conflicts as tangled as a bowl of spaghetti, including ties to the world’s largest pasta company, the Barilla Group.” BuzzFeed’s Stephanie Lee takes a look at research into pasta.
- A student at the University of Johannesburg had claimed that a professor had stolen his PhD research data. The university says that’s not true. (Kgaugelo Masweneng, Times Live)
- “How much research goes completely uncited?” wonders Simon Baker of Times Higher Education. See an analysis of the same question by Nature’s Richard van Noorden from December.
- India’s top business, school, the Indian Institute of Management Ahmedabad, “is hit by another plagiarism controversy with a senior faculty member at the centre of it.” (Niyati Rana, Ahmedabad Mirror)
- As plagiarism allegations grow at India’ Jawaharlal Nehru University, the Delhi High Court ruled that such behavior can’t be tolerated among faculty. ( ; Business Standard)
- The president of Hobart and William Smith Colleges has resigned, after facing allegations he had plagiarized in his PhD thesis. (Rick Seltzer, Inside Higher Ed)
- “The world is waiting for the citation graph to become a public good,” writes Dario Taraborelli. (BoingBoing)
- How do researchers perceive research misconduct in biomedical science, and how would they prevent it? (Accountability in Research, sub req’d)
- “Missing the point: are journals using the ideal number of decimal places?” (Adrian Barnett, F1000 Research)
- “An early halt to a trial of deep brain stimulation for depression reveals little about the treatment but more about the changing nature of clinical trials.” (David Dobbs, Mosaic)
- A 28-year-old allegation of plagiarism continues to dog a physics professor facing other charges. (Rick Karlin, Albany Times-Union)
- “The Karolinska case exemplifies the power of institutional actors to support dubious operators with a gift for international marketing and rapid publishing and to protect them in contexts of mounting external criticism.” Two management scholars dissect the Paolo Macchiarini case. (Research Policy; sub req’d)
- “Reluctance to shame those who breach editorial ethics has dented confidence in research integrity,” say the authors of a new paper. (Times Higher Education, sub req’d) We interviewed one of the authors, Adam Cox, last month.
- “From the point of view of statistical reporting consistency, [experimental philosophy] seems to do no worse, and perhaps even better, than psychological science.” (PLOS ONE)
- “Nature authors say a reproducibility checklist is a step in the right direction, but more needs to be done.” (Nature)
- What open peer review at The BMJ may reveal: Conflicts of interest. (Kevin Lomangino, Health News Review)
- “With these changes, we sought to publish fewer papers, but with a higher quality.” A prominent physics journal decides to publish dramatically less. (Smriti Mallapaty, Nature Index)
- “We all have them. Somewhere in a desk drawer or a forgotten folder lies the zombie paper, waiting.” (Jonathan Downie, The Research Whisperer)
- “Your CV would have to list retractions, with an explanation.” (Alan Finkel, The Conversation)
- “Are there really ‘only so many ways to write the Methods‘?” asks Stephen B. Heard. (Scientist Sees Squirrel)
- “The problem is not that I, as a gay white man, was told by a colleague in scholarly publishing that I ‘shouldn’t prance so much.'” (John Linton, The Scholarly Kitchen)
- Craig Wright, the “self-proclaimed Bitcoin creator purportedly borrowed mathematical equations from another paper…without giving citations.” (Neer Varshney, Hard Fork)
- “What happens when a scientific journal publishes information that turns out to be false?” asks Michael Spagat, referring to questionable data about deaths in Iraq due to the war fought there. (The Conversation)
Like Retraction Watch? You can make a tax-deductible contribution to support our growth, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
Will definitely consider making a donation. Need to read a few more issues to make up my mind.
Very interested in seeing a retraction by field category in so far as this can be done. My suspicion is that papers guided by comprehensive theory (i.e. physics, chemistry) are less likely to be retracted. Comments?
Would be delighted if you made a donation, thank you. Feel free to click through our archives for more posts — we’ve published more than 4,000.
Re: retractions by field, are you familiar with our database? It’s a far more comprehensive look at retractions than is possible on the blog. retractiondatabase.org
I wonder how much more limelight this pointless “Is reusing the two sentence paragraph already having the density of a neutron star in the Methods section of your new paper plagiarism?” pseudoargument will receive. I wish this was the most nagging problem with scientific publishing these days.
I like the way you select tempting quotes to entice us to click on articles. One minor suggestion though: reading this long list each week takes work. It might take less work if the articles were groups into categories so that thematically similar topics would appear adjacent.
I’m not sure what topics would be appropriate (systematic academic studies versus opinion pieces versus articles about particular incidents?). Regardless, I think some organization might help.
thanks for considering the idea!
Very interesting to see Spagat continue the spat over the Burnham et al. Lancet Iraq paper. Spagat published a mathematical model tendentiously claiming that Burnham’s results were due to a “Main Street Bias” whereby the sampling method used was more likely to find deaths because it *began* household sampling from a house selected two streets away from a main street (apparently such streets are “a natural habitat for patrols, convoys, police stations, road-blocks, cafes, and street-markets,” thereby meaning household members are somehow more likely to be casualties.)
The article makes several claims that are dubious – for instance it is not surprising that comprehensive Lancet casualty estimates differ greatly from the Iraq Body Count project, which counts newspaper reports of deaths – not every Iraqi death during the war was reported in an newspaper.
Burnham’s paper caused a great deal of anger and was politically inopportune. There were some minor corretions that had to be made to the original Lancet paper, and there were some difficulties regarding sharing data because of the confidentiality promised to respondants and the dangerous nature of Iraq at the time, but Spagat’s characterisation of it here is heavily biased.