This week at Retraction Watch featured the retraction of a widely covered paper on marriage and illness, and the resignation of a high-profile lab head in Toronto. Here’s what was happening elsewhere:
- “Is there fame bias in editorial choice?” A debate continues (paywalled).
- Retractions are “indicators of science’s unique and growing penchant for telling the truth,” write David Broockman and Joshua Kalla, who uncovered the fraud in the LaCour Science paper on gay canvassing earlier this year.
- “Scientists are hoarding data, and it’s ruining medical research,” says Ben Goldacre.
- Should all researchers start posting their work on pre-print servers? Michael Eisen comments on a piece by Ron Vale. Relevant: Climate scientist James Hansen and colleagues basically did just that earlier this week with a piece of research they hoped would influence upcoming talks.
- “Getting the killer paper is not as important as doing science properly and rigorously, because you want to contribute positively to the scientific community, not find yourself on Retraction Watch,” says Nature Chemical Biology editor Catherine Goodman.
- Want a refund of your (hidden) author fees? Just cite your own paper six times, says Chinese Chemical Letters (via Jeffrey Beall).
- A tour of academia’s silly side, from Glen Wright of Academia Obscura.
- Researchers who patent their work tend to have higher h-indices, according to a new study (paywalled).
- When submitting a manuscript, “does the cover letter really matter?” (paywalled)
- Another way to spot SCIgen-generated manuscripts, from Diego Raphael Amancio (paywalled).
- “Some doctors don’t read medical journals.” Louise Radnofsky reports on why many pregnant women get ultrasounds they don’t need.
- That “killer kale” meme is based on terrible science, says Julia Belluz.
- “CaseLabs, a manufacturer of computer cases, has issued a letter of retraction and apology for its recent attacks upon competitor Thermaltake, Inc., another manufacturer of computer cases and other computer components.”
- “Following cries of foul play,” Ohio “retracted its recent evaluations for charter-school sponsors, citing concerns with ‘methodology’ and a desire to make sure the ratings are ‘credible, accurate and compliant’ with state law,” The Cincinnati Enquirer reports.
- Science needs more female mice, argues The New York Times.
- Nature ran with another short embargo for journalists this week. This time, Science played along, too.
- “How Authorship Is Defined by Multiple Publishing Organizations and STM Publishers,” from Jaime A. Teixeira da Silva and Judit Dobránszki (paywalled).
- When it comes to reporting on climate change, journalists “have radically redefined the component of objectivity known as ‘balance,'” report Sara Shipley Hiles and Amanda Hinnant. “They now advocate a ‘weight-of-evidence’ approach, where stories reflect scientific consensus.” (paywalled)
- A study by David R. Johnson and Elaine Howard Ecklund suggests “that ethics training in science should focus not only on fabrication, falsification, and plagiarism and more routine forms of misconduct, but also on strategies for resolving ethically ambiguous scenarios where appropriate action may not be clear.” (paywalled)
- A debate over whether books on Hinduism should be banned has led to charges of plagiarism against one of the proponents.
- “The media in China was so misled that it ranked one sponsored story among world top 10 news.” A report on university-sponsored supplements in leading journals (paywalled).
- Russia’s “escalating encroachment on democratic freedoms undermines the nation’s claim of support for science,” writes .
- “Global Advanced Research Journals” are none of the above, says Jeffrey Beall.
- Could a new tool strengthen the voice of science in journalism? Emmanuel Vincent hopes so.
- In The Pipeline, a popular blog about chemistry and related issues that often discusses fraud and scientific error, is moving to Science Translational Medicine.
- How are academics portrayed in children’s books? An interview with Melissa Terras.
- Three ways to survive grant writing, from Jessica Breland.
- A group of scientists has joined forces to create a database of flawed chemical reagents, to save others times and money.
- When it comes to statistics, “Be on full alert when you see something you really want, or don’t want, to believe,” says Hilda Bastian. “The biggest bias we have to deal with is our own.”
- The U.S. Department of Health and Human Services’ definition of statistical significance is pretty bad, says Andrew Gelman.
- One consequence of open data sharing, from Jack Gallant: “*sigh*. A paper that we’ve been working on for 1.5 years was just scooped by another lab that re-analyzed open data from my lab. Ah well…”
- The American Society for Biochemistry and Molecular Biology — which publishes the JBC — offers “good practices for reporting numerical and statistical results.”
Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post. Click here to review our Comments Policy.
Related to the Moustafa JSEE’s paywalled paper on the relevance of the cover letter, I have also provided an alternative open access perspective elsewhere:
Teixeira da Silva, J.A. (2015) Make the cover letter extinct. Journal of Educational and Social Research 5(2): 11-12.
http://www.mcser.org/journal/index.php/jesr/article/view/6549
DOI: 10.5901/jesr.2015.v5n2p11
“Scientists are hoarding data, and it’s ruining medical research,” says Ben Goldacre.
The title of the article seems very misleading given the content. Goldacre’s main issue, stated towards the end of the article, is that “there is a replication crisis throughout research”, since the original analysis often doesn’t hold up after independent reanalysis. That may be true, but his case that it’s due to “hoarding” is weak at best. Goldacre doesn’t define “hoarding”, but suggests it happens if scientists refuse to hand over an entire data set (“all too often the original researchers duck, dive, or simply ignore requests”; “a trial withheld by a zealot, or by a company with money to lose from transparency”)
In two of the three cases he describes (both related to deworming), Goldacre makes no suggestions that authors involved refused to hand over data. In fact, Goldacre specifically praises Edward Miguel and Michael Kremer for handing over all original data from the 2004 study to independent reviewers in 2013. He also praises Richard Doll and Richard Peto for their large and methodologically sound study. The only criticism of Doll & Peteo is that their data checking and analysis took many years – due to lack of funding – and the results were not published in time for a recent Cochrane Review. There was no suggestion they hoarded data.
The Johnson et al paper on ethical ambiguity in science is paywalled at the journal Science and Engineering Ethics, but they posted a commentary in Physics Today on the same study that is freely accessible. The link is here:
http://scitation.aip.org/content/aip/magazine/physicstoday/article/68/6/10.1063/PT.3.2796