This week at Retraction Watch featured some big numbers: How a request to correct a single paper turned into 19 retractions, and 18 tips for giving horrible presentations. Here’s what was happening elsewhere:
- Paid peer reviewers: A new paper explores how they could promote scientific quality and offer new career paths. (bioRxiv)
- “The revolution will not be embargoed.” Scientific embargoes lets journals, universities — and government agencies, instead of journalists, decide what’s newsworthy, says our co-founder Ivan Oransky in Vox. But longtime PR pro Brian Reid warns against scrapping embargoes altogether. And “Publication before peer review could undermine the use of embargoed press releases, but the benefits outweigh the risks,” writes Stephen Curry. (ResearchResearch)
- The number of data repositories, and the actual data they contain, is growing too quickly for biomedical researchers to make sense of it. Richard Harris discusses the “embarrassment of riches” in biomedical research. (NPR)
- “It’s now clear that research is increasingly relying on very, very large datasets, and in Nature Physics we’re seeing a disconnect between the narrative contained in an article and the sheer amount of data, analysis and interpretation required to understand the paper,” says editor Andrea Taroni. (Rebecca Pool, Research Information)
- Post-publication criticism is vital to the scientific process, but it must be constructive if it is to do any good, says Nature.
- “He sent the paper to 11 different journals to be assessed 16 times by 22 reviewers before it was published and wrote thousands of emails about it.” Why did it take five years to get this paper published? (Holly Else, Times Higher Education)
- Scientific fraud isn’t the biggest problem in science. Our co-founders explain what is. (STAT)
- Should peer reviewers be responsible for catching scientific fraud or is that beyond their scope? asks Neuroskeptic. (Discover)
- Ten simple rules for structuring papers, courtesy of Konrad Kording and Brett Mensh. (bioRxiv)
- “Do highly productive researchers have significantly higher probability to produce top cited papers? Or do high productive researchers mainly produce a sea of irrelevant papers—in other words do we find a diminishing marginal result from productivity?” A new paper in PLOS ONE tries to answer.
- “The boom in co-authorship more than compensated for the drop in individual productivity.” Why research papers have more and more authors. (The Economist)
- There’s a biology journal that can teach physics journals a bit about peer review, says Raymond Goldstein. (arXiv)
- What makes a paper worthy of peer-review and acceptance by a journal? A Q&A with Helle V. Goldman, Chief Editor of Polar Research. (Editage Insights)
- “…BMC Psychology is launching a pilot to trial a new ‘results-free’ peer-review process, whereby editors and reviewers are blinded to the study’s results, initially assessing manuscripts on the scientific merits of the rationale and methods alone. The aim is to improve the reliability and quality of published research, by focusing editorial decisions on the rigour of the methods, and preventing impressive ends justifying poor means.”
- The peer review process is biased toward prominent researchers, says Brian Resnick. (Vox)
- “Reproducibility is a Cinderella problem, in need of some attention and investment if it’s to flourish.” (Mike Taylor, Digital Science)
- “Is the removal of this paper a victory for good sense over the irrational theory of vaccine denial? Or is it, on the contrary, censorship of a brave dissenting voice?” (Neuroskeptic, Discover)
- “Are footnotes a way to game the Impact Factor?” asks Zen Faulkes (NeuroDojo)
- Preprints have been slow to take root in biology, but lately efforts to increase their use in the life sciences have found some success. (Elie Dolgin, Nature)
- A new paper suggests ways to identify likely false-positive findings. (Biological Reviews)
- Psychological research could play a helpful role in assisting the Office of Research Integrity’s effort to understand the root causes of research misconduct, says ORI scientist-investigator Ann A. Hohmann. (Association for Psychological Science)
- “Time for Elsexit?” asks Tim Gowers, unpacking the recently negotiated deal between Elsevier and UK universities.
- Meta-analysis is rare in basic biology, but could help improve reproducibility, says Nature Methods.
- European Research Council grants are “biased against women,” according to a new analysis. (ResearchResearch)
- The idea that there’s little evidence for the benefits of flossing is based on “misconceptions about the relation between scientific research, evidence, and expertise,” says Jaime Holmes. (New York Times)
- “How can you evaluate a research paper?” Andrew Gelman has some ideas, but he’s not sure.
- Singapore universities have adopted a “unified set of standards for research publications, to advance research ethics.” (Laxmi Iyer, Biotechin Asia)
- More than 500 Wiley journals will now require ORCID IDs of submitting authors. (Press release) Major chemistry publishers are also joining the push. (Emma Stoye, Chemistry World, sub req’d)
- Elsevier has implemented the FORCE11 Joint Declaration of Data Citation Principles for more than 1,800 journals, which “means that authors publishing with Elsevier are now able to cite the research data underlying their article, contributing to attribution and encouraging research data sharing with research articles.” (Press release) More on how this will work at Elsevier Connect.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.
ORCID is run by a US organization, which is why the terms of use reflect their legal requirement to follow US embargo law: “If you are prohibited from receiving U.S. origin services or software, you may not use the Websites or Registry.” How does this restriction, when combined with an increasing requirement to have an ORCID, affect the ability to publish for researchers in embargoed countries like Cuba?