The week at Retraction Watch featured revelations about a cancer researcher in Canada and an author’s worst nightmare come true. Here’s what was happening elsewhere:
- Meet one of the most-cited papers ever, with more than 150,000 citations. (Jack Grove, Times Higher Education)
- A journal is holding a competition — funded by Novartis — to reward positive peer-reviewed articles about a Novartis drug. (Larry Husten, Cardiobrief)
- James Hartley found that “to my surprise, some pieces that I felt had made major contributions were hardly cited at all.” So he cited 20 of them. (Scientometrics, sub req’d)
- The National Natural Science Foundation of China disclosed 117 cases of scientific misconduct, hoping to improve the country’s research integrity. (Global Times) See our coverage of misconduct in China here.
- Some scientific disciplines have way more listed authors per paper than others. PhD Comics explains.
- Are replications even worth it if most subsequent papers will only cite the original paper, asks Tom Coupe? (The Replication Network)
- “First, do not bring a blog post to a primary literature fight.” Never cite blog posts during peer review, argues Timothee Poisot. (Medium)
- A comparison of the Impact Factor and Elsevier’s new CiteScore metric. (Eigenfactor)
- Don’t discard the Impact Factor, say Lutz Bornmann and Alexander I. Pudovkin. (arXiv)
- Although “citation data and perceived impact do not align well” in a new survey, “when scientists have full information and are making unbiased choices, expert opinion on impact is congruent with citation numbers.” (arXiv)
- University College London launches an investigation into its ties with controversial surgeon Paolo Macchiarini. (Hannah Devlin, The Guardian) See our timeline of the case here.
- Sweden’s misconduct board will grow, partly because of an increase in recent cases including that of Paolo Macchariani. (Press release, in Swedish)
- German research libraries intend to cancel their Elsevier subscriptions over the publisher’s refusal to adopt open access practices. (Cory Doctorow, Boing Boing)
- “Ultimately I think the only only only solution here is post-publication review.” Pre-publication review is a waste of everyone’s time, argues Andrew Gelman.
- The rector of King Juan Carlos University plagiarized six other authors, the University of California says. (Pilar Alvarez, El Pais, in Spanish)
- Not all replications are created equal, and there are nuances to what a replication study means, says Dorothy Bishop.
- “Female biomedical scientists tend to publish fewer articles as last author than their male colleagues and accrue fewer citations per publication.” A new study tried to figure out why. (Epidemiology, sub req’d)
- “Students are affected by research metrics even before they enter university, most of the time without even knowing it.” (Matthias Tinzl, Metrics in Research)
- Do preprints “’count’ for research assessment? Is it ok to post preprints in more than one place?” Cameron Neylon et al discuss. (bioRxiv)
- “The Board of Trustees of the International Human Frontier Science Program Organization (HFSPO) has decided that for competitions starting in calendar year 2017, applicants may list preprint articles in the publication section of HFSP proposals.” (via Richard Sever)
- “Can the behavioral sciences self-correct?” asks Felipe Romero. (Studies in History and Philosophy of Science Part A, sub req’d)
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.
Weird to see you refer to the Bradford paper as the most cited paper in history.
According to Nature, some two years ago, it is a paper by Lowry that holds the distinction, with Bradford’s paper only in third place, behind a paper of Laemmli:
http://www.nature.com/news/the-top-100-papers-1.16224
Also using Google Scholar Bradford is ‘only’ third.
THE gets it right: it is “one of the most cited papers”.
Fixed, thanks.
Regarding the most cited paper in the story of science, the 1st author commented in an autobiographical account [1] that after the first submission of this “not very original” paper to the JBC, it was returned by the editors for drastic shortening. According to Oliver Lowry, “this shortening may have improved the paper, but forced us to omit some details that perhaps would have lessened the plethora of papers by others describing improvements and precautions”.
The case is cited in a very interesting study published by Juan Miguel Campanario (Univ. de Alcalá de Henares, Spain), who reviewed a collection of such high-profile papers which were first ignored by editors and/or readers [2].
[1] O. H. Lowry “How to Succeed in Research Without Being a Genius”, Annu. Rev. Biochem. 59 (1990), 1-27. doi: 10.1146/annurev.bi.59.070190.000245
[2] J. M. Campanario “Consolation for the Scientist: Sometimes It is Hard to Publish Papers That are Later Highly-Cited”, Social Studies of Science, 23 (1993) 342-362.
http://www.jstor.org/stable/285483
(paywalled, however I have a hardcopy, if someone is interested).