The week at Retraction Watch featured a look at whether we have an epidemic of flawed meta-analyses, and the story of a strange case involving climate research and pseudonyms. Here’s what was happening elsewhere:
- Here’s how to use a new scientific finding to create a tabloid headline, in five easy steps. (Gretchen Vogel, Science)
- “Science is not supposed to work this way.” 50 years ago, the sugar industry paid for favorable research that shifted nutritionists’ attention away from the harmful effects of their product. (Melissa Bailey, STAT)
- Review articles: “The Black Market of Scientific Currency?” So argue Lee W. Cohnstaedt and Jesse Poland. (Annals of the Entomological Society of North America)
- These are parasitic jobs, and an unhealthy way of doing things.” As Iran has risen in science, so has a bazaar for buying theses and academic papers, reports Richard Stone in Science.
- “It’s a bit sarcastic.” But a proposal to build a monument to peer review is a bit serious as well. (Quirin Schiermeier, Nature)
- The EurekAlert! press release clearinghouse goes dark after being hacked, and our co-founder wonders whether the episode will show us a future without scientific journal embargoes. (Embargo Watch)
- “Under what circumstances could scientific misconduct constitute a civil or criminal wrong?” ask attorneys Callan Stein and Jim Kinnier-Wilson.
- “If authors are citing works from these predatory journals, do they really care about the science?” asks Gary W. Miller. Or is the entire process like the Gold Rush: sifting through data looking for anything that glitters? (Toxicological Sciences)
- “Capitalist science is the solution to socialist science’s replication process,” argues Bruce Knuteson. Isn’t money already a powerful force? (ArXiv)
- If today’s discussion over preprints sounds familiar, maybe that’s because of this suggestion from nearly 50 years ago. (Misha Angrist, Twitter)
- These are two incredibly productive researchers, courtesy of Jeffrey Beall.
- “If you open up any journal to see if you can get the data, for most articles you can’t.” Monya Baker of Nature talks to Victoria Stodden, a new “reproducibility editor” at a statistics journal.
- Medical research can’t progress if clinical trial results are suppressed, so please share, writes Anita Slomski. (Proto)
- If research is funded by the taxpayer, it should be freely available to the public, right? It’s not that simple, writes Jeffery Salmon of the U.S. Department of Energy.
- The Lancet and The BMJ are “at war over statins.” (Larry Husten, CardioBrief)
- Confused on what exactly post-publication peer review is? Here’s a handy explanation. (Tony Ross-Hellauer, OpenAIRE)
- Maybe there should be a special data access rate for anyone calling from Retraction Watch? A bit of fun from Andrew Gelman.
- New federal U.S rules aim to improve the dismal reporting of clinical trial results, even for drugs and devices that never make it to market. (Charles Piller, STAT)
- The new editor of The Leadership Quarterly – a journal that has seen its share of retractions – wants to make some changes.
- “So be very careful when you think, “this is a good study,” says Hilda Bastian. “That’s a big trap.”
- “You may not be interested in peer review,” says Andrew Gelman, “but peer review is interested in you.”
- What do you think about open peer review? OpenAIRE would like to know.
- What role does gender bias play in the peer review process? asks Emma Sayer. (Wiley Exchanges)
- Should citations be normalized across disciplines? asks John Ioannidis and colleagues.
- “As long as public engagement remains unimportant in the process of hiring, tenure, and grant funding,” says John Hawks, “the students and early career academics who are most out of touch will continue to win positions and prestige, and students who succeed at engagement and public impact will continue to seek other opportunities.”
- “How can editors and editorial boards improve the peer-review system?” Pamela Silver introduces a new series of editorials. (Freshwater Science)
- A better editorial staff could mean a better Impact Factor, say authors of an editorial in the Journal of Plastic, Reconstructive, and Aesthetic Surgery. (sub req’d)
- Two trends – “calls for dramatically increased emphasis on replicability as the core indicator of research quality and, on the other, listing everybody who had anything to do with a research project as an author of the project report” – could combine to lead to stagnation, argues Arthur Cropley. (Psychology of Aesthetics, Creativity, and the Arts, sub req’d)
- “Asking a Loch Ness Monster enthusiast to review a book on fraud in science: kinda weird, huh?” Andrew Gelman looks back a few decades.
- “[B]etter access to quality information resources for scientific researchers lead to an increase in its use and results in higher quality research shown by the gradual increase of publications of research results in higher impact scholarly journals,” concludes a new paper in the Journal of Librarianship and Information Science. (sub req’d)
- Heather Joseph looks at the “evolving U.S. policy environment for open research data.” (Information Services & Use)
- What do we know about the links between citations and open access? (Elsevier Connect)
- Margaret Kosmala explains what pushed her to post a preprint. (Ecology Bits)
- “Numerous variables can torpedo attempts to replicate cell experiments, from the batch of serum to the shape of growth plates,” writes Monya Baker in Nature. “But there are ways to ensure reliability.”
- “Manuscripts should include all the experimental and statistical details that are needed to replicate the experiments and analyses reported in them,” says M. Dawn Teare. (eLife)
- “The trend in big, interdisciplinary science is towards multiple authors on a single paper; in bioinformatics this has created hybrid or fractional scientists who find they are being positioned not just in-between established disciplines but also in-between as middle authors or, worse still, left off papers altogether.” (Minerva)
- What’s the future of peer review? A variety of answers from the contributors to The Scholarly Kitchen.
- “A new librarian, an experienced librarian and an archivist got together to do a research project. What happened along the way was not quite what we expected.” (Letters to a Young Librarian)
- “Science journals have arcane publishing rules and unfair biases,” says Laurie Vasquez. “Here’s how scientists can fix them.” Neuroskeptic has another idea: Boycott.
- “The European Commission has announced long-awaited plans to make it easier for researchers to harvest facts and data from research papers,” writes Declan Butler in Nature, “by freeing the computer-aided activity from the shackles of copyright law.”
- We need “greater transparency on ethics committee decisions to improve trial design,” argue Jonathan Mendel and colleagues. (The BMJ)
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.
“The firm, in its flier, advertises a knack for placing manuscripts in journals published by Springer and Elsevier.”
http://www.sciencemag.org/news/2016/09/shady-market-scientific-papers-mars-iran-s-rise-science
Will Springer and Elsevier investigate, and retract such papers that do not reflect scientists’ own intellectual efforts but instead reflect their buying power?
I find the referencing used in Bruce Knuteson’s paper quite astonishing:
https://arxiv.org/pdf/1609.03223v1.pdf
For example, one small sentence is supported by no less than 41 references, as follows: “Incremental improvements to the current system are proposed [50–90] [163].”
I believe that this is a serious abuse of citations, and that possibly 40 references are receiving an undeserving citation. In my publishing rule book, one fact = one citation.
The link to “looks back a few decades.” (Gelman’s second entry) is incorrectly formatted.
Fixed — thanks.
On the Beall vs Frontiers saga:
https://forbetterscience.wordpress.com/2016/09/14/beall-listed-frontiers-empire-strikes-back/
Update on Philipp Jungebluth, Paolo Macchiarini’s co-author:
https://forbetterscience.wordpress.com/2016/09/12/macchiarini-acolyte-philipp-jungebluth-lost-surgeon-job-in-heidelberg/
In full appreciation of your hard sincere work on bringing the ongoing fraud / scam in Research studies into the notice of public, we wish to inform you that we cite your websites at every seminar and health course.
While plagiarism in research studies is stealing a personal property of someone, the deliberate fraudulent research studies is tantamount to making fake currency to cheat the entire humanity.
Best Regards.
Talat Kamal
http://www.unitedglobalhuman.org