Weekend reads: Science is “show me,” not “trust me;” pressure to publish survey data; what peer review misses

Before we present this week’s Weekend Reads, a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance.

The week at Retraction Watch featured the University of Alabama’s request for 20 retractions of papers by one of its former researchers; a sturgeon researcher who’s up to 13 retractions for fake peer review; and what happens when researchers from several high-profile institutions can’t reproduce findings. Here’s what was happening elsewhere:

Like Retraction Watch? You can make a tax-deductible contribution to support our growth, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

2 thoughts on “Weekend reads: Science is “show me,” not “trust me;” pressure to publish survey data; what peer review misses”

  1. “Professor Moosa said the fact the Retraction Watch website existed was evidence of the growing problem of academic misconduct.” (Erica Servini, The Australian)

    This illustrates why the written scientific literature is the heart of science and not oral debate. No matter how smart you are it is so easy to say something that is obviously stupid in writing.

    Those who do not see the fallacy: could you start a Moosa Watch blog. That would then be evidence that there is something wrong with Australian science.

    P.S. It would be nice if links to paywall sites were marked as such.

  2. “39.2% revealed having been pressured by a principle investigator or collaborator to produce “positive” data. 62.8% admitted that the pressure to publish influences the way they report data.”

    To give such numbers with three digits is justified if the study included on the order of 1 000 000 or more participants, to compensate for the shot noise alone. But then, it is well-known that 67.9273446567326232 % of all statistical studies report way too many digits.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.