This week in Nature, Daniele Fanelli at Stanford made an interesting proposal: Set up a system of “self-retraction” that makes it crystal clear when a paper is pulled for honest error, rather than misconduct.
Fanelli, a whose work we have frequently covered, rightly notes that honest error represents a minority of retractions — around 20%. To remove any hint that a paper contains misconduct, Fanelli proposes designating self-retractions as those where all authors sign the retraction note:
To remove ambiguities, journal policies should allow authors to sign only retractions that the researchers have solicited spontaneously, because of a documentable flaw. In all other cases, retraction notes should not be signed — at least not by the authors recognized as responsible for misconduct.
Of course, authors would have to provide proof that the mistakes were due to honest error, rather than misconduct. Once that’s been validated, they should be praised for correcting the record, he notes in “Set up a ‘self-retraction’ system for honest errors:”
Self-retractions should be considered legitimate publications that scientists would treat as evidence of integrity. Self-retractions from prestigious journals would be valued more highly, because they imply that a higher sacrifice was paid for the common good. Scientists who committed misconduct would be unable to benefit.
Indeed, we’ve seen evidence to suggest that researchers who initiate their own retractions after discovering mistakes suffer no citation penalty as a result. In other words, “doing the right thing” is simply that — the right thing.
What do you think of Fanelli’s proposal? Tell us in the poll below.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.