Retraction Watch

Tracking retractions as a window into the scientific process

Weekend reads: Why a vice-chancellor uses Impact Factors; plagiarizing principals; time to publish less?

with 4 comments

The week at Retraction Watch featured the tale of a scientist whose explanations for misconduct kept changing, and revelations in a big legal case involving Duke University. Here’s what was happening elsewhere:

Written by Ivan Oransky

July 1st, 2017 at 9:30 am

Posted in weekend reads

Comments
  • Andy Patterson July 2, 2017 at 5:25 pm

    “This seems challenging: The incoming principal of an Idahoan school resigns …”
    The school is in Montana, not Idaho — the principal was moving to Montana from a position in Idaho…

  • BG July 3, 2017 at 4:49 am

    Here is something to potentially add, Alexandra Elbakyan has posted on “Some facts on Sci-Hub that Wikipedia gets wrong”.

    https://engineuring.wordpress.com/2017/07/02/some-facts-on-sci-hub-that-wikipedia-gets-wrong/

  • Frederick Guy July 3, 2017 at 8:34 am

    Crossley (“I confess, I do look at impact factors”) writes as a manager of people whose areas of specialization he doesn’t understand, for which reason he needs numbers, even if the numbers aren’t very good. That could be read as an argument that that university decision making has become too centralized and managerial, but it does not seem that he intended it that way.

    Everybody must make decisions based on imperfect indicators, but it is a real problem when the indicators produce behavior that degrades the thing they are meant to measure. As Crossley must know, his comparison with the heights of basketball players ignores the simple fact that athletes don’t choose their heights, while academics’ publishing strategies are chosen, and are shaped by incentives. That these high-powered incentives are creating perverse outcomes is a central claim of those criticizing their use. Crossley allows this in passing (“…can be gamed in various ways…”), but in the end he gives exactly zero weight to the problem and leaves us with a picture of metrics as measurements that are not affecting what they measure: “simply indicators or messengers… hard, cold numbers.”

  • Albert Henderson July 5, 2017 at 3:28 pm

    Input = Output! It is well established that the number of papers published rises with the amount of money spent on academic research. The notion that quality is simply diluted by quantity disregards forest for trees. Meanwhile, politics of academy budgeteros encourages grant income but decries spending on the results.

  • Post a comment

    Threaded commenting powered by interconnect/it code.