This week at Retraction Watch featured a look at the huge problem of misidentified cell lines, a check-in with a company that retracted a paper as it was about to go public, and Diederik Stapel’s 58th retraction. Here’s what was happening elsewhere:
- What do PhDs earn, and where do they end up working? A new study in Science takes a look (sub req’d). Here’s Nature’s coverage of the paper.
- King’s College London doesn’t want to release data to James Coyne from a study of chronic fatigue syndrome. See if the absurd reasons make your blood boil as much as ours: “The university considers that there is a lack of value or serious purpose to your request. The university also considers that there is improper motive behind the request. The university considers that this request has caused and could further cause harassment and distress to staff.” And here’s more from someone else who submitted a related request for data.
- “Authorship abuse is the dark side of collaboration,” says Bruce Macfarlane in Times Higher Education.
- “Let’s stop pretending peer review works,” write Julia Belluz and Steven Hoffman at Vox.
- “Should post-publication peer review be anonymous?” Paul Benneworth, Philip Moriarty and the founders of PubPeer
’s Brandon Stelldebate in Times Higher Education. - Some scientists are removing journal titles from their publication lists. In a journal that begins with the letter N and ends in the letter E, Dalmeet Singh Chawla explains why.
- “Would you trust a plagiarizing doctor?” Our newest column for STAT.
- Pay peer reviewers, says Joseph Ting in The Australian (sub req’d).
- Weary of poor job prospects, postdocs are disappearing, writes Beryl Benderly in Science. And that’s just fine, says Josh Nicholson in STAT.
- Reporting statistical significance causes p-hacking, says Nicole Janz.
- “Our results also suggest that a bias toward accepting statements as true may be an important component of pseudo-profound bullshit receptivity.” A new study in Judgment and Decision Making.
- “Make Science More Reliable, Win Cash Prizes.” Ed Yong writes about the Leamer-Rosenthal Prizes for Open Social Science.
- “How do we fix bad science?” asks Laurie Zoloth in Cosmos.
- In ecology, “a rank order and hierarchy has been gradually formed among countries,” according to a new analysis in PeerJ.
- Patrick Harran, a UCLA professor who settled charges following the death of a research assistant in his lab, has been honored by the AAAS, The Daily Bruin reports.
- Will academic journals still exist in 2035? Cameron Neylon gives his take, looking back at the history of scientific publication.
- Dead metrics: Why won’t these go away? asks Jeffrey Beall.
- “Should a uniform checklist be adopted for methodological and statistical reporting?” asks Aner Tal in Public Understanding of Science (
sub req’dSAGE made this article freely available following our post). - Dutch universities and Elsevier have reached a deal over open access, reports Times Higher Education.
- “Many researchers don’t share their raw data like they’re supposed to,” reports Cynthia McKelvey at The Daily Dot, riffing on a new PLOS ONE paper.
- Journal self-citation practices revealed, by Philip Cohen.
- Does PubMed Central increase citations? Phil Davis takes a look at The Scholarly Kitchen.
- Peer reviewers should make open practices a pre-condition for more comprehensive review, say the creators of the Peer Reviewers’ Openness Initiative.
- Really, Justice Scalia? “Aspiring black scientists may do better in ‘lesser schools’ where they don’t feel that ‘they’re being pushed ahead in classes that are too fast for them,’” the Supreme Court justice said this week (STAT).
- The ResearchGate Score is a good example of a bad metric, say Peter Kraker, Katy Jordan and Elisabeth Lex.
- How did the new head of a Belgian research funding agency end up with a fake prize? (in Dutch)
- What does the “publishing exception” to U.S. trade sanctions laws really mean? Elsevier’s Mark Seeley discusses a recent response from the responsible agency.
- The Royal Society will now publish the citation distributions of all of their journals, reports Stephen Curry, and Nature Chemistry will do the same. The Society will also require ORCID IDs for authors, reports Times Higher Education.
- “The UK’s learned society of educational researchers has been accused of seeking to take ‘editorial control’ over one of its journals, prompting the resignation of several editorial team members,” Times Higher Education reports.
- What will anyone actually learn from teaching metrics? asks Athene Donald in Times Higher Education.
- Stop specializing so much, scientists, says Thomas Bateman in The Conversation. “Big ideas come from understanding the big picture and making cross-boundary connections, not only from eking out incremental advances in an esoteric subfield.”
- Here are twelve bad reasons for rejecting scientific studies, brought to you by The Logic of Science.
- “[T]he more elitist a journal, the more biased its decisions, unavoidably,” says Khaled Moustafa in Scientometrics (sub req’d).
- “Is science being skewed by a gender bias?” asks Joe Humphreys in The Irish Times.
- Not The Onion: “The Least Interesting Unit: A New Concept for Enhancing One’s Academic Career Opportunities.” A paper in Science & Engineering Ethics (sub req’d).
- Researchers are grappling with authorship issues, this time on social media, Dalmeet Singh Chawla reports for Nature.
- Neuroskeptic writes that “one of the most popular approaches to analyzing fMRI data is flawed,” according to a new pre-print.
Retractions Outside of The Scientific Literature:
- Ruffling some feathers: A retraction involving the Birds of Paradise Condomiums.
- Donald Trump has had an honorary degree revoked.
- “The story was wrong and should not have been published,” says The Financial Times of a piece on interest rates. “The article was one of two pre-written stories — covering different possible decisions — which had been prepared in advance of the announcement. Due to an editing error it was published when it should not have been.”
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post. Click here to review our Comments Policy.
Boris Barbour and Gabor Brasnjo did most of the heavy lifting for our debate with Philip Moriarty. They did a great job and deserve the credit, not me.
There’s an error in the link about Patrick Harran’s election into AAAS reported in the Daily Bruin.
Fixed, thanks.
Delighted to see you bringing attention to the refusal of the authors of the £5 million, taxpayer-funded PACE trial of exercise therapy for chronic fatigue syndrome to hand over their data to Professor James Coyne.
Professor Coyne requested the data of one of PACE’s papers that was published in PLOS One. He cited PLOS One’s requirement that authors who submit papers to the journal agree to share their data.
The PACE authors bizarrely treated this as a request under the Freedom of Information Act rather than under PLOS One’s rules, made him wait the maximum 20 working days for no good reason, and then refused his request as “vexatious”.
Patients, who have long criticised the appallingly bad science in PACE, are hoping that PLOS One will be the first scientific institution to stand up to the PACE authors and will enforce their data-sharing policy on pain of retraction of the paper.
Patients want good science: and they want bad science exposed. The PACE authors are dragging their own reputations through the dirt, that of their universities, and that of this £5 million trial, in their insistence on keeping the data from independent researchers.
Some may find my most recent papers published this week of interest.
Teixeira da Silva, J.A. (2015) What’s not being discussed, or considered, in science publishing? The Journal of Microbiology & Biology Education 16(2): 130-132.
http://jmbe.asm.org/index.php/jmbe/article/view/928/pdf_221
DOI: 10.1128/jmbe.v16i2.928
Teixeira da Silva, J.A., Dobránszki, J. (2015) The role of the anonymous voice in post-publication peer review versus traditional peer review. KOME 3(2): 90-94.
http://komejournal.com/files/KOME_Silva-Dobranszki.pdf
DOI: 10.17646/KOME.2015.27
Please beware of what you wish for: peer review is like democracy – a flawed and sometimes ridiculous system that is much better than the alternatives
Sacha noted: “Professor Coyne requested the data of one of PACE’s papers that was published in PLOS One. He cited PLOS One’s requirement that authors who submit papers to the journal agree to share their data. ” That being true, the obvious action of the authors or PLOS editors is to retract their paper in PLOS, because it violates the journal’s requirements. It is an interesting situation.
I don’t know about a retraction at this time, but perhaps PLOS editors are considering issuing an expression of concern. They have done so in at least one occasion in the recent past for an instance in which authors failed to share a strain of bacteria.
The PLOS ONE Editors (2014) Expression of Concern: Bacillus pumilus Reveals a Remarkably High Resistance to Hydrogen Peroxide Provoked Oxidative Stress. PLoS ONE 9(7): 3100716. doi:10.1371/journal.pone.0100716
Reading the “vexatious” claim by the university for a request to release data reminds me of Yogi Berra’s deja vu all over again.
https://en.wikipedia.org/wiki/Freedom_of_Information_requests_to_the_Climatic_Research_Unit
http://www.theguardian.com/environment/georgemonbiot/2010/apr/08/hacked-emails-freedom-of-information
The fact that large organizations (including publishers) want to be secretive and protect their own, even when it involves disregarding their own policies, shouldn’t come as a surprise to anyone.
It’s what they do.
See https://jcoynester.wordpress.com/2015/12/12/formal-request-to-plos-one-to-issue-an-expression-of-concern-for-pace-cost-effectiveness-study/ for a formal request to PLOS ONE to issue an Expression of Concern. I also propose in this request to retract the paper if Professor Coyne has not received full access to all raw research data within one month.
See http://www.plosone.org/annotation/listThread.action?root=87754 for a “Notification from PLOS staff” relating to http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0040808
(…..) “PLOS staff are following up on the different concerns raised about this article as per our internal processes. As part of our follow up we are seeking further expert advice on the analyses reported in the article, and we will evaluate how the request for the data from this study relates to the policy that applies to the publication. These evaluations will inform our next steps as we look to address the concerns that have been noted.”
Oh wow, I was reporting 12 years ago on the mess that U.S. economic sanctions created for scientific publishing. Amazing that OFAC is still being called on for clarification. Hopefully it’s now clearer!
I don’t want to be picky, but isn’t the http://www.apache.be article about the Fake Prize in Flemish rather than Dutch?
Maybe. However, the wikipedia states [1]:
“Linguistically and formally, Flemish is not and does not refer to a current language or dialect but refers to the region, culture and people of (West) Belgium or Flanders. Flemish people speak (Belgian) Dutch in Flanders, the Flemish part of Belgium.”
From what I understand, the distance between Flemish and (Netherlands) Dutch is similar to that between French spoken in France and Belgium.
translate.google.com readily detects the apache article as being in Dutch.
[1] https://en.wikipedia.org/wiki/Flemish
Well, if wikipedia says it…
FYI google translate doesn’t have Flemish listed, and Dutch is the nearest thing. The two differ by about 10k words that are in common use.
The link to http://www.ncbi.nlm.nih.gov/Vaztw, labelled as “someone else who submitted a related request data” in the second item, persistently generates a “Can’t find the requested web page” response from the NCBI website.
Thanks for flagging that, the URL shortener appears to have stopped working. Fixed, along with the missing “for.”
A Tweet today by Elisabeth Bik adds an excellent reason for the need for anonymity in post-publication peer review:
“I enjoy anonymous peer review or posting on @PubPeer because those are rare occasions that I feel “equal” to men.”
https://twitter.com/MicrobiomDigest/status/676510351332999169
But why in post-publication peer review only but not in the first peer review for example, or even for an anonymous manuscript submission to avoid any bias?
The peer review is getting sick, so anonymous submission and anonymous peer-review may be a good remedy to reduce its damages.
A post peer review would transform the publication to vicious cycle or a continuous, endless quarrels => time waste.
A retraction may be brewing re. Harran’s AAAS Fellow election – http://chemjobber.blogspot.com/2015/12/patrick-harrans-nomination-as-aaas.html
Here is some background on the problem with Patrick Harran’s election:
http://wavefunction.fieldofscience.com/2015/12/the-aaass-bizarre-nomination-of-prof.html
===|==============/ Keith DeHavelle
Not a retraction, but the AAAS “will not move forward” with the nomination of Patrick Harran as an AAAS Fellow. http://www.aaas.org/news/aaas-chemistry-section-will-not-proceed-nomination-patrick-harran-fellow