Archive for the ‘psychology’ Category
The study, published today in the Proceedings of the National Academy of Sciences, used data from the psychology replication project, which found only 39 out of 100 experiments live up to their original claims. The authors conclude that more “contextually sensitive” papers — those whose background factors are more likely to affect their replicability — are slightly less likely to be reproduced successfully.
They summarize their results in the paper:
After PLOS ONE allowed authors to remove a dataset from a paper on chronic fatigue syndrome, the editors are now “discussing the matter” with the researchers, given the journal’s requirements about data availability.
As Leonid Schneider reported earlier today, the 2015 paper was corrected May 18 to remove an entire dataset; the authors note that they were not allowed to publish anonymized patient data, but can release it to researchers upon request. The journal, however, requires that authors make their data fully available.
Scientific fraud isn’t what keeps Andrew Gelman, a professor of statistics at Columbia University in New York, up at night. Rather, it’s the sheer number of unreliable studies — uncorrected, unretracted — that have littered the literature. He tells us more, below.
Whatever the vast majority of retractions are, they’re a tiny fraction of the number of papers that are just wrong — by which I mean they present no good empirical evidence for their claims.
I’ve personally had to correct two of my published articles. Read the rest of this entry »
The Open Science Framework (OSF) has pulled a dataset from 70,000 users of the online dating site OkCupid over copyright concerns, according to the study author.
The release of the dataset generated concerns, by making personal information — including personality traits — publicly available.
Emil Kirkegaard, a master’s student at Aarhus University in Denmark, told us that the OSF removed the data from its site after OkCupid filed a claim under the Digital Millennium Copyright Act (DMCA), which requires the host of online content to remove it under certain conditions. Kirkegaard also submitted a paper based on this dataset to the journal he edits, Open Differential Psychology. But with the dataset no longer public, the fate of the paper is subject to “internal discussions,” he told us.
After a journal began tagging papers that adopted open science practices — such as sharing data and materials — a few other scientists may have been nudged into doing the same.
In January 2014, Psychological Science began rewarding digital badges to authors who committed to open science practices such as sharing the data and materials. A study published today in PLOS Biology looks at whether publicizing such behavior helps encourage others to follow their leads.
The authors summarize their main findings in the paper:
Read the rest of this entry »
The case raises important questions about when retractions are appropriate, and whether they can have a chilling effect on scientific discourse. Although Hanna Kokko of the University of Zurich, Switzerland — who co-authored both papers — agreed that the academic literature needed to be corrected, she didn’t want to retract the earlier paper; the journal imposed that course of action, said Kokko.
It’s a somewhat unusual notice — it explains that the paper has been retracted and replaced with a new, corrected version.
The study, which included 452 adults with major depressive disorder, concluded that cognitive therapy plus medication works better to treat depression than pills alone. But after it was published, a reader pointed out that some of the numbers in a table were incorrect. The authors reviewed the data and redid their analysis, and discovered “a number of pervasive errors.”
The notice (termed “notice of retraction and replacement”) explains the consequences of those errors:
How easy is it to change people’s minds? In 2014, a Science study suggested that a short conversation could have a lasting impact on people’s opinions about gay marriage – but left readers disappointed when it was retracted only months later, after the first author admitted to falsifying some of the details of the study, including data collection. We found out about the problems with the paper thanks to Joshua Kalla at the University of California, Berkeley and David Broockman at Stanford University, who tried to repeat the remarkable findings. Last week, Kalla and Broockman published a Science paper suggesting what the 2014 paper showed was, in fact, correct – they found that 10-minute conversations about the struggles facing transgender people reduced prejudices against them for months afterwards. We spoke with Kalla and Broockman about the remarkable results from their paper, and the shadow of the earlier retraction.
Retraction Watch: Let’s start with your latest paper. You found that when hundreds of people had a short (average of 10 minutes) face-to-face conversation with a canvasser (some of whom were transgender), they showed more acceptance of transgender people three months later than people with the same level of “transphobia” who’d talked to the canvasser about recycling. Were you surprised by this result, given that a similar finding from Michael LaCour and Donald Green, with same-sex marriage, had been retracted last year? Read the rest of this entry »
We all know replicability is a problem – consistently, many papers in various fields fail to replicate when put to the test. But instead of testing findings after they’ve gone through the rigorous and laborious process of publication, why not verify them beforehand, so that only replicable findings make their way into the literature? That is the principle behind a recent initiative called The Pipeline Project (covered in The Atlantic today), in which 25 labs checked 10 unpublished studies from the lab of one researcher in social psychology. We spoke with that researcher, Eric Uhlmann (also last author on the paper), and first author Martin Schweinsberg, both based at INSEAD.
Retraction Watch: What made you decide to embark upon this project? Read the rest of this entry »
An unusual article that considered the concept of change from a systems perspective — including change in medicine, economics, and decision-making, for instance — has, well, changed from “published” to “retracted.”
After commenters on PubPeer called the 2014 paper “gibberish” and even suggested it might be computer-generated, Frontiers in Computational Neuroscience retracted it, noting it “does not meet the standards of editorial and scientific soundness” for the journal, according to the retraction notice. The paper’s editor and author maintain there was nothing wrong with the science in the paper.