Retraction Watch

Tracking retractions as a window into the scientific process

Archive for the ‘psychology’ Category

Context matters when replicating experiments, argues study

with one comment

PNASBackground factors such as culture, location, population, or time of day affect the success rates of replication experiments, a new study suggests.

The study, published today in the Proceedings of the National Academy of Sciences, used data from the psychology replication project, which found only 39 out of 100 experiments live up to their original claims. The authors conclude that more “contextually sensitive” papers — those whose background factors are more likely to affect their replicability — are slightly less likely to be reproduced successfully.

They summarize their results in the paper:

Read the rest of this entry »

Written by Dalmeet Singh Chawla

May 23rd, 2016 at 3:00 pm

PLOS editors discussing authors’ decision to remove chronic fatigue syndrome data

with 15 comments

After PLOS ONE allowed authors to remove a dataset from a paper on chronic fatigue syndrome, the editors are now “discussing the matter” with the researchers, given the journal’s requirements about data availability.

As Leonid Schneider reported earlier today, the 2015 paper was corrected May 18 to remove an entire dataset; the authors note that they were not allowed to publish anonymized patient data, but can release it to researchers upon request. The journal, however, requires that authors make their data fully available.

Here’s the correction notice: Read the rest of this entry »

Written by Alison McCook

May 20th, 2016 at 3:45 pm

Retractions aren’t enough: Why science has bigger problems

with 26 comments

Andrew Gelman

Andrew Gelman

Scientific fraud isn’t what keeps Andrew Gelman, a professor of statistics at Columbia University in New York, up at night. Rather, it’s the sheer number of unreliable studies — uncorrected, unretracted — that have littered the literature. He tells us more, below.

Whatever the vast majority of retractions are, they’re a tiny fraction of the number of papers that are just wrong — by which I mean they present no good empirical evidence for their claims.

I’ve personally had to correct two of my published articles.   Read the rest of this entry »

Publicly available data on thousands of OKCupid users pulled over copyright claim

with 15 comments

okcupidThe Open Science Framework (OSF) has pulled a dataset from 70,000 users of the online dating site OkCupid over copyright concerns, according to the study author.

The release of the dataset generated concerns, by making personal information — including personality traits — publicly available.

Emil Kirkegaard, a master’s student at Aarhus University in Denmark, told us that the OSF removed the data from its site after OkCupid filed a claim under the Digital Millennium Copyright Act (DMCA), which requires the host of online content to remove it under certain conditions. Kirkegaard also submitted a paper based on this dataset to the journal he edits, Open Differential Psychology. But with the dataset no longer public, the fate of the paper is subject to “internal discussions,” he told us.

In place of the dataset on OSF, this message now appears: Read the rest of this entry »

Written by Alison McCook

May 16th, 2016 at 3:10 pm

“I shared:” Can tagging papers that share data boost the practice?

without comments

Psychological Science

After a journal began tagging papers that adopted open science practices — such as sharing data and materials — a few other scientists may have been nudged into doing the same.

In January 2014, Psychological Science began rewarding digital badges to authors who committed to open science practices such as sharing the data and materials. A study published today in PLOS Biology looks at whether publicizing such behavior helps encourage others to follow their leads. 

The authors summarize their main findings in the paper:
Read the rest of this entry »

Written by Dalmeet Singh Chawla

May 12th, 2016 at 2:00 pm

Biologist critiques own paper, journal retracts it — against her wishes

with 10 comments

Evolution Cover ImageThe journal Evolution has retracted a 2007 paper about the roles of the different sexes in searching for mates, after the same author critiqued the work in a later paper. 

The case raises important questions about when retractions are appropriate, and whether they can have a chilling effect on scientific discourse. Although Hanna Kokko of the University of Zurich, Switzerland — who co-authored both papers — agreed that the academic literature needed to be corrected, she didn’t want to retract the earlier paper; the journal imposed that course of action, said Kokko

Let’s take a look at the retraction note: Read the rest of this entry »

Authors retract, replace highly cited JAMA Psych paper for “pervasive errors”

with 6 comments

JAMA PsychiatryAuthors have retracted a highly cited JAMA Psychiatry study about depression after failing to account for some patient recoveries, among other mistakes.

It’s a somewhat unusual notice — it explains that the paper has been retracted and replaced with a new, corrected version.

The study, which included 452 adults with major depressive disorder, concluded that cognitive therapy plus medication works better to treat depression than pills alone. But after it was published, a reader pointed out that some of the numbers in a table were incorrect. The authors reviewed the data and redid their analysis, and discovered “a number of pervasive errors.”

The notice (termed “notice of retraction and replacement”) explains the consequences of those errors:

Read the rest of this entry »

“Science advances incrementally:” Researchers who debunked gay canvassing study move field forward

without comments

David Broockman

Joshua Kalla

How easy is it to change people’s minds? In 2014, a Science study suggested that a short conversation could have a lasting impact on people’s opinions about gay marriage – but left readers disappointed when it was retracted only months later, after the first author admitted to falsifying some of the details of the study, including data collection. We found out about the problems with the paper thanks to Joshua Kalla at the University of California, Berkeley and David Broockman at Stanford University, who tried to repeat the remarkable findings. Last week, Kalla and Broockman published a Science paper suggesting what the 2014 paper showed was, in fact, correct – they found that 10-minute conversations about the struggles facing transgender people reduced prejudices against them for months afterwards. We spoke with Kalla and Broockman about the remarkable results from their paper, and the shadow of the earlier retraction.

Retraction Watch: Let’s start with your latest paper. You found that when hundreds of people had a short (average of 10 minutes) face-to-face conversation with a canvasser (some of whom were transgender), they showed more acceptance of transgender people three months later than people with the same level of “transphobia” who’d talked to the canvasser about recycling. Were you surprised by this result, given that a similar finding from Michael LaCour and Donald Green, with same-sex marriage, had been retracted last year? Read the rest of this entry »

Written by Alison McCook

April 14th, 2016 at 9:30 am

What if we tried to replicate papers before they’re published?

with 12 comments

Martin Schweinsberg

Martin Schweinsberg

Eric Uhlmann

Eric Uhlmann

We all know replicability is a problem – consistently, many papers in various fields fail to replicate when put to the test. But instead of testing findings after they’ve gone through the rigorous and laborious process of publication, why not verify them beforehand, so that only replicable findings make their way into the literature? That is the principle behind a recent initiative called The Pipeline Project (covered in The Atlantic today), in which 25 labs checked 10 unpublished studies from the lab of one researcher in social psychology. We spoke with that researcher, Eric Uhlmann (also last author on the paper), and first author Martin Schweinsberg, both based at INSEAD.

Retraction Watch: What made you decide to embark upon this project? Read the rest of this entry »

Written by Alison McCook

March 31st, 2016 at 2:00 pm

Neuroscience journal retracts paper for lack of “scientific soundness”

with 7 comments

Screen Shot 2016-03-03 at 9.21.12 AMAn unusual article that considered the concept of change from a systems perspective — including change in medicine, economics, and decision-making, for instance — has, well, changed from “published” to “retracted.”

After commenters on PubPeer called the 2014 paper “gibberish” and even suggested it might be computer-generated, Frontiers in Computational Neuroscience retracted it, noting it “does not meet the standards of editorial and scientific soundness” for the journal, according to the retraction notice. The paper’s editor and author maintain there was nothing wrong with the science in the paper.

Here’s the full note for “Sensing risk, fearing uncertainty: systems science approach to change:” Read the rest of this entry »