Archive for the ‘psychology’ Category
After a journal began tagging papers that adopted open science practices — such as sharing data and materials — a few other scientists may have been nudged into doing the same.
In January 2014, Psychological Science began rewarding digital badges to authors who committed to open science practices such as sharing the data and materials. A study published today in PLOS Biology looks at whether publicizing such behavior helps encourage others to follow their leads.
The authors summarize their main findings in the paper:
Read the rest of this entry »
The case raises important questions about when retractions are appropriate, and whether they can have a chilling effect on scientific discourse. Although Hanna Kokko of the University of Zurich, Switzerland — who co-authored both papers — agreed that the academic literature needed to be corrected, she didn’t want to retract the earlier paper; the journal imposed that course of action, said Kokko.
It’s a somewhat unusual notice — it explains that the paper has been retracted and replaced with a new, corrected version.
The study, which included 452 adults with major depressive disorder, concluded that cognitive therapy plus medication works better to treat depression than pills alone. But after it was published, a reader pointed out that some of the numbers in a table were incorrect. The authors reviewed the data and redid their analysis, and discovered “a number of pervasive errors.”
The notice (termed “notice of retraction and replacement”) explains the consequences of those errors:
How easy is it to change people’s minds? In 2014, a Science study suggested that a short conversation could have a lasting impact on people’s opinions about gay marriage – but left readers disappointed when it was retracted only months later, after the first author admitted to falsifying some of the details of the study, including data collection. We found out about the problems with the paper thanks to Joshua Kalla at the University of California, Berkeley and David Broockman at Stanford University, who tried to repeat the remarkable findings. Last week, Kalla and Broockman published a Science paper suggesting what the 2014 paper showed was, in fact, correct – they found that 10-minute conversations about the struggles facing transgender people reduced prejudices against them for months afterwards. We spoke with Kalla and Broockman about the remarkable results from their paper, and the shadow of the earlier retraction.
Retraction Watch: Let’s start with your latest paper. You found that when hundreds of people had a short (average of 10 minutes) face-to-face conversation with a canvasser (some of whom were transgender), they showed more acceptance of transgender people three months later than people with the same level of “transphobia” who’d talked to the canvasser about recycling. Were you surprised by this result, given that a similar finding from Michael LaCour and Donald Green, with same-sex marriage, had been retracted last year? Read the rest of this entry »
We all know replicability is a problem – consistently, many papers in various fields fail to replicate when put to the test. But instead of testing findings after they’ve gone through the rigorous and laborious process of publication, why not verify them beforehand, so that only replicable findings make their way into the literature? That is the principle behind a recent initiative called The Pipeline Project (covered in The Atlantic today), in which 25 labs checked 10 unpublished studies from the lab of one researcher in social psychology. We spoke with that researcher, Eric Uhlmann (also last author on the paper), and first author Martin Schweinsberg, both based at INSEAD.
Retraction Watch: What made you decide to embark upon this project? Read the rest of this entry »
An unusual article that considered the concept of change from a systems perspective — including change in medicine, economics, and decision-making, for instance — has, well, changed from “published” to “retracted.”
After commenters on PubPeer called the 2014 paper “gibberish” and even suggested it might be computer-generated, Frontiers in Computational Neuroscience retracted it, noting it “does not meet the standards of editorial and scientific soundness” for the journal, according to the retraction notice. The paper’s editor and author maintain there was nothing wrong with the science in the paper.
High-profile social psychologist Jens Förster has earned two retractions following an investigation by his former workplace. He agreed to the retractions as part of a settlement with the German Society for Psychology (DGPs).
The papers are two of eight that were found to contain “strong statistical evidence for low veracity.” According to the report from an expert panel convened at the request of the board of the University of Amsterdam, following
an extensive statistical analysis, the experts conclude that many of the experiments described in the articles show an exceptionally linear link. This linearity is not only surprising, but often also too good to be true because it is at odds with the random variation within the experiments.
One of those eight papers was retracted in 2014. In November, the American Psychology Association received an appeal to keep two of the papers, and Förster agreed to the retractions of two more:
A psychiatric journal has pulled a 2014 paper that found electroconvulsive therapy and exercise helped people with depression, after the authors determined they had mistakenly analyzed the wrong data.
According to the retraction notice from the Journal of Psychiatric Research, the researchers had “erroneously analyzed” data from a previous study they had published the year before.
Here’s more from the note for “Electroconvulsive therapy and aerobic exercise training increased BDNF and ameliorated depressive symptoms in patients suffering from treatment-resistant major depressive disorder:” Read the rest of this entry »
A psychology journal is retracting a 2015 paper that attracted press coverage by suggesting women’s hormone levels drive their desire to be attractive, after a colleague alerted the last author to flaws in the statistical analysis.
The paper, published online in November, found women prefer to wear makeup when there is more testosterone present in their saliva. The findings were picked up by various media including Psychology Today (“Feeling hormonal? Slap on the makeup”), and even made it onto reddit.com.
However, upon discovering a problem in the analysis of the data, the authors realized that central finding didn’t hold up, according to Psychological Science‘s interim editor, Stephen Lindsay: Read the rest of this entry »
After a group of researchers noticed an error that affected the analysis of a survey of psychologists working with medical teams to help pediatric patients, they didn’t just issue a retraction — they published a commentary explaining what exactly went wrong.
The error was discovered by a research assistant who was assembling a scientific poster, and noticed the data didn’t align with what was reported in the journal. The error, the authors note, was:
an honest one, a mistake of not reverse coding a portion of the data that none of the authors caught over several months of editing and conference calls. Unfortunately, this error led to misrepresentation and misinterpretation of a subset of the data, impacting the results and discussion.
Needless to say, these authors — who use their “lessons learned” to help other researchers avoid similar missteps — earn a spot in our “doing the right thing” category. The retraction and commentary both appear in Clinical Practice in Pediatric Psychology.
Their first piece of advice in “Retraction experience, lessons learned, and recommendations for clinician researchers” — assume errors will happen, and not vice versa: Read the rest of this entry »