Archive for the ‘psychology’ Category
The editor’s note — which reads like an Expression of Concern — reiterates the journal’s policy that authors make data and materials available upon request, and notes that staff are following up on “concerns” raised about the study.
There have been numerous requests for data from the “PACE” trial, as the clinical trial is known, which the authors say they have refused in order to protect patient confidentiality. On November 13, James Coyne, a psychologist at the University Medical Center, Groningen, submitted a request for the data from the PLOS ONE paper to King’s College London, where some of the authors were based. According to Coyne’s WordPress blog (he also has a blog hosted by PLOS), the journal asked him to let them know if he “had any difficulties obtaining the data.” He did — KCL denied the request last Friday (the whole letter is worth reading):
The university considers that there is a lack of value or serious purpose to your request. The university also considers that there is improper motive behind the request. The university considers that this request has caused and could further cause harassment and distress to staff.
Last author Peter White at Queen Mary University of London, UK, told us the journal had not asked them to release the data, but he would work with PLOS to address any questions:
We understand PLOS One are following up concerns expressed about the article, according to their internal processes. We will be happy to work with them to address any queries they might have regarding the research.
Here’s the editor’s note for “Adaptive Pacing, Cognitive Behaviour Therapy, Graded Exercise, and Specialist Medical Care for Chronic Fatigue Syndrome: A Cost-Effectiveness Analysis,” in full:
Obesity has retracted a study that suggested overweight people may be less depressed than their slimmer counterparts in cultures where fat isn’t stigmatized, after realizing the authors lied about having ethical approval to conduct the research.
The authors claimed their research protocol had been approved by Norwegian and Bangladeshi ethical committees, but, according to the retraction note, part of the study “was conducted without the required approval of the university ethics board.” The journal’s managing editor told us that there is no evidence that there was harm to the study subjects.
Social psychologist Diederik Stapel has notched his 58th retraction, after admitting he fabricated data in yet another article.
He’s holding onto his 4th place spot on our leaderboard.
This latest retraction is for “Correction or comparison? The effects of prime awareness on social judgments,” published in the European Journal of Social Psychology. As usual for Stapel, this paper has been retracted because he fabricated data.
Here’s the note:
Making error detection easier – and more automated: A guest post from the co-developer of “statcheck”
We’re pleased to present a guest post from Michèle B. Nuijten, a PhD student at Tilburg University who helped develop a program called “statcheck,” which automatically spots statistical mistakes in psychology papers, making it significantly easier to find flaws. Nuijten writes about how such a program came about, and its implications for other fields.
Readers of Retraction Watch know that the literature contains way too many errors – to a great extent, as some research suggests, in my field of psychology. And there is evidence that problem is only likely to get worse.
To reliably investigate these claims, we wanted to study reporting inconsistencies at a large scale. However, extracting statistical results from papers and recalculating the p-values is not only very tedious, it also takes a LOT of time.
So we created a program known as “statcheck” to do the checking for us, by automatically extracting statistics from papers and recalculating p-values. Unfortunately, we recently found that our suspicions were correct: Half of the papers in psychology contain at least one statistical reporting inconsistency, and one in eight papers contain an inconsistency that might have affected the statistical conclusion.
The origins of statcheck began in 2011, Read the rest of this entry »
Following questions about the veracity of multiple papers by his former employer, high-profile social psychologist Jens Förster has agreed to retract two papers as part of a deal with the German Society for Psychology (DGPs).
Last year, Förster had a paper retracted at the request of his former employer, the University of Amsterdam (UvA). In May, an investigation commissioned by UvA found that many of his experiments looked “too good to be true,” and eight papers showed strong signs of “low veracity.”
Just two of those papers are acknowledged in the settlement of a case by the DGPs against Förster, who currently works at Ruhr University Bochum. Here’s a translation of a notice from the DGPs from One Hour Translation:
A study that found a 15-fold increase in the rate of sexual trauma among men in the U.S. military — and sparked suggestions of “an epidemic of male-on-male sex crimes” in the military among conservative media outlets — has been retracted because of a flaw in the analysis.
The study, published just last week, appeared in Psychological Services, an American Psychological Association (APA) journal. In an announcement Sunday titled “American Psychological Association Retracts Article Positing Excessively High Rates of Sexual Trauma Among Military Men,” the APA said that “Scholars raised valid concerns regarding the design and statistical analysis which compromise the findings.” Here’s the text: Read the rest of this entry »
A paper published in August that caught the media’s eye for concluding that feeling sad influences how you see colors has been retracted, after the authors identified problems that undermined their findings.
The authors explain the problems in a detailed retraction note released today by Psychological Science. They note that they found sadness influenced how people see blues and yellows but not reds and greens, but they needed to compare those findings to each other in order to prove the validity of the conclusion. And once they performed that additional test, the conclusion no longer held up.
In the retraction note for “Sadness impairs color perception,” the editor reinforces that there was no foul play:
A disability journal is “paying significant attention” to papers authored by Anna Stubblefield, a former Rutgers researcher recently convicted of sexually assaulting a disabled man who participated in her research.
Stubblefield was convicted of sexually assaulting “DJ,” a man in his thirties with cerebral palsy who was “declared by the state to have the mental capacity of a toddler,” according to a lengthy piece in the New York Times. Stubblefield and DJ published papers in Disability Studies Quarterly; in one, Stubblefield describes a controversial technique which she claimed helped DJ communicate. But when she eventually used the technique to say DJ was in love with her, his family took her to court, and she was convicted of aggravated sexual assault.
In the original paper, the authors claimed that three out of eight patients who underwent a procedure that used gamma rays to kill brain cells showed improvements 12 months later (versus zero in the group who underwent a “sham” procedure). But after a reader noticed an “inadvertent” error in the calculation of how many patients had improved, the authors realized that only two of the patients had responded meaningfully to the procedure.
The new results “did not reach statistical significance,” the authors write in a “Notice of Retraction and Replacement.” JAMA Psychiatry published it yesterday, along with a new version of the article, a letter from psychiatrist Christopher Baethge pointing out the error, and an editorial. The original article is available in the supplemental material of the new version, with the errors highlighted.
Here’s the note in full for “Gamma ventral capsulotomy for obsessive-compulsive disorder: a randomized clinical trial,” which explains the error:
A study that looked at how entrepreneurs’ confidence levels change depending on market conditions has been corrected to fix an error that flipped the results of one of the experiments.
The paper was published in 2013 by the Strategic Management Journal, and explored how entrepreneurs stay confident in difficult marketplaces by studying how people reacted to tasks of varying difficulty. In one experiment, participants were asked how well they thought they did on an easy quiz and how well they did on a hard quiz. Results showed that “participants underestimated their scores on the easy quiz” and “overestimated their performance on the difficult quiz.” However, authors wrote the opposite in the final paper.
Here’s the correction notice for “Making Sense of Overconfidence in Market Entry”: