What if we tried to replicate papers before they’re published?

Martin Schweinsberg
Martin Schweinsberg
Eric Uhlmann
Eric Uhlmann

We all know replicability is a problem – consistently, many papers in various fields fail to replicate when put to the test. But instead of testing findings after they’ve gone through the rigorous and laborious process of publication, why not verify them beforehand, so that only replicable findings make their way into the literature? That is the principle behind a recent initiative called The Pipeline Project (covered in The Atlantic today), in which 25 labs checked 10 unpublished studies from the lab of one researcher in social psychology. We spoke with that researcher, Eric Uhlmann (also last author on the paper), and first author Martin Schweinsberg, both based at INSEAD.

Retraction Watch: What made you decide to embark upon this project? Continue reading What if we tried to replicate papers before they’re published?

Neuroscience journal retracts paper for lack of “scientific soundness”

Screen Shot 2016-03-03 at 9.21.12 AMAn unusual article that considered the concept of change from a systems perspective — including change in medicine, economics, and decision-making, for instance — has, well, changed from “published” to “retracted.”

After commenters on PubPeer called the 2014 paper “gibberish” and even suggested it might be computer-generated, Frontiers in Computational Neuroscience retracted it, noting it “does not meet the standards of editorial and scientific soundness” for the journal, according to the retraction notice. The paper’s editor and author maintain there was nothing wrong with the science in the paper.

Here’s the full note for “Sensing risk, fearing uncertainty: systems science approach to change:” Continue reading Neuroscience journal retracts paper for lack of “scientific soundness”

Psychologist Jens Förster earns second and third retractions as part of settlement

forster-j-a
Jens Förster

High-profile social psychologist Jens Förster has earned two retractions following an investigation by his former workplace. He agreed to the retractions as part of a settlement with the German Society for Psychology (DGPs).

The papers are two of eight that were found to contain “strong statistical evidence for low veracity.” According to the report from an expert panel convened at the request of the board of the University of Amsterdam, following

an extensive statistical analysis, the experts conclude that many of the experiments described in the articles show an exceptionally linear link. This linearity is not only surprising, but often also too good to be true because it is at odds with the random variation within the experiments.

One of those eight papers was retracted in 2014. In November, the American Psychology Association received an appeal to keep two of the papers, and Förster agreed to the retractions of two more:

Continue reading Psychologist Jens Förster earns second and third retractions as part of settlement

Authors used wrong dataset in study on shock therapy, exercise in depression

J psych resA psychiatric journal has pulled a 2014 paper that found electroconvulsive therapy and exercise helped people with depression, after the authors determined they had mistakenly analyzed the wrong data.

According to the retraction notice from the Journal of Psychiatric Research, the researchers had “erroneously analyzed” data from a previous study they had published the year before.

Here’s more from the note for “Electroconvulsive therapy and aerobic exercise training increased BDNF and ameliorated depressive symptoms in patients suffering from treatment-resistant major depressive disorder:” Continue reading Authors used wrong dataset in study on shock therapy, exercise in depression

Makeup use linked to testosterone levels? Not so fast, says retraction

Psych SciA psychology journal is retracting a 2015 paper that attracted press coverage by suggesting women’s hormone levels drive their desire to be attractive, after a colleague alerted the last author to flaws in the statistical analysis.

The paper, published online in November, found women prefer to wear makeup when there is more testosterone present in their saliva. The findings were picked up by various media including Psychology Today (“Feeling hormonal? Slap on the makeup”), and even made it onto reddit.com.

However, upon discovering a problem in the analysis of the data, the authors realized that central finding didn’t hold up, according to Psychological Science‘s interim editor, Stephen Lindsay: Continue reading Makeup use linked to testosterone levels? Not so fast, says retraction

What to do when you make a mistake? Advice from authors who’ve been there

cpp-150After a group of researchers noticed an error that affected the analysis of a survey of psychologists working with medical teams to help pediatric patients, they didn’t just issue a retraction — they published a commentary explaining what exactly went wrong.

The error was discovered by a research assistant who was assembling a scientific poster, and noticed the data didn’t align with what was reported in the journal. The error, the authors note, was:

an honest one, a mistake of not reverse coding a portion of the data that none of the authors caught over several months of editing and conference calls. Unfortunately, this error led to misrepresentation and misinterpretation of a subset of the data, impacting the results and discussion.

Needless to say, these authors — who use their “lessons learned” to help other researchers avoid similar missteps — earn a spot in our “doing the right thing” category. The retraction and commentary both appear in Clinical Practice in Pediatric Psychology.

Their first piece of advice in “Retraction experience, lessons learned, and recommendations for clinician researchers” — assume errors will happen, and not vice versa: Continue reading What to do when you make a mistake? Advice from authors who’ve been there

So, pot may not be as harmless as a recent study suggested

adb-150

Researchers are correcting a widely covered study that suggested chronic use of pot might not put users at risk of problems later in life.

It turns out that initial, unexpected finding — covered by Newsweek, The Washington Post, Quartzand (of course) The Stoner’s Cookbook (now known as HERB) wasn’t quite right, and a reanalysis found users had a small uptick in risk for psychosis. The authors have issued a lengthy correction in Psychology of Addictive Behaviors that includes some supplemental analysis, too.

Not surprisingly, the study’s findings engendered some controversy, which prompted the authors to reanalyze their data, collected from 408 males with varying levels of marijuana use, who were followed from their teens into their 30s.

Now, an American Psychological Association press release that accompanied the initial findings in August contains an editors note explaining why those aren’t quite correct:

Continue reading So, pot may not be as harmless as a recent study suggested

PLOS ONE issues editor’s note over controversial chronic fatigue syndrome research

Screen Shot 2015-12-15 at 11.42.08 PM

After a request for the original data was denied, PLOS ONE editors have flagged a 2012 sub analysis of a controversial clinical trial on chronic fatigue syndrome with an editor’s note.

The editor’s note — which reads like an Expression of Concern — reiterates the journal’s policy that authors make data and materials available upon request, and notes that staff are following up on “concerns” raised about the study.

There have been numerous requests for data from the “PACE” trial, as the clinical trial is known, which the authors say they have refused in order to protect patient confidentiality. On November 13, James Coyne, a psychologist at the University Medical Center, Groningen, submitted a request for the data from the PLOS ONE paper to King’s College London, where some of the authors were based. According to Coyne’s WordPress blog (he also has a blog hosted by PLOS), the journal asked him to let them know if he “had any difficulties obtaining the data.” He did — KCL denied the request last Friday (the whole letter is worth reading):

The university considers that there is a lack of value or serious purpose to your request. The university also considers that there is improper motive behind the request. The university considers that this request has caused and could further cause harassment and distress to staff.

Last author Peter White at Queen Mary University of London, UK, told us the journal had not asked them to release the data, but he would work with PLOS to address any questions:

We understand PLOS One are following up concerns expressed about the article, according to their internal processes. We will be happy to work with them to address any queries they might have regarding the research.

Here’s the editor’s note for “Adaptive Pacing, Cognitive Behaviour Therapy, Graded Exercise, and Specialist Medical Care for Chronic Fatigue Syndrome: A Cost-Effectiveness Analysis,” in full:

Continue reading PLOS ONE issues editor’s note over controversial chronic fatigue syndrome research

Authors lied about ethics approval for study on obesity, depression

cover (1)

Obesity has retracted a study that suggested overweight people may be less depressed than their slimmer counterparts in cultures where fat isn’t stigmatized, after realizing the authors lied about having ethical approval to conduct the research.

The authors claimed their research protocol had been approved by Norwegian and Bangladeshi ethical committees, but, according to the retraction note, part of the study “was conducted without the required approval of the university ethics board.” The journal’s managing editor told us that there is no evidence that there was harm to the study subjects.

Here’s more from the retraction note for “In Bangladesh, overweight individuals have fewer symptoms of depression than nonoverweight individuals:”

Continue reading Authors lied about ethics approval for study on obesity, depression

Diederik Stapel now has 58 retractions

stapel_npcSocial psychologist Diederik Stapel has notched his 58th retraction, after admitting he fabricated data in yet another article.

He’s holding onto his 4th place spot on our leaderboard.

This latest retraction is for “Correction or comparison? The effects of prime awareness on social judgments,” published in the European Journal of Social Psychology. As usual for Stapel, this paper has been retracted because he fabricated data.

Here’s the note:

Continue reading Diederik Stapel now has 58 retractions