Journal adds concern notice to paper by psychologist Jens Förster

A social psychology journal has added an expression of concern to a paper by prominent social psychologist Jens Förster, whose work has been subject to much scrutiny.

This is the latest in a long-running saga involving Förster. The 2012 paper in the Journal of Experimental Social Psychology had been flagged by a 2015 report describing an investigation into Förster’s work, which had concluded the paper likely contained unreliable data. Several other papers that received similar designations in that report have either been retracted or received expressions of concern (EoC).

The (paywalled) notice provides a lengthy explanation for why the journal chose to add an EoC, rather than retract the paper, as the University of Amsterdam had recommended. Here is an excerpt:

This Expression follows a request to retract the article received in late 2015 from the Rector Magnificus of the University of Amsterdam, where the underlying research was conducted, based on analyses of its published data provided in a report (Peeters, Klaassen, & van de Wiel, 2015a,b,c,d). This report judged the evidence for low scientific veracity of the publication to be “strong” according to their criteria for two tests of the excessive linearity of three means in multiple measures of multiple studies in the paper. That is, even assuming that the means for three conditions are perfectly linear in a population, the sample means are said to be unusually linear in pattern, in a way that calls into question the veracity of the data.

However, looking only at the article and its analysis, and having considered a number of exchanges on points of fact between the report authors, the article authors and others, my judgment is that the case for retraction is only at a borderline level. It warrants instead an Expression of Concern.

The notice provided more detailed comparisons between the methods used by Förster and his co-author, and the authors of the 2015 report.

When any Worx looks typical to you: Global relative to local processing increases prototypicality and liking” has been cited six times since it was published in 2012 (none since 2015), according to Clarivate Analytics’ Web of Science.

A long process

We obtained a draft copy of the EoC dated from February; there are slight wording changes — for instance, the final version adds the phrase “the case for retraction is only at a borderline level.” The published version also includes different phrasing in describing the nature of the statistical analysis. For instance, the draft EoC says:

However, this criterion at its most conservative level is associated with a type I error of .085, which is not a viable standard for strong suspicion in a journal which publishes over 100 articles a year.

The final version says:

However, the upper bound for the type I error probability in a single experiment is .0809, and thus the probability of 4 errors in 12 experiments is .0124, which to me is not a viable standard for strong suspicion in a journal which publishes over 100 articles a year.

Journal editor Roger Giner-Sorolla at the University of Kent explained some of the changes:

Comparing the draft to the final version my additions were sometimes aimed at clarifying things and sometimes in response to communications with the various parties. For example, in 1c I added a sentence answering the point that some of the individual tests were taken as anomalous. The point about not being enough evidence for retraction was a further explanation of my decision, in light of the request having originally been for retraction. Those are the main differences I can find, and I also edited some more to make clearer references, such as “the report authors” changing to “PKW.”

He also explained why it took months to release the final version:

I sent a draft version out for comment to the authors and a number of interested parties in the Netherlands on Feb 17 this year. On March 28, after processing and responding to the comments, I sent the final Expression of Concern to Elsevier for publication. Any delay since then has been a matter of typesetting, proof checking and coordinating  the large amount of supporting material to be hosted online, refining the references as they related to the actual documents to be included, and in particular, getting further permissions from various parties to host material authored by them online.

He added:

The retraction request was disputed from the start by the authors, so a lot of fact finding and decision making had to go on, and many eyes had to look at and approve each stage of the process. I took over the case from the previous editor-in-chief in January 2016. Just as we were ready to make a decision in September 2016, the authors contacted us with a new criticism of the report, which had to be evaluated just as fully. I had the final draft expression of concern ready in January of this year, and in the interest of transparency I wanted the major documents of the discussion to be openly available. Getting those together and approved by all parties took still more time. I wish this could have happened more rapidly, but being open and careful about the decision process and background materials has to come first.

(Incidentally, the final, published version of the notice was temporarily removed; it’s since been restored. Giner-Sorolla told us he didn’t ask to remove it.)

He explained the journal’s logic for why it chose not to retract the paper:

Looking at the evidence for the JESP paper in question, individually, I thought it was at a marginal level.

First, the report authors used two methods, delta-F and V, to calculate whether the data were unusually regular in their linear pattern. In the case of the JESP paper in question, only the V method met their own criteria for “strong” indications of low veracity.

Second, this V method is a technique that the report authors developed themselves but hasn’t yet been peer reviewed. It gives fairly arbitrary cut-offs for “strong” indications, that under the best circumstances could be produced by chance by 1 in 12 to 1 in 20 papers (Peeters et al. report, table on p. 3). I didn’t think that by itself this was a strong enough standard to warrant outright retraction. You can compare this to one of the papers that was retracted, published by Social and Personality Psychological Science in 2012, where the delta-F method indicated that the results would only be seen by chance once in 33 million papers (Peeters et al. report, p. 39).

Third, it turned out that one of the studies that was too linear in its pattern was linear in a way that didn’t support the authors’ hypothesis, unlike the others in the paper. While there was some disagreement about whether this should be counted, excluding this study would have also meant there was no longer “strong” evidence by the report authors’ own standards.

Giner-Sorolla added:

I’m open to reconsidering the decision if additional evidence surfaces, either statistical or pertaining to what went on at the University of Amsterdam. Because the original data records have been lost, we have been operating in the dark, both statistically and procedurally. Unless we find out more, the EOC will have to stand. But I hope the amount of material I’ve chosen to provide with this EOC allows people to make up their own mind about the studies and the body of work in question.

Indeed, a representative of the journal sent us correspondence cited by the notice, which is included as supplementary material with the EoC.

Stakeholders react

A representative of the University of Amsterdam, which requested the journal retract the paper, told us:

…the university of Amsterdam has published all materials pertaining to this online, including the report, analysis, response and so forth. Following the report, the journals in which eight articles in the first category (strong statistical evidence for low veracity) were published received a request to retract the articles or to consider doing so. The decision to do so, or to publish a eoc, subsequently rests with the journals.

We asked one of the authors of the 2015 report, Carel Peeters of VU University Medical Center, to comment on the journal’s decision to issue an EoC; he told us:

We had concluded in our report, commissioned by the University of Amsterdam (dr. Foerster’s previous employer), that this publication showed ‘strong statistical evidence for low scientific veracity of the reported data’. Prof. Giner-Sorolla, following COPE guidelines, posed several questions to better understand the debate regarding our report and its conclusions. A further enquiry, containing remaining questions, was made to us in December of 2016. Prof. Giner-Sorolla send all interested parties a draft EoC on February 17th, 2017. All parties were given the opportunity to reply, which we did. We are not aware if the other parties (i.e., Jens Foerster and relevant coauthors) replied to the draft. Our comments were incorporated in the final draft. So, indeed it took a while for this EoC to be issued. But we fully agree to the diligence and care with which the Editor approached this sensitive issue.

Peeters added:

We have always viewed our report as a contribution to the debate regarding the scientific value of publications by dr. Foerster. We have never made recommendations regarding procedural decisions such as retractions. Not to the board of the University of Amsterdam, nor to the Editors of the journals involved. We feel that such decisions must lie with third parties. Hence, in our responses to the inquiries by prof. Giner-Sorolla we have always clearly stated that our answers were intended to provide clarification, not to influence the final decision. Irrespective of the labeling (retraction notice or EoC) we stand by the conclusions of our report.

Among the eight papers that Peeters and his colleagues concluded had “strong evidence for low veracity of the reported data,” three have been retracted (1, 2-3). Two others have received expressions of concern (in addition to the latest notice); the American Psychological Association has announced it won’t retract the remaining two papers, despite the university’s recommendation.

We’ve contacted  Förster, who has a position at Ruhr-Universität Bochum. In 2015, he turned down a prestigious professorship, citing the toll the investigation had taken on him.

Hat tip: Rolf Degen

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.