Dirk Smeesters, the former psychology professor at Erasmus University found to have committed misconduct, has had another paper retracted.
Here’s the notice:
The following article has been retracted by the Editor and publishers of Psychological Science:
Liu, J. (E.), Vohs, K. D., & Smeesters, D. (2011). Money and mimicry: When being mimicked makes people feel threatened. Psychological Science, 22, 1150–1151. doi:10.1177/0956797611418348
The retraction follows the results of an investigation into the work of author Dirk Smeesters. The Smeesters Follow-Up Investigation Committee of Erasmus University Rotterdam has determined the following in regard to the retracted article:
The paper indicates that the variable Liking of the confederate consists of two items with α = 0.91, whereas a reconstruction of the data proves that three items were used with α = 0.90.The authors state that respondents were randomly assigned to the different experimental conditions. However a test of independence of gender with the experimental conditions shows that this is not the case (p < 0.001). In a response, Smeesters acknowledged this observation. The Committee considers this to be a major methodological mistake that can affect the interpretation of the paper referring to Criterion 7: committing imputable inaccuracies when undertaking research. As Smeesters was in charge of data collection the Committee holds him solely responsible. The Committee recommends retraction of this paper. (Smeesters Follow-Up Investigation Committee, 2014, p. 7)The committee found no blame on the part of Smeesters’s coauthors, who have seen and agreed to this retraction.
The study has been cited six times, according to Thomson Scientific’s Web of Knowledge.
The university urged a total of seven retractions. This is the fourth.
Hat tips: Rolf Degen, Jelte Wicherts
Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post.
This is what should be done in all established cased of misconduct. The university takes responsibility to look beyond the articles in the original complaint and tries to determine whether the field can still trust these other papers by the accused. Regardless of what has caused the experimental groups to differ so strongly on gender in this particular study, we should not trust the results and hence the paper should be retracted. Big universities in the US can learn a lot from this. For instance, Harvard University has not given any information about the work by Marc Hauser that was not retracted or corrected, and the field still has no clue whether they can trust any of his other papers (we don’t even known to what degree he was involved in data collection of other papers). Also, the Univ.of Michigan Ann Arbor and the Univ. of North Carolina Chapel Hill have not provided any information about the Lawrence Sanna investigation. So the field is kept in the dark about what he did and whether the investigation covered other papers authored by him. If science is at stake, the lawyers should leave the room and the assessment of studies need no longer be concerned with the legal issue of misconduct (establishing wilful deceit etc.) but rather with the question of whether the data are any good (a scientific assessment).
This is also a study in the “social priming” tradition, which has been publicly criticized by Daniel Kahneman. There is more to come from there, in terms of retractions.
http://www.nature.com/news/nobel-laureate-challenges-psychologists-to-clean-up-their-act-1.11535
As little as I like what Smeesters has done, doesn’t it strike anyone as a bit strange that just because a scale consisted of three instead of the stated two items and gender and experimental condition were not as independent as claimed (which, after all, can happen even with random assignment), a paper is retracted instead of corrected through an erratum? It looks like the committee (and the journal) pulled out a bazooka when a dagger would have sufficed. After all, this appears NOT to be a case of data fabrication or outright fraud, but rather as a case of sloppy reporting.
“this appears NOT to be a case of data fabrication or outright fraud, but rather as a case of sloppy reporting.”
It seems to me that what they cite in the retraction is what they can _prove_ happened. It might not be all that they suspect. If there are significant problems with a dataset, the set may be entirely made up. If the authors can prove that the set is legitimate, a correction may be all that is required. But in this case the claim is that there was systematic misconduct.
True, but think about this from the eyes of Psychological Science. The journal has a broad reputation for publishing fishy flashy findings. So, they are now going overboard in this case. The guy is already damaged goods, so why not beat him to a pulp just to show that they are really serious about this issue?…
Surprised