Study of a “nudge” to use hand sanitizer retracted

Image by Adriano Gadini from Pixabay

A group of researchers in the United States and China have retracted their 2018 paper on hand hygiene, admitting that they can’t account for “data anomalies” in their work.

The article in question, “The decoy effect as a nudge: Boosting hand hygiene with a worse option,” appeared in Psychological Science last May. Meng Li, of the Department of Health and Behavioral Sciences at the University of Colorado Denver, and Hui Chen, of the Chinese Academy of Sciences, reported results from experiments designed to increase the use of hand sanitizer in the workplace through the use of a “decoy” bottle:

This article provides the first test of the decoy effect as a nudge to influence real-world behavior. The decoy effect is the phenomenon that an additional but worse option can boost the appeal of an existing option. It has been widely demonstrated in hypothetical choices, but its usefulness in real-world settings has been subject to debate. In three longitudinal experiments in food-processing factories, we tested two decoy sanitation options that were worse than the existing sanitizer spray bottle. Results showed that the presence of a decoy, but not an additional copy of the original sanitizer bottle in a different color, drastically increased food workers’ hand sanitizer use from the original sanitizer bottle and, consequently, improved workers’ passing rate in hand sanitation tests from 60% to 70% to above 90% for 20 days. These findings indicate that the decoy effect can be a powerful nudge technique to influence real-world behavior.

Shortly after an in-press version of the article appeared online, however, Leif Nelson, of UC Berkeley’s Haas School of Business — and a founder of the Data Colada blog — read the study and became suspicious about the plausibility of the findings.

We won’t re-hoist all of the red flags here; for that, read the account on Data Colada. But the issues were compelling enough that the editor of the journal agreed action was needed.

In December, the journal issued a lengthy correction stating, in part, that:

The original online-first version of this article included some errors that are now being corrected. These changes do not affect the statistical results or the conclusions of the experiments.

The journal also published a detailed expression of concern. That notice, which thanks Nelson et al for their efforts, lays out the various flaws with the article, including “uncertainty regarding the provenance of the data” and “peculiarities” with the results.

However, the correction/expression of concern didn’t satisfy the Data Coladans, who wrote:

Even after the correction, and the clarifications of the Expression of Concern, we still believe that these data do not deserve the trust of Psychological Science readers.

Now, more than a year after publication of the original article, the journal is retracting the work.

According to the notice:

The following article has been retracted at the request of the first two authors (Meng Li and Yan Sun): Li, M., Sun, Y., & Chen, H. (2019). The decoy effect as a nudge: Boosting hand hygiene with a worse option. Psychological Science, 30, 139–149. doi:10.1177/0956797618761374

Li and Sun notified the Editor as follows:

In December 2018, we requested that an Expression of Concern be issued for our article because of anomalies in the data (brought to our attention by Leif Nelson, Frank Yu, Uri Simonson, and two anonymous researchers) and because we were not able to recover records of the messages containing the original data files that the factories sent to the third author (Chen), who was the sole point of contact with the factories where the data were collected. Since then, we have made many attempts to obtain more information about the data collection process. Unfortunately, Chen has not provided additional information, and we have not found a convincing alternative explanation for the data anomalies. While there is no direct proof that the data were tampered with, our faith in the data is substantially reduced, and we judge that it is necessary to retract the article.

Simonsohn told us that he is “quite satisfied with the paper getting retracted.”

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

2 thoughts on “Study of a “nudge” to use hand sanitizer retracted”

  1. While it’s right that this paper should be retracted (as no one seems to be able to trace the original data) I’m a bit concerned with the data-analysis by Simonsohn, Nelson and Yu, which I think is illogical. They correctly state that the final digit of a set of genuine measurements with normal random errors ought to be uniformly random; it should have equal chance of being anything from 0 to 9. The (unstated) alternative hypothesis is that if the data have been made up, then the distribution may not be uniform. But the final digit of random data made up using any half-way respectable random number generator (even something as simple as Excel’s function) will also be uniformly distributed. So by common sense, if we have a non-random distribution then either the data were made-up by hand – someone sat there thinking up ‘random’ numbers and wasn’t very good at it (surely any sensible criminal would use Excel? It’s so easy) – or we’re looking at real data with some underlying issue with how they were collected. For the Problem 3, last digit in Experiment 3, I have this image of someone sitting watching a balance that tends to creep upwards (many lab balances creep exponentially towards some end-point). They watch the next-to-last digit until they think it’s stable, pause to check, and write the whole number… meaning that there is a roughly constant delay between when the next-to-last digit turned over, and when they read the whole thing… so the last digit is now non-randomly distributed (because it depends on their typical, roughly-constant delay, and the roughly constant rate of creep). So we have a simply human hypothesis about how the data can end up systematically deviating from Benford’s law without actually being invalid (the weigher perhaps knew that the last digit was pretty meaningless anyway) or deliberately biased. This is only one hypothesis. It doesn’t excuse the loss of the raw data, nor does it say that the data were genuine, nor does it say that the human way to collect data was good enough. But it is not logical to reject data as obviously falsified if the obvious way to falsify data would pass the same test as true data… one should look a little deeper. Still, hats off to them for spotting a situation that couldn’t be backed up by producing the raw numbers.

  2. I am the lead author of the paper Meng Li.

    I (along with the 2nd author) decided to retract the paper after an extensive look into all the data and records, and months of delving into all relevant information.

    If anyone is interested in more details, when the Expression of Concern (which we initiated) came out in December 2018, I wrote a detailed response to the first DataColoada blog post to lay out everything I discovered and knew about the data collection process at the time. See the blog here https://openmethods.wordpress.com/2018/12/05/response-to-datacolada/

    It is still a mystery to me why the data look the way they look, as the 3rd author (a student of the 2nd author at the time) was the sole contact person with the factory where the data were collected. But after much deliberation, I (along with the 2nd author) decided to retract the paper because our faith in the data has reduced significantly (https://openmethods.wordpress.com/2019/05/29/upcoming-retraction/).

    I informed RetractionWatch about the retraction about a week before it came out so that the issues with the data and the retraction would be transparent and widely distributed. Happy to answer any questions through email: [email protected]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.