Retraction Watch

Tracking retractions as a window into the scientific process

Prominent food researcher retracts paper from JAMA journal, replaces it with multiple fixes

with 4 comments

Brian Wansink

Earlier this week, we reported that high-profile food researcher Brian Wansink — who’s faced months of criticisms about his research — had issued his second retraction. On Thursday, he issued his third.

The retracted paper — a 2012 research letter in the Archives of Pediatrics and Adolescent Medicine, now JAMA Pediatrics — reported that children were more likely to choose an apple over a cookie if the apple included an Elmo sticker. The same day the paper was retracted, Wansink and his co-authors published a replacement version that includes multiple changes, including to the methodology and the results.

The retraction notice lists the mistakes, which the authors say they made “inadvertently:”

we inadvertently provided an incorrect description of the study design and sample size, used an inadequate statistical procedure, and presented a mislabeled bar graph. This study explored the association of branding apples with cartoon characters with their selection among 208 students aged 8 to 11 years. The data were collected as an independent part of a larger semester-long project by Cornell University personnel, under the direction of a laboratory manager, from March to June 2008; the overall study was under the supervision of the first author (B.W.). These errors were recently discovered in the course of rechecking the data and the labeling of the Figure following a letter we received from a reader on February 12, 2017.

The notice lists multiple changes to the description of the methodology — for instance, children were offered either a cookie or an apple, not both — and explained the instances where certain individuals were excluded. They add that they re-ran the data using a different statistical test (rather than the one they “inappropriately” used originally) and corrected the figure.

The notice concludes:

We confirm that there are no other errors or omissions in the original article. The overall conclusions of the study as originally reported remain after correcting for these errors: branding an apple with a cartoon character was associated with an increase in student selection of an apple.

We regret these errors as well as the confusion they caused the readers and editors of JAMA Pediatrics. We have requested that the editors retract and replace the original Research Letter. The text and the Figure have been corrected and replaced online. We have also added a new Reproducible Research Statement with a link to the data, analysis scripts, and output files in an archive of the Cornell Institute for Social and Economic Research. An online supplement has been added that includes a PDF version of the original article with the errors highlighted and a PDF version of the replacement article with the corrections highlighted.

“Can Branding Improve School Lunches?” has been cited 21 times since it was published in 2012, according to Clarivate Analytics’ Web of Science.

The appendix of the replacement paper includes a copy of the original, retracted version with the errors highlighted; the entire results section is highlighted.

We contacted Wansink, and received a statement from the Food and Brand Lab Team at Cornell:

At the time this was originally published, Research Letters to JAMA Pediatrics were limited to 600 words and to only one figure or table. This accounts for the brevity of the original article. While we regret the mistakes in the original publication, having the opportunity to reanalyze and republish this research allowed for corrections to be made and for more clarifying detail to be added to the method section.  It also offered the opportunity to use a more sophisticated analysis procedure which showed the results to be even stronger than in the original paper.

The statement concluded:

The Food and Brand Lab has implemented an extensive set of Standard Operating Procedures (SOP) for research that is being used by all Lab researchers to ensure that proper procedure is adhered to and that tracking of all research activity is appropriately documented. This SOP is designed to help better coordinate, standardize, and track a wide range of researchers who have been trained in a wide range of disciplines and traditions.

Journal editor Frederick Rivara at the University of Washington told us the journal didn’t consider simply retracting the article, and would have published the now-revised version if it had been the original submission.

The notice says the researchers decided to check the data after receiving a letter on February 12; on February 15, researcher Nick Brown posted a lengthy critique of the paper, pointing out some inconsistencies in the data and methodology.

We asked Brown what he thought about the retract/replace notice; he told us:

Having read the cover letter that accompanies the retraction, my reaction would be to question just how chaotic a lab has to be to “inadvertently” submit an incorrect report of the study design and sample size to a journal with an impact factor of over 10. I also wonder how a misleading figure “inadvertently” came to replace a considerably less misleading one that was apparently in a draft of the article less than three months before it was published, as detailed in my blog post…

He added:

Given this apparent degree of chaos, I wonder how many other “inadvertent” errors are waiting to be discovered in the entire research output of this lab, and what the impact of those errors might be for the conclusions of those studies, some of which have major implications for public policy and have been behind the spending of substantial amounts of tax dollars on intervention programs.

Earlier this week, Wansink retracted another paper that had been critiqued by Brown and two co-authors in a PeerJ paper. Wansink has faced heavy scrutiny since November 2016, when a blog post he wrote prompted a backlash from readers who accused him of using problematic research methods to produce questionable data. In April, an internal review by Cornell University concluded that Wansink had made numerous mistakes in his research, but he did not commit misconduct.

We’re seeing more instances when journals issue retract-and-replace notices for papers that are affected by honest error (including one from JAMA issued this week).

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

Comments
  • herr doktor bimler September 22, 2017 at 10:07 pm

    “Can Branding Improve School Lunches?”
    It probably depends on who is branded.

  • Nick September 23, 2017 at 10:15 am

    The replacement article seems to have problems of its own, notably the fact that the data are not consistent with the reported method, and if this issue is fixed in what I believe to be a reasonable way, the authors’ principal claim no longer reaches conventional levels of statistical significance (p = .149 instead of p = .023). See http://steamtraen.blogspot.com/2017/09/problems-in-cornell-food-and-brand-labs.html

  • Mary Kuhner September 24, 2017 at 9:49 pm

    I note that the retract-and-replace does not address a key issue brought up in Nick’s original critique, namely that the paper says these were 8-12 year olds but multiple descriptions of this work in Wansink’s books refer to them as daycare children, and the original paper used the word “preliterate” of them. I don’t think we know how old they were, if indeed any children were ever tested at all.

    The study design is bonkers in all iterations. If you want to know whether putting branded, nonbranded, or no stickers on apples leads to a change in apple consumption, you should vary the stickers on the apples, keeping everything else constant. Varying the stickers on the cookies at the same time just muddies things up. I guess it offers more chances to p-hack. And the person doing the original writeup consistently conflates “unbranded” and “no sticker” which suggests that they did more conditions than they actually wanted (p hacking again?)

    I am strongly of the opinion, based on Nick’s follow-up article, that the paper needs to be retracted again, and should stay retracted. Giving one retry was already rather generous given the scope of the problems. Since the researchers’ own Excel files show that their data are not what they say they are, it seems unreasonable to expect valid results at this point.

  • Tom September 26, 2017 at 7:27 am

    At my institute, we’ve been discussing sustainable research for years now. I have no idea what precisely went on in this particular lab, but everything I read from it – with the PI’s own blog post at the center – makes me question the legitimacy of their work.

  • Post a comment

    Threaded commenting powered by interconnect/it code.