Matlab mixup sinks Journal of Neuroscience paper

journal of neuroscienceA team of neuroscientists at University of Oregon and the University of California, San Diego (UCSD) have retracted a paper from The Journal of Neuroscience after realizing their analytic code contained an error.

The authors state in the notice that their conclusion remains accurate after correcting the mistake in the program Matlab. However, the paper — which examined the role of neuronal oscillations in working memory — still contained “some findings that we no longer believe to be robust.”

It’s a very useful notice:

At the request of the authors, The Journal of Neuroscience is retracting “Induced Alpha Rhythms Track the Content and Quality of Visual Working Memory Representations with High Temporal Precision” by David E. Anderson, John T. Serences, Edward K. Vogel, and Edward Awh, which appeared on pages 7587–7599 of the May 28, 2014 issue.

We regret that there was an error in the analytic code used to compute oscillatory power in our article. Specifically, there was a matrix transposition error in the code (see abs(hilbert(eegfilt(data,Fs,f1,f2))).2̂ on page 7588, right column, end of second full paragraph). The data matrix was oriented correctly for the call to eegfilt, but the output of the call to eegfilt was not correctly transposed in the standard Matlab format before passing into the built-in Matlab ‘hilbert’ function, as the EEGLAB function ‘eegfilt’ and the built-in Matlab function ‘hilbert’ require the data matrix to have different dimensions in order to operate correctly across time. Fortunately, this error had a relatively modest impact on the overall pattern of data, because even though the imaginary component of the complex modulus of the hilbert transformed data was incorrect, the real component was not affected and averaging across trials largely washed out the perturbations in our estimates of instantaneous power. Thus, because our analysis was focused on the spatial distribution of power, the broad empirical patterns that we reported remain intact even after the matrix transposition was corrected. The spatial distribution of alpha power across electrodes still covaries with the orientation of the remembered stimulus, and thus still enables time-resolved tracking of the contents of visual working memory (i.e., Fig. 2A,C), and this neural activity is sensitive to whether or not the observer is voluntarily maintaining the memorandum in working memory. Nevertheless, the corrected analysis no longer shows a robust correlation between behavioral mnemonic precision and the tuning properties of the neural activity as reported in the original article. Thus, we chose to retract the paper because it was evaluated by reviewers in the context of some findings that we no longer believe to be robust.

We should note that not all Journal of Neuroscience retractions are this useful or detailed. The journal’s policy has been to give no reason for retractions unless the authors provided one. The journal got a new editor as of January 1 this year, so we’ve reached out to her to see if such complete explanations for retraction will become the norm. We’ll update if we hear back.

We also got in touch with author Edward Awh, who said that the whole team agreed to retract. He also told us:

We have ongoing projects that already replicate the basic features of the new analytic approach, one of which is under review. Thus, it may be that those studies will instead become preliminary data for other reports that will more fully develop the individual studies reported in the retracted paper.

Hat tip: Rolf Degen 

14 thoughts on “Matlab mixup sinks Journal of Neuroscience paper”

    1. I guess that they eventually tried something that gave them results that they weren’t expecting.

      More focus in courses on scientific computing should be given to detecting bugs. One simple method is to generate a data set based on the fitted results and then fit that and see that the results match reasonably closely to what was achieved first time. Doing that a number of times is known as parametric bootstrapping and will give standard errors which are also hopefully close to the original model fit.

  1. Kudos to the authors. This is a gutwrenching case. The error was (if I understand it correctly) literally a missing apostrophe, since that’s how you do a matrix transpose in Matlab.

    So the ‘line’ between a high-impact paper and a retraction was, in this case, a very small line: ‘

  2. In the second para above:

    “The authors state in the notice that their conclusion remains accurate after correcting the mistake in the program Matlab.”

    might be read as suggesting that the problem was a bug in Matlab – not in the authors’ code.

    1. “It’s not a bug, it’s a feature” (heheh.)
      This is the best retraction notice I’ve ever read, and I even understood most of it.

  3. That was how I read both it and the title of the Retraction watch post. It should read: “matrix transposition mixup sinks…”. Matlab had nothing to do with it.

  4. Yep, and this is why I build and test my code one function at a time, and make sure when things are nested I know exactly what is going on (since it isn’t always obvious when stringing together functions in your head).
    I imagine hilbert(eegfilt(data,Fs,f1,f2))) would have produced some kind of warning…

  5. This is yet another reason why all in house made codes should be made available as supporting material when submitting to peer reviewed journals for review and provided when published.

    1. In that specific case the faulty code line was explicitely written in the paper. That’s what the retraction notice says…

      1. R Markdown allows for precisely that now, enveloping everything (data, the exact analysis code, the results, the text, and the figures) in one markdown document, making it easy for reviewers and subsequent readers to check the data and analyses, apply different analyses, and so on). When knitted (with KnitR), it can output into many document types (html, pdf, LaTeX, even Word—although why anyone would want the last is beyond me). I have written to the chief editors of some top journals in psychology suggesting that R Markdown should be the preferred submission format and preferred supplementary material format, but have yet to hear anything back.

        1. I have written to the chief editors of some top journals in psychology suggesting that R Markdown should be the preferred submission format and preferred supplementary material format, but have yet to hear anything back.

          I’m unsurprised, given that there’s no obvious way to get it into an XML-based production stream.

  6. It is usually the case that the coding is left to the student authors, while the PI just looks at the results and plots and does not go over the code. That’s where a student’s lack of experience can come to bite you.

  7. Re: FooBar

    Then more shame on the PI. That’s negligence by another name. If you teach a craft, you have to guide and monitor the student every step of the way. Same thing is science. He/she is a sloppy master (aka, PI) who lightly skips this step.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.