Retraction Watch

Tracking retractions as a window into the scientific process

Negligence by stressed-out postdoc led to retraction of high-profile paper, supervisor says

with 11 comments

The timing was tight, but Sergio Gonzalez had done it. Gonzalez, a postdoctoral researcher at the Institute for Neurosciences of Montpellier (INSERM) in France, had a paper accepted in a top journal by the end of 2015, just in time to apply for a small number of highly sought-after permanent research positions that open up in France each year.

If Gonzalez had missed the January deadline for this system of advancement, known as concours, he would have had to wait until the following cycle to apply.

Once his paper was accepted by the Journal of Clinical Investigation, Gonzalez could breathe a sigh of relief. He began being invited to interviews. But then, a comment showed up at PubPeer.

Over the next several months, it was revealed that Gonzalez had fallen prey to the intense pressures many researchers face. Although his research itself was top-notch, according to his supervisor Nicolas Tricaud, a group leader at INSERM, the scramble to publish led Gonzalez to make some unintentional errors to several figures, which led to a corrigendum, then an expression of concern, followed by an institutional review, and finally a retraction.

Ultimately, Gonzalez never obtained a permanent position, and dropped out of academia, Tricaud told us.

We contacted Gonzalez at his INSERM email address, but haven’t heard back. Tricaud told us he hasn’t heard from Gonzalez in a while.

As a result, everything we know about this story comes from Tricaud, who explained the rigorous concours system in France, and why it’s so important for young researchers to publish in a prestigious journal before applying:

Most of the candidates have at least one top ten publications on their CV as first author.

Tricaud described the dedication and focus required for Gonzalez to meet his goal:

The guy worked like hell in my lab for around 2 years on a highly technical project … solving technical problems, collecting and analysing a huge amount of data, developing the project with me further toward cell signalling and drug therapy for peripheral nerve diseases…He had his goal is mind and pushed himself very hard to get the paper published in time to be ready for the concours. And he did it.

Once Gonzalez had completed the research, he began working on the figures while Tricaud wrote the paper. The paper, “Blocking mitochondrial calcium release in Schwann cells prevents demyelinating neuropathies,” was officially accepted in the Journal of Clinical Investigation (JCI) in December 2015 and first appeared online in February 2016. In the paper, Gonzalez explored drug therapy for peripheral nerve diseases, presenting four different animal models of diseases in mice and rats as well as four independent motor functional tests at different times after treatment, Tricaud explained.

Even with his paper accepted in a high-impact journal, Gonzalez’s stress continued to grow, Tricaud said, as he still faced the intense scrutiny of concours. Gonzalez had applied to three different concours categories within the CR2 INSERM and CNRS competitions, and was awaiting word on whether he would proceed to the next stage—an oral defense to a committee of specialists.

Tricaud told us Gonzalez was selected for his first oral defense “thanks to the JCI paper (and previous ones such [as] a Nature Protocol he did before in the lab).”

But on February 18, 2016, just two weeks before this first defense, a comment appeared on PubPeer flagging a problem with a figure. This is when things began to unravel. Tricaud explained:

Indeed, one of the control picture[s] was actually part of a larger picture that was taken on EM samples of treated animals … I was upset because I knew we had plenty of EM samples and EM images of these samples. I asked him to give me a new correct image and I told him to check very carefully if any other mistake occurred in his Figures.

Tricaud explained that Gonzalez came to him in tears, swearing that all other figures were okay. Tricaud then asked JCI to add a corrigendum with the new image, which was published in March 2016.

Gonzalez went to first his defense and “did great,” Tricaud said.

Several days before his second defense, on March 18, a new comment popped up on PubPeer, flagging similar problems, this time with several supplementary figures, Tricaud said:

Looking carefully I realized that it was a similar problem: the control pictures were actually taken on treated animal samples; it was not the same sections because the focus changed but it was the same samples.

The main issue was that Gonzalez “had collected pictures on the microscope but he had not archived them correctly,” which meant that some of the treated sample pictures ended up in the control sample folder, Tricaud said.

Tricaud was upset because “it was my job and my responsibility to check everything,” plus the supplementary figures were not even needed in the paper:

We did not use them for our analysis and I had told him so but he wanted to add them because our colleagues had shown them in a previous paper.

At this point, the problems continued to snowball. Tricaud wanted the paper to be accurate so he asked the editors to replace the incorrect images with the correct ones. But it was too late. The editorial board had grown suspicious after the first corrigendum. The editors issued an expression of concern in April 2016, and asked for an institutional review.

According to Tricaud, the expression of concern plus the comments on PubPeer and institutional review had created sufficient doubt in the minds of the concours committee, which “killed” Gonzalez’s application in all categories. Gonzalez left the lab for many months and “would not even talk to me,” Tricaud said.

According to the retraction notice and Tricaud, the institutional review ultimately found “no evidence of intention to falsify results and concluded that errors were made due to negligence during the assembly of figures.” Still, the journal decided to retract the paper because of the image errors. Tricaud was able to negotiate special terms for the retraction, which would allow him to “re-publish the excellent data we have somewhere else.”

Here’s the retraction notice for “Blocking mitochondrial calcium release in Schwann cells prevents demyelinating neuropathies,” published in the Journal of Clinical Investigation:

Following an institutional investigative review of multiple errors in data presentation in this paper, including several instances of reuse of the same images to represent independent samples in Supplemental Figures 7 and 11, the Editorial Board is retracting this paper. The institutional review found no evidence of intention to falsify results and concluded that errors were made due to negligence during the assembly of figures. The institutional review panel did not question in any way the authenticity of the published results. The paper is being retracted because JCI editorial policy prohibits image duplication and misrepresentation of data.

Even though these errors were unintentional and Tricaud was transparent about the problems as well as swift in his efforts to correct them, he has still had to face the stain that can accompany a retraction:

I am now stuck with this infamous retraction.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

Comments
  • TL March 14, 2017 at 10:41 am

    Why would you ever need to create rotated and slightly offset sections of the same micrograph? What legitimate use is there to generate such duplicates?

    • Anna March 14, 2017 at 1:14 pm

      My experience with electron microscopy is minimal, but I’d guess that what could happen is that you choose the settings which seem optimal, you take a picture, and then it turns out it could have been better, so you change the focus a bit, you rotate a bit, and you retake the image. And you end up having a folder with mixed good and bad images, because this is the folder with raw data and you shall not tamper with raw data.

    • imohacsi March 14, 2017 at 6:34 pm

      Pretty much as Anna said. You are taking images one after another to finish before your time ends. If you have diverse samples and dont change filenames, within a few hour you will forget which picture is which. Then comes that you are just dumping the best looking images into the papers without actually knowing whats on them.

  • herr doktor bimler March 14, 2017 at 6:17 pm

    I am now stuck with this infamous retraction.

    It is important to know who is the real victim.

    • Sylvain Bernès March 14, 2017 at 10:46 pm

      Science? (I mean the building of knowledge, not the journal)

  • Cancer doc UK March 15, 2017 at 6:02 am

    Anna
    My experience with electron microscopy is minimal, but I’d guess that what could happen is that you choose the settings which seem optimal, you take a picture, and then it turns out it could have been better, so you change the focus a bit, you rotate a bit, and you retake the image. And you end up having a folder with mixed good and bad images, because this is the folder with raw data and you shall not tamper with raw data.

    Having performed EM for several years, I disagree. Each image is labelled carefully during experiments – how else can any future analysis be performed if the images are mislabelled? It’s impossible. I’m not blaming the post doc, it’s the culture in the group, it appears to be an acceptable practice by the PI (who should check EVERY RAW DATA prior to paper submission and grant application).

    • TL March 15, 2017 at 9:44 am

      Agreed. I never understood why editors accept these excuses for sloppy micrograph acquisition and presentation. Either they form the basis of your scientific argument, in which case they need to be professionally managed and not doing so is a form of misconduct, or they don’t and there’s zero reason to paste them all over your paper.

      The sob story also isn’t particularly convincing. I’m sure there are lots of hard working students who didn’t rush their papers through in time for the yearly panel deadline and consequently didn’t even get interviewed.

  • art March 15, 2017 at 8:45 am

    Anna
    My experience with electron microscopy is minimal, but I’d guess that what could happen is that you choose the settings which seem optimal, you take a picture, and then it turns out it could have been better, so you change the focus a bit, you rotate a bit, and you retake the image. And you end up having a folder with mixed good and bad images, because this is the folder with raw data and you shall not tamper with raw data.

    But good practice is to name all of the files with the date and specimen number. Also the treatment group if you want to save digging through notebooks later. If more than one picture is being taken of a specimen, the file names get appended with an image number. Even if the folder has a mix of good and bad images, there should never be any question of exactly what each image is of.

    • DocMartyn March 18, 2017 at 5:40 pm

      My file names for optical images are huge; plate position, magnification, cell-type, treatment, RGB channel label and if backgrounded. I keep a virgin file, do background subtraction, and saveas with MB
      Typical
      E2 x60 NHA 150 uM TMZ&OBG, MT 50ms, H2DCF 150 ms & Hoe 25 ms MB

      The files are also date stamped, in a folder with huge file name and the deconvoluted excel file goes into same folder.
      After deconvolution and stats, I select the image that is closest to the average for that treatment group, rather than cherry pick. It does mean that you take more images than you need, but you will not end up on RW.

      • Lee Rudolph March 18, 2017 at 8:48 pm

        That sounds to me (but, hey, I’m a mathematician, I don’t do data) like an excellent methodology. Is it allowed/required/forbidden to include such a description—including precisely how the representative data published in the paper is derived from the collected data, as well as how the data is collected and preserved—in the “methodology” section of an experimental scientific paper? Because such a practice would appear to greatly simplify eventual decisions about whether retraction-worthy errors in the presentation are culpable or excusable.

  • Joulik March 16, 2017 at 12:23 pm

    I feel sorry for Gonzalez. Now it is publish AND perish. It is really disgusting how we academics are treated sometimes. We work a lot, what we do costs little money in the end, but the outcomes are useful to mankind. Yet it seems we are treated like criminals. We’d rather fake a sample and start a war based on it to kill thousands of innocents without solving any problem and waste billions in money. Then we may be called heroes.

  • Post a comment

    Threaded commenting powered by interconnect/it code.