Transparency in action: EMBO Journal detects manipulated images, then has them corrected before publishing

As Retraction Watch readers know, we’re big fans of transparency. Today, for example, The Scientist published an opinion piece we wrote calling for a Transparency Index for journals. So perhaps it’s no surprise that we’re also big fans of open peer review, in which all of a papers’ reviews are made available to readers once a study is published.

Not that many journals have taken this step — medical journals at BioMedCentral are among those that have, and they even include the names of reviewers — but a recent peer review file from EMBO Journal, one publication that has embraced this transparent approach, is particularly illuminating.

Alan G. Hinnebusch, of the U.S. Eunice Kennedy Shriver National Institute of Child Health and Human Development, submitted a paper on behalf of his co-authors on November 2, 2011, at which point it went out for peer review. The editors sent those reviews back to the author on January 2, 2012, and Hinnebusch responded with revisions on April 4. So far, the process looks much like that any scientist goes through — questions about methods, presentation, and conclusions, followed by answers from the authors.

But what caught the eye of frequent Retraction Watch commenter Dave, who brought this to our attention, was what happened starting on May 18 when the editors responded to the authors again. (That letter is labeled as page 6, but is actually page 16 of the linked document.):

We have now finally heard back from all three of the original referees regarding your revised manuscript. All of them consider the study considerably improved and would now in principle be supportive of publication without additional changes (see comments below).

Before we shall be able to further proceed with the manuscript, there are however some important issues regarding the Western blot data in the manuscript that need to be clarified. On our routine pre-acceptance checks of the figures, we noted the images in several such panels appear to be composites of distinct images that have been overlaid or spliced together. In order allow proper comparison and assessment of these data, and to avoid potential misrepresentation, I therefore need to ask you to kindly send us all files containing the original, unprocessed scans used to assemble the various Western blot figure panels. These files/images should just be sufficiently annotated to allow interpretation of their contents and how they were used. For all figures were composite images had been assembled, we will require explanations of the rationale behind this processing, as well as clarification of whether the composite images could still be considered a faithful representation of the actual experimental data.

Hinnebusch responds on May 29. Excerpt:

In going over the construction of the Western figures with Dr. Qiu, the first author, I realized that in both Fig. 1D and Fig. 3B of the revised manuscript we had indeed spliced together results from experiments conducted previously for the original version of the paper with new data obtained recently in response to reviewers’ requests for experiments on Ser7P CTD peptides or an additional Cdc73 mutant. I am convinced that Dr. Qiu’s intention was to consolidate findings and make the presentation easier to follow with fewer figures; however, this was clearly inappropriate because it gives the false impression that the results derive from single experiments. Hence, to correct these two errors, I propose that we replace the “offending” composite figures with the corresponding original figures from the first version of the paper and insert additional panels comprised of the new results, complete with controls, all in the manner described in detail in the next paragraph. This remedy will eliminate all splicing of lanes from separate experiments, without altering any conclusions made in the “1st revised” version of the paper that was just reviewed.

The fixes went a long way toward allaying the editors’ concerns, they wrote Hinnebusch on May 31, but not quite far enough:

Thank you for your message and original as well as revised files that you sent us – I very much appreciate your taking these matters very seriously. I also understand that all the instances of inappropriate figure assembling you found and discussed were, as I suspected, owed to well-meant attempts of streamlining the data presentation. Importantly, all the examples you mention are indeed sufficiently clarified by your explanations, and the proposed revisions should for the most part address them. Nevertheless, I am afraid that several issues still remain:

In response to those issues, Hinnebusch’s team repeated some experiments, and recreated some images before sending back revisions on June 15. On June 20, the editors wrote to say “that there are no further objections towards publication in The EMBO Journal.”

And with that, the paper was published online on July 13, along with the review process file.

We asked EMBO Journal editor Bernd Pulverer what the journal meant by “routine pre-acceptance checks of the figures:”

Our editors visually check every image panel before acceptance for quality (e.g. resolution, contrast), modifications and inconsistencies; we also assess that adequate statistical and scale information is provided. Of course, our referees are also encouraged to do so.  If any questions arise, our trained data editor undertakes a series of standard image forensic tests in photoshop that can reveal any hidden break points and duplications. As you know, we also routinely check all manuscripts with iThenticate/Crosscheck technology.

Did the editors consider rejecting the paper once the issues with the figures came to light? Or sending it to reviewers for another review?

We consult with referees as necessary in such cases. Our editors are scientifically trained experts and who also assess data for publication where appropriate. Since the editor uncovered these issues, he also evaluated the revisions with the help of expert colleagues. Given the exchanges fully documented in the review process files, there was no need to contemplate any decision other to the one taken in this case.

How did the editors determine that the figures were “well-meant attempts of streamlining the data presentation?”

We look at the nature of the problem – including the type of manipulation and the data in question – to determine if we are dealing with a case of beautification, incompetent data presentation or fabrication. In the vast majority of cases the problems we see are patently one of the first two categories; often these derive from a lack of understanding of what processing is acceptable and what is not acceptable. This is why it cannot be overemphasized that training in data processing and ethics is crucial.

Pulverer added:

We are pleased to see that the EMBO Transparent peer review practice of publishing important editorial communication alongside the referee reports added significant transparency and accountability in this case and we continue to actively encourage other journals from adopting such standards.

We agree. We also asked Hinnebusch for comments, and will update with anything we hear back.

Update, 10 a.m. Eastern, 8/6/12: Hinnebusch responded after a vacation:

With the guidance of The EMBO Journal editor, we revised the manuscript to eliminate presentation errors and to ensure that the origins of the data are transparent, in a manner fully described in the published transaction report. All findings presented in the published paper have been confirmed in replicate experiments, and we stand solidly behind the conclusions of our study.

8 thoughts on “Transparency in action: EMBO Journal detects manipulated images, then has them corrected before publishing”

  1. You have to applaud EMBO for the transparency, but having read the review file many times, I can only conclude that the authors were very lucky that they got away with one here. I’m still on the fence regarding what should have happened, but I am certainly a little uneasy that the paper was eventually published. I agree that for the most part it is inappropriate presentation but, in this case, I feel like there was a deliberate attempt to “pull the wool over the eyes” of the EMBO staff. They even went through and changed a lot of the figures in the supplement! They didn’t get away with it in the end, but I’m surprised that they did not take more offense to this at EMBO.

  2. This won’t please our resident splice spotters here. Although I salute their doggedness, determination and downright dedication to detail, I have always felt if you are using photoshop to falsify results you are doing it wrong. But I acknowledge it is a concern when it is found and important to bring up. Occasionally I feel it is done shorn of any scientific context – ie the point that a western was making didn’t particularly seem out of the ordinary.

    I have often wondered how a truly artistic fraudster should do western blots – a maestro who does fraud not to cut corners but for the sheer exhilaration of it. True fraud, in my view, should take longer and have virtuosity than genuine work.

    I would buy some commercial loading control proteins from recombinant sources and then spike them in your individual lanes according to the effect you are trying to achieve. In theory this would show up with a ponceau stain of the membrane – perhaps (if anyone asked for it). You could get around that with additional spikes of either cross species crude protein prep and/or immunoglobulins. It might take a bit of trial and error to get a handle of the what needed to create a seamless western, but well worth it.

    Any other suggestions?

    PS. I am always interested in a green card if any struggling US lab would like to put my unique skill set to use.

    1. Oh well, worth a try.

      As the saying goes: Good girls go to heaven, bad girls go to conferences in Milan, Graz, Hawaii, Cartagena, Port Moresby – OK maybe not Port Moresby.

  3. As one who has been reading the Western Blot stories on this blog with great interest, I think the editors did a great job, making the authors go back and do over all the spliced Westerns. I wish all editors would pay attention to detail like this.
    As to artistic fraud: of course it is harder than just doing the real thing (assuming it’s even possible to do the real thing) and it’s better looking, too. Sorry, dear rabbit, no takers on a real artistic aptitude.

  4. Basically they should make the authors sign a statement that they followed the explicitly outlined proper guidelines about presenting data from different experiments or gels and splicing them together and other appropriate behaviors regarding loading controls etc. At least they would then get exposed to what is expected of them. I think a lot of people don’t even realize it is inappropriate to splice lanes together if they feel they are presenting true results and appropriate controls. I do not think this is fraud, just a basic misunderstanding of what is acceptable.

  5. “I think a lot of people don’t even realize it is inappropriate to splice lanes together if they feel they are presenting true results and appropriate controls.”

    I’m not sure about this statement. If you don’t think there is anything wrong with it, why go to (often) great lengths to hide it or cover it up?

  6. You can splice westerns and any other image, for sure. You just can’t act like you’re trying to cover anything up! I have presented westerns, albeit all from the same experiment, where I have spliced out lanes that contain MW markers or are extraneous and beyond the point of the figure. The difference is that I made the splices obvious, leaving blank space where lanes were spliced out.

    As long as you’re up front and not trying to pull a fast one, you’re cool!

  7. Agree 100%. If you read the review file, I think that in this case they were clearly trying to hide something though.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.