Retraction Watch

Tracking retractions as a window into the scientific process

Cell Press investigating possible image manipulation in influential yeast genetics paper

with 10 comments

cellCell Press is looking into anonymous allegations that a pair of influential papers on gene activation in yeast may contain more than two dozen instances of image manipulation, according to a spokesperson for the journal.

The accusations first appeared in March on PubPeer, where they triggered a small avalanche of comments, including one asserting “unambiguous and repeated examples of data re-use.”

The concerns raised on PubPeer have even sparked an investigation by an institution in Spain, which found no evidence to support the allegations. But not everyone agrees with that verdict.

The images are of Western blots and chromatin immunoprecipitation (ChIP) assays, and appear in two reports published by Cell Press — one in Cell from 1999, the other in Molecular Cell from 2001 — and in a 1998 report in the American Society for Microbiology’s Molecular and Cellular Biology.

According to the comments on PubPeer, a number of figures in the papers contain duplications — and even triplications — of individual bands and lanes.

A spokesman from Cell Press said the publisher is looking into the allegations, but would not share details of the investigation.

The first author of all three papers is Maria Pia Cosma, now with the Centre for Genomic Regulation (CGR) in Barcelona, who earned her PhD in 2000 from the University of Naples Federico II in Italy.

In a comment on PubPeer from April 9, Cosma noted that the only data available today from the studies are low-resolution images in PDF files.

She added that her conclusions have been “confirmed, reproduced, and expanded” by other studies, such as papers in Genes & Development and PNAS. As for the data:

The figures were made after scanning pictures of multiplex PCR amplifications ran in agarose gels. The final pictures were submitted as print out documents and finally re-scanned. This is a 14 year-old paper, at that time, journals had no clear guidelines on how to present gel data.

Cosma did not respond to emails or phone calls from Retraction Watch, despite asserting in the comment that she was “open to a constructive discussion about the matter with anybody willing to contact me.”

One outside expert agreed the images look suspicious.

“After looking at all the images in these three papers, my personal opinion is that the similarities are too many to have arisen by chance, or to have resulted from accidentally incorporating parts of the same original images twice when composing the figures,” said David Vaux, a cell biologist at the Walter + Eliza Hall Institute of Medical Research in Melbourne (and a member of the board of directors of the not-for-profit Center for Scientific Integrity, our parent organization).

Vaux, who emailed us while attending the 4th World Conference on Research Integrity in Rio de Janeiro, said he had formed his “opinion based on examining results of many such experiments as a researcher, as laboratory head, journal peer reviewer, journal editor, and journal reader.” But he issued this caution:

However, because I do not have access to the original images, and cannot interview all of the authors, I cannot be certain that the images were falsified, or, if so, which of the authors were responsible.

The Cell paper from 1999, “Ordered recruitment of transcription and chromatin remodeling factors to a cell cycle- and developmentally regulated promoter”, has been cited 536 times, according to Thomson Scientific’s Web of Knowledge.

This was a very influential paper, said Craig L. Peterson from the University of Massachusetts Medical School in Worcester, who has done similar research. “It was one of the first papers to use ChIP to monitor sequential recruitment of transcriptional regulators to a eukaryotic gene, yeast HO.”

Peterson noted that he understands “why people might think that the bands in question were duplicated,” but doesn’t believe the “possible issues” would “refute the actual conclusions made from the experiment.” Even his work upholds their findings, he added:

Our data are mostly consistent with the Cosma paper, though the exact experiments in the same genetic background (i.e. ash1 mutants) have not been performed as far as I am aware. In general, the results from Cosma et al. ARE NOT considered controversial. I think this is a non-issue that does not warrant further discussion.

The other papers, “Mutations in the extracellular domain cause RET loss of function by a dominant negative mechanism” (Molecular and Cellular Biology), and “Cdk1 triggers association of RNA polymerase to cell cycle promoters only after recruitment of the mediator by SBF” (Molecular Cell), have been cited 44 and 109 times, respectively.

Kim Nasmyth of the University of Oxford, the senior author on two of the papers, said he has “No idea how to explain the similarities” in the images:

If someone comes to me with clear evidence for wrongdoing then I will be more than happy to respond. However, I am not aware that this has happened. For all I know, the allegations are part of a personal vendetta against Pia.

Nasmyth added that an independent investigation had not substantiated the claims.

That investigation was done by Jose M. Rodriguez Sanchez, a senior engineer in computer science from the Universitat Politecnica de Catalunya in Barcelona.

“The conclusion of the expert’s analyses, available upon request, is that none of the allegations on [sic] data manipulation are valid, i.e. the assessment unequivocally disproved the posted claims,” say CRG’s director, Luis Serrano, Juan Valcarcel, also of CRG, and Jaume Bertranpetit of the Institucio Catalana de Recerca i Estudis Avançats in a joint comment on PubPeer.

The analyses were done as pixel-by-pixel comparisons of images that, according to the PubPeer comment by Cosma, had been scanned, printed out, and then re-scanned. As described in his three reports, Rodriguez magnified the images and scrutinized them using various tools in Photoshop. Whenever he found a difference, no matter how tiny, he took it as evidence against duplication.

Of the 30 allegations from PubPeer that he examined, Rodriguez concluded that every single one was false.

But this method doesn’t pass muster with everyone, including Vaux:

I do not dispute his analysis, but I disagree with his conclusions, because he was not provided with the original images (just the published ones), and he did not consider the possibility that images could be altered after they were digitally duplicated.

An anonymous PubPeer commenter offered this view:

There are such an amazing number of similarities in several of these cases, but the “analysis” does not seem to evaluate the extent of similarities and instead seems to be looking for the faintest speck of non-identity. As if two identical pictures of the night sky were juxtaposed, and one star was slightly smudged in one, the “expert” would conclude “they are different”. Not only is this expert analysis unconvincing, it almost seems amateurish.

Update 5:08 p.m. eastern 6/4/15: Valcarcel has provided more details about how the report was commissioned:

A committee was set up at CRG to examine the allegations in Pubpeer. The committee urged Dr. Cosma to address the allegations and seek an independent assessment. The standard procedure to obtain the most authoritative, independent and legally-binding professional assessments of this kind in our legal system is to request them from the relevant Official College, in this case the Official College of Computer Engineers of Catalonia. Dr. Cosma contacted the College and the College assigned an expert with competence for the specific tasks requested. The expert acted independently to carry out this task, was not contacted by anyone at CRG during the course of the assessment and legally declared that he had no conflict of interest to carry out his job on behalf of the CRG.

Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post.

Comments
  • Leonid Schneider June 4, 2015 at 1:12 pm

    I am sorry, but I don’t follow Craig Peterson’s argument. So even if the bands were found to have been duplicated, meaning in turn data was intentionally manipulated, it doesn’t matter at all, scientifically? What kind of science would it be and who needs such science?
    As for the judgement by Kim Nasmyth, even if this is indeed “a personal vendetta against Pia”, so what? What does it matter how calls out data irregularities?
    Why is it so difficult to provide the original data, at least for some of the criticised images, instead of hiding behind a questionable forensic report (please read this opinion: https://pubpeer.com/publications/43D229CE50CAC900509F635F611EBA#fb30695)?

  • Dave June 4, 2015 at 2:32 pm

    I do not agree with Craig Peterson’s comments.

  • Pinko Punko June 4, 2015 at 3:17 pm

    As noted in discussions on Pubpeer. For some of the figures in question, there are jpg on the journal website that are slightly better than PDFs.

    Scientifically, the simple model proposed by Cosma 1999 was not entirely correct, and that is fine- models are incorrect or incomplete all the time. There are details that emerge when both URS1 and URS2 are examined for all factors at all time points and when ASH1 and ash1∆ cells are compared. These details are noted in subsequent papers from others. This lack of completeness may not necessarily have been a problem in the original papers but when reading Cosma 1999 at a distance (when it came out it was absolutely stunning), one might reflect on a feeling that the results presented are selective in some ways, and the experiments that are missing all add complexity to the model.

    Whether the images were merely doctored to make some figures look nicer or what have you rather than being completely fabricated is sort of an unnecessary distinction if we are considering what is ethical behavior. Claims of vendetta do not seem supported- the images are extremely troubling and seem to fall in the “no doubt” category. I disagree with Dr. Peterson that this case is closed- biologically, many of the results do stand, but these don’t explain the figure abnormalities. What is depressing to many in the field is that these papers were considered classic (especially Cosma 1999), and we referred to it as THE Cosma paper in how powerful it showed chromatin-IP to be, and how clean the data were.

  • Sylvain Bernès June 4, 2015 at 3:40 pm

    Please, authors… STOP using the argument that “conclusions have been confirmed, reproduced, and expanded by other studies”. Drowning the fish only make matters worse.
    Disclaimer: it’s just meant to be a general and respectful comment. I haven’t read the PubPeer thread and I don’t know anything about the research carried out in the group headed by M.P. Cosma.

  • Vladimir Svetlov June 5, 2015 at 7:47 am

    Craig Peterson’s argument is fundamentally flawed. It completely ignores the problem of anchoring in human heuristics. Anchoring forces people to cling to the notion put in front of them, often in the face of conflicting evidence. It works like that: one is trying to replicate findings from Cell paper. Barring personal vendetta against the authors or a particularly acute scientific acumen this one doesn’t run 20 gels to evaluate validity of original report. One runs a gel and if it looks consistent one declares that the results was replicated. If the gel doesn’t look “right” one is confronted by suspicions of being technically incompetent – because a Cell (Science, Nature) paper saz otherwise. And one has a choice – to keep plugging at it, risking more contempt and ridicule, or just run it “right”.
    I have seen many cases when one pile of bad data was accepted as “good” because another pile of similarly bad data was “consistent” with the first. And none was properly questioned.

  • Pinko Punko June 5, 2015 at 9:09 am

    Reading Dr. Peterson’s comment again, I realize there is a more charitable way to understand it. He may not be making excuses for the Cosma papers. He may instead just be saying what happens to be true in this case. The outcome for those papers will not overturn any knowledge. I think the way this was said lacks clarity. I do not think that he is arguing that this is less of a problem for the authors, he is stating the truth about the lack of effect on how we will understand gene regulation. In this case we will understand from other labs who have done experiments in this system.

  • Paul Brookes June 5, 2015 at 11:24 am

    Petersen is wrong.

    Regardless if a scientist “gets it right in the end”, the route they arrived by is important. What if we were all allowed to publish what we think should be the correct result, and have the plebs figure out the details post-hoc? Those who are best at predicting outcomes would get a jump ahead of the competition for jobs, promotion, grant awards, etc., versus those who diligently plow through the tedium of careful experimentation.

    Did Petersen ever consider the possibility that, had things happened differently, maybe his papers “confirming” the findings would have been published first, and everyone in the field would now refer to “the Petersen paper”?

    An ethical analogy would be a guy abducting a child he claims is his, with a later court ruling that indeed he is the father and is granted custody. Technically he wasn’t “stealing” anything since it was his kid anyway, and he was right in the end, but I have a feeling the mother and the police might view things differently at the time of such an incident. Being “right in the long run” does not excuse rash or lawbreaking behavior in the short term.

    • Pinko Punko June 5, 2015 at 5:57 pm

      I think how it is written it is unclear if this is what he meant. In some cases fraud would mean there needs to be a reevaluation of knowledge. In this case, there need not be a reevaluation of knowledge for the field because the result stands independently. This does not have to be an argument having anything to do with the original paper or how it should be investigated or otherwise. It can be read as a statement of fact regarding what is known in the field. Of course, what can be reevaluated is the priority of discovery and any other fruit from this poisonous tree. I note that Dr. Peterson is not quite clear in this regard, and that is why we are discussing what he said and what it might mean.

  • AMW June 5, 2015 at 11:57 am

    Certain similarities with the Saad cases in Brazil (although the age of the papers in question here is much greater) – the authors use citations of their work as part of their defence, and a clearly flawed investigation is undertaken that dismisses the allegations.

    The key issue is what the journal will do. These are major league journals with reputations to uphold. Given the extent of the problems in the papers they can’t simply ignore the issue, can they?

  • Educated guesswork June 5, 2015 at 1:53 pm

    The Peterson comment lays bare all that is rotten in science. The end justifies the means. It is acceptable to jump to conclusions once an educated guess is possible of what the likely true result might be. From now on further serious experimentation with replications and controls become a superfluous waste of time. Later work may “confirm” this guesswork irrespective of “beautifications” or other fabrications, and all is well. If the guesswork got it wrong, bad luck for those unfortunate enough to find out.

  • Post a comment

    Threaded commenting powered by interconnect/it code.