JCO expresses concern over images from Spanish group that had aroused earlier concern

The Journal of Clinical Oncology has issued an expression of concern about a 2003 article by a group of researchers in Spain who appear to have had recurrent problems with images.

The paper has been cited 56 times, according to Thomson Scientific’s Web of Knowledge. Here’s the notice:

It has been brought to our attention, regarding Figure 1, Part A, that similarity exists between bands 3, 4, and 5 of the top row and bands 1, 2, and 3 of the bottom row, including the presence of artifacts, in the April 15, 2003 article by [Jose] RomanGomez et al, entitled, “Cadherin-13, a Mediator of Calcium Dependent Cell-Cell Adhesion, Is Silenced by Methylation in Chronic Myeloid Leukemia and Correlates With Pretreatment Risk Profile and Cytogenetic Response to Interferon Alfa” (J Clin Oncol 21:1472-1479, 2003).  This similarity has raised concerns about whether these rows represent independent data. We alerted the corresponding author, Dr. Roman-Gomez, of this concern, and he has repeated the experiment shown in the original Figure 1, with the results below. Although the results of the figure appear to be unchanged, we wish to inform the readers of this issue, so that they may make their own independent assessment.

That’s transparent as it goes, but if Roman-Gomez was able to supply new figures that were consistent with genuine data, why be concerned? Similarly, if the journal believed that Roman-Gomez or someone in his lab was doctoring images, wouldn’t they be likely to submit bogus images to support their initial deception?

It turns out that Roman-Gomez and his colleagues had run into trouble with their images in the journal on another occasion.

A paper from the group was the subject of a 2010 expression of concern, again for image issues:

It has been brought to our attention that Figure 1 in the October 1, 2005, article by Roman-Gomez et al, entitled, “Lack of CpG Island Methylator Phenotype Defines a Clinical Subtype of T-Cell Acute Lymphoblastic Leukemia Associated With Good Prognosis” (J Clin Oncol 23:7043–7049, 2005), was originally published by Clinical Cancer Research (Clin Cancer Res 10:6126–6133, 2004). A Correction Note has been published to correct this error as of September 20, 2010.

That correction notice read as follows:

The October 1, 2005, article by Roman-Gomez et al, entitled, “Lack of CpG Island Methylator Phenotype Defines a Clinical Subtype of T-Cell Acute Lymphoblastic Leukemia Associated With Good Prognosis” (J Clin Oncol 23:7043–7049, 2005), contained an error.

An erroneous image was given as Figure 1. The corrected figure is reprinted here in its entirety.

We take the correction and the 2010 expression of concern to indicate that the researchers submitted to the JCO a partial, and thus manipulated, image from a previous publication in a different journal.

We emailed Roman-Gomez for comment but have yet to hear back.

Since the JCO wants its reading public to adjudicate the veracity of the images, here are a couple to consider:

From the 2005 publication that was corrected:

Fig 1.

And from the 2003 paper that’s the subject of the most recent expression of concern:
Fig 1.
An earlier version of this post referred erroneously to Western blots. Thanks to a few eagle-eyed Retraction Watch readers for pointing that out.

0 thoughts on “JCO expresses concern over images from Spanish group that had aroused earlier concern”

  1. You may need to check the veracity of this RW post. Those are not western blots, but pcr gels for determining gene methylation status (very different methods).

  2. This is a bit like sight-reading music! More exciting though!!

    2005 corrected figure 1 (that really is the corrected figure).

    What I notice in the SYK panel is that the upper and lower halves of it do not line up, the lanes do not line up (It is true that they have included a clear white horizontal line showing that the halves have been spliced together).

    What I notice in all the other panels is that about halfway down there is a horizontal discontinuity in the quality of the image between the upper and lower halves of each of the panels. This discontinuity is not so noticeable in the NES-1 panel, one up from bottom, but if you move you head up and down wrt the screen you can see that there is a clear discontinuity in the image.

    The bottom panel, WIF-1, is really badly put together. You can clearly see that the lanes do not line up. UM is shunted to the left wrt M.

    In the sFRP-5 panel the horizontal discontinuity in quality does not extend all the way across the panel, but stops short at the last lane on the right. In the first and third lanes (from the left) of this panel you can see that the upward tails of the bands have been covered up by the upper half of the panel. The upward tail in the right most lane does not look overlaid, but it still does not look like there is a gradual end of this tail.

    The panels have been spliced together, and that, with the exception of the top panel, the authors let you figure this out. It might all as well be in free space (does any body know what non-free space is?).

    Pulling the other one is more effective. It has got bells on it.

    1. I agree with Clare. There is definitely a horizontal discontinuity on all of the images shown in the first figure. It is less obvious on the second image from the bottom. The background in all the images shows a sudden change from darker to lighter gray at about the midpoint from the top. There are other, fainter changes visible on most of the images. These images appear to have been spliced together; it is hard to believe that the change in the background could be so sudden in a continuous gel. If there were a white line across the image in all the panels, as there is in the first one, it would be understandable; but there is no indication.

      That’s the good thing about these “primitive” techniques: you can see when they start manipulating the images. That’s probably why there have been so many retractions recently: it’s so easy to see. When authors realize that people are actually looking at their figures, this will probably taper off (but it may never go away altogether.)

      Nothing more to say about Western Blots (these aren’t, but they use a similar-looking end point so they’re equally transparent.) Everything (and I mean everything) was said in the comments on the last post.

      Monkey liked the editorial from Ivan/Adam and feels sorry for them because of the intense antagonism, which he felt was over the top. Nonetheless, the antagonism resulted in a very detailed explication of the benefits and risks of WB from people who really use it a lot, those best qualified to judge. So it wasn’t all bad that Dave got his feathers ruffled. Monkey thinks he was already mad before he read the editorial.

  3. To me the events are as follows:

    In 2010 the authors replaced their 2005 JCO paper Figure 1 with a new figure (which has the bizarre features that Clare has pointed out). Why replace a clear figure with a bizarre version of the same thing? Because their original Figure 1 was stolen, lock stock and barrel, from another group’s paper (Takahashi et al., http://clincancerres.aacrjournals.org/content/10/18/6126.full.pdf) relating to patients with gall bladder disease. The lane titles were simply changed to indicate that samples from T-ALL patients. This looks like plagiarism and fabrication combined (? fabroplagiarism, plagibrication?)

    Now we learn that Figure 1 in their 2003 paper also has a ‘problem’. This can be seen at:

    http://jco.ascopubs.org/content/21/8/1472.full.pdf

    and clearly shows copying of three lanes of a gel and pasting directly below, but with a two-well shift to the left. This is only simple fabrication. We were just discussing on RW before how not just Western blots but any technique can be subverted if investigators choose to do so, and along comes an example (agarose gel) that illustrates the point.

    Journals now seem to be prepared to risk their own reputations in order to avoid conflict, but in this case I don’t think there is room for manoeuvre. JCO has now had two papers demonstrating fraud (yes I think I can say it) from the same authors. As well as issuing an expression of concern, the journal needs to contact the host institution and ask that an inquiry be undertaken to investigate these papers (and all related literature – there may be other examples that are yet to come to light) where there seems to be clear evidence of scientific fraud. If no investigation is forthcoming then the papers should both be retracted by the journal.

    If JCO takes no further action then it will be in danger of damaging its own reputation for scientific integrity in order to protect the reputation of authors who (from what we can see here) don’t have any.

    If you keep falsifying your data, eventually someone will notice.

  4. Hang on… check this out regarding a Retraction of a 2004 Blood paper by Roman-Gomez et al. in 2009:

    http://bloodjournal.hematologylibrary.org/content/113/10/2370.2.full.pdf

    It says:

    ‘The authors retract the October 15, 2004, paper cited above. Figure 1 in that paper was inappropriately reproduced from a previously published paper, namely Dong SM, Kim HS, Rha SH, Sidransky D. Promoter hypermethylation of multiple genes in carcinoma of the uterine cervix. Clin Cancer Res. 2001;7:1982-1986.
    The authors deeply apologize to the authors of that paper and to the scientific community.’

    So on the one hand Roman-Gomez et al. apologise to the scientific community for fabricating a paper, but on the other they do nothing to alert readers to the fact that they have done exactly the same sort of thing in other papers. When will science learn that people like this cannot be trusted to comment on their own papers?

    I’m no private investigator and it took me about 10 minutes to find evidence of this other case of plagiarism / fabrication from the same authors that came to light in 2009. JCO is the official journal of The American Society of Clinical Oncology, boasting an impact factor of 19, placing it 4th in Oncology and has now published two statements from these authors even after they retracted the paper from Blood. These papers refer to clinical samples and hence there are far-reaching ethical implications of fabrication.

    Interestingly, Blood could also have helped with this matter when presented 3 years ago, with the above retraction, which is an apparent admission of scientific fraud.

    JCO needs to get its act together rapidly and, instead of asking readers to sort the mess out for themselves, contact the host institution with a very clear message that the institution needs to clear up its own mess, or admit that it can’t. In case they need help, the host institution is Department of Haematology, Hospital Reina Sofía, Maimonides Institute for Biomedical Research, Córdoba, Spain.

  5. “We were just discussing on RW before how not just Western blots but any technique can be subverted if investigators choose to do so, and along comes an example (agarose gel) that illustrates the point.”

    Precisely and this illustrates everything that we have been trying to say over in that discussion! Stand-by for an article by Adam and Ivan entitled – “Can we trust PCR?” LOL.

    1. I think we now all agree people are indeed the problem – and I don’t think we should shoot the messengers.

      To me this case is breaking new ground in scientific fraud because the problem has been so obvious for so long. Roman-Gomez et al. appear to have fabricated at least three papers in a high profile field involving clinical samples and no seems to have noticed. They even admitted one instance of apparent scientific misconduct in 2009 (retracting a paper), yet no one at the journal or institutional level seems to have thought it worth considering the possibility that there might be serial cases of misconduct and other fabricated papers. Instead, JCO allowed the same authors to comment on cases of clear fraud in the authors’ own terms, and JCO’s Expression of Concern now, although welcome, comes woefully late.

      Another notable thing I just noticed is that although the Roman-Gomez Blood paper (2004) was retracted in 2009, it was quoted 16 times in 2010 and 13 times in 2011, and twice already in 2012. Is the methylation / malignancy world out there asleep? Does anyone in this field actually care about the validity of the papers they are quoting?

      Still shaking my head in disbelief….

      1. amw, I suspect that people that continue to cite Roman-Gomez et al BLOOD (2004) don’t know it’s been retracted (or maybe don’t understand that it’s been retraceted – several of the citing papers are from all Chinese or all Korean groups).

        I can imagine that these groups have Roman-Gomez (2004) in their database and they link it into their papers whenever they consider that it’s appropriate to cite it in the context of their work.

        If you’ve downloaded a paper into your database you simply may not be aware that it’s been retracted. Perhaps when electronic database programs become sufficiently sophisticated then retractions will become automatically flagged. Again it’s one of the things publishers might find a way to screen for. It shouldn’t be beyond the whit of mankind to have a depository of retractions against which all electronic submissions are screened. Then the authors can at least be prompted as to whether they really want to cite the retracted paper.

        And it does lead onto a more interesting and almost philosophical point! Maybe the citing aouthor considers that there is something sufficiently valuable in the retracted paper that s/he considers it worth citing. Maybe the paper was retracted for a specific reason (e.g. plagiarism) and the citing authors consider that important elements of the data are fine.

        That seems like an interesting question to me. Does a retraction mean that a paper becomes entirely a “non-paper” deemed for the purpose of scientific enquiry never to have existed? Or is retraction a flag to subsequent researchers to carry on but proceed with caution??

  6. I think an issue with some of the posts made in this blog is that many people do not have the technical expertise to interpret blots, pcr, etc.

    Methylation specific PCR protocols are easy to find online, and I recommend that clare francis, monkey, and whoever else is interested to read and understand the method first. The multiple panels in the corrected figure from 2005 are not suppose to depict single gels. You are actually seeing 2 gels placed on top of one another in each panel. In methylation specific PCR you essentially use 2 near identical sets of primers (usually differing in a few nucleotides) to determine the methylation status of a genomic region. So one primer was used for the UM row, and another for the M row. Many people run the pcr products from methylation and unmethylated specific primers side by side, but in this case the authors have run them on separate gels. You would not be able to load the pcr products within the same gel lane because you would have no way of distinguishing the pcr products because they will produce bands with the same size. (It has been a few years since I have done one so anyone feel free to correct me if i’m wrong)

    Another comment made by monkey: “Nothing more to say about Western Blots these aren’t, but they use a similar-looking end point.” Not really. Western blot images are from films (for the most part). The images shown above are actually images of the dna gel.

    1. This is absolutely correct – the problem is not about whether gels (or their replacements) are technically correct. These gels escaped notice because they look fine! The critical fact though is that in two papers the gels were simply stolen from other methylation papers (that’s why they looked fine). In the third case the same gel was copied and pasted to fabricate the unmethylated results (in the lower part of the Figure – or vice-versa). This doesn’t look fine.

      It would be very helpful if someone could upload the relevant images:

      Case 1: The stolen image (Figure 1) in Roman-Gomez et al. JCO 2005, and the valid original image (Takahashi et al. Clin Cancer Res September 15, 2004 10; 6126) – these are identical

      Case 2: The stolen image (Figure 1) from Roman-Gomez et al. JCO 2004 and the valid original image (Dong SM, Kim HS, Rha SH, Sidransky D.Clin Cancer Res. 2001;7:1982-1986) – again these are identical.

      Case 3: The fabricated lower gel (Figure 1) from Roman-Gomez et al. J Clin Oncol. 2003 Apr 15;21(8):1472-9.

      Then you can all see why I’m ranting about this (instead of doing something more constructive….)

    2. “I think an issue with some of the posts made in this blog is that many people do not have the technical expertise to interpret blots, pcr, etc.”

      Couldn’t agree more and you beat me to it with this entire post.

      1. Yes puzzled monkey – they are supposed to be 2 different gels placed one after another and that is why you see the horizontal discontinuity. That is how they should be. In first one authors do have a line to show that but have missed it in the others assuming that anyone who is going to read that paper would be familiar with the way these are done and so its probably not needed.

      2. “Assuming that anyone who is going to read that paper would be familiar with the way that these are done…”
        I don’t feel that’s an appropriate assumption. I would have been happier (and never would have embarrassed myself agreeing with Clare) if all the images had that horizontal white line separating them.

        I feel that I, as a monkey, should be pandered to (sorry.) Make it clear that your figures represent two different gels, lined up with matching lanes, to show the differences.
        I know that’s hard for sophisticated people to accept, but making it clear enough for even the dumbest readers to understand is important to avoid controversy and confusion… to me, it’s important, anyway.

        That’s why, for example, when using an acronym, you always define it the first time it appears in the text.

        After all, your audience is not just the other researchers who are using the same techniques you use, but
        the general (scientifically interested) public. Science has become so complicated that researchers in one field have little knowledge of the intricacies of another field, thus increasing the need for clarity and simplicity in writing.

        If I just identified a drug as Vibrya, you would say “what?” But if I gave its generic name, vilazodone, you’d at least have a hope of knowing what drug class it is part of and maybe remember that it is a relative of trazodone.
        (“Drugs” that are monoclonal antibodies have the ending “mab” in their generic name–a handy way of recalling what they really are: proteins of a specific class.)

        Unless, of course, you don’t want me to read the text of your papers…only the abstracts.

      3. I agree with you that the figures should be explained with sufficient clarity so that the readers can follow the data presented. I guess one would need to read the paper and the figure legends to make that out.

    3. I have done those assays. I am not so naive. A description is a place to start. Do we agree on what we see?

      Within the corrected figure 1 2005 there is an inconsistency in presentation. Not breaking the rules, but an inconsistency. It could all be fine. The corrected 2003 figure 1 could all be fine too, but then the editor-in-chief issues an “expression of concern” with the final sentence. what is all that about?

      “Although the results of the figure appear to be unchanged, we wish to inform the readers of this issue, so that they may make their own independent assessment”.

      An underlying concern is why has the editor-in-chief of the Journal of Clinical Oncology, Stephen Cannistra, been cryptic? What does an “expression of concern” mean in this case? Is he only 50% sure the corrected 2003 figure 1 is true, 25% sure…? He writes that he leaves up to the readers to make up their own mind. This is not in itself a bad thing, but the readers do need to be given the relevant information, such as how much the editor-in-chief knows. I imagine that he knows about some of the other papers mentioned in this post in his own journal, and in other journals. He has added a bit of context, the fact he is not sure, but has not given the full picture.

      You would be left staring until the end of time (and be none the wiser) if it were not for electronic communication, Retraction Watch included as an important catalyst. When there were jsut a few people biology perhaps you could know everybody and nearly everything, but not now. Some people think it is clever to be opaque and believe that their schemes with arrows with a PLUS, or a MINUS at their ends, explain what is going on. People are brow-beaten into not asking for the data, or even asking a question.
      Editors act as if they were in purdah and are affronted when you point out inconsistencies, or highly unlikely consistencies, or even simple things such as people publishing the same articles more than once, forget about simply some data more than once.

      The editor-in-chief of the Journal of Clin Oncol, Stephen Cannistra, should explain his reasons for issuing the “expression of concern”. What did he take as evidence and how did he weight the evidence?
      If the readers knew the evidence they would be better able to weigh it themselves. What he has written is that he is not sure and the readers can be unsure too. They can make an independent assessment, but do need to know what to look at.

      I think that in this case unwittingly the editor-in-chief of the Journal of Clinical Oncology has done medicine and science a real favour because others have found even more.

      1. Clare -Yes it is right that if the editor has a reason to be concerned then the reasons should be fully disclosed to the readers and the concerns should have been communicated to the author’s institute for further investigation. After all the journals do not go ahead with publication of any manuscript unless they are convinced of the data and its interpretation. How many other publications do we have where its left to the readers to decide whether the data is right or not? Reading some of the posts below doesn’t leave much scope for doubt for the readers here .. I guess its time for the editor to decide now!

    4. Splicing of gel images, in my opinion (and that of many journal editors) should be clearly demonstrated by visual clues, such as in the top panel of the first figure – a white or black line separating the two gels helps the reader in the correct interpretation of the data. It could also be argued that a MW marker should be included in a DNA gel so that there is at least some minimal control instance to check size of the PCR products against.

      It might be a question of technical expertise that the lack of knowledge about the particulars of methylation PCR (or should I rather say methylation-specific PCR of bisulfite-converted DNA?) leads to misinterpretation of the presented figures, but I believe that results should be presented in such manner that the potential for misinterpretation of confusion of the reader is minimized…

    1. FigureSleuth – please can you look very closely at Figure 1 of this Takahashi article and tell me if you see any problem within the gels. Perhaps my eyes are deceiving me but there seems to be repetition of band patterns across 3 of the 4 GAPDH panels (spanning Fig. 1A and 1B). There’s a small artefact in a band towards the right which seems to repeat itself, and the pattern of band height and intensity seems very similar across the different gels. Many thanks

    1. Thanks, FigureSleuth, I thought I must be going mad. Again, in the Leukaemia paper, you can see identical gel artefacts in lanes that are claimed to be different samples. So that’s now 5 cases of clear falsification of data, involving 4 journals.

      Within the 5, there are 2 cases of copying and pasting a gel that might actually be the authors’ own (so far we haven’t found a source paper from which the image was stolen). But in the other 3 cases the authors clearly took someone else’s gel image from an independent published paper, and applied it to what must now be considered to an imaginary set of novel clinical data.

      Seems like in 2003 – 2004 these authors were fabricating their own gels ‘in-house’. This is riskier since people can spot the copying and pasting – the crime is visible in one figure. Then in 2004 – 2006 they started simply lifting whole gels, or portions thereof, from other people’s papers. That’s quite clever since only the authors of the original papers (Takahashi et al., twice, and Dong et al., once) would be likely to pick up the fraud, and only if they happened to notice. They even managed to correct one of these 3 fabroplagiarism articles by reinserting an entire new figure (Case 1) and get away with it (JCO 2005, corrected in 2010 as noted above by Adam and Ivan).

      Systematic fraud on a very large scale.

      1. Actually not so clever – in the Oncogene paper they lift a gel from Takahashi and colleagues, and then use it twice for two different sets of experiments (lower panels of Figure 1b and 1d show the same gel artefacts). Everywhere you look you find a new problem.

  7. The blog is a little confusing with its presentation of the Journal of Clinical Oncology 2003 paper update.

    The image they show is the re-performed experiment which clearly is not a duplication, the original 2003 image that caused the concern is here
    http://jco.ascopubs.org/content/21/8/1472/F1.expansion.html
    I don’t think you could be absolutely sure that this is an actual duplication as the lower half is too truncated.

    Having said that this from FigureSleuth:
    “Agirre, Roman-Gomez and colleagues, Oncogene, 2006
    http://www.nature.com/onc/journal/v25/n13/full/1209236a.html
    Figures 1B and Figure 1D

    were ‘stolen’ from:

    Takahashi and colleagues, Clin Cancer Res, 2004
    Figures 1A and Figure 1B
    http://clincancerres.aacrjournals.org/content/10/9/2928.full
    is certainly true. Although who has the time or the type of memory to go through and find these borrowings.

    Its mysterious why anyone should do such a level of cheating. It might be something as simple as the lab did not have a gel digital camera and the quality of scans from the photograph prints were not publication quality.

    This is Spain after all

    1. If you can’t afford a digital camera to take pictures of your gel should you be doing research at all?

      1. Put it another way, I am sure he can get bands on a gel if he wants to – since we have no idea what really goes into each well, his habit of “borrowing” gels from other people seems unnecessary if he was really committed to fraud. The question then becomes has he actually got these methods to work at all and are his case-control results reliable. In theory he could just be inventing everything, but I think that is unlikely.

        Prof Richardo Veroz Herradon is the name of the University of Cordoba Ombudsman, email: [email protected], which is probably the first port of call in such incidents. I wouldn’t hold your breath in expecting any results from it though. If something was going to happen it would have happened back in 2009.

        The most likely result will be for repercussions to rebound on anyone who complains, and J Roman-Gomez to sail on unaffected.

    1. Reassuring to know others are seeing these issues. The image link above is very helpful.

      Littlegreyrabbit, as I suggested above, can you also look at Figure 1 of the Takahashi article:

      http://clincancerres.aacrjournals.org/content/10/9/2928.full

      and tell me if you are concerned about any of the gels. Perhaps my eyes are deceiving me but there seems to be repetition of band patterns across 3 of the 4 GAPDH panels (spanning Fig. 1A and 1B). There’s a small artefact in a band towards the right which seems to repeat itself, and the pattern of band height and intensity seems very similar across the different gels.

      1. AMW: Right. it appears that the GAPDH bands are repeated. Someone needs to investigate that in detail. Did we all agree that it is ok to use loading controls across the gels and throughout the paper?!! Sometimes, we even can use the same loading control for different papers? Abnormal science has discussed this issue in depth earlier. If this practice is not alright – you are hitting another big person in Texas.

      2. Is this what you mean?

        http://imageshack.us/photo/my-images/839/temptm.jpg/
        In a blue box I have taken the portion of the gel that Roman-Gomez borrowed. In the red box at the top I put the GAPDH panel that is repeated with fewer and fewer gel wells down the figure.

        Having said that this is more “naughty” than anything else. Since we aren’t looking at quantitative changes but just an on/off – the role of the GAPDH is just to demonstrate everything worked. Still shouldn’t do it.

        With Roman-Gomez I would still be willing to give him the benefit of the doubt that what he published reflected what he found in his lab. But it is extraordinary.

      3. @ littlegreyrabbit – when controls are switched or not there then its not just ‘naughty’ but the entire data becomes questionable. As you said that GAPDH is just to show that the experimental setup was working so how do we know for these samples a no signal means what they are claiming or those particular samples did not work for some other reason (e.g. no good quality template) but still they went ahead and copied GAPDH bands to make it look like it had worked

  8. Thanks Ressci, littlegreyrabbit

    Yes, http://imageshack.us/photo/my-images/839/temptm.jpg/ is exactly what I mean (now we’re talking about the Takahashi paper, which really needs to be dealt with separately). To my understanding these lanes are completely different samples and can’t be explained by being the same loading controls. If these are indeed the same images being copied and pasted, then this is fabrication, and one has to assume that people whose integrity is compromised enough to fabricate gel images in one section of a paper may have fabricated other elements of their paper. At the very least the authors need to explain what happened and correct or retract the paper.

    Moving back to the Roman-Gomez story, I just can’t see how there is any doubt about this being fraud. It’s inconceivable that a group would obtain data and then publish instead plagiarised images from other papers that somehow represented what they already had in their hands exactly… on three separate occasions. There ARE no gel data – that’s why they have to steal them, and/or use copy & paste. By extension, the clinical series referred to within these five fraudulent papers (with differential survival rates according to methylation status) must also be presumed to be non-existent.

    And there are almost certainly more papers out there from this group if an effort is made to find them…

    1. You might well be right on the non-existence of any actual data. It’s so easy to produce a gel with bands in the desired positions by just exchanging the samples / no-substrate controls in the PCR reactions that I wonder why they would bother faking it with Photoshop, where falsifications are easy to detect.

      1. I agree, but I am no longer surprised by the fact that people fake their data in easily detectable ways (although remember these authors also steal other people’s gels, which is incredibly hard to pick up).

        Firstly, I don’t think people who commit scientific misconduct (or indeed any form of criminality) think like ordinary people – they don’t generally think about the possibility that they will be detected (we can think of many examples from human history where people behave in ways that seem bizarre after the event).

        Secondly, there’s the big problem of ascertainment: the ‘careful’ fraudsters who falsify data in a way that can’t be detected by outsiders are only discovered if they are reported by a whistleblower; even then institutions can sweep the matter under the carpet if no one from outside is looking.

        So by implication, we only get to see obvious and indisputable cases of fraud of the type now coming to light with these authors.

      2. A fox in charge of the chicken-coop?

        I’d be interested to hear what the editor-in-chief of the Journal of Clinical Oncology, Stephen Cannistra, has to say on the issue on “falsifications (are) easy to detect” variety, such as with photoshop, and harder to detect, such as the making it all up and making collages variety. Setpehn Cannistra might enlighten the reading public as to his view about plagiarism by one of the deputy editors of his own journal in his own journal. Apart from the retarcted editorial there seem to be no iother consequences.

        http://jco.ascopubs.org/site/misc/edboard.xhtml

        EDITOR-IN-CHIEF Stephen A. Cannistra, MD, Boston, MA

        Deputy Editor: Translational Oncology Mary L. Disis, MD, Seattle, WA

        Deputy Editor: International Editions David M. Khayat, MD, PhD, Paris, France

        http://www.retractionwatch.com/2012/01/05/jco-retracts-article-from-major-french-cancer-group-over-apparent-plagiarism/

        The retracted editorial was not by a “guest” editor, but by one of the permanent editors.

    1. Yes, that’s seven papers now by my calculations. In the Blood 2007 paper there’s again a little fleck in the gel that clinches it.

      Looks like we saw the JCO 2009 flipping at the same time – to me there are also other copy / pastes there which you can see at

      http://img687.imageshack.us/img687/1223/figure1labels.jpg

      The green one has a distinctive curly defect in the band in the left-hand lane.

  9. A 6th paper containing apparent fraud from Roman-Gomez et al. (and a third for JCO to sort out) showing that the problems span 2003 – 2009 (at least): Roman-Gomez et al., J Clin Oncol. 2009 Mar 10;27(8):1316-22,

    http://jco.ascopubs.org/content/27/8/1316.full.pdf

    Figure 1 – there are two and probably three pieces of image manipulation here which are labelled in the following site:

    http://img687.imageshack.us/img687/1223/figure1labels.jpg

    The most obvious case is the horizontal flipped image because there are three gel fleck artefacts which give the fraud away. The other two cases are less clear, but again there are small artefacts which look very suggestive of image duplication.

  10. Liitlegreyrabbit – you suggest it is unlikely that the Roman-Gomez invented everything, presumably because you can’t imagine exposing yourself to the risks that they took.

    But people who commit scientific fraud do not behave normally. This group steals gels from other labs’ papers and presents them as their own. When discovered (Blood 2004) they simply do what is necessary to carry on without admitting systematic fraud (issue an apology, presumably enforced by Blood). Someone else notices that a figure in JCO paper is also stolen. No problem – they get away with replacing it with a new one. This is not normal behaviour.

    This time from what I can see (with a gel that doesn’t seem to have been stolen, only fabricated), JCO have realised something is up, and issued an Expression of Concern rather than simply allowing the correction (hence triggering this thread).

    So Retraction Watch seems to have uncovered a case of large-scale fraud that no one has acted on at the institutional or journalistic level. To me this is what Retraction Watch should be about; reports of ‘cut and dried’ cases are interesting but are only the tip of the iceberg.

    So the big question is – is anything going to be done? Would RW consider contacting JCO and other relevant journals to make sure something happens this time? A summary of the 7 cases of likely fraud uncovered so far can be found at:

    http://img42.imageshack.us/img42/3324/summaryq.jpg

    1. I completely agree with you … repeated over a period of time and getting away with it will only encourage such behavior. JCO and other relevant journals need to take a firm stand against such fabrications and fabricators. Interestingly the stolen image has also exposed the Takahashi paper … that too needs to be dealt with.

      1. Thanks WB – reassuring to know there are others of like mind.

        Just to clarify the issue with respect to the Takahashi papers, what seems to have happened is re-use within the 2004 CCR paper (see below) of a gel image describing control data (GAPDH) for very different sets of lymphoid and hematopoietic cell lines / conditions (incredibly this is also one of the gel images stolen by Agirre / Roman-Gomez et al. and used in their 2006 Oncogene paper).

        Having come across this, I looked at a few other papers. I think Takahashi et al. also took the same gel image, flipped it vertically and claimed it to be the control GAPDH gel for a set of entirely different colorectal cancer lines described in two papers (IJC 2005 and 2006).

        This link shows what they did:

        http://img85.imageshack.us/img85/8323/takahashigelreuse.jpg

        Unfortunately I don’t think there can be an innocent explanation for false representation of data on two separate occasions. The GAPDH data are indeed the controls, but the first rule you learn in science is that the controls are the most critical part of the experiment. Again, this really needs to be investigated. If my above experience with the Roman-Gomez group is anything to go by, I’m not sure anything from this field is believable and there may be other instances of falsification by these groups.

        1. Takahashi T, Shivapurkar N, Reddy J, Shigematsu H, Miyajima K, Suzuki M, et al. DNA methylation profiles of lymphoid and hematopoietic malignancies. Clin Cancer Res. 2004 May 1;10(9):2928-35.
        2. Takahashi T, Suzuki M, Shigematsu H, Shivapurkar N, Echebiri C, Nomura M, et al. Aberrant methylation of Reprimo in human malignancies. Int J Cancer. 2005 Jul 1;115(4):503-10.
        3. Takahashi T, Shigematsu H, Shivapurkar N, Reddy J, Zheng Y, Feng Z, et al. Aberrant promoter methylation of multiple genes during multistep pathogenesis of colorectal cancers. Int J Cancer. 2006 Feb 15;118(4):924-31.

      2. amw – with regard to the Takahshi paper I too find it difficult to believe that it is just an innocent mistake. If the entire GAPDH gel band was reused I could have thought that they did that by mistake while selecting the images but since they have gone on trimming the image to suit the number of samples they have, it seems to be highly unlikely. For flipping and reusing it in another paper .. just one look at that image would tell anyone who has run a DNA gel that it is upside down by the pattern of the bands so even if it was not known that it has already been published once, didn’t anyone wonder if it was flipped left to right or vice versa?
        If they actually did the GAPDH controls then why would they not show those and if they did not do it then how do we believe that absence of bands is not due to problem with the sample due to which there was no amplification in those samples. After all there is a reason for doing the GAPDH control!!

        This group too seems to be plagued with image issues and needs to be investigated by the concerned journals / university. Since both these groups work in the same area, I would think that they would have seen the paper from Roman-Gomez group and should have recognized their image but did they remain silent about it as that would have exposed their own image manipulation? Something seems to be terribly wrong here

      3. @amw: it is a good job. It is so investigative that one case uncovers the other one? Chain reactions everywhere. If the above Takahashi papers are proven – it might be similar to the MD Anderson case http://md-anderson-cc.blogspot.com/
        This may need help from from Abnormal science and/or 11jigen – bth of them are quite these days. (http://www.blogger.com/profile/03513633746083109180). I guess..Senior author of Takahashi paper is very famous..scary thought though…let us wait and see how this progresses…

      4. my earlier response is still under moderation may be because of the links I have provided. it is a good job. As WB points out the lab from which Takahashi paper was published is a big group working in the field of clinical cancer research. It would be informative if 11jigen or abnormal science help with this….hopefully, we will find something really interesting.

  11. So the Roman-Gomez (University of Cordoba) story now seems pretty clear as far as scientists should be concerned – the major matter of if / how it will be dealt with by the institution / journals is of course another matter.

    There is now a concern over gels from the Hamon Center for Therapeutic Oncology Research, University of Texas Southwestern Medical Center. I would be very interested in people’s opinion on the following gels (for references see end):

    http://img696.imageshack.us/img696/6578/suzukibands.jpg

    http://img703.imageshack.us/img703/4101/suzukicloseup.jpg

    In the first image, featuring gels from different papers and samples, the two bands circled in red look identical. They drew my attention not just because of their similar intensities and shapes, but also because they are distinct from most other bands on these gels because one can see a background above and below them. Most other bands on these gels are surrounded by total blackness.

    Most importantly, in the second image, I show what happens when you closely examine one of these bands, taken from the PDF file of Suzuki et al. Cancer Letters 2006. To me there is a very abrupt junction between the band and the lane on the right which is consistent with the band having been pasted in vertically from a separate image with different background.

    As a secondary point, I also ringed a number of bands in green which look identical also. I should say that I would not normally have been concerned about these, but given what seems to be evidence of image reuse before by this group (the Takahashi papers referred to above), and the above concern about pasting of individual wells, I now wonder if this is the same band being pasted into different positions.

    Very keen to hear what people have to say about these. It would indeed be good for others such as 11jigen and Abnormal Science to comment.

    Refs:
    1. Suzuki M, Shigematsu H, Shames DS, Sunaga N, Takahashi T, Shivapurkar N, Iizasa T, Frenkel EP, Minna JD, Fujisawa T, Gazdar AF. DNA methylation-associated inactivation of TGFbeta-related genes DRM/Gremlin, RUNX3, and HPP1 in human cancers. Br J Cancer. 2005 Oct 31;93(9):1029-37.
    2. Suzuki M, Shigematsu H, Shivapurkar N, Reddy J, Miyajima K, Takahashi T, Gazdar AF, Frenkel EP.
    Cancer Lett. 2006 Oct 28;242(2):222-30. Methylation of apoptosis related genes in the pathogenesis and prognosis of prostate cancer.
    .

  12. “In the first image, featuring gels from different papers and samples, the two bands circled in red look identical.”

    Not especially.

    In both this case and that of Roman-Gomez the PCR results shown are only indicative – to demonstrate the assays work. The “meat” of their papers is genotyping (epigenotyping?) their cases and/or controls and/or library of cell lines and correlating these results to clinical outcomes. So the vast bulk of their gels are not shown, as they are not interesting

    I have never done methylation assays myself, but they don’t read as being particularly complicated – so I don’t find the idea that labs would be faking them as particularly likely. The Cordoba group had a problem obviously – but I don’t think it will turn out to be a very interesting problem. The Texas group just got lazy on one figure. If you think either group won’t be able to show lab journals with pages and pages of gels backing up their respective papers, then I think you will be in for a disappointment.

    I don’t know how useful this candidate gene approach is or was – (since these papers are relatively old) – but that is another issue.

    1. There seems to be a suggestion that the Roman-Gomez lab had large quantities of solid data on agarose gels, but presented stolen gel images in their papers. To me this is much harder to conceive than the alternative: that there are simply no gels.

      In one case, Roman-Gomez et al. stole a gel image from another group and used it in their 2004 Blood paper. The paper was retracted in 2009. The retraction says nothing about lab journals or pages and pages of gels (the ‘meat’ you refer to). To me it’s natural for anyone reading that retraction letter to assume that the paper was fabricated in its entirety, and that nothing is said about other gels because there were no gels at all.

      I agree that with the Texas group things are by no means clear, but the matter deserves attention and not sweeping under the carpet. The idea that re-applying images to new samples is only ‘lazy’ seems to me to undermine fundamentally what science is about. Just because a technique is easy does not mean the authors are doing it. These gels are there to provide the referees and readers with evidence that these authors are capable of obtaining robust data with their system, before the results from wider application of the method are presented. If they have to copy and paste, or flip horizontally old gel images, to produce just one example gel, this suggests to me that they are not actually producing any data.

      In general terms, I know that to ordinary people the idea of stealing images from other papers, or your own old gels, might seem odd when it should easily be possible to generate new gels by less detectable means. But people who commit research fraud are not ordinary. Scientists who commit fraud are leaving behind all their years of lab training and academic development; they also have to deal with a team of people who need to be persuaded to go along with it (or need to be kept unaware of it). It should not be a surprise that people don’t commit fraud in the most ‘logical’ ways. I thought most people were wise to this now given the large number of proven cases of systematic fraud where the evidence of fraud was very obvious to anyone who looked carefully at the papers.

      The specific issue of the red bands is indeed not cut and dry, but suspicious; these bands do look very similar. This is exactly why I also pointed out that in the second paper the band has what looks very much like a splicing artifact that would be consistent with copying and pasting it from another gel. As I said, it would be helpful for readers to look at that issue:

      http://img703.imageshack.us/img703/4101/suzukicloseup.jpg

    2. @ littlegreyrabbit – the methylation PCR for patient samples are indicative and there no other control is required as long as they have a +ve and -ve sample as the DNA can either be methylated or unmethylated. So if they do not get a signal in either of the reactions then it would imply problem with the template / dna sample so you can say that it is internally controlled.

      The GAPDH bands (Takahishi et al) republished as Abl bands (Agirre et al Oncogene 2006) are RT-PCR data to show expression correlation with DNA methylation status (fig 1b and d oncogene 2006 & Takahashi et al). These are not from patient samples but cell lines and you will find the same band being used by both the groups to different set of cell lines / treatment conditions which is definitely not an acceptable practice. Here it has the same role as b-actin / gapdh in data normalization for qRT-PCR even though they have shown it as detectable and undetectable levels of mRNA (non quantitatively).

      As for why would some one fabricate when they can actually do it is a question that only the fabricators can answer. So far whatever has been caught is not due to the inability of the involved people to perform the technique or not using it regularly (they all knew how to do western blots/ pcr/ histology etc etc) or not doing the work (read about the long hrs the post docs put in some labs and yet had made up the figures in their papers). In every case it does feel like why even put in this much effort when every thing can be made up without the possibility of detection unless someone tries to replicate the experiments. These people can serve as case material for psychological studies on abnormal human behavior.

  13. More evidence of image duplication from the Texas Lab. Kind of obvious once you notice it.

    http://img804.imageshack.us/img804/1761/takahashi2004fig2.jpg

    Or would anyone like to suggest the red images are not the same? Bear in mind this is the second separate instance of image duplication in the same paper (both Figure 1 and 2 afflicted), and further that the authors appear to have duplicated images across publications relating to very different samples.

    I don’t see how it is now possible to have faith in anything this lab has produced in this area (DNA methylation in a variety of cancers) until a very simple question is answered: does this lab have gel data for these patients or not? That should not be a difficult question to answer. If they do, good news – but they still need to explain the duplicated gel images we are seeing, and correct the relevant papers. If not, they need to retract the relevant papers. I don’t see this as being very complicated.

    As this is the US, there is a clear process for getting this question answered.

  14. may be I am hallucinating…the left most panel for p16UM appears to be a mirror image of the middle and last panel (here and there some addition/deletion of lanes). I might be wrong – sorry for that.

    1. I know what you mean given the curve and general appearance of the bands, but considered on its own the left most panel looks fine.

      But the idea that there might be lane splicing is reasonable; I already pointed out an image from this group with possible splice artefact (see above).

      Along these lines, a more recent paper from this group contains a gel with a strange appearance if you look closely (needs a clean screen!). I would be interested to know what people think about the upper panel of:

      http://img195.imageshack.us/img195/7833/suzukiaso2007fig.jpg

      1. I believe a case can be made for splicing of several bands in the upper panel. The lower panel also looks like its been fiddled with, probably a non specific band that might have been erases (clear black region below bands that looks very different from the region above the bands)

      2. To me too it does seem that the lanes in top panel of the image in the link have been spliced together as the lane background extends beyond the well horizontally. These possibly explain why they remained silent when some one else was taking their images.

  15. Ressci Integrity, regarding that left-hand curved panel (p16UM) in the lymphoid / haematopoeitc malignancy paper from Takahashi et al. 2004, it does turn out to be relevant; the whole span of bands was re-used by the same authors in a paper on colorectal 2 years later:

    http://img198.imageshack.us/img198/5770/takahaship16um.jpg

    Regarding the Texas lab, there is now clear evidence of image re-use across different sample sets and on multiple occasions, and two convincing cases of gel-splicing artefacts where specific lane images appear to have been inserted. The issues affect both methylation PCRs and controls.

    The more one goes into this, the more it resembles fraud on a large scale.

  16. Well, regretfully the new image is not good either. Panels labeled “M” in both sFRP-2 and sFRP-5 are upside down (band whiskers point down rather than up as they should). Potential concern?..

  17. Thanks – no surprise that another of Roman-Gomez papers has been found to contain an image taken from a completely different paper from another group. For those who follow the morbid details, the Roman-Gomez 2007 paper discusses 7 genes, but the valid Battagli paper had only 6 in its figure. He got round this by taking the 6 panels from Battagli, but also repasting the third panel (horizontally flipped) at the bottom to make a 7th. This could be termed plagioduplifabrication.

    I wonder how this was discovered since there is still no evidence that University of Cordoba is undertaking an investigation. Instead we have piecemeal retraction / correction / expression of concern. This is extremely unsatisfactory because while all of the methylation work originating in Cordoba seems to have been fabricated, there are wider questions. Were others involved? It is hard to believe that no one else knew about this. And does more recent work involving Roman-Gomez and other Spanish universities also involve misconduct?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.