About these ads

Retraction Watch

Tracking retractions as a window into the scientific process

Lifted figure prompts retraction of Oncogene paper by Roman-Gomez

with 17 comments

onc_cimageThe journal Oncogene has retracted a 2005 paper from a group led by Jose Roman-Gomez, a Spanish researcher who appears to be a serial image manipulator/misappropriator.

The article, “Promoter hypomethylation of the LINE-1 retrotransposable elements activates sense/antisense transcription and marks the progression of chronic myeloid leukemia,” was published online in September 2005 and has been cited 106 times, according to Thomson Scientific’s Web of Knowledge.

Now comes this:

The authors wish to retract the November 3, 2005, paper cited above. Figure 6a in the published manuscript was inappropriately reproduced from figure 1 in a previously published paper from Rosas SL, Koch W, da Costa Carvalho MG, Wu L, Califano J, Westra W, Jen J, Sidransky D. Promoter hypermethylation patterns of p16, O6-methylguanine-DNA-methyltransferase, and death-associated protein kinase in tumors and saliva of head and neck cancer patients. Cancer Research 61, 939–942, 1 February 2001. The manipulation of the figure was performed by Dr Roman-Gomez. None of the other authors were involved in or were aware of these events. The authors apologize to the readers, reviewers, and editors of Oncogene for publishing this erroneous image. All the authors agree to the retraction except Dr Roman-Gomez.

In two previous expressions of concern, the Journal of Clinical Oncology has allowed Roman-Gomez’s group to keep its papers despite the questionable figures. But Oncogene evidently felt the best solution in this case was retraction — a decision with which we’re inclined to agree given the history here.

We tried to reach Oncogene’s editor for comment and will update this post if we learn more.

We spoke with Sidransky, who was surprised to hear about the purloined figure — although it wasn’t the first time the Hopkins researcher has been on the receiving end of such a misappropriation.

Perhaps, we wondered, he might consider making his own figures a little less attractive…

About these ads

Written by amarcus41

February 8, 2013 at 11:00 am

17 Responses

Subscribe to comments with RSS.

  1. I’ve seen re-use of figures from one’s own lab but stealing something from another group is a new one for me.

    StrongDreams

    February 8, 2013 at 11:27 am

    • Why should it be different from the rest of human activity?

      fernando pessoa

      February 11, 2013 at 6:23 pm

  2. Yes – it happens – you could call it ‘plagiofabrication’ or just simply ‘poaching’. Roman-Gomez has taken entire images from the papers of at least 3 different groups. It’s virtually impossible to detect (note publication 8 years ago) as the images look perfect – unless you were the one who produced the originals…

    For sure this paper requires retraction, but one could ask wider questions – was ANY of the work in this paper actually undertaken? Do the patients described exist? Put another way, is Roman-Gomez the fall guy for institutional fraud? When did the fraud stop? Surely other unretracted papers must still exist. As no institutional committee is involved, how can anyone know what happened?

    Without answers to these questions, the community should simply view everything from this group was, and is unreliable.

    amw

    February 8, 2013 at 1:31 pm

  3. It is interesting to read his answer in a recent interview after receiving a research prize two years ago.
    Q What would you say to your MD colleagues to encourage them to do research?
    JR-G The important thing is to start in research. They say that research is a matter almost ethical, in order to bring the experience of what you do and know for the benefit of other colleagues and patients, a wonderful way to improve clinical practice and a great antidote to exhaustion syndrome at work.
    (Probably getting the figures from other papers was little exhausting)

    Pablo

    February 8, 2013 at 2:26 pm

  4. This got me musing – what if the thief inadvertently stole falsified data from another paper? I’d pretty much define irony that way.

    Reminds me of the transgenic sorghum that (eventually) never was, but that somehow generated strong positive results in subsequent papers.

    BoDuke

    February 8, 2013 at 2:31 pm

    • My previous comment absolutely in no way refers to the situation reported in this blog post, by the way. As I said, just musing on a Friday afternoon.

      BoDuke

      February 8, 2013 at 2:32 pm

    • See RW post for 5/18/12 — pretty close to your speculation.

      Toby White

      February 11, 2013 at 1:34 pm

  5. Oncogene. 2004 Mar 18;23(12):2177-87

    http://www.nature.com/onc/journal/v23/n12/pdf/1207327a.pdf

    My eyesight is getting weaker. Confusion sets in with old age. O me miserum!

    Could somebody explain figures 1a and 1 b?

    fernando pessoa

    February 8, 2013 at 3:35 pm

  6. There are automated tools to identify near-duplicate text; why isn’t there something like this for figures?

    Morgan Price

    February 8, 2013 at 6:59 pm

    • Great idea.

      I did try the ‘search using image’ option on Google Images and I did get one hit with these Roman-Gomez poaching examples.

      But generally the outputs are crudely similar e.g. DNA gels pick up black & white photographs of almost anything.

      amw

      February 8, 2013 at 7:32 pm

    • There isn’t. Yale has a database of 1.2 million images, but it didn’t find the SW Lee reuse of images. You have do as the departmental secretary said “look at all the papers”. There isn’t an “app” for it. Sometimes the images are altered, “re-sized” like French fries, stretched, things added to distract your eye, so a computer may have a hard time. It is not the best to of all worlds.

      fernando pessoa

      February 8, 2013 at 8:53 pm

      • I agree – most cases of image fraud will pass through peer review – how often do we hear the phrase ‘eagle-eyed’ to describe those who find problems?

        Perhaps we shouldn’t be too depressed about the lack of such a tool – if for the sake of argument it existed, and it was applied by journals at the review stage, what would happen? The journal could only point out to authors that their images appeared to be the same as unrelated images. In some cases this might have a positive effect and alert authors until then unaware of the fraud so that the deeper issue as to why images were being duplicated was could be deal with. But I suspect in most cases the authors would simply find another image and resubmit (perhaps to another journal) and be more careful with the fraud next time.

        A relevant additional argument is that most fraud doesn’t involve images but falsification / fabrication of numerical data – this is impossible to spot (although error analysis can help in theory). I imagine, but have no evidence to back this up, that data fraud dwarfs image fraud in frequency.

        Continuing this argument, image fraud is actually a specific marker of a fundamentally flawed approach to science. Put another way, the sections of a paper referring to images aren’t the real problem. For example the figures that Roman-Gomez poaches are often claimed to be ‘representative’ e.g. 10 patient samples out of 500. What’s more important to consider is that if the representative figure is stolen, there can’t be any figures for the other 490 patients i.e. the whole thing is a fiction. So image fraud is a marker of deeper problems. It also completely undermines the review process. Unfortunately, many readers, institutions and journal editors misunderstand this point (perhaps deliberately) and allow authors simply to replace individual falsified images without further ado (as noted above, Roman-Gomez has actually managed to do this on a couple of occasions with other instances of image fraud – incredibly there is still only an erratum for Roman-Gomez et al. J Clin Oncol. 2005 Oct 1;23(28):7043-9).

        If I had time (and no fear for my career) it would be relatively easy to examine a large number of cases of clear image fraud (I imagine there are at least 20-30 stories, with probably over a hundred papers, on the Science Fraud website) and describe how journals handle the problem i.e. do they insist on retraction or do they allow correction? One could write it in non-judgmental language without using the word ‘fraud’ and compare responses in different journals and fields; from what I know JBC would score ‘fastidious’ as they now insist on retraction while Nature would probably come out ‘easygoing’ (I think there have been at least 5 cases where they allowed megacorrections of false images in the last couple of years – who remembers the one about the problem being due to poor record-keeping?).

        If journals and institutions are mishandling and misunderstanding image fraud, which we can at least see as outsiders, you can be sure they are doing the same and worse for data fraud, which I am sure we never hear about. To me that is why image fraud remains the clearest window we have into understanding scientific misconduct.

        amw

        February 9, 2013 at 2:04 am

        • “image fraud is actually a specific marker of a fundamentally flawed approach to science”.

          “So image fraud is a marker of deeper problems. It also completely undermines the review process.”

          “image fraud remains the clearest window we have into understanding scientific misconduct.”

          Couldn’t say that better.

          fernando pessoa

          February 9, 2013 at 10:14 am

        • I did post this on the SW Lee page, but perhaps it is now buried.

          http://retractionwatch.files.wordpress.com/2013/02/cancer-res-2001-yellow-arrow-1.ppt

          It is relevant to Oncongene as one of the publications is in Oncogene.

          How to explain?

          fernando pessoa

          February 9, 2013 at 10:29 am

    • There is a least one photoshop plug-in to investigate manipulation within individual images (splices, erasures, etc.) Unfortunately the one I found is very expensive. Finding manipulations across publications would require a large database of identified images. I suspect the plaigiarism detection sites built up their databases via volunteer submissions (you submit text, it becomes part of their database for the next person). I don’t know how one would make a comprehensive database of scientific images.

      StrongDreams

      February 15, 2013 at 12:31 pm

  7. I couldn’t agree more with amw. Terms like “image manipulation” or “misappropriation” can easily serve to hide fraud and fabrication of data. The distinction between mere plagiarism – presenting someone else’s ideas or words as your own, and copying data or images, which involves stating that you did experiments that you didn’t do, is enormous. Retraction notices must start making this distinction. Authors for whom the latter category of misconduct is proven should simply be suspended from science – publication, grants, conferences etc until a thorough investigation of all their work and that in their lab has been undertaken.

    Michael Kovari

    February 9, 2013 at 9:29 am

  8. and there is more: http://www.ncbi.nlm.nih.gov/pubmed/22898606 and http://www.ncbi.nlm.nih.gov/pubmed/19264925
    maybe all of his “paper” should be reconcidered?

    Leonid Schneider

    February 19, 2013 at 5:12 am


We welcome comments. Please read our comments policy at http://retractionwatch.wordpress.com/the-retraction-watch-faq/ and leave your comment below.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 33,721 other followers

%d bloggers like this: