Cell reviewing allegations of image reuse in human embryonic stem cell cloning paper

cell cloningCell is looking into whether the authors of a widely hailed study published last week claiming to have turned human skin cells into embryonic stem cells manipulated images inappropriately, Retraction Watch has learned.

The potential image problems came to light on PubPeer, a site designed to allow for post-publication peer review. A commenter, identified as Peer1, identified “several examples of image reuse which might be of interest to PubPeer members and readers:”

– Fig. 2F is a slightly cropped version of the cell microscopy image in Fig. 6D top left.

– Fig. 6D top right, the cell microscopy image is a slightly cropped version of supplementary Fig. s5, top right. The cells in 6D are labelled as “h-ESO-NT1 Ph” yet in figure s5 they are labelled to be “hESO-7”. We understand the former to inherit caffeine-treated somatic nuclei whereas the latter are original stem cells.

Under pressure to assemble the figures for rapid publication, one can understand making a cut and paste figure assembly mistake. Nevertheless it should be noted that image cropping does take extra work.

– Figure S6 top centre and top right are the same image.

– Figure S6 middle left and lower right are reported to be biological replicates of microarray expression quantitation. In those cases however the narrow spread indicates that the data are extremely similar and are only understandable as technical replicates (where the same RNA sample is hybridised to two different arrays). It is useful to do technical replicates to control experimental reproducibility, but biological replicates are more valuable when reporting results. They are not the same thing and should not be conflated. (For the record, we did check the microarray data deposited at Gene Expression Omnibus (GSE46397)).

Cell tells us that

Our editorial team is currently assessing the allegations brought up in the PubPeer piece.

It’s not clear whether these alleged manipulations are central to the claims of the paper. We’ll keep an eye on this and update as we learn more from Cell.

Please see an update on this post, with more details from Cell.

Hat tip: Ed Yong

69 thoughts on “Cell reviewing allegations of image reuse in human embryonic stem cell cloning paper”

  1. That sounds like the template reply I’ve gotten from Nature/Cell/etc. when I’ve brought up issues with other papers in the past. I never know if “currently” means that someone else contacted them previously about the same issue (ie. it’s not news to them) or that they are currently assessing the issue as a result of my message…

  2. Looking for “dead giveaways” (thank you Charles Ramsey!), just eying the MtDNA genotyping of Fig. 6B raised an eyebrow. Look carefully at the Oocyte donor sequence and you will see that the text and sequence letters are separated from the plot, while all other plots have the text and letter directly adjacent to them. Why?
    Supplementary Fig. 2E of the Nature paper (ref below), Egg donor plot 6 (also a bit like Egg donor 7, but 6 more similar vertically) looks suspiciously similar to the Oocyte donor plot of Fig. 6B of the Cell paper. The heights of the hills and the distances between the hills and the valleys are suspiciously analogous.

    Towards germline gene therapy of inherited mitochondrial diseases.
    Tachibana M, Amato P, Sparman M, Woodward J, Sanchis DM, Ma H, Gutierrez NM, Tippner-Hedges R, Kang E, Lee HS, Ramsey C, Masterson K, Battaglia D, Lee D, Wu D, Jensen J, Patton P, Gokhale S, Stouffer R, Mitalipov S.
    Nature. 2013 Jan 31;493(7434):627-31. doi: 10.1038/nature11647. Epub 2012 Oct 24. PMID: 23103867

    1. Those are sequence data, and were already extensively normalised by the sequencing software.
      These softwares try to normlise the distances between signals from each base (=hill) and heights somehow.
      So I don’t agree with Junk Science’s argument about plot similarity.

      1. @Shin-ichiro Hiraga. You might be right, but I would argue that despite the normalization it is very unlikely that you get exactly the same plot as in the Nature paper. However, my main point and the central question in my previous comment is why do all other sequence plots have their sample text etc next to the plot, while one hasn’t. It seems like the raw data has been just pasted in, except for in one case, which makes it look very suspicious. And I see now that it is not the Oocyte donor, but the hESO-NT1 that is the odd one out so I would like to correct my first comment. If this paper had been properly reviewed, this might have been one of the questions that might have been asked.
        An analogy would be if I had been asked to review an earlier Stem Cells paper (ref below) by Mitalipov, I would have asked why there are so many identical numbers in Table 2 of their gene expression experiment. The fold changes have a range from 14 to 3326 for the 25 genes and it just seems very improbable that so many numbers are identical. It might be some normalization process, but I would like to hear from the authors because it seems very unlikely that you have so many identical values in that table (e.g. 415 eight times, 675 seven times, 588 six times, 1024 six times, 222 five times etc etc).

        Isolation and characterization of novel rhesus monkey embryonic stem cell lines.
        Mitalipov S, Kuo HC, Byrne J, Clepper L, Meisner L, Johnson J, Zeier R, Wolf D.
        Stem Cells. 2006 Oct;24(10):2177-86. Epub 2006 Jun 1. PMID: 16741224

        1. Sharp observation. Table 2 in that paper does have a very curious frequency of individual values. Seems rather improbable. I doubt you would find many values to be exactly the same with that kind of a spread. Here is the total breakdown of the 125 values in that table. But

          1 times 2194, 3326, 1782, 3104, 2048, 2702, 2896, 1351, 14, 25, 1176, 194, 24, 42, 724, 238, 111, 97, 45, 84, 147, 137, 128, 68
          2 times 2352, 194, 477, 1260, 776, 630, 181
          3 times 548, 477, 315, 337, 512, 207,
          4 times 1097, 274, 388, 362, 445 , 955, 294
          5 times 256, 222, 168
          6 times 1024, 588, 675
          8 times 415

      2. Another example, are the gels for Hetero-2 and Hetero-3 identical in Supplementary Figure S1C (ref below, open access so please check it out)? Looks very similar and they have for example the white dot below the second to last band at exactly the same location.

        Rapid mitochondrial DNA segregation in primate preimplantation embryos precedes somatic and germline bottleneck.
        Lee HS, Ma H, Juanes RC, Tachibana M, Sparman M, Woodward J, Ramsey C, Xu J, Kang EJ, Amato P, Mair G, Steinborn R, Mitalipov S.
        Cell Rep. 2012 May 31;1(5):506-15.
        PMID: 22701816

        1. junk Science – Good find. I’ve just looked very carefully at the entire gels.

          S1C available here:
          you’ll need to scroll down to supplementary material and download the 1Mb file.

          The entire hetero-2 gel is identical to the entire hetero-3 gel. The hetero-3 gel has slightly increased exposure with decreased brightness..

  3. Way to go PubPeer!

    Regardless of whether the “manipulations are central to the claims”, manipulation per sehas no place in science, so if these allegations prove true, the paper should be retracted or at least subject to an expression of concern visible on PubMed.

    It’s also notable that the tab for “comments” on the enhanced version of the paper at the Cell website says there are 7, but then click on the tab and there are “no items to display”, suggesting they removed the comments following this event. Good to see nice open dialog is alive and well elsewhere on the InterNet… catch up old publishers, you’re being left behind.

    1. The seven missing comments looks bad. However, in this case the appropriate axiom may be:

      “Never ascribe to malice that which is adequately explained by incompetence.”

      I had tried to read the comments and they were not working earlier this week, before the “event”. Apparently just broken – unless these issues had first been raised there of course…

  4. The Cell also removes the pdf file of a suspicious paper before correction for image duplication at the Cell website. It seems that they want to hide inconvenient truth.

    Cell. 2012 Jun 8;149(6):1298-313. doi: 10.1016/j.cell.2012.03.047.
    Complement C1q activates canonical Wnt signaling and promotes aging-related phenotypes.
    Naito AT, Sumida T, Nomura S, Liu ML, Higo T, Nakagawa A, Okada K, Sakai T, Hashimoto A, Hara Y, Shimizu I, Zhu W, Toko H, Katada A,Akazawa H, Oka T, Lee JK, Minamino T, Nagai T, Walsh K, Kikuchi A, Matsumoto M, Botto M, Shiojima I, Komuro I.
    Department of Cardiovascular Medicine, Osaka University Graduate School of Medicine, Japan.


    1. So, arguably the two most prestigious science journals on Earth let this pass through their review process. I wonder how many grants have been written based upon this work.

      For those reading this with no real scientific interest – its equivalent to winning the boxing heavyweight gold medal at the olympics only to find out later the gold medallist had an iron bar gold in his gloves.

      We need pubpeer but it would be better to have a site where we could see something similar like big red arrows or equivalent (as shown in jigens blogspot above).

      1. Peer 1 scores a bullseye!

        Well, apparently some retractions from Pubpeer will be soon to follow and it is such a recent initiative.
        I do think some peers will be capable of finding sites to host their images showing red arrows or whatever and post all relevant links under their comments in Pubpeer for interested readers to see.
        One good example is the excellent site http://www.anony.ws/.

  5. This is nuts. Given the history of other polemical publications in this area, my mind boggles at the thought that any or all of the authors thought they could get away with this. Supposing it’s the ubiquitous”errors during figure assembly”, and I say that with a few kilos of salt thrown in, you’d think that someone along the chain of authors would have spent some reasonable time triple checking every single figure. As for the editors and reviewers, they should hang their heads in shame.

    1. Yeah this is real important point – there’s already been outright fraud in stem cell research, and you’d think given all the hype of this field people (i.e. especially editors) would have learned to be more suspicious. So depressing to see the same things happening all over again.

      1. Suspicion in stem cell research has also been raised about the research of Nobel Prize winner (in Physiology or Medicine 2012.), Shinya Yamanaka MD, PhD,.

        EMBO J. 2000 Oct 16;19(20):5533-41.
        Essential role of NAT1/p97/DAP5 in embryonic differentiation and the retinoic acid pathway.
        Yamanaka S, Zhang XY, Maeda M, Miura K, Wang S, Farese RV Jr, Iwao H, Innerarity TL.

        In Fig.3A, “+/-” band is similar to “-/-” band.


        1. I am seeing this in utter disbelief after viewing

          “Shinya Yamanaka received the Wolf Prize in Medicine in 2011 with Rudolf Jaenisch; the Millennium Technology Prize in 2012 together with Linus Torvalds. In 2012 he and John Gurdon were awarded the Nobel Prize for Physiology or Medicine for the discovery that mature cells can be converted to stem cells. In 2013 he was awarded the $3 million Breakthrough Prize in Life Sciences for his work”

          Alot more skilled science-fraud hunters are needed.

    2. BoDuke

      The excuse used that authors did not have much time puting figures together doesn’t ring true, We go through figures even for internal presentations with a fine tooth comb.

      For conferences it practice after practice, every word on every slide checked by multiple people. For publications we go through raw data, together, and the figures are checked by the team, as are the statistics. Mistakes do happen, but they are always spotted by the team.

      We often send potential publications for higher impact journals to professors/senior academics etc to check prior to sending off. Rejection hurts less when we have approval of an expert! 🙂

  6. As a Science journalist I was wondering, why this complex paper was supposedly received on April 30, 2013, Revised on May 3, 2013, Accepted already on the SAME DAY and published two weeks later. How can scientific review happen in such a short time frame? Am I missing something here?

    1. Well, Journals like Cell / Nature / Science compete for publishing high-impact papers, for instance by approaching Authors at conferences and offering them to consider their paper. They may offer fast review and select reviewers that offer to review the papers in a short time. It is in the Journal’s interest to start the citation clock ticking asap.
      Mind you, this does not necessarily mean they do a bad job. One can easily thoroughly review a paper in one or two days, and if the Authors then submit a convincing revision, it is logical the Editor accepts it on the same day.
      (as a Reviewer, having a paper on your desk for three weeks and then reviewing it in half an hour does not mean you did a good job…)
      Of course, in this case, it appears clear there were some oversights by the Authors, the Reviewers and the Journal – let’s see whether they turn out to be important for the central claim.

    2. You’re basically saying it yourself: you’re missing the peer review process for this paper.

      If we just take peer review step by step:
      1. It is unlikely that the paper was assigned to reviewers on the day the manuscript was submitted.
      2. It is unlikely that the reviewers completed their reviews the day they received the manuscript.
      3. It is unlikely that the initial decision on the manuscript was made the day of receiving the reviews.
      4. It is unlikely that the revision was completed the day the reviews were communicated to the authors.
      5. It is unlikely (but this is still the most likely) that the final acceptance came the same day as the submission of the revised manuscript.

      I’m seeing at least five days. With epsilon probability of decent peer review actually being done.

      1. I don’t agree, 90% of “peer review time” is the manuscript waiting on someone’s desk. If an Editor is really interested in getting a paper published fast, peer review CAN be done thoroughly in 3 days, the rest is administration, and can be done in hours. I’d dare to wager plenty of good papers were accepted in a few days – of course, this does not appear to be one of them, here some corners were clearly cut.

      2. PS As Editor of a couple of lower-IF Journals, I make most decisions on the same day of receiving the reviews or revisions.

    3. I don’t think peer review is an issue here. The purpose of peer review is not to check for cheating. The significance of the finding is beyond doubt, the authors have the responsibility for the quality of the work.

    1. The figure in question appears to represent technical replicates from within the same cell line and it is pretty clear from the text that the authors are actually assessing this deliberately, but they use the term “biological replicate” incorrectly. However, with cell culture, the issue of technical vs. “biological” replicates is not straightforward given that there really is no such thing as a biological replicate in a cell culture system. This is why in the text they state:

      This assay demonstrated 99% transcriptional correlation within each cell type, suggesting that minimal variations existed between biological replicates collected from different culture plates (Figure S6)

      Different culture plates does not represent biological replicates; it’s the same thing just grown on different days, or in different plates. No way around this in cell culture; every cell culture experiment is a technical replicate if the same line was used, regardless of how many days or weeks separate the experiments, or how many different plates were used. Thus, the only “mistake” the authors appear to have made here is in using the phrase “biological” when that is clearly not the case (nor is it even possible). Nothing fraudulent or suspicious about it in my opinion and, to me, what they have done in S6 is important. Personally a correction is not even needed for this since it is obvious from reading the manuscript what is going on.

      People here should read the text of the actual paper…………………….

      1. While this may be your opinion of what technical replicate and what a biological replicate should be – this is not the generally accepted usage of the words. This sets out the issue very well
        “Different culture plates does not represent biological replicates” in a cell based system, yes it does.
        I should add I have never done gene expression arrays, so I am a little hesitant with the data, but I have always found gene expression levels – individually assayed – can be quite varied from experiment to experiment. Are the numbers in the files in this instance on a logarithmic scale?
        Unfortunately when I work, I work long shifts and I have not yet had time to read the paper or try some analysis myself – hence I was interested in others opinions.

        Just on another point raised – I heard mention that it should be easy to test if the nuclear DNA is from the donor cell line and the mitochondrial lane is from the embryo line. While I think it is a little frafetched, hypoethically this could be faked
        Another test would be if thenuclear DNA is from the donor line and the cells are pluripotent. Or maybe chromatin ChIP assays to look at chromatin structure between the donor and the derived cell lines. However, I expect in the fullness of time all ambiguity will be removed.

        1. This may be my “opinion”, but I can assure you that I am correct. The opinion of a statistician who doesn’t seem to understand the nature of cell culture experiments does nothing to sway me. If I repeat an experiment using the same cell line on a different day but just one passage later and in a different culture plate, but I use the same media, plastics, reagents etc, where exactly is the biological variability? The only “variability” I am testing is my ability to accurately recreate an experiment with the passage of time and, thus, this is purely technical. It’s still valid variability to be sure, but its not biological in any way. Of course, cell lines can change over time and over passages and I will entertain that discussion to a point, but this can and should be controlled for by limiting passage number and by handling the cells properly (confluency etc). But, in my opinion, this still falls under technical variability and any “biological” variability would be artificial and artefactual in this case. The only way cell culture experiments incorporate true biological variability is if a different cell line from the same tissue immortalized from different sources is used to validate the results. Very few researchers will do this for perfectly valid and understandable reasons.

          The bottom line is this: by definition there can only be technical variability in cancer cell line experiments when a single cell type is used. Period. End of story.

          1. “This may be my “opinion”, but I can assure you that I am correct.”
            iloveresearch, demonstrably that is not how the term is actually used by the scientific community, however incorrectly. Since language is about communication if the general (ill-educated) consensus is that biological replication can be applied to cell culture, then when you use that term you have to abide by that consensus in order to communicate. If you wish to use a term in a non-standard way, then you need to define it in your text. As it is, the dispute is not about what is a technical and what is a biological replicate – but that data is so closely matching then it has to represent 1 experiment, 1 RNA purification and then probed on 3 different chips – which is indisputably technical replication.

            Now you ask how could a cell culture possibly present different results if treated exactly the same; all I can say – conceding I was a woeful scientist – is that it just does. For example, lets say I did a control and 3 treatments for RNA expression experiment and for whatever reason the RNA purification for one of the treatments didn’t work. I couldn’t just do that single treatment a few days later and slot it into the other data – the variability is just too great. You have to repeat the entire experiment.

          2. but that data is so closely matching then it has to represent 1 experiment, 1 RNA purification and then probed on 3 different chips

            No, no it doesn’t. If I plate out the same cells in 3 different plates on 3 different days, extract the RNA from each and run a microarray or RNA-Seq experiment, I would expect very low variability. That’s what they show.

      2. iloveresearch, have you looked at this type of data before? This high level of correlation between two of the replicates in each line may be possible but it is unlikely.

        It seems possible that RNA extracted from two cultures was used and that on two of the microarrays for each line, samples from the same culture were hybridized. Even if we agreed on the way that you use the terms “biological” and “technical” replicate, would you agree that if this were true, these are two different types of technical replication and thus misleading? Your distinction does not explain the distinction between the differences in the numbers of off-diagonal points in the lower-left and lower-right panels, for instance.

        In addition the upper-middle and upper-right panels are clearly duplicates even though they show different correlations.

  7. just a small update…the column of the comment now seems to work.. (and check the comments they are mostly about the ethical consideration about “donors” and “ova”)

    and another consideration…how on earth it is possible that a paper on cell get accepted in 3 days? some friends who tried to publish in cell get peer reviewed in 1-2 months, at minimum…

      1. Stewart..yes..unfortunatly I know.:) .. I did not make myself clear obviously…I was talking about when the paper (in high I.F. journals) go to the reviewers …in nature neuroscience some friends (after it get past the editorial comb) waited for 3 months for a review and it was a deep review…it took another 3 months to reply 🙂

  8. This pictures seems to be manipulated. What will be very interesting to see is what happened next? In many cases the researchers says later that it was a mistake, everything is fine etc. and a update-version of the article will be accepted by the journal…

  9. Another thought, suppose you want to fake pictures you need, but don’t have, on purpose, wouldn’t you rather take several different pictures of the ‘wrong’ colonies instead of taking only one and cropping it differently? Even if you are running out of time and just discovered you are in need of a pic, there are more elegant ways to fake it, for example asking other lab members or taking a generic one from the internet and cropping it (hoping no one does a picture search). I’m not trying to find the best way to cheat, just saying the malice-option seems… wondrously stupid. Deadlines sometimes make people superficial.

  10. @Xardram

    You are right, these are “funny” mistakes in faking pictures. I can’t understand why people fake scientific results, but also can’t understand why they make it so easy to find out that they manipulate. For this special case for example, why didn’t they use another picture of the stem cell? Think they will have many of them.

      1. Deserves the benefit of the doubt, but definitely believe it when some of the 10 independent institutions have sequenced the clones and confirmed the provenance of the genomic and mitochondrial DNA.

  11. I am a scientist myself. At present, we are really endagering the foundation of science itself. Science needs to become much more serious again. Away from hype, away from impact and h-factors, forward to slow, solid and substantial work. Otherwise, all these errors and retractions will erode the basis of science and with that its public support.

      1. @ilovescience: Other than the supplementary array comparison plots (which are not central to the issues about this sorry affair), as has been noted elsewhere, this all happened before. As greenfly proliferate on rosebuds in spring, so image fabrications multiplied in First Supreme Scientist Hwang’s stem cell fraud. There were two main ways to show what was going on: (1) those created organisms were “fictitious” when retrospectively tested and (2) there were manifold image frauds in the publications.

        Go here:


        Eg it says “many photos presented in the paper were also fabricated.”

        How, then, should we rule out (1) when (2) is has in the past been so strongly corrrelated?

        We can keep on loving research as it is, but it is not in a such a state that many of us would like to be able to do so too. It is worthile also to consult the US expert on Hwang:


        For a picture of fraudulent image overlap to get motivated, there is this contemporary article still happily available on the intarwebs:


        After all, a picture tells a thousand words.

  12. Accepted in 5 days, reviewed in 4 days or so? Apparently my research sucks as it takes similar journals more then a year and 3 rounds of reviews and responses to get it accepted or after 4 months of reviews rejected after all.

  13. It takes an experienced reviewer a few hours (half day max) to review a paper properly. If a Cell editor emailed me a paper and asked me to get it done in a few days as an emergency, I could do it no problem, especially if I was excited to read it. I don’t understand what the fuss is all about.

    1. @iloverresearch: you responded already. By the way, are you involved with this work at all? Just curious – you know so much about the paper….

      1. Not convinced they are identical. The relative intensities of the bands don’t match exactly.

        1. iloveresearch, I would value your opinion.
          On s1c, draw a line between the ‘O’ in red/green on the bottom row of % labels and the label “hetero 3” and “hetero 2” on the respective gels.
          Do you see a white speck in an identical position on both gels? What is your opinion on this?

        2. ilovereserach, you are correct, the relative intensities don’t match. There has been a slight alteration of brightness or contrast so they appear different. But there are tell-tale signs………

          If you wish – copy both images into powerpoint. Alter the brightness and contrast on each image. Identify regions where there are similar marks. There are several – blow the images up 400-500% – it makes it easier to see.

          You make a good point though – it is not easy to see on the untrained eye.

          More Big Red Arrow blogs are needed to make it easier for all to see (are you listening SF?) 🙂

          1. The author guidelines for Cell Reports state “Groupings and consolidation of data (e.g. cropping of images or removal of lanes from gels and blots) must be made apparent and should be explicitly indicated in the appropriate figure legends.” This figure (S1C in Cell Rep. 2012 May 31; 1(5): 506–515) is an assembly of multiple parts and does not comply with the guidelines.

          2. In reply to Stewart. “More Big Red Arrow blogs are needed to make it easier for all to see ”

            There you go:


            Referes to from Stewarts comment:
            Scroll down to supplementary material and download the 1Mb file.
            Fig. S1C:
            The hetero-2 gel is identical to the hetero-3 gel. ”

            I pointed out some similarities (1-6). There are many more, proving that those blots are the same.

          3. Sorry, I made a little mistake in the .jpg behind the link above. I made a correction and put some text for clarification.

            Here is the link to the corrected .jpg:

            Does Hetero-2 really carry the resident mtDNA, or what is happening here?

      2. Yes, both images are indeed from the same sample. Intensities have been altered. Makes it therefore even more suspicious in being an intentional performed duplication.

        1. Hans the blog image you posted is great. Sadly, I did not see all FOUR gels are from the same image.

          I only saw 2.


          All four gels from the same image – with varied contrast intensities – that is very worrying indeed.

  14. Lots of coverage in the press, most of it quite balanced in the sense that they understand that most researchers (well, there is the occasional apologist) don’t like it when things happen like this but at the same time we all don’t know what the final outcome will be.

    Gotta love the superposed images with built-in slider on the Spiegel article. Don’t need to understand any German at all to appreciate it.



  15. I think you all will appreciate the no-nonsense wisdom of Natalie DeWitt’s comment on this whole cloning paper fiasco: wp.me/p1xWpk-48p .


    1. I think it is worth noting that stem cell research is highly competitive, and the rewards for apparent success are great. It would surprise me if a lone researcher (or as so often happens a researcher working as part of a team) got away with smudging figures once and then reaped the reward, would not do it again and again until reprimanded.

      A case from ORI:


      “Specifically, ORI finds that the Respondent knowingly and intentionally:

      1. Falsified three (3) figures for immunocytochemistry and alkaline phosphtase (AP) staining images, karyotyping and real-time reverse transcription polymerase chain reaction (RT-PCR) results by using experimental results from her prior work in Korea with human embryonic stem cells (hESCs) to confirm the generation, differentiation, and verification of human induced pluripotent stem cells (iPSCs). The false data were included in:

      a. Figures 1c and 2i (panels 4 & 13) in the Nature 2009, Science 2009, and Nature Biotechnology 2009 manuscripts and Supplementary Figure 4 in the Nature 2009 manuscript
      b. Supplementary Figure 5 in the Nature Biotechnology 2009 manuscript
      c. Figures S1B and S1D (panels 4 & 13) in the Blood 2009 manuscript
      d. Supplementary Figures 8B and 8D (panels 4 & 13) in the Nature Medicine 2009 manuscript
      e. Figure 9 in the RC1 GM092035 grant
      f. Figure 8 in the R01 HL079137 grant
      g. Figure 2 in the RC1 HL100648 grant
      h. Figure 8 in the RC2 HL101600 grant
      i. Figure 3 in the R01 HD067130 grant
      j. Figure 1 in the RC4 HL106748 grant
      k. Figures 1C, 1H, and 1I (panel 3) in the R03 HL096325 grant
      l. Figure 5 in the U01 HL107444 grant
      m. Figures 2C and 3I (panels 4 & 13) in the poster presented at the 2009 AHA meeting
      n. The presentations `Figures–Sinae Kim–120808.ppt’ and `Figures–Sinae Kim–121508.ppt’
      o. The image file `HiPS–E1–x100.jpg’

      2. Falsified one (1) figure for the real-time RT-PCR data for endogenous SOX2 expression in human iPSCs derived from dermal (HiPS-E1) and cardiac (HiPS-E2) fibroblasts and iPSCs generated from peripheral blood mononuclear cells derived from coronary artery disease patients (HiPS-ECP1, HiPS-ECP2, and HiPS-ECP3) by substituting real-time RT-PCR data for endogenous OCT4 expression in the forementioned cell lines. Specifically, the false data were included in:

      a. Figure 2i (panels 2 & 5) in the Nature 2009, Science 2009, and Nature Biotechnology 2009 manuscripts
      b. Figure S1D (panels 2 & 5) in the Blood 2009 manuscript
      c. Supplementary Figure 8D (panels 2 & 5) in the Nature Medicine 2009 manuscript
      d. Figure 3I (panels 2 & 5) in the poster presented at the 2009 AHA meeting
      e. The presentations “Figures–Sinae Kim–120808.ppt’ and `Figures–Sinae Kim–121508.ppt’

      3. Falsified data in two (2) PowerPoint presentations for RT-PCR data of osteogenic-specific gene expression in bone marrow cells by substituting data for RT-PCR data in primary bone-derived and Saos2-osteosarcoma cells.

      4. Falsified one (1) figure for the real-time RT-PCR data of OCT4, SOX2, KLF4, c-MYC, NANOG, hTERT, REX1, and GDF3 fold-change expression levels in H1 hESCs, human cardiac and dermal fibroblasts, HiPS-E1, HiPS-E2, HiPS-ECP1, HiPS-ECP2, and HiPS-ECP3 cell lines by substituting data from various other cell lines that did not exist. Specifically, the false data were included in:

      a. Figures 2a-h in the Nature 2009, Science 2009, and Nature Biotechnology 2009 manuscripts
      b. Figure 10 in the RC1 GM092035 grant
      c. Figure 9 in the R01 HL079137 grant
      d. Figure 5 in the R01 HD067130 grant
      e. Figure 3A-H in the poster presented at the AHA meeting
      f. The presentations “Figures–Sinae Kim–120808.ppt’ and `Figures–Sinae Kim–121508.ppt'”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.