Mysterious retraction in the Journal of Biological Chemistry for Takashi Tsuji’s group

The authors of a paper in the Journal of Biological Chemistry (JBC) have retracted it, but don’t ask us why.

This being the JBC, the retraction notice for “Human T-cell Leukemia Virus Type I Tax Down-regulates the Expression of Phosphatidylinositol 3,4,5-Trisphosphate Inositol Phosphatases via the NF-κB Pathway” is the very definition of opaque:

This article has been withdrawn by the authors.

The paper has been cited 13 times, according to Thomson Scientific’s Web of Knowledge.

The corresponding author of the paper, Takashi Tsuji, appears to be a fairly prominent researcher. Last year, he and his colleagues published a study in PLoS ONE in which they showed how a tooth could be grown from stem cells. The paper garnered significant media attention. He also studies hair regeneration.

We’ve contacted Tsuji, and the JBC, for more information, and will update with anything we find out.

Update, 9:30 a.m. Eastern, 1/31/12: See this post for an explanation from Tsuji.

Hat tip: Clare Francis

9 thoughts on “Mysterious retraction in the Journal of Biological Chemistry for Takashi Tsuji’s group”

  1. Some of the blots in Fig 1 have “odd looking seams” under high magnification. Also clear evidence of deletion of material above the main band, in one of the b-actin control blots in panel A.

    Fig 2A blots also have seams.

    Figure 3 A/B the p65 and b-actin blots are the same. These are loading controls for some RTPCR in the graphs below, so conceivably they could have originated from the same sample (but if so, why not just say that and show one set of blots).

    Fig 4C, p65 has been IP’ed from 2 different samples (p300 N and p300deltaN transfected cells) but the p65 westerns are identical – they should not originate from the same sample. The “input” lanes below the p65 are different, so therefore the p65 blots should be too.

    Thus, probably a simple case of the PI finding problems with the images, and withdrawing it to avoid being ordered to do so at a later date.

    1. It’s not at all surprising to hear about a retraction from JBC. New editorial guidance is accompanied often by changes in the journal’s policy. At the moment JBC really seems to have a very strange policy in my opinion.
      For a reputed journal it should be clear that:
      1) articles that do not meet basic technical demands are not acceptable
      (e.g. western blot quantification when the bands are saturated and no more in the linear range. qRT-PCR results with inappropriate housekeeping genes. Use of viruses without equalizing multiplicity of infection)
      2) articles rejected by reviewers, should not be reconsidered by the editor, only because some authors complain. Reviewers reject for good resaons.
      So after all, not surprising to see retractions from JBC. Will happen more frequently in the future

        1. Really David Hardman? Presumably you are publishing in Cell and Science, only! or PNAS at a push, eh.
          And what about 3000 journals published by big publishing houses?

  2. JBC has an impact factor of around 5.3 and is a good journal.
    In the past it has published good to excellent articles. Since a while however, it gets in my personal opinion hard to understand the decisions taken by the editorial board of JBC.
    It’s important that the editors rely on the reviewers and their expertise and that the decisions of the reviewers are taken seriously and communicated to the authors clearly.
    When reviewers ask for further experiments because the manuscript objectively lacks evidence, then the authors need to deliver all the missing experimental pieces. When the reviewer reject a manuscript, the editorial board should have the strength and the fairness to clearly communicate this to the authors. It shouldn’t be that agressive scientists can force articles through the editorail board despite rejections by the reviewers.
    This is not understandable for the scientific community, it’s unfair and it leads to such retraction in the end.
    To say something to the previous post “If you can’t get in anywhere else try JBC”: This has something true. In JBC you can maybe talk a manuscript through…….

  3. ““If you can’t get in anywhere else try JBC”, or would that be unfair?”
    Not at all, my friend.
    It seems, at least in my opinion that in JBC you can easily talk a manuscript through!
    Even if the reviewers reject it. It think that’s very bad for the scientific community and leads to retractions in the end. In fact, many retracted articles on this site are from JBC.
    It has to be emphasized however, that JBC was a very good journal before. Just the current editorial board seems to be far too lenient

  4. The main problem is seemingly that under the current management of JBC some articles are accepted to easily.
    Despite of the fact that they do not fulfil basic requirements for publication, they end up in JBC. Apparently, the editors are too weak to stand against the authors or they are simply unable to judge adequately the articles that they get presented.
    What I realized is that authors in JBC often search quick solutions like measuring expression levels of mRNA by qRT-PCRs or of some proteins by Western blot. Thereafter, they completely overstate their findings to give more relevance to their work. They use terminology like “signaling”, but have in reality measured only expression of some genes or proteins (instead of assessing binding of the ligand/activator to the receptor, activation of downstream events etc). They also talk about “hormones, cytokines, secreted substances” based on mRNA levels (instead of really assessing these parameters by ELISA in the circulation). They use terms like “activity” but only measured expression of mRNA or proteins.

    Once published scientist then quickly realize the weak points of these studies and attack which results in retraction in the end.
    To give now some constructive comments to the editors of JBC:
    1) The journal should principally reject all articles that are based on cell culture experiments only. Cell culture experiments do not necessarily reflect what really occurs under physiological conditions. They are often artefactual.
    2) qRT-PCR data are insufficent. Articles based mainly on such data should be rejected directly. Instead measurements of activity etc need to be requested.
    3) More attention needs to be drawn to the statements from the authors. Statements (especially in title, running title, abstract and discussion) have to be backed up strongly by experimental data.

    That should help the journal to regain its reputation and avoid future retractions

  5. Absolutely right,

    there is currently a paper published where we encounter exactly the problems mentioned above. It seems JBC didn’t learn the their lesson.

    I refer to the article “The PGC-1 coactivators repress the transcriptional activity of NF {kappa} B in skeletal muscle cells” recently published by Eisele et al in the journal of biological chemistry. While reading the article I encountered several issues that make me doubt of the integrity of the published data.

    -Figure 1 A: It is shown that IL-6 is about 3.5 times HIGHER in control cells infected with PGC-1{alpha} compared to GFP. The difference is with a p-value of <0.05 significant. With a probability of 95% the repetition of the experiment should thus give the same result, even if performed by an independent researcher. Especially if the experiment is performed by the same person, it should be highly reproducible. However, in Figure 2 D, under the same experimental context, there is suddenly NO DIFFERENCE between PGC-1{alpha} and GFP infected control cells. In Figure 3 A, the authors present again IL-6 relative gene expression and that time PGC-1{alpha} infected cells express even around 3 times LOWER IL-6 levels than GFP infected cells. The p-value is even indicated there as <0.01! How can it be that the results from the same lab are diametrically opposed and that with such high significance? The data are clearly not reproducible.

    -Similar differences are observed for other genes. E.g. TNF{alpha} levels in Figure 1A (PGC-1{alpha} compared to GFP, controls) are not different. In Figure 3 A, there is suddenly a significant reduction of p<0.05 within the same experimental context. Or Mip-1 levels in Figure 1A are not different between PGC-1{alpha} compared to GFP, controls, but are suddenly highly significantly reduced (p<0.01) in Figure 3 A!

    -There are other examples of gene expression where things do not really fit.

    -In Figure 6 there are substantial problems with the western blots: It is claimed that the density of protein blots has been measured. But the blots of tubulin, to which everything has been normalized are completely saturated. No program can perform densitometry on saturated blots!
    That holds also true for other quantified blots. E.g. p65.

    -Independent of that basic quantification problem there are further inconsistencies. In Figure 6 B it is shown that GFP infected cells treated with TNF have about 3.2 times higher Pp65 than GFP control cells. The difference is highly significant with p<0.01. If we look at the blots then we see the following: Tubulin (although saturated) is at least 2 times more in GFP TNF than GFP control. The Pp65 levels of GFP TNF compared to GFP control are not so denser. If it has really been normalized to that it is impossible that a 3.2 fold induction came it. If the example shown in the blot was an outlier and not representable, then it is statistically impossible that with 3 independent experiments such p value came out!

    -A similar issue if you compare in Figure 6 A PGC-1{alpha} with TNF to PGC-1{beta} with TNF. In the quantification there is no difference. In the blots shown in Figure 6 A there is a huge difference.

    -The quantification of p65 seems also to be incorrect. If GFP control and GFP TNF shown in the blots (around 2 times different) are really normalized to the corresponding p65 shown (around 2 times different) then there is no difference and the quantification of these blots can not lead to the significant result shown in Figure 6 B.

    -Figure 7 A, 2 publications have already demonstrated that PGC-1{alpha} strongly induced Akt, which is the complete opposite of what In Figure is shown here

    -In Figure 7 B in the p65 control blots there are always higher levels of p65 at 5 mins . If Pp65 at 5 mins is normalized to that, then there is no increase in phosphorylation. The results shown in Figure 7 C do not at all fit to the blots shown in Figure 7 B. It is in my opinion impossible that they could find 4 fold increases with that blots.

    -In Figure 7 C they claim that at 120 mins TNF induced a more than two fold increase in Pp65 in GFP infected cells, but in Figure 7 D under the exactly same conditions that cannot be seen, although the statistical significance was <0.01 before.

    If you go through the whole article there are many more such issues. It is very obvious that the results presented have to be doubted. The results published furthermore are diametrically opposed to what has already been published repeatedly. That would be ok, if there were not such issues that rise considerable doubt of the integrity of the presented data. In my opinion the article by Petra Eisele should never have been published. Now it should clearly be retracted from JBC ASAP.

  6. “The results published furthermore are diametrically opposed to what has already been published repeatedly. That would be ok??**&, if there were not such issues that rise considerable doubt of the integrity of the presented data”

    this is a typical example of “I want, but I can’t”. There are several ways to fabric experiments besides image manipulation. You should be more worried about the increasing data without the proper controls, than for what is going on with the images. Science is about confidence. you don’t rely in your colleges, you see something estrange in the image and you immediately think that the work is fake? It is very sad, how the things are becoming.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.