The Journal of Biological Chemistry has a fairly gory correction — we’d call it a mega-correction — for a 2010 paper by Levon Khachigian, an Australian researcher whose studies of a new drug for skin cancer recently were halted over concerns about possible misconduct, including image manipulation. As we reported earlier this year, Khachigian has already lost four papers, including one in the JBC — which the journal simply noted had “been withdrawn by the authors.”
The new correction involves the article “c-Jun regulates shear- and injury-inducible Egr-1 expression, vein graft stenosis after autologous end-to-side transplantation in rabbits, and intimal hyperplasia in human saphenous veins,” which Khachigian wrote with Jun Ni and Alla Waldman. The paper has been cited nine times, according to Thomson Scientific’s Web of Knowledge. Here’s the notice:
The original versions of Figs. 1, 5, and 6 contained incorrect micrographs.
In Fig. 1C, the “Dz13scr/Elastin” panel contained the same micrograph as the “Dz13/Elastin” experimental condition. A replacement micrograph for the “Dz13scr/Elastin” panel from a replicate experiment performed at the time of the original experiment is provided.
The “fresh isolate/PCNA” and “fresh isolate/MMP-2” micrographs in Fig. 5D, also used in Fig. 5F to represent the base-line condition, contained the same image. Replacement micrographs from a replicate experiment performed at the time of the original experiment are provided for the two panels in Fig. 5, D and F. The panels labeled “Adeno-LacZ/c-Jun” and “Adeno-LacZ/PCNA” in Fig. 5F were reversed.
The same micrograph used to represent the “vehicle” condition in Fig. 6C was also used in Fig. 6B as the “no shear” condition. A replacement micrograph from a replicate experiment performed at the time of the original experiment is provided for the “no shear” panel in Fig. 6B.
The corrections do not affect the interpretation of the results or the conclusions of the original paper.
We wonder whether readers will agree with that last bit.
As a larger point, however, we wonder whether journals need be so passive when it comes to cases like this. In a one-off situation, it makes sense to give the benefit of the doubt. But from an author with multiple retractions and an ongoing misconduct investigation, such an involved correction really ought to prompt a little concern that ought to be expressed, no?
Australia’s ABC News reported last month that concerns had been raised about another of Khachigian’s papers, in PLOS ONE. Khachigian denied misconduct, calling the problem “a simple formatting error.”
“But from an author with multiple retractions and an ongoing misconduct investigation, such an involved correction really ought to prompt a little concern that ought to be expressed, no?”
No. Each retraction has to be judged on it’s own merit. There is no shortcut. And if an investigation is ongoing, there can be definition be no final conclusion. There should be no rush to retract papers. And there should be no scatter-gun effect on retractions. Even within the same lab experiments can be done to different levels of integrity. I would hate for an honest, ethical junior scientist to lose papers because other people in the same lab had faked experiments in other, different projects.
The statement is that each interaction should be judged on its own merits. I don’t think that we cannot completely be factual. There is also an element of personal judgment, whereby it is well known that behavior in the past correlates with that of the future. Also, somebody, who has been found to have plagiarized or falsified in the past is open to a higher level of scrutiny. These items are allowed. However, it is connected to a particular individual, and not to others (the honest ethical junior scientist). Also people can repent, and later in professional life adopt even higher standards. Nevertheless, the policy of reactionwatch to report past retractions, appears to me justified.
Given that most “science” these days consists of:
1. Professor has great idea
2. Students generate data until the data fits the idea
3. Publish!
or even
1. Students generate data
2. Professor makes up ‘theory’
3. Publish!
The actual data in the papers don’t really ever change the conclusions. Not a big surprise.
I think the students in Levon Khachigian’s lab would like this post.
http://crypto.junod.info/2013/09/09/an-aspiring-scientists-frustration-with-modern-day-academia-a-resignation/
Readers, another extraordinarily faulty Australian paper that in my opinion should suffer a “fairly gory correction — we’d call it a mega-correction –” or retraction is the notorious “Australian Paradox” paper, self-published by two senior University of Sydney scientists in the MDPI journal Nutrients.
I have discussed in detail the self-published (the lead author also was the “Guest Editor” of Nutrients) Australian Paradox paper’s glaring faults with MDPI CEO Dietrich Rordorf on these pages but, alas, competence and integrity seem not to be a priority. Our discussion can be viewed in comments at http://www.retractionwatch.com/2013/08/22/journal-to-feature-special-issue-on-scientific-misconduct-seeks-submissions/
More recently, I have written to BioMedical Central officials to alert them to the Australian Paradox authors’ latest outrageous efforts to misrepresent their faulty paper as flawless, in BMC’s Public Health journal: http://www.australianparadox.com/pdf/LetterBioMedCentral.pdf.
A senior BMC official has assured me that the origins and quality of the University of Sydney’s Australian Paradox paper are under investigation.
Readers, if you have two minutes to compare the paper’s (false) “finding” – down – with the trends in the authors’ published charts – up, or at least up for those series that do not feature conspicuously flat falsified data – then I’ll be surprised if you do not agree that the University of Sydney’s “shonky sugar study” should be corrected or retracted without further unreasonable delay: http://www.australianparadox.com/pdf/GraphicEvidence.pdf
We are left to ponder: What is the purpose of the University of Sydney’s Academic Board – http://sydney.edu.au/ab/about/members.shtml – if the University recklessly pretends that obviously faulty self-published research is top-notch “peer reviewed” science? http://www.smh.com.au/national/health/research-causes-stir-over-sugars-role-in-obesity-20120330-1w3e5.html
amarcus41 wrote “As we reported earlier this year, Khachigian has already lost four papers, including one in the JBC — which the journal simply noted had “been withdrawn by the authors.”
He has actually already withdrawn three papers from the JBC:
Ets-1 positively regulates Fas ligand transcription via cooperative interactions with Sp1.
Mary M. Kavurma, Yuri Bobryshev and Levon M. Khachigian
VOLUME 277 (2002) PAGES 36244–36252
This article has been withdrawn by the authors.
Histone deacetylase-1 is enriched at the platelet-derived growth factor-D promoter in response to interleukin-1β and forms a cytokine-inducible gene-silencing complex with NF-κB p65 and interferon regulatory factor-1. Mary Y. Liu and Levon M. Khachigian
VOLUME 284 (2009) PAGES 35101–35112
This article has been withdrawn by the authors.
Injury-induced platelet-derived growth factor receptor-α expression mediated by interleukin-1β (IL-1β) release and cooperative transactivation by NF-κB and ATF-4. IL-1β FACILITATES HDAC-1/2 DISSOCIATION FROM PROMOTER.
Ning Zhang and Levon M. Khachigian
VOLUME 284 (2009) PAGES 27933–27943
This article has been withdrawn by the authors.
The replacement of the “Dz13scr/Elastin” panel in Fig. 1C still remains the major problem.
In this paper, the authors claimed that Dz13, but not Dz13scr, inhibited intimal thickening of human saphenous veins as shown in Figure 1B and 1C (two “Elastin” panels).
However, when gaze the original Fig. 1C top two panels, Dz13/Elastin (left) and Dz13scr/Elastin (right) were actually identical, just different magnification.
The magnifications of both images Dz13/Elastin and Dz13scr/Elastin were labeled 200X, but in fact, the magnification of Dz13/Elastin image was much lower than that of Dz13scr/Elastin. The Authors blatantly used different magnifications of Dz13/Elastin and Dz13scr/Elastin images to show the difference in intima thickness of veins that in fact not due to Dz13 and Dz13scr treatments in original Fig 1C.
Now the authors have provided a replacement to Fig. 1C top right (Dz13scr control). The replaced panel appears to be at the same magnification as the original Dz13scr/Elastin panel when compare those two panels. So again, the difference in thickness of the intima is just due to the different magnification, not due to Dz13.
I agree with your comments.
The authors have provided a correction to Fig.1C top right panel, but there is no scale bar, so it is not possible to know if it is shown at the same magnification as the Dz13 treated section. The corrected panel appears to be at the same magnification as the panels below, instead of at the same magnification as the Dz13 Elastin panel on the top left.
The difference in thickness of the intima is just due to the different magnification, not due to Dz13 and Dz13scr treatment.
Authors have explicitly indicated that the images of four Dz13 panels (left) and four Dz13scr panels (right) in Fig.1C are from 2 sets of 4 corresponding sister-slides (cross-sectioned from a paraffin-embedded Dz13 treated vein and a paraffin-embedded Dz13scr treated vein). Authors also indicated that the magnification of all 6 panels of images C-Jun, PCNA and α-SM actin are in high-power field 400X. As discussed (by Peer 9), both images Dz13/Elastin and original Dz13scr/Elastin are identical. Therefore, the eight panels in Fig. 1C are indeed from same sample (if the images of intimal layer(s) of Dz13/Elastin and/or the original Dz13scr/Elastin are enlarged and compared with C-Jun, PCNA or α-SM actin images, the similarity of those images can be seen).
The replacement of Dz13scr/Elastin does not alter the fact that the other seven panels [four Dz13 panels (left) and three Dz13scr C-Jun, PCNA or α-SM actin panels (right)] are still from the same sample.
@Dan Zabetakis is correct, each paper judged on its own merits. However, the question remains, should the paper be corrected or retracted and the authors then free to pull the data together property in a new paper. @Peer 9 suggests that there are still problems.
Allied to this is the fact that we do seem to operate in these types corrections under a double standard, with students at Universities facing more severe penalties than their professors. This is something I have posted about at length before, e.g., http://bit.ly/1eg4lYO and http://bit.ly/1bVw8cc. I do not think that such a situation is acceptable, the professors, who after all enforce academic standards, should be leading by example. I cannot see any viable argument to the contrary.
Dear Ferniglab ,
Many thanks for your comments. I also like your earlier comments on RW this year and I would like to put them here for the researchers in Khachigan’s lab. Thanks again
I wish the executive team members of the Universities and the research funding bodies to read this comments.
“We should also remember that the fraud we are talking about has gone through a number of “filters” prior to publication. The PI must be aware of the raw data and of the rules of the game. So if the PI “pushes” a student or postdoc to manipulate (or cherry pick, etc., etc.) data prior to publication, the PI is not just contravening established practice for reporting results, but also for training. There is consequently a question of corruption of the integrity of others by bullying, since the young are in a weak position and may not be able to stand up to the PI.
On those occasions when they do, they don’t get a PhD in that lab or they lose their postdoc, have a publication gap in their CV and a problem with getting a reference. Career wise they will lose several years and they are likely to be lost to science. That is one of the major tragic outcomes of fraudulent science (the other being the death of people).
We should have funds to spend on an initiative to support the young where were both brave and foolhardy to challenge a fraudulent PhD supervisor or postdoctoral mentor. Young scientists who challenge their PI with data are the dream of all real scientists and clearly have the right ingredients to make important contributions to science in the future.
We should also remember, in terms of what punishment fits the crime that a fraudulent lab will produce a mix of young scientists with a tendency to do fraud and/or who leave science. This is where we lose talent, since the young were recruited in the first place because they have talent.
The fraudsters may be very smart, but they are flawed in some way and so are unlikely to be able to continue in science. Again noted by others, we might take a leaf out of how sports deal with cheating. First a ban for a couple of years, then a lifetime ban and the loss of medals and in some cases a demand for a return of prize money. ”
Ferniglab January 9, 2013 at 7:26 pm
Pubpeer reporting early Khachigian works.
https://pubpeer.com/publications/8596917
https://pubpeer.com/publications/10807774
2018 retraction of paper which is topic of this post.
http://www.jbc.org/content/293/52/20307.short
This article has been withdrawn by the authors. The article was the subject of independent external investigations commissioned by the University of New South Wales in 2014 that made no finding of error beyond those specified in the November 2013 correction. However, the Journal raised questions regarding Figs. 1A and 5B, which the authors were not able to address as the raw data were no longer available.