Fired Kalasalingam prof Gurunathan’s retraction count stands at eight

We’ve found another retraction for a paper bySangiliyandi Gurunathan, the former researcher at Kalasalingam University in India fired over multiple instances of data fabrication that also caused six Ph.D. students to get kicked out of their program.

The retraction was published In Vitro Cellular & Developmental Biology – Animal in October 2011 but we only saw it last month. It reads, in full:

The paper, “Isolation and characterization of goat retinal microvascular endothelial cells,” by Haribalaganesh Ravinarayanan, Banumathi Elayappan, Sheikpranbabu Sardarpasha, Deepak Venkataraman, Sirishkumar Namagiri, and Gurunathan Sangiliyandi has been retracted at the request of the Editors as it contains fraudulently manipulated data. Our investigation has found at least two instances of data misrepresentation in this paper. In figure 2a, the 12 h/10%FBS and 24 h/1%FBS photomicrographs are identical as are the 24 h/10%FBS and 36 h/10%FBS photomicrographs. We have also found instances of image duplication in the 0 h/1%FBS, 0 h/10%FBS, 12 h/1%FBS and 24 h/1%FBS photomicrographs. As such, this article represents a severe abuse of the scientific publishing system. The scientific community and the Editors take a very strong view on this matter, and apologies are offered to readers of the journal that this problem was not detected during the submission and review process.

—The Editors

The current online version of the paper makes no mention of the retraction, but instead has the following warning after the abstract, keywords and editor’s name:

An erratum to this article is available at http://dx.doi.org/10.1007/s11626-011-9460-1.

Typically Springer stamps titles and watermarks PDFs with “RETRACTED ARTICLE” as a fair warning to researchers.

Springer didn’t seem to be making the distinction a priority. Jacco Flipsen, editorial director of life sciences at Springer Dordrecht that publishes the journal, told Retraction Watch:

I don’t have any information on that for you. I’m not that interested to dig into this at this point in time.

Sangiliyandi, who has had eight papers retracted, by our count — two in Angiogenesis, one in Biotechnology Advances, two in Experimental Eye Research, one in In Vitro Cellular & Developmental Biology – Animal, one in Life Sciences, and one in Vascular Pharmacology —  did not respond to email requests for comment.

5 thoughts on “Fired Kalasalingam prof Gurunathan’s retraction count stands at eight”

  1. hmmm…”In Vitro Cell. Development. Biol., Animal” – Impact Factor (IF) 1.31

    recent Retraction Watch posts have involved retractions in:

    Immunopharmacol. Immunotoxicol. – IF 1.8
    Korean Circulation J. – IF 0.37
    Hip International – IF 0.76
    Protein Pep. Lett. – IF 1.78
    J. Systems Architec. – IF 0.44
    J. Software – IF 0.44
    J. Clin. Biochem. Nutrition – IF 1.9
    etc.

    There are simply far too many low rank journals that are of little interest to anyone (since hardly anyone cites their articles) other than the authors that obtain publications for their CVs. It’s not surprising that this is associated with a rise in journal retractions, since it’s simply far too easy to prepare a fraudulent manuscript (often involving plagiarism) and get this past the lax editorial practices of very low rank journals where nobody seems to care too much. The huge slew of new electronic journals in the last 10-15 years adds insult to this problem.

    This is a particular issue with the the vast expansion of “you pay – we publish” journals. These are simply recipes for promotion of unethical behaviour as an article in tomorrow’s Nature describes:

    Predatory publishers are corrupting open access

    Journals that exploit the author-pays model damage scholarly publishing and promote unethical behaviour by scientists, argues Jeffrey Beall.

    Although RetractionWatch describes itself as a “window into the scientific process”, it seems to me that much of what is described here is a window into the seedy side of the lower echelons of dodgy scientific publishing and dubious efforts at career advancement. The “scientific process” proceeds much as it has always done with good work done by good scientists published in good journals.

    That’s not to minimize the fact that there are some disgraceful practices that occasionally manifest themselves in the meaningful scientific literature. Can’t fault Retraction Watch for highlighting problematic science in journals across the board and encouraging/shaming editors and publishers to address this properly. But if we’d like to get some insight into the rise in retractions in recent years we shouldn’t neglect to notice an elephant in the room.

    1. Well, apparently these journals weren’t all that lax, nor its readers, considering the fact the papers were retracted…

  2. I don’t think that any “business model” for journal publication offers protection against sloppy science.
    The only protection is a vigilant scientific community.

  3. I think you have to separate those journals that are vanity publishers from those that have a normal peer-review process. Impact factor derives largely from a very small number of papers in a journal (and a cunning way of using reviews) and is a measure of popularity, not prestige. Review journals have very high impact factors, but contain no new information. A journal that does contain new information is likely to have a lower impact factor. If you rank journals by a prestige indicator (so trying to rank according to the utility of the primary information), ranking changes. ISI’s eigenfactor attempts to do this, but the Page-rank algorithm is probably better – after all Page got very rich and everyone uses it! When you do this review journals sink and many of the “older” established learned society journals, tarred by some with the “sound but dull” label rise to the top. So journals with modest impact factors contain excellent articles, well worth reading. It is a lot easier to repeat an experiment described in “In vitro” than in Nature, because the former actually has a methods section worthy of the name. I’ll take “sound but dull” any day over “popular fiction”.

    1. I agree with some of what you say. No question that the vanity publishers are particularly problematic. But there are simply far too many journals and so there is a vast tail of published research that has very little impact in the progression of scientific fields. Over the years in any particular field I’ve worked in there have always been of the order of 10-12 core journals that we might consider publishing in, and which I would routinely read (or at least scan the contents). In those same fields one might uncover maybe 50-100 other journals many of which one never hears of (the first time I’ve heard of quite a number of journals in fields broadly related to my own is when these appear in posts on Retraction Watch!).

      I’m not sure I agree completely about Impact factors. Of course review journals should be removed from consideration when discussing the value /impact of primary research journals. But Impact Factors do bear some relation to the quality/importance of the work. I did a quick perusal of the citations of papers in J. Am. Chem. Soc. (IF ~ 10) for a two month period in Jan/Feb 2010 this morning, and almost all of the papers have high citations in the intervening period (obviously there is a distribution with some having lower and some higher, even much higher than the Impact Factor). But if you work in a Chemistry/Biological Chemistry field one knows that the important work in these fields is likely to be found there – the high impact factor is a refelction of that. It’s not a matter of popularity, nor the effects of a few highly-cited papers. Maybe things are different in oher fields.

      The same can be said for the very high impact multi-disciplinary journals (Nature/Science/PNAS). One could do quite well in following the progression of scientific fields by restricting one’s reading to these journals. They’re simply where much of the important work in a field is published. I don’t share the view that there’s something inherently problematic about these journals – even if they are sometimes provocative in their selection of papers. I also don’t think the less than fulsome Methods section of Nature is such a problem either. Nowadays this is usually supplemented by a more detailed supplementary document, and if I have ever had problems reproducing an experimental protocol from a paper, I’ve contacted the authors directly, or if it’s important enough visited their lab. The idea that one should always be able to immediately reproduce a methodology from a protocol in a published paper has always been a bit of a fiction.

      You’re right ‘though about prestige and old prestigious journals. In my field J. Biol. Chem. has particulalry suffered fom the rise of metrification – it’s a prestigious journal which has seen its impact factor drift downwards (around 6 now), although it’s probably near the top of the Biol Chem/Mol Biol journals when it comes to page rank.

      I’ve probably been a little unfair in picking out the journals on the list in my post to make a point about the ludicrous number of journals now in circulation, and the higher likelihood that unscrupulous authors wishing to boost their CV’s are more likely to target these with shoddy work. They’re just a selection of journals that have been highlighted on this site in the last few weeks. Obviously good practice should be encouraged across the board and bad practice condemned wherever it appears.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.