Journal pulls four breast cancer papers for duplication

The journal Breast Cancer: Targets and Therapy, a Dove Medical Press title, has retracted four articles from a group of Indian researchers over what it said were “unacceptable levels” of duplication with other published work. (Such a construction leaves us wondering what might constitute “acceptable levels” of duplication, but that’s for a different post.)

The articles were submitted by Rajeev Singhai, who is listed as being with Grant Medical College and the Sir J J Group of Hospitals, in Mumbai. According to his After College page, Singhai received his PhD in 2011 and is now a research fellow.

As the notice states:

The Editor-in-Chief and Publisher of Breast Cancer: Targets and Therapy have been alerted to unacceptable levels of duplication with previously published papers. All four papers were submitted by Dr Rajeev Singhai.

It is worth noting that all papers were peer-reviewed by a minimum of two peer-reviewers and the Editor-in-Chief of Breast Cancer: Targets and Therapy before publication. The papers concerned are:

Patil VW, Tayade MB, Pingale SA, et al. The p53 breast cancer tissue biomarker in Indian women. Breast Cancer: Targets and Therapy. 2011;3:71–78.

Singhai R, Patil AV, Patil VW. Cancer biomarker HER-2/neu in breast cancer in Indian women. Breast Cancer: Targets and Therapy. 2011;3:21–26.

Patil AV, Bhamre RS, Singhai R, Tayade MB, Patil VW. Estrogen receptor (ER) and progesterone receptor (PgR) in breast cancer of Indian women. Breast Cancer: Targets and Therapy. 2011;3:27–33.

Patil VW, Singhai R, Patil AV, Gurav PD. Triple-negative (ER, PgR, HER-2/neu) breast cancer in Indian women. Breast Cancer: Targets and Therapy. 2011;3:9–19.

Hat tip: Clare Francis

0 thoughts on “Journal pulls four breast cancer papers for duplication”

    1. I doubt if the authors paid the fees (1249 pounds). It would be more than the annual salary of the corresponding author who is just a research fellow to pay for the four papers. Dove has a provision for waiving the fees and I think they would have published only if they got the waiver.

      I think the more pertinent issue is their review process. As per the article processing statistics on the home page of the journal, the time for decision on a manuscript is merely 6 days which includes review by the editor and the peer review. It makes me wonder if they are compromising on the quality of the process for the sake of speed?
      Speed of publication seems to be their USP though there is a mismatch in what is presented on the homepage under processing stats and on the page ‘call for papers’ where the processing times are different “Breast Cancer: Targets and Therapy has one of the fastest turnaround times of any medical journal in the world. Generally peer review is complete within 2-3 weeks and the editor’s decision within 24 hours of this. It is therefore very rare to have to wait more than 4 weeks for a final decision.”

  1. As an Indian researcher in the US I am appalled by the recent spate of researchers of Indian origin accused of misconduct (Bharat Aggarwal, Anil Potti, Dipak Das). What is wrong with our society? I know having received my basic training in India that the system is deeply hierarchical, authority is rarely questioned, and research integrity is never taught as a course. Nonetheless, I also feel there is something deeply flawed in the moral compass of our upbringing (there is a reason why India routinely ranks as one of the bottom dozen nations on the Corruption Index), which also permeates science. It is deeply disappointing and troubling.

  2. Even if there were some correlation between cultural background and scientific misconduct you still have judge each case on its merits, wherever it is and whoever committed it. It is unlikely that everybody from one country, or culture, is tarred with the same brush. It is possible that political correctness may make people from the majority group less likely to accuse people from recognized minorities of scientific misconduct (say in the U.S), but we do not have any evidence that this is so, and there could be many other reasons. Perhaps a sterile argument? The just and practical thing to do is simply examine each case as it arises.

    Some countries are better quipped to cope with scientific misconduct.

    The U.S. does have the Office of Research Intergity, the UK is attempting something like it, UKRIO, one man and an office, perhaps a dog too, http://www.ukrio.org/.

    http://www.nature.com/news/british-science-needs-integrity-overhaul-1.9803

    Germany does not have an Office of Research Integrity equivalent,

    http://abnormalscienceblog.wordpress.com/2011/06/10/germany-needs-ori-style-investigative-panel-to-cope-with-the-borstel-scandal/

    although institutes do have ombudsmen and ombudswomen, as do the larger organisations such as the Max Planck and the largest funder the DFG. The problem is that they are not independent from those organisations.

    You can get a feeling of which places are better and worse at scientific misconduct (or conversely covering it up) by looking up retractions by country on this website:

    Retraction posts by author, country, journal, subject, and type

    or from glaning at the “Alleged cases” section of http://en.wikipedia.org/wiki/Scientific_misconduct

    although you can see that Cyril Burt and Andrew Wakefield cases clearly belong under the “Great Britain” subheading (people should learn that there is no country of Great Britain, but the country is the United Kingdom of Great Britain and Northern Ireland, the United Kingdom for short). The table is hardly complete.

    Time will tell, and Retraction Watch is gathering the data points.

  3. There are another 6 articles by the same group, published elsewhere, which should also be retracted

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.