In July 2015, DNA and Cell Biology began routinely scanning manuscript submissions for plagiarism using iThenticate; since then, it’s rejected between four and six manuscripts each month for that reason alone. Additional submissions have been rejected after the journal realized the authors had digitally altered figures. The level of misconduct “shocked” editor-in-chief Carol Shoshkes Reiss, as she wrote in a recent editorial for the journal. She spoke to us about the strict measures the journal has adopted in response to these incidents.
Retraction Watch: Why did you decide to begin scanning submissions for plagiarized text using iThenticate™ software in July, 2015? Did something prompt that decision?
Carol Shoshkes Reiss: Scientists evaluating more than one paper that summer indicated in their critiques that text was strikingly familiar to material they had read. Until that point, we had done the scans only when alerted by reviewers. We also were alerted by a careful reader of the literature that one paper published by DNA and Cell Biology appeared to have text in common with another paper.
We elected to err on the side of caution, to maintain high standards, and therefore to scan each submission before considering the paper. I know that at least one journal scans accepted manuscripts, but in my opinion, accepting a manuscript is a contract, and too late in the process to reject for plagiarism.
RW: You say you now reject 4-6 articles each month that include long stretches of copied text. Out of how many monthly submissions, approximately? (Just trying to get a sense of the rate of plagiarism.)
CSR: The number of papers considered each month varies. I do not include the number of papers sent back (unsubmitted) to meet our Instructions to Authors; there are eight unsubmitted papers in our active system today. The statistics I keep are the number of decisions made each month; these include rejection as inappropriate for the journal, rejection following peer review, major or minor revisions, redirected to the Open Access companion journal, or accept. As of 5/2/16, just under 200 papers have had decisions. 31 Accept, 5 referred to BioResearch Open Access, 11 Minor revisions, 46 major revisions, 17 rejected following peer review, 28 were rejected for either scientific misconduct (including digital manipulation of data and most recently dual submission of the identical paper to 2 journals — again we were alerted by the fortunate choice of a reviewer) or plagiarism. One accepted paper was withdrawn by the authors who found they could not replicate their work.
RW: How did you feel when you saw how many papers contained plagiarism?
CSR: Frankly, I am shocked by the abundance of this unprofessional behavior. Most would have “slipped through the cracks” if we were not vigilant.
RW: As you know, there are many other forms misconduct can take, besides plagiarism. What steps are you taking to identify duplicated or manipulated images, for instance, which one study recently suggested could affect as many as 1 in 25 published papers?
CSR: With the assistance of my section editors, we have examined the data figures (gel bands and microscopic figures). In most cases these have not raised flags, but, as I said, in 2016 so far there have been three (one photographs of cells and the other 2 were gel bands) submissions detected.
RW: You note that you now inform the authors’ institutions as well as funders if you find plagiarized papers. Do you know of other journals that take the same steps?
CSR: I do not know if other Editors do more than reject papers when they detect plagiarism or scientific misconduct. I offered two other Mary Ann Liebert editors the database I am developing of contact individuals in universities — currently there are 50 institutions, some with more than one instance of an issue in the last 10 months. Some of these email addresses are harder to obtain than others, and some foreign universities have web sites that are completely opaque; I have found that published fax numbers are often disconnected and I am unwilling to spend money to FEDEX packages of documents to foreign universities. I don’t know how to reach the integrity officers at companies, when the authors are associated with industry.
Yes, this has been a burden, no question about that.
RW: Some people have criticized plagiarism detection software, saying it can flag non-plagiarized text if it contains clichés or other familiar phrases. Have any authors defended their innocence?
CSR: We are not talking about a few words here and there. Much more than phrases or common language describing materials and methods (which we all know can be cookbook).
This week I received an email from one author, begging me to support him with his university; it appears that although the universities do not generally respond to me, at least a few read the messages I send. This author indicated that he was very busy with his numerous duties and not a native English speaker. For those reasons he borrowed language from papers he read.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.
Was Mrs. Reiss making any distinction normal plagiarism and self-plagiarism? Although both are problematic, I think that they should be considered separately. If an author re-uses too much of their own text, without reusing their previously published data, I’d say that this simply laziness or a bad habit (it’s also quite common). Double publishing data or presenting someone else’s work as your own are higher levels of misconduct.
http://www.nature.com/news/not-all-plagiarism-requires-a-retraction-1.15517
It is actually “Dr. Reiss”….
I think in cases self-plagiarism can be even good practise. Think about a mathematician finding some new equation wich he uses for further work. Would it make sense to change to ordering of additive or multiplicative terms just to avoid “self-plagiarism”?
Typically plagiarism detection software is set to 20% identity with references (obviously) filtered out. 4-6 per months is a low estimate as those who do it often find a way to bypass it (I’d rather not say how here) and then you can see it only by “manual” reading and only in “special”, if you will, circumstance (eg, drastic changes in style or literary quality of the text).
Special case is image plagiarism – those are almost impossible to track – in all cases that I have seen I discovered it either accidentally or along with plagiarized text.
Second special case – copycat papers. This is not even plagiarism per se but rather a situation when you see two articles on somewhat different subject with very similar writing style and figure aesthetics. Although these articles have no overlapping authors it is likely that those texts were written by the same person (or people) and certainly illustrated by the same artist. Technically not much could be done about these papers. It often makes me wonder, however, whether the experiments described in these papers have ever been conducted.
There is an Association of Research Integrity Officers. Surely, they must have an updated list of RIOs with contact information and more. The association even sponsors an annual conference, https://www.mskcc.org/ario2016, though I cannot seem to find the Association’s actual website at the moment.
Are you concerned about the corpus of the literature? Or, is it acceptable to let previous plagiarists get away with their ill gotten gains? IMO, journals should datamine previously published research, because they will be doing a great disservice to honest authors if they do not start scanning backdated publications!
usually one needs subscription for turnitin or ithenticate. How did DNA and Cell Biology manage to pay for this service?
It would just be one more cost of doing business, and they would have cut costs somewhere else or increased subscriptions.
The publisher, Mary Ann Liebert is a member of Crossmark and has a subscription for ithenticate
I like the idea of notifying the institutions and the funders. I don’t know how much these entities would care, but it’s a nice move to inform them.
The “31 Accepted to 28 rejected for either scientific misconduct …or plagiarism” is a bit for me to take in. I believe it accurate but am astounded.
“I don’t know how to reach the integrity officers at companies when the authors are associated with industry.”
In law this would be called “assuming a fact not in evidence” the fact being there are integrity officers at companies. [i’m sure some companies that do, but many that do not, and won’t].
But I think most companies would, even without such an officer, take any concerns very seriously.
Companies in PRC are quite different in many aspects and standards than are those in Western countries.
A point worth discussing: In the “exact sciences” the more concise and on-spot your wordings are the more “unique” they get. The penultimate case of this would be mathematical formulae. The question is if once for one and the same thing “the best” wording is found, is it really scientific to change it just because to avoid plag-software to respond?
How does iThenticate work? Does it detect cross-language plagiarism? Can an article published in 1990 be scanned for plagiarism?
Aceil,
Does it detect cross-language plagiarism?
Yes. It has 30 or more languages that it scans through.
Can an article published in 1990 be scanned for plagiarism?
Yes.