Elisabeth Bik, a microbiologist at Stanford, has for years been a behind-the-scenes force in scientific integrity, anonymously submitting reports on plagiarism and image duplication to journal editors. Now, she’s ready to come out of the shadows.
With the help of two editors at microbiology journals, she has conducted a massive study looking for image duplication and manipulation in 20,621 published papers. Bik and co-authors Arturo Casadevall and Ferric Fang (a board member of our parent organization) found 782 instances of inappropriate image duplication, including 196 published papers containing “duplicated figures with alteration.” The study is being released as a pre-print on bioArxiv.
An example the paper uses of “duplication with alteration” is this Western blot where a band has been duplicated:
Bik’s procedure to find these kinds of duplications is disarmingly simple. She pulls up all the figures in a paper and scans them. It only takes her about a minute to check all the images in a PLoS ONE paper, a little longer for a paper with more complicated figures. In some cases, Bik adjusted the contrast on the image to better spot manipulations.
My initial screen is without software, just quickly flipping through images. When I see something that might be a duplication, I will download the figure(s) and inspect them using Preview (Mac software). I use simple color or brightness adjustments in Preview. I don’t use Photoshop or other advanced programs.
It gets easier to spot problems the more you look, Bik told us on the phone:
In Western blots, every band has their own characteristics, they’re like faces. I think if you train people, immediately they will see something is wrong. It takes you less than a second to recognize a face.
After screening the papers for this study, she then puts together detailed reports on the duplications and sends them to Casadevall and Fang, who both have to agree there was inappropriate duplication before inclusion in the paper.
For me, it’s very obvious. It sort of shouts out to me. Some of these examples are almost funny, in a disturbing way – they’re so obviously copied…On the other hand, these papers have been reviewed and seen by editors, and downloaded hundreds or thousands of times.
She admitted that there are almost certainly examples that she missed, noting that she got better and faster over time. But she said false positives are less likely, because the team required consensus from all three members to include a paper.
About 10% of the papers I flagged, we didn’t agree so I took them out…When three people agree that an image looks very similar, it might still be a different image, but I think it’s reason to flag that something is possibly wrong, and then talk to the author.
In 2013, Nature reported the findings of a previous screen of published papers — done by Italian bioinformatics startup BioDigitalValley founded by Enrico Bucci, and focusing mostly on gel-electrophoresis in Italian studies — which found about one in four papers had inappropriate duplications of images. Those included repeated use of the same image, as well as copy and pasted gel bands.
Mike Rossner of Image Data Integrity, former managing editor of the Journal of Cell Biology, was impressed by how many papers Bik screened, and how fast she was able to spot problems. “To look [for duplications] between different figures takes really good visual memory, and she must have that,” he told Retraction Watch. (Note: Rossner spoke to us about his new company in February.)
None of the papers included in Bik’s analysis had been retracted at the time she screened them. She has since submitted over 700 reports to journal editors showing the duplications, and written to about 10 institutions where she found what she calls “clusters” — three to six papers from the same group containing duplications.
This has resulted in six retractions, four of them in Fang’s journal, Infection and Immunity (which we covered at at the time) — much lower than the 42% retraction rate she has received in the past from reporting plagiarism she found through Google Scholar searches. About 60 inappropriately duplicated images have been corrected since Bik first began reporting these findings to editors in the spring of 2014. She estimates an average of six months between her reporting and the retractions or corrections over the course of this project.
I do plan to write another paper on the lack of outcome with these reports…It’s not in [journals’] interest to retract the papers, because it will bring down their impact factor. If we publish their response rate, then maybe we can motivate them to show they care about the science.
Bik has struggled with journals in another respect — getting this latest analysis published. The paper has been rejected by three journals, one after peer review. The other two journals chose not to send it out at all.
“I expect this to be a controversial paper, no journal wants to hear a percentage of their papers is considered very bad,” Bik told Retraction Watch when asked why she thinks it was rejected. “One reviewer said, oh, this has to be published. Most of the others said it’s very controversial, that it’s not novel.”
At one of the journals that didn’t send it for peer review, the editors told Bik it was important work, but not suitable for their audience.
“The fact that image manipulation is going on is a problem with the scientific record,” Rossner told Retraction Watch. “It seems to me she should be able to find a place that will publish this.”
Casadevall is the editor in chief of mBio, which showed an inappropriate duplication rate of 1.71%; at Infection and Immunity, that rate was 2.8%.
David Vaux, a cell biologist at the Walter + Eliza Hall Institute of Medical Research in Melbourne and a board member at our parent organization, told us he believes this study is an important step towards a better scientific record:
The paper from Bik, Casadevall and Fang provides strong evidence supporting the idea that a major reason for the lack of reproducibility of science papers is deliberate falsification of data.
They completed the Herculean task of visually inspecting the figures in over 20 thousand papers, looking for image duplications. They then looked at suspicious images more closely using image processing software. In this way they detected duplicated images in 3.8% of papers. Although they could not distinguish accidental duplications (e.g. by incorporating the same image file twice in a multi-part figure), they did sub-categorize the papers into less worrying classes such as “cuts”, “beautification”, and “simple duplications” and more worrying classes in which the duplicated images were repositioned or further altered such as by stretching or rotating.
Although they mainly focused on papers with Western blots, their findings in no way suggest that Western blotting is a flawed method. Indeed, it suggests that Western blots are harder to fake in an undetectable way than other experimental data.
The strength of their evidence should be enough to convince everyone that there is a major problem with how research is being conducted. Now we need to determine what to do with this information. Should journal implement similar visual screens, should they use computerized screens, or some sort of combination? What is the best way to handle the suspicious images are detected?
Bik told us she believes peer reviewers and editors should be more aware of inappropriate duplications, and take the time to look for this kind of problem before a paper gets published, so journals don’t have to deal with retractions after the fact.
“You should be flagging this when you see it, give authors a chance to do something” like fix the questionable figures, she said. “It’s better to reprimand somebody in private, rather than in public.”
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.