Plague of anomalies in conference proceedings hint at ‘systemic issues’

Hundreds of conference papers published by the U.S.-based Institute of Electrical and Electronic Engineers (IEEE) show signs of plagiarism, citation fraud and other types of scientific misconduct, according to data sleuths.

“I am concerned that the issue with these particular conferences is widespread enough such that it indicates systemic issues with their peer review systems,” Kendra Albert wrote last August in an email to IEEE that Retraction Watch has seen. 

Albert is a clinical instructor at Harvard Law School and a lecturer in women, gender, and sexuality at Harvard University. On the side, Albert has been working with Guillaume Cabanac, a professor of computer science at the University of Toulouse, in France, to ferret out research misconduct using a computer system called the Problematic Paper Screener.

The tool flags tortured phrases, which suggest a publication’s authors have copied work from other researchers and tried to disguise the offense by using paraphrasing software that also renders scientific terminology near-unintelligible. “Breast cancer” might become “bosom peril,” for instance, whereas “artificial intelligence” could turn into “counterfeit consciousness.”

“IEEE keeps publishing proceedings riddled with tortured papers and other anomalies,” Cabanac told Retraction Watch.

The publisher said it believes its “preventive measures and efforts identify almost all papers submitted to us that do not meet our standards,” and told Retraction Watch that it is “currently evaluating the papers in question.”

IEEE has faced problems with peer review in the past, as we reported last year when it pulled 400 conference papers at once. In previous years, IEEE has retracted thousands of papers, accounting for a sizeable chunk of the retractions in our database.

In their email, Albert cited a paper in the 2021 International Conference on Computer Communication and Informatics whose abstract begins: “Tumor growth is the real reason for death on the planet, furthermore, lung disease is the most every now and again observed sort of Tumor growth among others.”

The paper also contains tortured phrases – say, “arbitrary backwoods” instead of “random forest” – as detailed in a PubPeer comment.

“This paper claims to be a survey of machine learning tools for detection of lung cancer, however, the abstract is incoherent, many of the citations are to irrelevant literature on epilepsy detection via EEG, and much of that literature is incorrectly summarized as being about lung cancer detection,” Albert wrote. “It is unclear to me how any reasonable peer review process would accept such an article.” 

The article’s senior author, K. Venkata Rao, professor and head of the department of computer science and systems engineering at Andhra University in India, did not respond to requests for comment from Retraction Watch.

According to Albert: 

Many other papers published in IEEE conference proceedings and flagged by the Problematic Paper Screener and reviewed by human assessors found clear plagiarism, signs of citation fraud, or errors of a type that should have been caught by an editing process. Of note, few, if any, of the authors have responded to PubPeer inquiries about the irregularities in their papers.

How so many suspicious papers could slip through peer review is unclear. But an interview with a university researcher familiar with the issue suggests one possible answer. 

The researcher, who requested anonymity for fear of reprisal from his institution, said he recently submitted an article to a conference that is “technically sponsored” by IEEE and organized by his university (we are withholding the name of the conference to protect his identity); proceedings from such conferences are available for purchase from IEEE. When the researcher learned that a pair of colleagues would be reviewing his paper, he reported this conflict of interest to the conference chair. The chair, however, told him it was standard procedure to recruit reviewers from the host institution, which also supplied most of the papers for the conference, according to the researcher.

“What they do is to motivate all the faculty members and students to submit a low quality article which will be accepted through a fake peer review by a fellow colleague,” the researcher said in an email. “If the submitted articles in such events were reviewed independently by the international experts in the field no article can get accepted.”

Not only does this system provide a large number of ostensibly peer-reviewed publications for the host institution, it can also be exploited to ensure that the organizers’ work is cited in the submitted papers, the researcher explained. He added that the “systematic misconduct” was “not limited to IEEE conferences,” and cautioned that not all of the publisher’s conferences are tainted. 

“There are numerous correct IEEE conferences with” acceptance rates of “only 30%,” the researcher said, “where the independent or external chairs handle peer reviews.”

As a potential starting point for an investigation, Albert supplied IEEE with a list of 15 conferences whose proceedings contained up to 49 flagged articles each. 

“Of course, not all papers flagged by the Problematic Paper Screener are indicative of scientific misconduct,” Albert wrote, “but the high numbers from these conferences, combined with the findings from our manual reviews suggests that this is likely to be a systemic problem rather than an isolated one.”

We reached out to the chairs of three conferences — the 2022 Fourth International Conference on Emerging Research in Electronics, Computer Science and Technology, the 2021 International Conference on Artificial Intelligence and Smart Systems, and the 2022 5th International Conference on Contemporary Computing and Informatics — whose proceedings have drawn criticism on PubPeer, but did not hear back.

In an email to Retraction Watch, Monika M. Stickel, an IEEE spokesperson, said:

IEEE continuously reviews and improves our processes to ensure that we detect articles that do not meet our standards and that are reported to IEEE. We believe our preventive measures and efforts identify almost all papers submitted to us that do not meet our standards, and we act accordingly when we are alerted to an error and take the appropriate level of care and time in our review. IEEE is currently evaluating the papers in question. Our policy for removing a paper or providing a label is contingent on the outcome of our evaluation.

Meanwhile, IEEE is still selling hundreds of suspicious papers on its website. The gibberish-laden article about lung-cancer detection, for example, can be purchased for $33.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

One thought on “Plague of anomalies in conference proceedings hint at ‘systemic issues’”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.