ORI: Ex-grad student “falsified and/or fabricated” data in PNAS submission

A former graduate student falsified or fabricated data in a manuscript submitted to the Proceedings of the National Academy of Sciences, according to the Office of Research Integrity at the U.S. Department of Health and Human Services.

In a finding released Dec. 8, ORI said that Matthew Endo, a former graduate student at the University of Illinois at Urbana-Champaign, “intentionally, knowingly, or recklessly” caused false data to be recorded, and “falsified and/or fabricated data and related images” by altering, reusing, or relabeling them.

Endo has agreed to a settlement, effective Nov. 16, which requires him to work under supervision for three years on projects supported by the U.S. Public Health Service, among other conditions.

The manuscript entitled “Amphotericin primarily kills human cells by binding and extracting cholesterol” was submitted to PNAS, but withdrawn prior to peer review.

Specifically, ORI found that Endo used tactics to make results look better than they actually were, such as altering a laboratory test result to make a drug preparation “appear more pure than in the actual results of experimentation,” and lying about the number of times he’d run an experiment.  As an example: Continue reading ORI: Ex-grad student “falsified and/or fabricated” data in PNAS submission

New feature aims to draw journals into post-publication comments on PubPeer

Brandon Stell

When a paper is challenged on PubPeer, is a journal paying attention? A new feature recently unveiled by the site makes it easier to find out. The Journal Dashboards allow journals to see what people are saying about the papers they published, and allows readers to know which journals are particularly responsive to community feedback. We spoke with co-founder Brandon Stell to get more information.

Retraction Watch: Can you briefly describe the Journal dashboards and how they work?

The dashboards are a collection of features that we created to make it easier for journal editors to track and react to comments on their journal.  The dashboards allow journals to create teams whose members receive immediate alerts to new PubPeer comments.  They will also be able to access other information such as statistics of commenting trends across the journal.  Specialized searches will also be available. At the moment the dashboards are available to journal editors only but we hope to offer a similar service for institutions in the near future.

RW: What prompted PubPeer to create the Journal dashboards?

Continue reading New feature aims to draw journals into post-publication comments on PubPeer

“(Hundreds of hours of) work vindicated:” Critic of food researcher reacts to new retraction

Nick Brown

Ever since Cornell food researcher Brian Wansink wrote a blog post one year ago praising a graduate student’s productivity, things have gone downhill for him. Although he initially lauded the student for submitting five papers within six months of arriving at the lab, the four papers about pizza have all since been modified in some way after the research community began scrutinizing his work; two have been outright retracted. On Friday, Frontiers of Psychology retracted the fifth paper, about the shopping behavior of military veterans, with a notice stating a journal probe found “no empirical support for the conclusions of the article.” The retraction — covered by BuzzFeed — was likely not a surprise to Nick Brown, PhD student at the University of Groningen, who had expressed concerns about the paper in March.

Retraction Watch: You note that this newly retracted article was co-authored by the graduate student Wansink initially blogged about, but wasn’t as heavily scrutinized as the four papers about pizza consumption she also co-authored. Why do you think this paper wasn’t as closely examined?

Continue reading “(Hundreds of hours of) work vindicated:” Critic of food researcher reacts to new retraction

Make reviews public, says peer review expert

Irene Hames

After more than 30 years working with scholarly journals, Irene Hames has some thoughts on how to improve peer review. She even wrote a book about it. As the first recipient of the Publons Sentinel Award, Hames spoke to us about the most pressing issues she believes are facing the peer review system — and what should be done about them.

Retraction Watch: At a recent event held as part of this year’s Peer Review Week, you suggested that journals publish their reviews, along with the final paper. Why?

Irene Hames: I don’t think that saying something is ‘peer reviewed’ can any longer be considered a badge of quality or rigour. The quality of peer review varies enormously, ranging from excellent through poor/inadequate to non-existent. But if reviewers’ reports were routinely published alongside articles – ideally with the authors’ responses and editorial decision correspondence – this would provide not only information on the standards of peer review and editorial handling, but also insight into why the decision to publish has been made, the strengths and weaknesses of the work, whether readers should bear reservations in mind, and so on. As I’ve said before, I can’t understand why this can’t become the norm. I haven’t heard any reasons why it shouldn’t, and I’d love the Retraction Watch audience to make suggestions in the comments here. I’m not advocating that the reviewers’ names should appear – I think that’s a decision that should be left to journals and their communities.

Continue reading Make reviews public, says peer review expert

Former Emory, Georgetown postdoc falsified cancer research data: ORI

A former postdoc at Emory and Georgetown Universities falsified data in manuscripts and a grant application to the U.S. National Institutes of Health, according to the Office of Research Integrity (ORI) at the U.S. Department of Health and Human Services.

Mahandranauth Chetram committed misconduct while at Georgetown, the ORI said in a finding released today Continue reading Former Emory, Georgetown postdoc falsified cancer research data: ORI

Weekend reads: Ethical issues could cost university millions in funding; Stolen bone raises questions; Ingelfinger rides again

The week at Retraction Watch featured a the story of how a nonexistent paper earned 400 ciations, a lawsuit filed against a journal for publishing criticism, and the retraction and replacement of a paper by a group of anti-vaccine advocates. Here’s what was happening elsewhere: Continue reading Weekend reads: Ethical issues could cost university millions in funding; Stolen bone raises questions; Ingelfinger rides again

Newly released AI software writes papers for you — what could go wrong?

This week, we received a press release that caught our attention: A company is releasing software it claims will write manuscripts using researchers’ data. 

The program, dubbed “Manuscript Writer,” uses artificial intelligence (AI) to generate papers, according to the company that created it, sciNote LLC. A spokesperson explained the software generates a first draft the scientist should revise, and won’t write the Discussion, “the most creative and original part of the scientific article.” But can it provide any coherent text?  

According the release from sciNote, Manuscript Writer (an add-on to the company’s Electronic Lab Notebook, or ELN):

Continue reading Newly released AI software writes papers for you — what could go wrong?

Looking to avoid a bad lab? A new site wants to help

We’ve all heard horror stories of lab disputes that can quickly spin out of control. (Such as a graduate student obtaining a restraining order against his supervisor, which we covered earlier this year for Science.) Naturally, prospective students want to do their homework before committing to a particular laboratory or supervisor. A new website, QCist, is trying to make that process easier, by letting students rate labs. It’s still new – only several dozen lab heads have been rated so far, mostly from the U.S. – but founder and Executive Director Qian-Chen Yong has plans for it to grow much bigger. We spoke with Yong, currently a research fellow at the Cancer Research Institute, Baylor Scott & White Health in Texas — who completed a postdoc at Texas A&M Health Science Center and a PhD at the National University of Singapore — about the plan to keep the site from becoming a place to smear a tough boss’s reputation.  

Retraction Watch: What inspired you to create this site?

Continue reading Looking to avoid a bad lab? A new site wants to help

Weekend reads: Death penalty for scientific fraud?; Why criticism is good; Cash for publishing

The week at Retraction Watch featured revelations about a case of misconduct at the University of Colorado Denver, and the case of a do-over that led to a retraction. Here’s what was happening elsewhere:

Continue reading Weekend reads: Death penalty for scientific fraud?; Why criticism is good; Cash for publishing

RAND withdraws report on child welfare reform for further analysis

Last week, Emily Putnam-Hornstein, an associate professor at the University of Southern California, was reading what seemed like a noteworthy new report from the RAND Corporation on the child welfare system. But then she realized that some of the key estimates were off. When she sent the report to some colleagues, they agreed.

Curious, Putnam-Hornstein and some of her colleagues tuned into a RAND webinar on Thursday, May 25, to discuss the report, Improving Child Welfare Outcomes: Balancing Investments in Prevention and Treatment, which had been released two days earlier. They asked the report’s lead author, Jeanne Ringel, about the numbers, and Ringel responded by saying they were on-target. (Ringel recalls acknowledging that the numbers were conservative, but that revised inputs would not change the overall results substantially.) The Pritzker Foundation, which had funded the study, also dismissed the concerns.

Ringel, however, contacted Putnam-Hornstein to suggest a phone call. The Memorial Day holiday weekend was just about underway, so the call was scheduled for Wednesday, the 31st. In the meantime, Putnam-Hornstein and other researchers drafted a letter explaining their concerns. A conference call happened on the 31st, during which the critics shared their concerns, and also said that they’d publish the letter online if the report was not retracted swiftly.

Apparently, the critics were persuasive:

Continue reading RAND withdraws report on child welfare reform for further analysis