Should researchers guilty of misconduct go to “rehab”?

Nature_latest-cover

A report on the first few years of “researcher rehab” suggests that three days of intensive training have a lasting impact on participants.

Specifically, among participants — all of whom had been found guilty of at least one type of misconduct — the authors report that:

A year later, follow-up surveys indicate that the vast majority have changed how they work.

The authors claim this shows the program is worth the time and investment — a $500,000 grant from the National Institutes of Health, and a cost of $3,000 per participant for the three-day course. Do you agree? Tell us what you think in our poll at the end of the story.

Infractions ranged from consent issues for human subjects, plagiarism, and outright fraud. Still, researchers who need this training aren’t much different from everyone else, the authors note in “Lessons of researcher rehab,” published today by Nature:

We have now trained 39 researchers from 24 institutions. Researchers in our programme do not display personality traits that are distinct from the general population of scientists. We believe that most researchers may be susceptible, and that the busiest ones are most likely to err.

Half of participants in the program — initially called RePAIR, now dubbed the Professionalism and Integrity Program, or PI Program — had enrolled because of “failure to provide oversight, leading to problems,” the authors note:

Three causes played a part in most cases: paying too little attention to details or oversight; being unsure about relevant rules; and not prioritizing compliance. All these could be attributed to other, more basic causes. For example, many participants provided too little oversight of their teams because they were overextended or understaffed. People sometimes were unsure of rules after moving into a new area of research. They also encountered regulations that had grown more complex since they completed their training.

None were serious cases, they add:

There are high-profile cases of serial fraudsters who have consciously built their careers on fabricated data and who, some research suggests, have personality disorders. We do not encounter such individuals in our programme. In general, we work with talented faculty members who seek to do good research and whom institutions wish to retain.

The project has revealed some “myths” about misconduct, according to James M. DuBois at Washington University in St. Louis and his colleagues — namely, that “only bad apples get in trouble:”

…participants’ infractions rarely resulted from a conscious intent to mislead or break rules.

Another myth: “the more publications and grants the better:”

By the metrics that institutions use to reward success, our programme participants were highly successful researchers; they had received many grants and published many papers. Yet, becoming overextended was a common reason why they failed to adequately oversee research. It may also have led them to make compliance a low priority. People who are too busy must triage, and what scientist wants to prioritize checking patient signatures above data gathering?

Principal investigators should protect themselves and their labs by taking on no more projects than they can responsibly oversee and adequately staff.

In all, DuBois and his co-authors argue the program is worth the time and investment in the program:

Following the workshop, our participants demonstrate more positive attitudes toward compliance, improved problem-solving skills and better lab-management habits…In our view, intense, individualized training following a breach can be remarkably effective. And it is unquestionably much more cost efficient than letting problems fester until even bigger problems arise for investigators and institutions. Our participants have gone from analysing their own lapses to customizing solutions, such as holding more face-to-face meetings or developing [standard operating procedures].

Do you agree with the authors? In 2013, after the program began, we ran a poll asking that question, and one-third of readers told us they didn’t believe researchers found guilty of misconduct can be rehabilitated (and posted nearly 90 comments in response). Now that the first results are out, we’re asking the question again. Tell us what you think, below.

[polldaddy poll=9440587]

Update: 6/8/16 2:31 p.m. eastern: We heard from Donald Kornfeld at Columbia University, who noted that — as the authors say — this program did not include researchers evaluated for serious offenses, such as those sanctioned by the U.S. Office of Research Integrity (ORI):

If this program is limited to individuals who have either committed plagiarism or failed to follow various Regulations re animal welfare and informed consent, they are not working with the 88% of  individuals found guilty of research misconduct annually by ORI. Over the ten years I studied ORI Reports, only 12% of guilty offenders were guilty of plagiarism; 45% were guilty of fabrication and 66% of falsification.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

21 thoughts on “Should researchers guilty of misconduct go to “rehab”?”

  1. James DuBois has been a friend and colleague for many years. I have been uniformly impressed by his work, including RePAIR and the PI Program.

    However, I think it’s important to consider some of the characteristics of the program that suggest (to me, anyway) that it would be difficult and expensive to ramp up to serve a significant percent of researchers found of bad behavior. For example, my understanding is that the program has a small but highly qualified staff of the kind that cannot be found just everywhere.

    I haven’t read the report yet, but from what I’ve heard over the years, including some statements by DuBois, that the program addresses several kinds of bad management of science, and “research misconduct” in the narrow, U.S. Federal definition, is just one concern.

    I hope that the PI Program continues to do such good work, and I hope (with less confidence) that franchises can be created around the country – but I don’t think it’s a panacea.

  2. I see this in two ways:
    1. There should be criminal charges + rehabilitation for serious offenses and fraud. And a permanent band from science.
    2. There should be rehabilitation for minor offenses and serial offenders. And a temporary ban from science (e.g., 1-3 years).

    Two other questions:
    A. Is this an extension of the for-profit penal system in the US?
    B. Is the $3,000 per participant a fair price for only 3 days, and what does it cover? And is 3 days enough to rehabilitate an “offender”?

  3. Not every kind of avoidance behavior ought to be labeled with the quasi-honorific term, “rehabilitation;” especially, if the behavior is an adaptive response to have been discovered violating a standard. Many, if not most, research institutions provide not only incentives to cut corners, e.g. taking personal risks for “the good of the group,” but shy away from procedures of discovery and enforcement of sanctions.
    “Rehabilitation” is the au courant label for punishment visited on privileged members who get caught at violation, but whose mea culpa and personal vows of reform are accepted as sufficient to readmit them to the communities they may have embarrassed.
    If not based on misinformation, trying to cheat is rational. Cheating pursues efficiency, which morality tends to disregard. See http://goo.gl/hIBc6c
    Cordially — EGR

  4. Instead of spending money on people who know better and choose not to do the right thing, bar them from public finding for LONG terms. Why spend even more money on cheaters?

  5. I voted no on the principle that the tax payer shouldn’t be stuck with the bill. If institutions want to retain the offenders, then either they or the employees themselves should foot the bill for rehab.

  6. It seems to me that any institution receiving federal funding ought, as a condition of receiving it, to be required to conduct regular (re)training of its PIs in ethics, compliance, research integrity, employment law, etc.

    For example, my industry employer requires completion of self-study refreshers in ethics / harassment / discrimination annually. My particular work group has additional domain-specific legal compliance refresher requirements. There is additional training for managers. The time cost I incur is about 1-2 hours/year.

    That is: Especially if it is believed that “infractions rarely [result] from a conscious intent to mislead or break rules” — why provide this training only AFTER a bad thing happens (and, potentially, there is collateral damage to multiple careers)?!?

    As EGR implies above, it’s also important for the institution to convey that it takes such issues seriously BEFORE problems occur.

    I’m not a professional scientist. My impression from reading RW and from talking with professional scientists I know has been that there is no such training requirement in academia already; and that really leaves me scratching my head. I’m happy to hear a reason why I’m all wet.

    PS
    Re: “federal funding” — I write with a U.S.A. perspective, but perhaps the sentiment is generalizable globally.

    1. Requirements for training (at various levels of specificity, focused on a set of topics, and directed to certain classes of individuals conducting research) is already required for NSF and NIH grantees. Grantee institutions may choose to meet these requirements at a minimum, or may choose to expand the training and classes of individuals required to complete the training. Given that these federal funding requirements were put in place several years ago, those researchers recently found to have committed research misconduct in federally-funded research often have previously completed some training — either at the minimal required level or sometimes even at an expanded level. This previous training therefore becomes a factor in devising an appropriate response following a finding of research misconduct.

  7. I think everybody deserves a second chance but also I think a researcher guilty of misconduct should suffer some penalties.

    1. I agree, including paying for their own rehabilitation. Expecting taxpayers to pay for it makes no sense to me. Or let their institution pay for it out of their over-the-top “overhead”.

  8. There is secondary gain here. It is a crime and should be treated as such. No need to medicalize it with “rehab”.

  9. I voted no because it is not addressing the root causes of the problem: current reward structure, and journals’ preferences for significant ‘interesting’ effects. You can’t expect all researchers to do the right things when they operate in a culture that is not ‘right’, to begin with.

    1. If a PI bullies his/her PhD student/post-doc because they aren’t getting results, causing them to falsify gel lanes with Photoshop to get something published, who exactly should go to rehab?

  10. Rather than rehab fraudsters, it seems more important to help whistleblowers. They are the ones who can get truly scarred in the process, then drop out of science, discouraged. Noone even notices.

  11. An even simpler solution. Just eject them from the system altogether. There are so many hard-working and deserving scientists – ranging from young to retired – who are unemployed but who would die for a chance to making an honest career from their skills. But they can’t because the system is clogged in certain circles by bad and powerful apples. Throw out the bad apples, and you create job competition (competition for better, not an attempt to rehabilitate bad apples), and you save tax-payer’s money.

    1. I truly appreciate efforts to rehabilitate offenders and strongly believe in second chances. But, at a time when fewer than 15% of newly minted STEM Ph.Ds. will have the opportunity to land a tenure-track position, it will be difficult to justify to them, let alone justify to taxpayers, any expenditure of public funds on behalf of those who have so blatantly violated our trust.

  12. “…they are not working with the 88% of individuals found guilty of research misconduct annually by ORI.”

    Huh? 88% of individuals are found guilty of research misconduct annually? What’s the universe from which that 88% is calculated?

    1. What Don Kormfield meant was 88% of individuals found by ORI to have committed research misconduct did acts of falsification and fabrication – not plagiarism —

      In my 25 years of handling research misconduct cases for ORI and with public and private research institutions, most of the 500 or so cases were like those outlined in this piece – but 5-10% were serious sociopathic and/or self-convinced researches who are likely never to sign up for rehab nor be rehabilitated.

      But that is not the group that James DuBois and his unique, sophisticated, and fine program targets and gets – those who agree they need help, and they personally or their instutuon is willing to pay for rehabilitation training. I support that effort.

    2. For clarification: Of the total sample of those ORI found guilty of research in the ten year period studied, 88% were guilty of either fabrication, falsification or both. The remaining 12% were guilty of plagiarism.

      Don

  13. I am not a fan of these workshops. The ‘original’ one may be useful in a limited sense, but when expanded, these things automatically and seamlessly devolve into a webinar, web-questionnaire, online course or some such, with ridiculous ‘case studies’, followed by multiple-choice questions and a certificate / Record of Completion that has to be submitted to some administrator. I’ve seen quite a few of these – COI workshops, sexual harassment web ‘course’ etc. etc. Most of them were a total waste of time! It’s just yet another way to expand the bureaucratic and admin bloat.

    I think the problem is better addressed by expanded access to tools for lab management, fraud detection, well designed electronic lab notebooks and the like – tools that can be deployed before a manuscript is submitted.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.