Why do scientists commit misconduct?

Cristy McGoff
Cristy McGoff

What makes a person fabricate data? Pressure from different corners in his or her life, to get published, funding, or promotions? Are there personality traits that occur more often among those caught committing misconduct? We spoke with Cristy McGoff, the Director of the research integrity office at the University of North Carolina at Greensboro – who also has a master’s degree in forensic psychology from the John Jay College of Criminal Justice – about the minds of serial fraudsters.

Retraction Watch: Let’s start with your background. Why did you make the switch from forensic psychology to research compliance?

Cristy McGoff: My first position utilizing my degree was working at an alternative-to-incarceration program in Brooklyn, New York, and after two years, I was considering other career options, and a PhD in forensic psychology at an accredited university was hard to find at that time, so I needed to think about other areas where my master’s degree might be an asset Though I’d spent years studying criminal behavior, my favorite part was the researching the reasons behind how people learn to engage in dangerous behavior. I realized I wanted to be part of research and use my degree in some capacity.

My social science background was the reason I was originally asked to take my first research compliance position at Teachers College-Columbia University reviewing human subjects research protocols, as there is a need to think analytically and apply human behavior to the review process, as well as the federal regulations. I also enjoyed being part of research advancement in that role. In my current position, I oversee the Institutional Review Board (IRB), the Institutional Animal Care and Use Committee (IACUC) and the Institutional Biosafety Committee (IBC), so I have expanded my knowledge of research in general. I am also the Research Integrity Officer, and in this role, I have discovered that my education and experience are an asset to me in understanding the behaviors involved in the act of research misconduct.

RW: How often do you apply what you learned about the minds of criminals to your work in research integrity?

CM: I don’t want to be dramatic in this analysis, but I think that there are levels of narcissism and even sociopathy within some cases of those who engage in research misconduct. I think some institutions may see glimpses of it in the category of noncompliance and repeated noncompliance — a lesser, though sometimes reportable version of research misconduct under the federal guidelines. While there are certainly researchers who make mistakes or may not be trained adequately within their fields, there are some that may be showing evidence of future behavior and possibly eventual research misconduct. I might use my degree and my experience with the criminal population as a way of predicting patterns.  The environment of research advancement contains levels of ego, competitive behaviors, and the need to be respected both by peers and students. While those involved may not have started out in their field with these traits, it is in some way something that can be bred just by being within this culture daily and the modeling of these traits all around them.

RW: So many types of misconduct are perplexing, because there’s such a high risk the fraudster will get caught. For instance, plagiarism and duplication are easily spotted. Then there are researchers who doctor data repeatedly, knowing each time it’s a huge risk. What compels people to take such risks, in your opinion?

CM: I think that while competition plays a part in the quest for funding, ego can be a large barrier to ethical conduct of research in these instances, I think that the self-perception of being respected, and all-knowing can lead someone to “push the envelope.” They might think that since they are an expert in their field at such a great level, that no one else would even understand the manipulations they attempt to hide.  Many researchers also see the research integrity efforts of an institution as a hindrance to their success, but the importance of the role of a research integrity officer is essentially to protect the public from dangerous behaviors based on this self-perception. Of course there is a level of laziness to data manipulation, falsification and plagiarism, but cutting corners is not what I would consider the motivation for the majority of these individuals, especially serial “fraudsters.” Although many people would consider the risks involved, I am not sure the risks of engaging in research misconduct enters into their minds. I think it is fleeting if it does, and it is mostly because they are out for themselves.

RW: Are there any traits of serial fraudsters that overlap with traits typical of some personality disorders?

CM: I don’t think that all researchers accused of or found to have committed research misconduct have a history of mental health issues per se, but I think some of them fall under the pressures and culture of the research environment. If there are hierarchies that only allow room for a few and the only way to succeed in your area of expertise is to test the ethical waters or commit fraud as way to achieve that higher status, then this behavior may be engaged in without any previous history relative to the acts. That said, I do think it is possible that there are individuals in this area with personality disorders. Personality disorders, though they appear in the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), are not considered diseases with traditional pharmaceutical treatment, but rather, a series of traits inherent to the way they get through life and interact with people. One that comes to mind is Narcissistic Personality Disorder, which easily could describe many features of a highly competitive researcher. Narcissistic Personality is described in the DSM-5 by exaggerated self-importance, the need for admiration and power, lack of empathy, a sense of entitlement, and selfish behavior that takes advantage of others (such as graduate students and postdocs).

RW: Have there been any cases of research misconduct you’ve directly been involved with where you’ve applied your training in forensic psychology?

CM: Unfortunately, yes. My knowledge of this field did help me try to understand the motives of a particular individual and also disturbed me as to how complicated the investigation I was involved in might be beneath the surface. I felt at the time I was dealing with someone just learning how to get away with this behavior and it appeared they would probably end up fine-tuning it the next time. I was also left wondering if they had done it before. There were other serious issues that expanded from this investigation outside of research so I think I was correct in my assumptions.

RW: What can universities and institutions do to minimize cases of serial misconduct?

CM: I think that a strong Responsible Conduct of Research program is key. I think a culture change or an effort to change the culture of a research lab is equally important. The history of research ethics is fairly young, and the regulations that govern research are as young, but also rapidly evolving. If you consider that there is a generational gap in who is teaching and who is learning to perform research you’ll realize that there are a lot of years between them in most situations. Add that to the individual training and experience of the individual principal investigators and those they mentor and you have a recipe for inconsistency in the way research is carried out. If the ultimate goal is to find something new and gain respect and notoriety in your field, then the path to that goal needs to be consistent in its values. These efforts may be difficult and take a long time, but I think eventually those efforts will be worth it. It is worth it to remind members of a research lab that they can also fail at their hypothesis and still find important results. Maybe it will be those results that give them notoriety one day.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

23 thoughts on “Why do scientists commit misconduct?”

  1. Based on my 17 years in the federal OSI/ORI, I fully support Cristy McGoff’s expert opinion that some research misconduct perpetrators (the worst of the senior ones) are very narcissistic and even sociopathic. Has she or anyone else published on such observations?

      1. Not sure if the following couple of references will help you, but there is a significant connection between the number of serial killers in a country and the number of Nobel laureates per capita in the same country. This results may be obtained via a double correlation:

        1) Correlation between countries’ annual per capita chocolate consumption and the number of Nobel laureates per 10 million population:
        Franz H. Messerli (2012) http:/dx.doi.org/10.1056/NEJMon1211064
        (see Fig. 1: R=0.8, p<0.0001, n=23)

        2) Correlation between Countries’ Annual Per Capita Chocolate Consumption and the (log) number of serial and rampage killers per capita:
        Seán Roberts & James Winters (2013): http://dx.doi.org/10.1371/journal.pone.0070902
        (See Fig. 10: R=0.52, p=0.02, n=18).

        Now, more seriously, remember that "Cum hoc ergo propter hoc".

  2. RW makes the statement that “So many types of misconduct are perplexing, because there’s such a high risk the fraudster will get caught.”
    This statement is based on an unsupported assumption, which is that the risk of detection is high. I do not believe that there are significant data on the risk of detection. However, based on the emerging understanding that much and perhaps most of what is published is irreproducible, the prevalence of fraud may be much higher than is generally recognized. Thus, the risk of detection may in fact be very low and people who commit fraud may be balancing the risks and rewards and concluding that the rewards in terms of publishing more papers and getting more grants are worth what may be a very low risk of detection. Could it be true that in science, unlike bank robbery, crime does pay?

    1. Indeed. In eLife 2014;3:e02956, Fang & colleagues combine their previous observation that 0.02% of published data get retracted for misconduct (PNAS 2012;109:17028-17933), with Fanelli’s observation that 2% of scientists admit to misconduct (PLoS One 2009;5:e5738), to conclude that perhaps only 1% of misconduct is detected.

    2. Perhaps a better way of expressing it would be that it’s surprising because so many fraudulent behaviours do not stand up to the slightest scrutiny. While the odds of someone checking may be low, the odds of them spotting it should they look is high. I think one of the most surprising things about many of the cases of misconduct we see here on RW is that they are so basic, obvious and very unsophisticated.

      1. I disagree that it is surprising that the fraudulent behaviors that are caught are so basic, obvious and unsophisticated. Rather, I think that this is exactly what would be expected given how hard it is to catch fraudulent behavior. Thus, the only behaviors that can be definitely attributed to fraud are those that are obvious, brazen, and unsophisticated. What I believe is likely is that the vast majority of fraud and misconduct is never caught. For example, graphs based on numerical data, which make up a high percentage of data presented in the literature, are easily manipulated and almost impossible to catch. Unless the datasets are very large and thus prone to statistical analysis for irregularity, such data almost never form the basis for misconduct/fraud allegations precisely because it is almost impossible to catch.

        1. It’s impossible to know how much doesn’t get caught, but I think what’s surprising is these very long running frauds and how unsophisticated many of them are. There probably are some who are perpetuating highly skilled frauds, but when you look at some of the biggest scandals most of them got away with it because no one looked, not because they were clever.

  3. Yes – indeed!

    This is a very important. High time that the link between Narcissistic Personality Disorder and science fraud was made publically!

    I support Ms. McGoff and Dr. Price’s observations that the worst of the perpetrators of scientific misconduct fulfill the DSM criteria for NPD.

    The fact that NPD and science fraud are increasing together is definitely not a coincidence.

    Bravo!

  4. At one level I don’t understand how the immediate perception about detection, the risk of being caught, ever plays into the decision to falsify data? More specifically, I never stumbled across any respondents in my years at ORI who falsified or fabricated data thinking they might get caught. Part of the evidence for that is the varying degrees some went to disquise their falsification; some times elaborate, othertimes no effort whatever in what would be immediately apparent to the uneducated eye! Virtually all of them thought their efforts would not be detected. Sociopathy contributes more to this event, as stated by Alan Price. Take a general level of ~one percent sociopathy of the population, whether it is lawyers, scientists, politicians, or televangelists, and you are still at a level that is 25 fold greater than the yearly number of retractions in science.

    1. I agree. I think that the grandiosity involved cloaks the acknowledgement of risk. Since the PI engaging in RM believes they are the only expert, they may not think that anyone else could detect it.

      1. That certainly fits my experience. I can recall multiple cases where respondents provided data to ORI that they had originally held back from the institution (and even their own experts upon one notable occasion), perhaps thinking that nobody would really look at it carefully. My favorite was the original ‘film’ of a black and white western with the linearly arrayed, telltale yellow, magenta, and cyan spots deposited by a YGB printer (i.e., not your standard emulsion)!

      2. I think the average sociopath does not even realize what they are doing is wrong. They just ‘know’ what the answer should be, so they are (in their own mind) not making stuff up.

        In other cases you really wonder what a person was thinking. Here in Denmark, Milena Penkowa, who committed various acts of research misconduct, sent an e-mail in 2003, informing her boss that on May 14 her mom and sister had died in a car crash in Belgium. She also claimed to have spent quite some time in hospital and planning the funeral. Six years later, mom and sister miraculously had risen from the grave and attended Penkowa’s defence of her dissertation (the Danish version of “habilitation”, she already had a PhD). Who makes up stuff like the death of two family members?!

  5. A pet peeve: RW does a disservice to the intelligent discussion of misconduct by continuing to use the indiscriminate and meaningless term “fraudster”, an unfortunate and loaded ‘shorthand’ that immediately forecloses critical thinking.

  6. Reading this, you would think that Ms. McGoff is the singular thumb in the research-integrity dike, holding back the floodwaters of research fraud misconduct that would, without her efforts, occur at UNCG. In fact, the focus of her position at UNCG is on compliance with ethical standards for research and essentially has nothing to do with research misconduct of the kind talked about in this article. It should also be noted that Ms. McGoff is not a licensed psychologist and does not have the training required to diagnose Narcissistic Personality Disorder (or any other disorder for that matter). Indeed, and somewhat ironically, it is generally considered to be a violation of professional ethical standards to speak far beyond one’s true level of expertise, which is exactly what Ms. McGoff is doing here.

    In this context, let me comment more specifically on one of Ms. McGoff’s statements: “While there are certainly researchers who make mistakes or may not be trained adequately within their fields, there are some that may be showing evidence of future behavior and possibly eventual research misconduct. I might use my degree and my experience with the criminal population as a way of predicting patterns.” Basically, Ms. McGoff is claiming that some faculty who have not engaged in research misconduct show signs (that she is able to detect based upon her experience with convicted criminals) that they are likely to engage in fraudulent research behaviors in the future. Wow! That is an extraordinary claim for an administrator in Ms. McGoff’s position to make. If I were a faculty member at UNCG I would be outraged. Does Ms. McGoff intend to intervene in particular cases based upon her self-proclaimed expertise at predicting the likelihood of future research misconduct actions? A truly chilling thought.

    1. CC I don’t think Ms. McGoff is speaking as a professional, just as someone in the research integrity background who happens to have an MA in the area of Forensic Psychology. She seems more to be speaking regarding the culture of the lab and what she might do if she were to apply the experience and education in Forensic Psychology to a misconduct investigation. I don’t read any assumptions or direct diagnoses here.

    2. I disagree with all that CC said as well.

      Is a scientist with NPD traits at risk for perpetrating scientific misconduct or fraud? In my experience most definitely yes!

  7. The term fraud almost always apples to the most severe forms of scientific misconduct, falsification, fabrication and plagiarism.

    On the other hand the majority of scientific misconduct, for example conducting human research without ethical approval, reviewing your competitors grant/paper and delaying then killing it, lying about someone else’s research, personal demonization of competitors, failing to disclose conflicts of interest, is not fraud and those that perpetrate this behavior should not be called fraudsters.

  8. This interviewer focuses more on the “narcissism” and “sociopathy” of the scientists (I won’t use the term ‘fraudster’). But I don’t completely agree. Scientists, especially in academia, are not paid much monetarily for their work. The “currency” they most often receive is the form of respect AND inclusion from their peers, not to mention tenure. This creates an environment where one becomes desperate to produce ‘positive’ results.

    Many of these people are not sociopathic. They have a moral compass they just choose to ignore it. A sociopath has no moral compass. They simply were not born with one.

    I have a friend who thinks it is like the doping scandal among athletes. They don’t get paid much (if anything) either unless they come back with the gold. Then they get the respect and prestige of being on a Wheaties cereal box. The environment they are in pushes them in some ways to cheat to win.

  9. I reviewed 146 ORI reports of those found guilty of research misconduct.
    They were divided into three almost equal size groups: Trainees; Senior scientists and Support staff.
    The trainees’ feared failure to get an academic appointment;the senior scientists sought funds and fame and the support staff was usually trying to reduce their work load or make an extra buck.
    Among the trainees were also those perfectionists who hadn’t gotten a grade below A+since first grade.

    The trainees needed really good mentors or psychotherapy.
    The senior scientists, narcissists, psychopaths, etc needed to fear detection, best achieved by really protecting whistleblowers from retaliation.
    The support staff would benefit from a better understanding of the impact their phony data could have on science and even patient care.

    All achievable goals. What would it take to make it all happen ?

    Don Kornfeld

  10. Congratulations for this article. It is time to link misconducts to unhealthy narcissism. I take the opportunity of this article to mention my recent book “An Essay on Science and Narcissism: How do high-ego personalities drive research in life sciences?” which discuss the good and bad points of narcissism (see more http://brunolemaitre.ch/). Concerning misconducts, It should be note that expert on narcissism consider that narcissists made riskier decisions (frauds, bad investment …), not necessarily because they failed to appreciate the risk associated with their decision, but rather because the lure for the success is irresistible. In this line, narcissistic scientists seem to neglect the warning signals that indicate a weakness or a trivial interpretation of the dataset, because they are so much more attracted to the fanciness of the final story, the furore it will make among their colleagues and media and, last but not least, their personal benefit in terms of promotion. This is line with the idea that narcissists tend to distort the reality to maintain positive illusions on themselves. More in the book
    Best,
    Bruno

  11. This is an important and well-timed article that is deserving of wide spread readership The author’s forensic background and insight provides an important commentary about why certain personality traits, such as narcissism, and perhaps more importantly, the destructive traits of sociopathology, are worth considering when scientists find themselves facing the often difficult to spot and catch (and thus often overlooked), components when faced with colleagues or collaborators engaging in research misconduct and ethical compromise. Scientists should be encouraged to remain ever vigilant when it comes to seeing patterns of behaviors that reveal these two traits. Traits like these may initially lead these types of personality scientists to vast and quick amounts of publications or slick avenues to funding, but ultimately will bring about the sad collapse of their research, a dreadful tarnishing of their own name and, sadly, potentially the names of their colleagues/collaborators. Cristy McGoff offers a strong conversation that shows that people with these traits tend to be willing to more easily engage in research misconduct and ethical compromise and why ‘the little foxes’ can ruin the vine.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.