“Ethical ambiguity:” When scientific misconduct isn’t black and white

David Johnson
Elaine Howard Ecklund

Some types of misconduct are obvious – most researchers would agree cooking data and plagiarizing someone’s work are clear no-nos. But what about overhyping your findings? Using funding allocated to an unrelated project, if it keeps a promising young student afloat? On these so-called “gray” areas of research behavior, people aren’t so clear what to do. A few years ago, David R. Johnson at the University of Nevada Reno and Elaine Howard Ecklund at Rice University interviewed hundreds of physicists; their conclusions appeared recently in Science and Engineering Ethics (and online in 2015).

Retraction Watch: Your paper discusses “ethical ambiguity” – what does that mean? Can you provide examples of such behavior?

David R. Johnson and Elaine Howard Ecklund: Ethical ambiguity refers to circumstances where the line separating legitimate and illegitimate behavior is gray rather than black or white. In short, the same behavior can be open to different ethical interpretations based on the stakeholders who are involved and the intended or actual outcomes of the behavior. Physicists talked to us about ethically gray scenarios that included: accepting funding for military research, misuse of research funds, plagiarism, allocation of credit and authorship, cronyism, overhyping research results, and exploitation of subordinates (graduate students and postdocs).

Consider a scenario in which a scientist receives a grant for one activity and then uses a portion of that money to support a graduate student on a project unrelated to the grant. Many scientists would regard this practice as a black and white instance of unethical conduct. But some scientists we interviewed view this an ethically gray scenario, indicating that the use of funds for reasons other than specified in a grant is justifiable if it means supporting the careers of their students or keeping their lab afloat. In these and other circumstances, scientists cope with ambiguity through decisions that emphasize being good over the “right” way of doing things.

RW: You found 48 physicists who answered “yes” to this question: ‘‘Do you find yourself confronting any ethically gray areas in your own research, where you’re not sure what’s the responsible thing to do?’’ Did that surprise you?

DJ and EE: What surprised us most in the beginning stages of this research was how frequently some scientists’ conceptions of responsibility in science was limited to comments like “don’t make stuff up” and “be honest.” In many respects, this reflects the tendency in research on scientists and policies that treat misconduct in science as a black and white issue. Having interviewed hundreds of scientists in our broader research, we expected more nuance. But as we continued, a narrative about ethically-ambiguous scenarios became more apparent and at that point we added the interview question about ethical gray areas. The forty-eight instances we report reflect affirmative responses to the new question or incidents in which respondents brought it up without being prompted (before the question was added).

RW: Your subsequent questions focused on how people behave when faced with these ethical ambiguities. What did you find?

DJ and EE: We found that scientists select what they believe to be an appropriate course of action and rationalize it using one of three themes: altruism, consequences, and preservation of the status quo. To take the example of a scientist misusing funds to send a student to a conference very distantly related to the original project. The underlying justification may be altruistic in that it promotes the career of another scientist. Another practice is to allow what one believes to be unethical behavior to go uncontested because the outcomes are viewed as inconsequential. For example, we interviewed one scientist who described how he did not take any corrective action when a foreign student visiting his lab “stole” an idea, because he viewed it as an isolated incident that did not have consequences for his own graduate student (who had not come up with the idea, so wouldn’t have likely played a major role in developing it). Other scientists emphasized that levels of competition are so high in science that it is hard to separate competitive behavior from unethical behavior. A number of scientists, for example, spoke of the regularity with which good ideas are taken from other scientists’ proposals as an ethical gray area. Many would not view this as an appropriate action, but some might justify saying that competition is the status quo in science.

RW: Were you surprised to find similar narratives from all 48 scientists who encountered “gray” areas, whether or not they were based in the US or UK, at elite or non-elite institutions and positions?  

DJ and EE: There is surprising similarity in scientists’ narratives at elite and non-elite universities and between scientists in the US and UK. One of the key differences in elite and non-elite universities is the level of resources for scientific research. There is a lot of scholarship suggesting that one might find a higher incidence of misconduct at universities characterized by limited resources for research because it can be more challenging to succeed through traditional means with fewer resources. So you might expect specific types of ethically ambiguous issues in physics departments at a regional state university when, for example, compared to a department at a top-tier research university. One might also expect UK and US scientists’ to have different experiences due to differences in funding structures or culturally specific practices. We were indeed surprised at the similarities, taking them as a sign of the pervasiveness of ethical ambiguity.

RW: You conclude by suggesting that ethical training also include help for how to deal with these “gray” areas of behavior. What might that training look like?

DJ and EE: The first step is to emphasize that the ethical questions that scientists will encounter will likely be unrelated to black and white issues like fabrication, falsification, and plagiarism. Scientists need ethical training, but courses, discussions, and seminars on ethics in science should spend more time addressing the issues scientists are most likely to encounter. For example, scientists encounter difficult scenarios related to questions like “what merits authorship?” or “what separates intensive demands of a graduate student from overwork and abuse?” The second step is to make clear that scientists will encounter ambiguous scenarios where the appropriate action is unclear. Third, training should emphasize ethical frameworks such as virtue ethics that acknowledge ambiguity.  Virtue ethics recognizes that ethical decision-making requires consideration of circumstances, situational factors, and one’s motivations and reasons for choosing an action, not just the action itself. It poses the question, “what is the ethically good action a practically wise person would take in this circumstance?” When possible, consulting with senior and trusted colleagues to think through such circumstances is always a valuable practice.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

7 thoughts on ““Ethical ambiguity:” When scientific misconduct isn’t black and white”

    1. Perhaps this is a US versus UK thing but these days in the life sciences you require so much evidence to support a grant application and the turn around is so slow that frequently the work is already done by the time you get the money so in reality you spend the money on the next phase of the research. On the other hand UK grants are much more tightly controlled than US grants (in my experience) and the grant budget is seperated into salary, consumables, overheads etc and moving money between those is difficult.

      1. There is a difficult balance between accountability in public funding and the uncertain reality of science. In my opinion, funding agencies that treat researchers as subcontractors who are contractually bound to do exactly what it says and only what is says on the proposal are living in utopia. It also doesn’t help that negotiating changes in funding can take several months, e.g. when you discover that a research line is actually not viable and need to reallocate resources. Stop everything and sit around for half a year waiting for the updated DoW to be approved until you move on to the new task?

  1. What a cop out by the scientists. The examples provided don’t look gray to me. A student stole an idea. Presumably this meant they got credit for intellectual work that was not theirs. This is potential plagiarism, given the nod. A student got to a conference that wasn’t in the research budget? This is misuse of research funds. Unless all other students in the lab got the same opportunity this is also favoritism (perhaps also latent cronyism, sexism, racism etc depending on who the winners and losers are).

    What a cop out by Johnson and Ecklund who recommended the generic panacea of “education”. “Education” to address unethical behavior excused as ignorance avoids the moral imperative to hold researchers accountable for their actions. Calling examples of clear ethical violations “gray areas” remedied by education alone is the equivalent of believing that my dog ate my homework. Even if my dog DID eat my homework, I would still be required to present my homework, which would require remedial action and effort on my part.

    Why no recommendation for reasonable consequences in addition to education when a researcher decides to err on the side of, “I’ll do what furthers my interests and not think about the implications?” Such consequences send the message that acting in a way THAT POTENTIALLY HARMS OTHERS (there’s a clue for the clueless scientists), including other students, taxpayers etc, is NOT acceptable.

    If these scientists ever played cards, a board game or a sport as a kid, they KNOW what it feels like when another player cheats. They would be justifiably angry and wouldn’t accept, “I didn’t know the rules of the game” as an excuse. Their disingenuous explanations as adults look like rationalization.

    Reasonable consequences would include formal documentation of the ethical “blind spot” and its remediation (education) in an employee’s HR file, repayment of unauthorized funds, and/or requirement to add/delete an author to accurately reflect authorship of a published paper. In other words, action is required to set the record straight, remediate an inadvertent injustice, and provide clear, specific examples of actions that are not acceptable.

    Experiencing consequences can be very educational, and might encourage scientists to err on the side of ethical action the next time they encounter a self-defined ethical “gray area”.

    1. I am inclined to agree with you. Saying that misappropriating money from a grant to send a student to a conference is “altruism” seems to stretch the definition of that word. What exactly is the respondent giving up of him/herself to assist the student? Stealing good ideas from others is okay because science is competitive? “That’s just the way it is.” You use grant money intended for one project on another project because the first project didn’t pan out – that’s gray? What if your second project doesn’t pan out either? Not to mention no peer review of it. Suppose you go on to publish your findings from the second project – do you indicate how you were funded?
      I also agree education isn’t the answer. People who see the above examples as ethically gray will just roll their eyes while they play on their smartphones.

  2. Grants are not contracts. Students training is an important and allowed goal by itself and not narrowly defined to be limited to specific items. If you choose to interpret it this way then pay overhead and full fringes on your trainee costs to remain ethical in your own eyes.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.