Last month, the community was shaken when a major study on gay marriage in Science was retracted following questions on its funding, data, and methodology. The senior author, Donald Green, made it clear he was not privy to many details of the paper — which raised some questions for C. K. Gunsalus, director of the National Center for Professional and Research Ethics, and Drummond Rennie, a former deputy editor at JAMA. We are pleased to present their guest post, about how co-authors can carry out their responsibilities to each other and the community.
Just about everyone understands that even careful and meticulous people can be taken in by a smart, committed liar. What’s harder to understand is when a professional is fooled by lies that would have been prevented or caught by adhering to community norms and honoring one’s role and responsibilities in the scientific ecosystem.
Take the recent, sad controversy surrounding the now-retracted gay marriage study. We were struck by comments in the press by the co-author, Donald P. Green, on why he had not seen the primary data in his collaboration with first author Michael LaCour, nor known anything substantive about its funding. Green is the more senior scholar of the pair, the one with the established name whose participation helped provide credibility to the endeavor.
The New York Times quoted Green on May 25 as saying: “It’s a very delicate situation when a senior scientist makes a move to look at a junior scientist’s data set.”
This comment is perplexing in many respects, not least of which is the plain language of the Submission Requirements and Conditions of Acceptance of Science, the journal in which the study was published. Upon acceptance, Science requires each author personally to attest to its terms. The terms include, prominently, the following statement:
The senior author from each group is required to have examined the raw data their group has produced.
The reason for it is obvious. The reader is happy to give credit for good work, but neither the peer reviewers, nor the editors, nor the readers were there as witnesses, so it is up to the authors to certify what took place. Co-authors are the only ones who can do this. This is fundamental to the compact with the reader that all co-authors accept. Yet we know that Green agreed to be the second (of two) authors of a manuscript based on original data he had never seen or asked to see, though he then signed a testament that he had done so.
If the co-author is not standing for the integrity of the paper, who is? Whom shall the reader trust?
WHO DID WHAT?
The rules set forth by Science go farther than requiring examination of data. They also include the following:
All authors of accepted manuscripts are required to affirm and explain their contribution to the manuscript.…
These words, as with most regulation, rest on the rubble of past fights, which in this case are fallen publications over decades. Soman/Felig was one of the earliest, and Duke/Nevins/Potti one of the latest. A necessary common factor in such debacles is a set of co-authors who put their names on work they know little or nothing about. We’re all clear on the benefits of authorship; some get a little hazier on the corollary responsibilities.
In 2001, Science (and Nature) were both badly burned by having to retract half a dozen papers (each) from a young researcher at Bell Laboratories, Hendrik Schoen. An investigating panel spent a great deal of time concluding that Schoen had committed scientific misconduct, but that his colleagues were guiltless.
Some clinical journals began in 1996 introducing a system of disclosing authorial contributions to the reader. One of us (DR) wrote to Science in 2002 to suggest that if Science had such a system in place, the whole Schoen investigation would have been unnecessary because the co-authors who were found innocent would have declared that they contributed nothing.
Change finally came to Science only later, after its editors in 2005 had found themselves having to retract several studies by the fraudulent South Korean stem cell researcher Hwang; an internal investigation of the journal called for a declaration of contributions in the way other journals had, by then, required for some time.
Given this history and the requirement, how do we understand what, exactly, were Green’s contributions to this work? We cannot find a public version of the contributions statement in the journal Science, nor in the supplementary materials. Merely disclosing to the editor is insufficient. Of what value are these statements if the reader is not able to understand the role of each author?
AND WHAT ABOUT THE FUNDING?
In this particular case, the funding was another matter that the senior researcher was quoted as characterizing as too delicate to raise. Green acknowledged that he wasn’t aware that the funding referenced in the article didn’t exist:
Michael said he had hundreds of thousands of funding, and yes, in retrospect, I could have asked about that. But it’s a delicate matter to ask another scholar the exact method through which they’re paying for their work.
He has got to be kidding.
Both the authors are political scientists and must surely have realized how carefully their results would be scrutinized. One can easily imagine the existence of funding that could bias the results in this area.
Indeed, the instructions in Science about funding and conflict of interest provide:
Authors must agree to disclose all affiliations, funding sources, and financial or management relationships that could be perceived as potential sources of bias, as defined by Science’s conflict of interest policy.
Exactly how does the senior author imagine he can fulfill this requirement if asking his co-author about funding is too delicate a matter?
Unfortunately, it now appears that little of what was printed about the funding sources can be substantiated. Too bad that discussing—and accurately documenting—that wasn’t a part of the collaboration process.
WE ARE NOT THAT BUSY
A commonly raised objection to honoring — as opposed to just signing — co-authorship requirements is that everyone is so busy, and interdisciplinary collaborations are so complicated in these modern times that it is not reasonable to ask coauthors to examine data they will not understand anyway, or to offend others by asking about funding. This overlooks — or willfully ignores — some critical points.
Attesting to Science’s requirements of authors may seem a tedious chore. But the requirements only outline the simple ethical rules the profession of science has adopted governing how we must behave towards those to whom we are responsible: our colleagues, our profession, our readers, and the public. None of this shambles would have happened if those involved had understood the reason such openness and honesty is so crucial, and had followed these rules. (Yes, the reward system and incentives often are weighted to actions that undercut the fundamental mission and purpose of our entire enterprise. Yes, systemic fixes are necessary. None of that lets us off as individuals.)
Of course, everyone is busy. But the busier you are, the more critical these habits and best practices become. In fact, when thoughtfully adopted, these practices will protect you, much like the rules of the road. Just because you’re in a rush, if you drive on the other side of the road into the oncoming traffic, you put yourself and others in jeopardy.
The complexity of collaboration and cross-disciplinary expertise is not new. Researchers have been raising this issue for decades. That is precisely why the contributions system was introduced, to provide accountability for members of interdisciplinary teams to specify their areas of expertise and who did what.
At a larger level, our question is this: if the coauthors cannot understand the data, how can the reader? Isn’t it the duty of those on the collaboration to understand it well enough to stand behind it?
WITH POWER, COMES RESPONSIBILITY
We don’t wish to suggest that it is easy to fulfill the obligations of a co-author or mentor. In reality, given the complexities of interpersonal relationships and the trust science requires, it can be difficult and daunting. Still, it’s the job.
Nobel Laureate David Baltimore endured a decade of questions about his impulse to defend a collaborator—without ever reviewing any data—when presented with questions about work on which he was senior author. Yet even as late as 2002, he said:
In science, we assume that a colleague is trustworthy and only in extreme do we doubt it.
The reality is that a fundamental betrayal of the trust of collaborators is often at the root of a scientific fraud. The roster of saddened senior scientists who had have been profoundly betrayed include some truly great scientists: Efraim Racker by a protégé he called “the son I never had”, and Francis Collins, who said of a fraud in his lab:
It was my darkest professional hour when I found out that a talented student who I had great hopes for systematically manipulated data. It changed me forever.
It’s a conundrum: in a community based on trust, how do you find a reasonable balance of trust and skepticism? Where are the lines?
The crux of the matter is that, as human beings, we are prone to self-deception and are poor judges of character and credibility. We all too often see what we want to see and hear what reinforces our existing beliefs. We are influenced (often without realizing it) by those around us. Remember the old saying “if it sounds too good to be true, it probably is?” Someone who approaches you to join in publishing results that correspond to your own theories—and provides ample funding to support the work—should trigger the idea that this is precisely the time you need a set of collaboration hygiene habits and use of good practices. Doing so will protect you, and each and every participant in the collaboration, as well as the scientific process.
Of course, knowing what you should do, and knowing how to do it are sometimes two different things. If you are a senior researcher and wish to do the right thing, what should you do if you are approached because your participation (read: reputation) is a desirable addition? If you begin to have questions about data with which your name will be associated, how do you raise them without seeming to trample on your juniors?
Here are some possible approaches.
- Start As You Mean to Go On
Set up the collaboration with clear understandings and protocols about contributions, roles, and access to data. This is to clarify who does what. It can be done matter-of-factly, without any heat or malice. If you set up the ground rules and expectations from the beginning, asking questions will not be “rude” or intrusive; it will simply be a necessary part of doing good science.
It gets increasingly difficult to ask questions once a whole set of assumptions have been made (and confirmed implicitly by actions). Think about advice you’ve likely received about teaching: Start out strictly enforcing rules and relax as you go, because it’s almost impossible to tighten them later.
There are some useful resources that can be consulted for developing a set of questions or personal policies for how to go about establishing collaborations. If you’ve come across something particularly useful—and practical—please do share it with us, as well as your experience implementing the ideas.
One way to do this is to establish, up front, a process to specify each person’s contributions – perhaps using guidelines in your field or from the journal to which you hope to submit. Then, make it clear that you’ll revisit this discussion at the end, after finishing a manuscript, to clarify who did what.
This conversation can be easier than you think. You can say something like:
So we don’t have any miscommunication, let’s go through what we plan to do and assign each of our efforts to one of the categories from the International Committee of Medical Journal Editors and discuss. We’ll circle back at the end and revise according to how things develop over time to reflect what we planned to do and what actually happened.
It’s my policy to ask all those with whom I collaborate to….[fill in the blank]
You may be pleasantly surprised how easy this can make an otherwise awkward process.
- Institute and Implement Good Practices—Build in Your Own Checks and Balances from “Go”
Build practices, appropriate to the discipline, for open sharing of data, and a system to raise and address questions. Example:
-Timelines and expectations
-Where data are kept and in what form. After publication, who takes the lead in responding to queries about the data
-Lab practices for replication/repetition of work before manuscripts are submitted
-Who has access to raw data and/or notebooks, and under what conditions
-What documentation standards are expected for code and where programs are kept; who has access
-What requirements for the research are imposed by funder or institutional policies in terms of the research plan, approvals, intellectual property, etc.
One extremely valuable practice is one that comes from the management literature, in a classic article by Peter Drucker, Managing Oneself. Drucker suggests having what he calls a “contributions conversation” that involves saying, as you begin each new working relationship:
This is what I am good at. This is how I work. These are my values. This is the contribution I plan to concentrate on and the results I should be expected to deliver.
And then continue by asking:
And what do I need to know about your strengths, how you perform, your values, and your proposed contribution?
To this, one of us (CKG) teaches the value of adding one more component:
This is how I would like you to approach me if you have a problem with something I’ve done. How would you like me to approach you if I have a concern with your work?
Some people like to talk things through, others prefer to read and mull before talking. Knowing the most constructive way to raise an issue is both helpful, and it signals that, of course we’ll have difficulties and we’ll approach them like the grown-up professionals that we are. (Or aspire to be.)
Human beings have misunderstandings, frustrations, and friction. If you acknowledge that from the first, and develop between you a clear understanding about how you each will handle such a situation, that will help reduce awkwardness as well.
- Develop Personal Scripts
No one ever expects to be named in the New York Times as the co-author of a manipulated study. Not Donald Green and not Francis Collins. This makes it imperative to have on hand some words (your personal scripts) for interactions from which you might otherwise cringe.
Here are some possible examples:
Let’s look at our project as a reviewer might. For each conclusion in our paper, where is the support in our data?
If we are questioned, where are the primary data kept and who knows how to interpret them, or our code/protocols/methods? How the data got to become conclusions?
Our protocol of sharing processed data is efficient and saves time. To fulfill my obligations to you and the partnership, there may be times when I need to look at raw data to think this through. Where can I find them and the accompanying documentation, even at off-hours?
It’s our practice to keep a complete file of foundational documents. Please send the IRB /IACUC/funding/whatever documents for our documentation file as we begin our process.
The journal is requiring me to certify that I’ve seen the raw data. What process will we implement that so we can sign our affirmations honestly?
And what about if you have concerns about data? The first, most important, most effective rule is to start by asking questions, in a non-confrontational way. It’s all too easy to be confused or be misunderstood. Only escalate if the answers are not satisfactory or seem contradictory.
These data confuse me, as they seem not to include as many data points as we had discussed. Can you help me understand where I’m getting confused?
I thought we’d agreed all the data would be deposited by now. Is there a technical hangup or something else keeping this from happening?
- Designate Contributions for All Projects
Make it one of your personal professional policies to always clearly designate contributions of all participants for each and every manuscript.
Set out your criteria for authorship at the beginning, with examples, and then revisit at the end when the work is done. Set aside time to go through the criteria from the International Committee of Medical Journal Editors (ICJME) or the journal to which you plan to submit, and assign the contribution of each participant into one or more categories. Again, if you have indicated (in writing) at the beginning of the collaboration that this is your policy or practice (matter-of-factly, just the price of admission), it will go more smoothly.
HOW COULD ALL THIS HAVE HELPED THE GAY MARRIAGE STUDY?
As a thought experiment, consider this: Imagine that Green had said his routine policy for collaborating was that he be provided with: a) a copy of the approved IRB protocol; b) a full research design including the survey instrument, procedures and analysis protocols, including information about the survey companies doing the survey; c) a copy of the funding agreement so he could assure he was fulfilling his obligations as a collaborator; d) periodic check-ins to discuss the progress of the project; and e) access to the raw data before manuscript submission. How do you think the events that have absorbed so much attention might have played out differently?
We do not know if Green did all of these things. If he did, we are perplexed about how the paper got as far as it did. But such policies provide more opportunities to discover the flaws in any study earlier. At the most, this policy might deter problematic conduct.
Bottom line: if you think it’s rude to ask to look at your co-authors’ data, you’re not doing science.
What is your good name worth to you?
Drummond Rennie is an Adjunct Professor of Medicine, the Phillip R. Lee Institute for Health Policy Studies, University of California San Francisco and former deputy editor, west, JAMA. C. K. Gunsalus is director of the National Center for Professional and Research Ethics (http://www.nationalethicscenter.org), research professor, Coordinated Science Laboratory, professor emerita, College of Business at the University of Illinois at Urbana-Champaign. She also runs a consulting practice.
Contributions: Gunsalus wrote an initial draft. Rennie discussed and rewrote every part of each successive draft and idea collaboratively with Gunsalus, and provided additional insights and references. Each approves and takes full responsibility for this final version. The authors received no funding for this article.
Like Retraction Watch? Consider supporting our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post.