A highly cited paper has received a major correction as a result of the ongoing battle over attitudes towards gay people, when a prominent — and polarizing — critic showed it could not be replicated.
In December 2017, researchers led by Mark Hatzenbuehler of Columbia University corrected the paper, originally published in Social Science & Medicine in February 2014, which showed that gay people who live in areas where people were highly prejudiced against them had a significantly shorter life expectancy. The corrigendum came more than a year after a researcher who has testified against same-sex marriage was unable to replicate the original study.
“Structural stigma and all-cause mortality in sexual minority populations,” has been cited 102 times, according to Clarivate Analytics’ Web of Science, and attracted media coverage when it was published, from outlets such as Reuters and U.S. News & World Report.
In November 2016, polarizing researcher Mark Regnerus at the University of Texas at Austin published the results of his (failed) attempt to replicate the study. After Regnerus’s study, Hatzenbuehler hired a colleague at Columbia, Katherine Keyes, to try to replicate the findings as well. She found a variable coding error. Hatzenbuehler requested a correction — although it’s not clear when [See update below]– in September 2017, despite the fact that the error nullified the main findings. The journal agreed to simply correct (not retract) the paper, and issued the notice Dec. 11, 2017.
We asked Social Science & Medicine why it agreed to a correction, given that the paper’s main finding was no longer valid. The journal’s Editors-in-Chief Ichiro Kawachi and S.V. Subramanian, both of Harvard University, told Retraction Watch the authors claim they can support their original findings with additional data, which they are preparing as a new submission:
If the findings do not stand up to peer review, we will proceed to retract the original paper. But in the meantime, we asked the authors to issue the Correction.
When we contacted Hatzenbuehler, we received a response from a spokesperson for Columbia, who confirmed the authors are preparing a re-analysis of the paper that will be submitted “in the coming months.”
Needless to say, the notice has a lot of backstory.
Regnerus has been accused of harboring anti-gay bias. What’s more, several researchers have said they’ve identified problems in Regnerus’s own research results. In 2012, 200 researchers wrote a letter to the editor of Social Science Research criticizing a paper Regnerus had published that year. In addition to questioning the review process that led to that paper, the critics wrote:
The methodologies used in this paper and the interpretation of the findings are inappropriate.
Despite the fact that Regnerus’s critique of the 2014 paper was ultimately proven correct, Nathaniel Frank, a lawyer public policy scholar who runs the What We Know project, a catalog of research related to LGBT issues housed at Columbia Law School, told Retraction Watch:
…Mark Regnerus destroyed his scholarly credibility when, as revealed in federal court proceedings, he allowed his ideological beliefs to drive his conclusions and sought to obscure the truth about how he conducted his own work. There’s an enormous body of research showing the harms of minority stress, and Regnerus is simply not a trustworthy critic of this research.
In September 2017, a year after Regnerus published the results of his failed replication study, he mentioned the debate over the Hatzenbuehler paper in a brief to the U.S. Supreme Court, submitted in support of a Colorado baker who refused to make a wedding cake for a gay couple. The brief said:
Ironically, both the Hatzenbuehler study and the study documenting its inability to be replicated are published in the same academic journal—even though both cannot be correct. This reinforces impressions of disarray in this new and politicized field of research.
Regnerus denied that his replication study was an attempt to influence the battle over attitudes towards gay people. He told Retraction Watch:
There are parties I’ve worked with who think ahead down the line, but I do not. I’m a social scientist who loves to mess with data.
Regnerus said he didn’t come across the Hatzenbuehler study until the summer of 2015, more than a year after it first appeared in February 2014. He said replicating studies is “not usually something I do,” however, he said he also tried to replicate a different study from Hatzenbuehler published a month prior, in January 2014, that used the same dataset. Regnerus claims he found a coding mistake in that paper as well, which his research assistant communicated to Hatzenbuehler.
Regnerus told us the February 2014 study caught his eye because:
I just didn’t believe there could be a 12 year loss of lifespan because of the attitudes of one’s neighbors… It didn’t seem valid in my mind. If smoking knocks a decade off your life, how could anti-gay attitudes of people you don’t know be more harmful than something you would do to yourself?
Regnerus’s 2016 replication paper itself was picked up by the New York Post and by conservative publications, including the National Review.
Regnerus said that he never zeroed in on the particular coding error discovered by the second replication attempt:
It was always in my head that someone made a coding mistake…
Stanford University Professor John Ioannidis, who has studied reproducibility but was not involved in either replication attempt — and who Regnerus cited in his Supreme Court brief — told us:
I suspect [coding errors are] a common problem, but it is rare for authors to go back and recheck what they did. A failed replication may be an incentive to go back and recheck the data and the analysis code, so this is one extra benefit from replication attempts.
Update, UTC 21:00, 2/1/2018: A spokesperson for the journal told us that on Sept. 28, 2017:
Hatzenbuehler contacted us to say he had hired a research group to replicate his 2014 findings and as a result of their findings wanted to publish a corrigendum.
Like Retraction Watch? You can make a tax-deductible contribution to support our growth, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
It is incorrect to say that “the paper’s main finding was no longer valid.” It is true that a replication did not support the findings. That is not the same as showing lack of validity. Similarly, the writer here states “Despite the fact that Regnerus’s critique of the 2014 paper was ultimately proven correct,” – it is important to avoid terms like “proven correct”. These terms are too strong. “supported/not supported by additional studies”, “conclusions of subsequent studies differed” and so forth are the kind of statements that should be made. Criticism of studies is important, but the language of criticism should be muted and careful, and should avoid terms like “no longer valid” or “proven correct”. These phrases are not appropriate to science.
Well said.
The article says “She found a variable coding error. Hatzenbuehler requested a correction … despite the fact that the error nullified the main findings”. This does seem to indicate that “the paper’s main finding was no longer valid.”
If a study is not replicated, then the original is much more likely to be invalid/wrong than it is to be valid/right. Every scientist experiences it – semantics won’t cure it. Get over it and move on.
Absolutely correct: “Semantics won’t cure it!”
“If a study is not replicated, then the original is much more likely to be invalid/wrong than it is to be valid/right. ” I’m pretty sure this is not something you can know with any degree of certainty – at least under Frequencist approaches. We know the number of failed replications we can expect if H0 is true, but there’s no way of knowing what number of failed replications to expect if indeed reality is not-H0. You would need to know these two quantities to be able to state what you’ve just stated. The removal of data from the literature simply damages the efficiency of meta-analyses to find whether H0 or non-H0 is true. Yet, in case your statement was actually based in Bayesian thinking, could you give me references showing that, in general, when a study is not replicated, the original is much more likely to be invalid/wrong?
To be clear: if there was an error in the analysis of the paper, as this article says there was (an error in coding), the study needs to be heavily corrected (in the case that showing H0 is probably true is interesting) or removed from the literature if its main conclusion doesn’t hold. My point is simply that removal of articles based on non-replication is damaging to meta-analyses and, as such, counter-productive.
I could not disagree more with Nathanliel Frank. If Regnerus has come forward with valid criticism, then Hatzenbuehler is obliged to hear it and correct any errors. It does not matter one whit that Regnerus’s own study was fatally flawed, structured in a way as to guaranty a particular result, deliberately misleading (in that it called itself a “family structures study” but then went on to measure something other than family structure), ethically suspect, and riven with errors (which he has admitted). His study and proffered expert testimony was roundly rejected by the US District Court in Michigan, which held: “The Court finds Regnerus’s testimony entirely unbelievable and not worthy of serious consideration.”
But none of that has anything to do with the present dispute. If Bernie Madoff and Satan jointly presented a valid critique of Hatzenbuehler’s study, the author and the journal would be bound to consider their data, and make corrections if warranted. Mr. Frank seems to operate from the “social justice” mindset that is infecting the academy today. He cares more about the person making the argument, and the purity of his soul, rather than the data or the merits of the argument itself.
Indeed. Nailed it Mike. Data is data. It does not care how we feel. Objectivity does seem to be getting lost of late – people take things much too personally and forgets the merits of an argument.
I would argue that all science students should study scholastics, rhetoric and philosophy in order to better prepare them for the real world of science. The areas are not mutually exclusive.
What Nathaniel Frank said is that Regenerus is not a trustworthy critic of this body of research. This is correct. Regenerus finding that something is wrong with a study that critiques homophobia is not a reason to mistrust the study.
It may be, however, a reason to review the study. Mistrust [Regenerus] but verify [his target].
A stopped clock may be right twice a day, but if you know it’s a stopped clock, don’t set your watch by it.
It’s surprising to me that Regnerus is described with the pejorative “polarizing.” What does that have to do with the fact that the original Hatzenbuehler study was obviously not analyzed correctly or perhaps even falsified?
Whatever Regnerus’s politics, the data are what count. Serious questions about the original study were raised. This is apolitical and should be treated as such.
My assumption is that Hatzenbuehler has political views as well. Those politics would undoubtedly be viewed as “polarizing” to people with different views as well.
Please tell the story straight. This is Retraction Watch for god’s sake.
And…..now the original article is retracted.
https://www.sciencedirect.com/science/article/pii/S0277953613003353
“The reason for the retraction is that the authors discovered an error in the study, which, once corrected, rendered the association between structural stigma and mortality risk no longer statistically significant in the sample of 914 sexual minorities. The authors published a Corrigendum (Corrigendum to “Structural stigma and all-cause mortality in sexual minority populations” [Soc. Sci. Med. 103 (2014) 33–41], Volume 200, March 2018, p 271), pending a re-analysis of the data. Re-analysis confirmed that the original finding was erroneous and the authors wish to fully retract their original study accordingly.”
I think there should be a high bar for attributing misconduct. But the fact that a “coding error” just happened to coincide with a politically congenial conclusion is extremely worrisome.