Two psychology researchers are retracting a meta-analysis after discovering errors they believe may affect the conclusions.
We’re giving this a “doing the right thing” nod, as last author Pankaj Patel of Villanova University in Pennsylvania contacted us about his plan to retract the paper, and resubmit for publication once he and co-author Sherry Thatcher — at the University of South Carolina — have performed all their recalculations.
Here’s the retraction notice for “Demographic faultlines: A meta-analysis of the literature,” published in the Journal of Applied Psychology:
At the request of the editor and in consultation with the American Psychological Association, the article is being retracted. This action is a result of a review by the editor and two additional experts that determined that there are significant errors in Tables 1, 2, and 3 which may affect the overall conclusions of the article.
Co-author Pankaj C. Patel led the analysis, and both authors acknowledge that inaccuracies were made.
The retraction of this article does not preclude resubmission of a new article that addresses the issues noted in the retraction.
The 2011 paper has been cited 57 times, according to Thomson Reuters Web of Science.
Patel told us they discovered problems with their calculations after concerns were raised by the journal’s editor, Gilad Chen (note: the explanation will make the most sense to readers familiar with statistics):
At this point, we retraced our steps since late 2008, when the work on the paper started. The query focused on high rho values in Tables 1 and 2 (for rows 1-8) and Table 3.
Patel goes on to explain why those high rho values were such a concern:
1. The rho values in rows 1-8 in Tables 1 and 2 were added based on a request from the editorial team in a late round of revisions and details about our assumptions were inadvertently left out of the manuscript (these details on calculations of the reported rho values are available from the authors upon request). Although these calculations were not used in subsequent calculations, they were used in some of our inferences discussed in the paper.
2. The second point of the query was the mismatch between the actual sample size and the number of studies reported in the random-effects correlations in Table 3. Upon re-examination, we found errors in manually calculating several of the 78 random-effects correlations in our Excel spreadsheet. These errors were a result of selecting an incorrect number of cells for inclusion in calculating the random-effects correlations, resulting in inaccurate random-effects correlations for several of the relationships in our matrix in Table 3. The random-effects correlations were inputs for estimates in Figure 1. These errors were honest mistakes. As our registration on Open Science Framework indicates, we are fully committed to addressing these issues in a public domain.
Although Patel and Thatcher will have to recalculate all of their random-effect correlations, Patel does not believe the results will change dramatically:
Due to the large sample size we expect the significance levels to hold. However, we cannot be certain of this until we recalculate all 78 random-effects correlations.
He added that he and Thatcher are “both deeply distraught over these errors:”
Our goal is to resubmit the manuscript to the Journal of Applied Psychology within the next few months using the meta-analysis protocol registered on Open Science Framework (https://osf.io/zca4j/).
Patel told us some of what they’ve already accomplished:
At this time, on OSF we have uploaded the protocol, collated the papers, recruited two co-authors and are close to completing the first round of coding. The coding tools with the list of additional studies published since Thatcher and Patel (2011) has been updated periodically and uploaded on OSF.
Patel added:
In conclusion, it was our responsibility to ensure accuracy. We apologize to the scientific community for our errors. We are appreciative that the editor, Dr. Gilad Chen, sees value in this study and is willing to consider a new submission. We believe that Open Source Framework registration of the study, uploading all the coding and results, and conducting the study in a public domain is the most appropriate way for moving forward.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.
Hello:
The public access link to the project on Open Science Framework is: https://osf.io/zca4j/?view_only=8b7d2d8aa2e74c9e88b491516fed9966. We would welcome any comments, suggestions, and guidance from the academic community through the OSF public access link to the meta-analysis.
Thank you.
Interesting the author contacted Retraction Watch, almost as a confessor. Not that they need to confess, just retract- no shame in that. To err is human, to forgive apparently the domain of Retraction Watch. 🙂
Thank you authors for setting an excellent example.
Death by EXCEL! Never, ever use EXCEL for anything close to a statistical analysis. There are better and free tools out there for that.
Just google it.
The OSF link in the text above is “frozen”.
I wonder why there are coefficients with obviously irregular confidence intervals (for example r = -.17, CV ranges from .17 to .15; table 1).