Concerns attached to three more papers by retraction-laden management researcher

Fred Walumbwa
Fred Walumbwa

Fred Walumbwa, a management researcher with eight seven retractions, has received three expressions of concern from two journals after he failed to provide raw data following an investigation into potential errors.

In the past, Walumbwa has said he only keeps data until his papers are published, but a lack of raw data has become a common theme in his notices, which now also include four corrections, and one other EOC (making a new total of four). There are no standard rules about how long to store raw data, but one journal that issued two of the new EOCs has since updated its submission policy to require that authors keep data for at least five years.

Walumbwa currently works at Florida International University. When concerns about the statistics were raised about five of his papers in Personnel Psychology, the journal conducted an investigation that led to flagging two of those articles, the expression of concern explains:

Concern has been raised in relation to five articles published in Personnel Psychology between 2007 and 2011. The substance of the complaint is that the fit statistics reported in the articles contain numerous errors. The journal has conducted an investigation into the matter, including seeking the opinion of a third party.

In the case of three of the articles, any errors or omissions were either reconciled or determined not to significantly affect the articles’ findings or conclusions. In the case of the two articles listed above, published online on 22 August 2011 in Wiley Online Library, and in Volume 64, pp. 739–770 and 771–802, respectively, the journal concluded that there may be errors in some of the reported fit statistics, but it is difficult to understand the nature of the potential errors and their implications unless the raw data is made available.

Unfortunately, the journal was not provided with the raw data, despite repeated requests to the authors. In the case of the second article (Hannah et al., 2011), a governmental agency confirmed that they collected the data and controlled its storage, but no longer maintained the data after five years and thus could not make it available. Thus, consistent with guidelines published by the Committee on Publication Ethics (COPE), this Expression of Concern is issued by agreement of the Editor, Bradford S. Bell, and Wiley Periodicals, Inc. to make our readers aware of the potential issues with the results reported in these articles. Dr. Walumbwa, a co-author on the five respective articles, has requested that readers be informed that he was responsible for the analyses.

The journal has updated its article submission process such that authors will now have to explicitly agree to retain their raw data for a minimum of 5 years after publication of the research and to provide their data for verification when requested by the journal and editor, providing confidentiality of participants can be protected and legal rights concerning proprietary data do not preclude their release.

Those flagged articles are:

The third expression of concern is for “Innovation Strategy, Human Resource Policy, and Firms’ Revenue Growth: The Roles of Environmental Uncertainty and Innovation Performance,” published in Decision Sciences:

The substance of the concern is that the fit statistics reported in the article may contain many errors. The journal has investigated the matter with the help of a committee of methodology experts in statistical technique. The committee has come to the conclusion that there may indeed be something wrong with the statistics provided in the article. However, it is difficult to ascertain the implications of these errors unless the raw data is made available. Unfortunately, the authors were not able to provide the raw data despite repeated requests. We feel it is important for our readership to be aware of potential issues with the results reported in this article. We would like to reiterate that authors should be able to provide raw data when requested by journals and editors. Providing access to data when requested is good scientific practice, and is encouraged by this journal.

That paper has been cited seven times.

Walumbwa has also earned a recent correction for “Moderating role of follower characteristics with transformational leadership and follower work engagement,” published in Group Organization Management and cited 48 times. The whole correction is long (it may even count as a “mega-correction“), so we’re not posting it in full. The conclusion notes that:

…the empirical findings reported in the original Group & Organization Management (GOM) publication are valid and accurate, thus the revisions above do not change any conclusions made in the article. First, the new test for common source variance using the four-item scale of follower characteristics shows again that the three-factor model fits the data better than the one-factor model, indicating that common source variance was not a serious problem in this study. Second, the current four-item scale of follower characteristics used in this study, exhibits the best psychometric properties based on the current dataset.

We have also unearthed a correction from 2014 to “Accounting for the influence of overall justice on job performance: Integrating self-determination and social exchange theories,” published in the Journal of Management Studies and cited once. The correction notes:

…the journal has been informed about potential problems regarding the empirical analysis. In the process of clarification during which the authors provided the editors with the raw data in question, it has become clear that there was an error in the calculation of the Average Variance Extracted for the different concepts.

Walumbwa’s problems often come from not having the raw data to back up his work when questions arise. As we reported in 2014, in an investigation by his former employer, Arizona State University, Walumbwa stated that his practice was to keep data until a paper was accepted for publication. That rapid disposal has caused problems for him — but how long should researchers hang onto data?

According to Guidelines for Responsible Data Management in Scientific Research developed for the Office of Research Integrity and the US Department of Health and Human Services, there is “no set amount of time for which data should be stored:”

In some cases, the time period is at the discretion of the PIs; however, many sponsor institutions require that data be retained for a minimum number of years after the last expenditure report. For instance, the USDHHS requires that project data be retained for at least 3 years after the funding period ends. Other sponsors or funders may require longer or shorter periods.

If problems do arise, long-term data storage is helpful: After concerns were raised about a paper by MIT longevity researcher Leonard Guarente, he needed to find files that were a decade old. He told us that “it was a tremendous effort to get to the figures, but we did so in almost every case.” The first author had saved the files, and now the paper remains in the scientific record with a correction.

We’ve reached out to the editor in chief of Personnel Psychology, and will update this post with anything else we learn.  

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.

7 thoughts on “Concerns attached to three more papers by retraction-laden management researcher”

  1. The majority of Dr. Walumbwa’s papers appear to now have been either retracted or had expressions of concerns and corrigendums attached to them. At the same time the state of Florida appears to continue to pay him close to $180,000 a year (if the website that discloses salaries for the Florida State University System is to be believed).

    1. If he were based in Georgia (where I am), I would urge extreme caution with regard to those website numbers. I was a department head for a few years, and I became familiar with the website when an upset faculty member asked why everyone else in the department got paid so much more than this individual. As department head, I knew the real salary numbers, and the reported numbers were hugely inflated–and my informant’s inflated salary was right where my informant thought it ought to be relative to colleagues. Of course, maybe Florida’s public records are more accurate.

  2. This article says Walumbwa has eight retractions in total, but I can only identify seven (five in Leadership Quarterly, and one apiece in the Journal of Organizational Behaviour and the Journal of Operations Management). So far as I can tell, Retraction Watch has only reported on seven retractions. So where is the remaining one to be found?

  3. Given everything that is wrong with statistical practice in the social sciences, I have to wonder exactly how much less valid these findings really are versus findings from papers that only misuse statistics in ways that don’t lead to EOCs or retractions. Put another way, suppose the papers in question did not include (what seem to be) obvious errors in the reporting of statistical results: how much higher would your confidence in the substantive conclusions really be, given the travesty that is conventional statistical practice in the social sciences?

    I suppose I’m getting old, but the selective reporting of results, the p-hacking, and the use of statistical results to “confirm” theoretical models which were themselves ginned up to conform to those same results, make the bizarre reporting in these papers seem like a red herring.

  4. Ed, I agree with your sentiment but just wish that more senior members of the field with your stature made these points.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.