Bucking the advice of university investigators, a journal founded by Hans Eysenck has issued expressions of concern — not retractions — for three articles by the deceased psychologist whose work has been dogged by controversy since the 1980s.
The move comes barely a week after other journals opted to retract 13 papers by Eysenck, who died in 1997. Those retractions were prompted by the findings of a 2019 investigation by King’s College London, where Eysenck worked until 1983. That inquiry concluded that:
we consider the published results of studies that included the results of the analyses of data collected as part of the intervention or observational studies to be unsafe and that the editors of the journals should be informed of our decision.
The report lists 26 “unsafe” papers, including three in Personality and Individual Differences (PAID), an Elsevier journal that Eysenck launched in 1980 and which is the official journal of the International Society for the Study of Individual Differences.
While the editors acknowledge the evidence against Eysenck, and the actions of other journals that published his work, they opted to leave his papers intact.
‘Spineless’
Critics of that decision lashed the journal of covering for one of their own.
And they have a point. Some people affiliated with the journal are co-authors or former students of his. The editor of PAID, Donald H. Saklofske, and a collaborator with Eysenck on many publications, described him as a “creative genius.”
David F. Marks, the editor of the Journal of Health Psychology — which published Pelosi’s critique of Eysenck’s work, alongside a detailed editorial by Marks requesting “a thorough investigation of the facts together with retraction or correction of 61 publications” — called the journal’s move “spineless”:
The PAID response is as one predicted. It cannot bring itself to publicly acknowledge that HJE, PAID’s founder editor, was a charlatan. I note that many of those who signed PAID’s spineless expression of concern are Eysenck’s co-authors. No conflict of interest there then.
According to the lengthy statement:
On 13 September 2019, Professor Sir Robert Lechler (Provost and Senior Vice President (Health) Kings College London) notified the chief editor of Personality and Individual Differences (PAID) that an internal review had been conducted and noted that the committee recommended retraction of three articles published in PAID:
Grossarth-Maticek, R., H. J. Eysenck, & H. Vetter (1988). Personality type, smoking habit and their interaction as predictors of cancer and coronary heart-disease. Personality and Individual Differences, 9, 479–495.
Grossarth-Maticek, R., & H. J. Eysenck (1995). Self-regulation and mortality from cancer, coronary heart-disease, and other causes – A Prospective-Study. Personality and Individual Differences, 19, 6, 781–795.
Grossarth-Maticek, R., H. J. Eysenck, A. Pfeifer, P. Schmidt, & G. Koppel (1997). The specific action of different personality risk factors on cancer of the breast, cervix, corpus uteri and other types of cancer: A prospective investigation. Personality and Individual Differences, 23, 949–960.
Evidently, the editors of PAID decided things were a bit more ambiguous than the King’s College investigators. The notice continues:
The Editor-in-Chief and Senior Editorial Board of Personality and Individual Differences (‘the Editors’) carefully considered the request to retract, and the proposed status of the three articles published in PAID as ‘unsafe’. In doing so we were minded of the previous articles, claims/counter-claims, and the formal investigation into this matter by the British Psychological Society during the early 1990s. The latter investigation declined to proceed with the complaint against Professor Eysenck. We also considered the constituent principles and the lines of arguments that would support a decision to retract vs. a decision to ‘warn’, considering Elsevier’s PERK guidelines, with regard to issuing an expression of concern. i.e., editors should consider such an action if:
–they receive inconclusive evidence of research or publication misconduct by the authors;
–they believe that an investigation into alleged misconduct related to the publication either has not been, or would not be, fair and impartial or conclusive;
— an investigation is underway, but a judgment will not be available for a considerable time;
After deep consideration, we feel it is advisable to consider the three articles in question as ‘potentially unsafe’, given the data upon which the results are based are considered ‘extraordinary’ in the context of typical medical-effect observational/interview data. It would simply be unwise to rely upon the veracity of the reported results until either evidence appears of deliberate intention to deceive (which would result in immediate retraction), or the results are replicated by an independent group of researchers (we note that Professor Pelosi detailed the failure of some attempts at replication in his 2019 article).
As a consequence, the Editors have linked this expression of concern with the aforementioned publications in their online records.
The policy of the journal Personality and Individual Differences in such matters is that retraction is reserved for articles for which there is clear evidence of purposeful malpractice or data fabrication, or an admission of malpractice by an author/authors. The detailed investigative work by critics of this work have failed to uncover such evidence.
The Editors reserve the right to take further action on these and other relevant publications in the light of any new pertinent evidence that might emerge in the future.
A dig at a critic
In an unusual twist for a retraction notice, the text then pivots to a discussion of one of Eysenck’s most vocal critics, a Scottish psychiatrist named Anthony Pelosi. Pelosi, whose own digging led to the King’s College investigation, told The Guardian that Eysenck’s papers contain:
what must be the most astonishing series of findings ever published in the peer-reviewed scientific literature, with effect sizes that have never otherwise been encountered in biomedical research
The Guardian’s article, published some nine months ago, also states that PAID rejected an invited paper by Pelosi about the Eysenck data — a claim the journal is now trying to rebut.
According to the notice, Pelosi’s article was a libel trap for the journal, which was merely acceding to the concerns of Elsevier’s lawyers (which is an odd claim, given that under UK law, the dead can’t be libeled):
The article in question was submitted for publication consideration in the Special Issue commemorating the 100th anniversary of Professor Eysenck’s birth. The issue editor worked closely with Professor Pelosi to ready his adversarial article for publication, retaining his criticisms of Professor Eysenck’s work but removing the personal attacks on the man himself. The proposed final version attracted the interest of Elsevier’s legal advisers. The tone of the article was adjudged to possess the likelihood of a legal claim potentially being brought against Elsevier, with the ability of Elsevier to defend such a claim in some doubt. Although requested by the special issue editor to moderate the tone of his article, Professor Pelosi declined; so publication was declined by the editor. This was not a decision made to protect the scientific reputation of Professor Eysenck, but one made solely in recognition of the legal opinion on the potentially libellous nature of some of the statements made by Professor Pelosi.
Pelosi told us:
I am disappointed that Personality and Individual Differences has not retracted these three articles despite the findings and recommendations of King’s College London. The journal has however done Expressions of Concern and hopefully these will lead on to retraction.
In my opinion these articles – especially Grossarth-Maticek, Eysenck & Vetter (1988) “Personality type, smoking habit and their interaction as predictors of cancer and coronary heart-disease” and Grossarth-Maticek & Eysenck (1995) “Self-regulation and mortality from cancer, coronary heart-disease, and other causes – A Prospective-Study” – are a disgrace. They bring psychology and medicine and science in general into disrepute.
There is another paper solely authored by Professor Eysenck in the same issue of Personality and Individual Differences as the 1988 paper. It is: “The respective importance of personality, cigarette smoking and interaction effects for the genesis of cancer and coronary heart disease” by H J Eysenck, Personality and Individual Differences Volume 9, Issue 2, 1988, Pages 453-464. It was and it remains an absolute disgrace. In my opinion, it should also be retracted. Given the approach of Personality and Individual Differences they should do an Expression of Concern about this paper.
In my opinion anybody who cannot see, simply by reading these published articles, that they are a disgrace should not be carrying out scientific research on human participants.
I do kind of agree with the statement about the rejection of my paper. However, it is a complicated story. My invited manuscript went back and forward for 6 months to the Guest Editor of the Special Issue commemorating Eysenck’s Centennial. The Guest Editor suggested some major changes to the manuscript but I could never under any circumstances have put my name to these. However, I made almost all of the changes that were suggested by Elsevier’s Legal Counsel. The paper ended up being reviewed by 9 referees. I made a lot of the changes that emerged from the referees’ reports. I did not however make changes that were suggested by a couple of the later referees. This is because these would have made the article even more critical of Professor Eysenck and Professor Grossarth-Maticek. This was after I had spent months toning down the article at the request of the Editors and Legal Counsel.
The paper was eventually formally accepted by Personality and Individual Differences. I corrected the proofs and I even paid 2,327 Euros to make the paper fully Open Access. However, it was then “spiked” at the last minute by the then Editor-in-Chief on the grounds that it was “potentially defamatory”. I did get my money back.
Marks said:
PAID’s attempt to discredit Tony Pelosi’s paper on the legal front is nothing short of pathetic.
There will be more yet to be published about HJE’s deliberate misrepresentation of scientific evidence in the near future.
Another publisher, SAGE, has already expressed concern about many papers in the core area of HJE’s work, i.e. his research on personality. More detailed exposure is bound to follow.
The real question is: why was H J Eysenck allowed to get away with the fraud for so long? Too many vested interests in the British psychological establishment one can only assume. The fact that the British Psychological Society has washed its hands of the Eysenck affair and continues to publish supportive articles is shameful. One cause of the so-called replication crisis is research fraud. As long as journals such as PAID turn a blind eye to obvious research fraud, nothing can be expected to change.
Eysenck is up to 14 retractions and 64 expressions of concern.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].
This is a highly tendentious account of this case. I think some more restraint is in order. The net effect of this enthusiasm will be to destroy your own credibility, if Eysenck’s work survives scrutiny.
An “expression of concern” or marking a paper as “unsafe” is surely a necessary first step. Further action can later be taken. Demanding immediate “retraction” will end the problem–but it will almost certainly not help us to understand the motivation of this clearly complicated researcher. Your enthusiasm smacks of moral righteousness, which is never a good thing in a scientist.
The youngest of those articles is over 20 years old. Pelosi (and others) have been investigating these issues for a long time. Retracting them now would hardly be “immediate”. An EOC is a good start, but some expediency should also be employed to remove the destructive influence of these articles on the literature.
Why is “help us to understand the motivation of this clearly complicated researcher,” even of interest, let alone a goal? Ideally, scientific papers should stand (or fall) on their own.
It is relevant because unless you simply retract everything Eysenck has ever written, you are looking for papers where he was dishonest. The smoking gun seems to be the connection with the tobacco industry. Does that mean all the papers connected with this issue are problematic? Perhaps, yes. Does this mean all the papers he has ever written are problematic? I am not sure.
The other common element in these specific papers is the involvement of Grossarth-Maticek as the source of the raw data. There is evidence of routine manipulation (if not fabrication) in the data-collection sheets:
https://www.tandfonline.com/doi/pdf/10.1207/s15327965pli0401_13
This paper on data integrity in Grossarth-Maticek’s survey, by collaborator-turned-skeptic Vetter, is astonishing.
https://www.tandfonline.com/doi/pdf/10.1207/s15327965pli0203_17
“Yes, one of the data-collection minions waited until subjects were dead before retrospectively filling in some of the mortality-predictor items, based on knowledge of their mortality.”
“Later I was informed that it had turned out that on the occasion of changing the code numbers of subjects, those alive in 1982 (and only those) had received, by accident, completely wrong code numbers for those variables, so that prediction after 1982 was impossible, whereas the predictive success for deaths up to 1982 was genuine.”
That second quote is verbatim rather than my exaggeration.
Stockholm syndrome at PAID?
The unreliability of the works of H J Eysenck was an open secret when I took a nonmajor psychology class ca. 1965. The prof mentioned in class of 300 that Eysenck’s small computational errors seemed to always favor Eysenck’s conclusions. I remember this lecture because open accusations of dishonesty did not happen in any of my other classes. The prof was likely safe concentrating on demonstrable computational errors.
FYI: The overall subject that class day was opposition to “scientific racism” against blacks. (I am not a lawyer.)
So retraction of Eysenck’s works may be both overdue and unnecessary.
With regard to Smut Clyde and others viz a viz:
“This paper on data integrity in Grossarth-Maticek’s survey, by collaborator-turned-skeptic Vetter, is astonishing.
https://www.tandfonline.com/doi/pdf/10.1207/s15327965pli0203_17”
I might respond that the reply/direct by HJE in the same 1991 journal issue is equally astonishing.
https://doi.org/10.1207/s15327965pli0203_21
Who =really= knows for sure who is being ‘economical with the truth’ here?
I think what’s required now is a kind of deep investigative approach coupled with computational modeling/simulation work far beyond what we’ve seen to date. I’m not sure it is even possible to do this anymore given we are talking about interview records etc. at least 40 years old.
But clearly, this issue is not just about the data but about the ethics and morality of HJE.
That’s another matter altogether on which I am unable to offer anything but my opinion – which is just one among many and of no consequence.
And right now we are awash with ‘opinions’ on all sides. One more adds nothing but stray noise to the complex issues to be resolved.
Regards .. Paul
How can any scientist respond to statements from David Marks such as his comment on:
http://retractionwatch.com/2020/02/12/journals-retract-three-papers-by-hans-eysenck-flag-18-some-60-years-old/
“Hans Eysenck’s reputation as the third most cited person after Freud and Marx is now in tatters. The question is: how did Eysenck get away with this huge body of unsafe work for more than half a century? Where are his defenders and acolytes? Why are they not springing out to defend him? Are there more surprises yet to come?”
And from David Marks:
http://retractionwatch.com/2020/02/26/journal-founded-by-hans-eysenck-issues-expressions-of-concern-for-his-papers-despite-calls-by-university-to-retract/
“The PAID response is as one predicted. It cannot bring itself to publicly acknowledge that HJE, PAID’s founder editor, was a charlatan.”
I’m not sure what anyone is meant to respond to – beyond agreeing (as many of us have done via our respective journals’ “Expressions of Concern”) that the Grossarth-Maticek data is ‘extraordinary’, and that the papers which analyzed that data are “of concern/unsafe”. I note Grossarth-Maticeck himself has now responded with a spirited defence:
https://www.krebs-chancen.de/referenzen-und-gutachten/denunziation-englisch/
Right now, there is no evidence showing the Grossarth-Maticek data was manipulated or the analyses fudged.
If that should eventuate, then some very big reputational consequences will follow as a matter of course.
It is not ‘spineless’ to defer retraction until clear evidence of malpractice/data manipulation is brought forward; it is merely prudent. Noting relevant papers with “Expressions of Concern” and drawing attention to the IOP report is quite sufficient to ‘warn’ any reader the data are no longer considered ‘robust’ or ‘trustworthy’.
Vary careful readers will note that the IOP committee also took this view, by choosing the word ‘unsafe’ – with no reference or use at all in the report of that word “retract”. It appears only in the press-release first page, but not in the actual report itself, upon which we at PAID acted.
Of course, Marks and Pelosi et al are perfectly entitled to voice whatever opinions they choose .. but, it is unwise to make sweeping claims about bodies of papers in which at least one used data from previously published work by independent authors (nothing whatsoever to do with cancer, illness or Grossarth-Maticek).
I’m just finishing up a Technical Report which highlights just such a paper, providing a complete re-analysis with output (using the readily available input correlation matrix from the independent publication), using the same programs as were used 27 years ago (quite a feat to coax the old Fortran code into running on a Windows machine) as well as Statistica (latest version) for a double-check.
Nothing to see here as I’d expected (although I was nervous given the “Eysenck’s work is all crooked” doom and gloom spiel from Marks and Pelosi!), given I was responsible for all the original analyses – while Hans wrote it all up. A right pain in the proverbial butt as I’m busy on real work these days – but this had to be done to protect my own reputation from the veiled insults being doled out liberally from David Marks et al.
I’ll be posting it on my website in a day or two as a pdf (including the two original papers and analyses outputs), and will announce its availability in this comments area.
And, I would note that all the EPQ data which formed the basis for many cross-cultural publications was made freely available to anyone interested in using it for any purpose in 2013:
Eysenck S.B.G. & Barrett, P.T. (2013). Re-introduction to cross-cultural studies of the EPQ. Personality and Individual Differences, 54, 4, 485-489 (includes the complete 40,000+ case, 35-country EPQ cross-cultural data-archive, scorekeys, analysis outputs etc., and data dictionary for free download in the Elsevier article supplementary data area):
http://dx.doi.org/10.1016/j.paid.2012.09.022
The same is available for the Junior EPQ data – although not yet published (within a few weeks I think).
As to the Biosignal Lab work .. the experiments, data acquisition, biosignal and other analytics papers were all conducted by myself and two marvellous ladies who acted as EEG technicians/ experiment administrators, using an array of Unix real-time computer-controlled tasks set up by me. I wrote all those papers because they were so technical – and the interpretations within them reflected my attention to details and my inherent scientific caution about forming any definitive conclusions.
This whole Eysenck issue is clearly complicated
I’m all for investigative reporting and forensic/detective work if that helps resolve certain controversial issues. But the rather emotive statements I’ve seen recently on Retraction Watch from David Marks are quite unnecessary.
As to being called “spineless” .. might I direct those who think this to take a look at a 20second video segment on my website – in Barrett View #1:
https://www.pbarrett.net/bview1-video/BView_1.html
That’s me to a tee (without the colourful language) .. and I even look vaguely like him)!
But seriously, I don’t do waffle or engage in tit-for-tat squabbles (which is why I avoid all social media like the plague). I respond best to reasoned argument and empirical evidence, not simply someone’s opinions, personal interpretations, or beliefs. Which is why we responded as we did in PAID.
I’m a measurement expert and evidence-base constructor/evaluator these days – not an applied numerologist or ersatz statistician. I think a read of my recent position paper on these matters might inform some that they are not dealing with some cowboy or card-carrying ‘status-quo’, famous-name preserver:
Barrett, P.T. (2018). The EFPA test-review model: When good intentions meet a methodological thought disorder.. Behavioural Sciences (https://www.mdpi.com/2076-328X/8/1/5), 8,1, 5, 1-22. [open-access] ..
as well as my paper with James Grice and colleagues:
Grice, J., Barrett, P., Cota, L., Felix, C., Taylor, Z., Garner, S., Medellin, E., & Vest, A. (2017). Four bad habits of modern psychologists. Behavioural Sciences (http://www.mdpi.com/2076-328X/7/3/53), 7(3), 53, 1-21. [open-access]
And no, I don’t do snappy one-liners like so many others these days!
So, here is one of Hans Eysenck’s “colleagues” responding, but not in the somewhat “Outraged from Tunbridge Wells” manner some were probably hoping for!
Back later with the Technical Report link.
Here is the Technical Report link to which I referred in my previous message:
https://www.pbarrett.net/techpapers/Verification_of_analyses_in_Schizotypy_1993_paper.pdf
with supplementary materials/program outputs in a zipped archive:
https://www.pbarrett.net/techpapers/Kendler_Hewitt_Schizotypy_reanalysis_materials_PB_6-Mar-2020.zip
Regards .. Paul