PLOS ONE issues editor’s note over controversial chronic fatigue syndrome research

Screen Shot 2015-12-15 at 11.42.08 PM

After a request for the original data was denied, PLOS ONE editors have flagged a 2012 sub analysis of a controversial clinical trial on chronic fatigue syndrome with an editor’s note.

The editor’s note — which reads like an Expression of Concern — reiterates the journal’s policy that authors make data and materials available upon request, and notes that staff are following up on “concerns” raised about the study.

There have been numerous requests for data from the “PACE” trial, as the clinical trial is known, which the authors say they have refused in order to protect patient confidentiality. On November 13, James Coyne, a psychologist at the University Medical Center, Groningen, submitted a request for the data from the PLOS ONE paper to King’s College London, where some of the authors were based. According to Coyne’s WordPress blog (he also has a blog hosted by PLOS), the journal asked him to let them know if he “had any difficulties obtaining the data.” He did — KCL denied the request last Friday (the whole letter is worth reading):

The university considers that there is a lack of value or serious purpose to your request. The university also considers that there is improper motive behind the request. The university considers that this request has caused and could further cause harassment and distress to staff.

Last author Peter White at Queen Mary University of London, UK, told us the journal had not asked them to release the data, but he would work with PLOS to address any questions:

We understand PLOS One are following up concerns expressed about the article, according to their internal processes. We will be happy to work with them to address any queries they might have regarding the research.

Here’s the editor’s note for “Adaptive Pacing, Cognitive Behaviour Therapy, Graded Exercise, and Specialist Medical Care for Chronic Fatigue Syndrome: A Cost-Effectiveness Analysis,” in full:

Several readers have raised concerns regarding the analyses reported in this article. We are also aware that there have been requests for the data from this study.

The article was published in 2012; the PLOS data policy that applies to the article is that for submissions prior to March 3, 2014, which is outlined here: http://journals.plos.org/…. The policy expects authors ‘to make freely available any materials and information described in their publication that may be reasonably requested by others for the purpose of academic, non-commercial research’. The policy also notes that access to the data should not compromise confidentiality in the context of human-subject research.

PLOS ONE takes seriously concerns raised about publications in the journal as well as concerns about compliance with the journal’s editorial policies. PLOS staff are following up on the different concerns raised about this article as per our internal processes. As part of our follow up we are seeking further expert advice on the analyses reported in the article, and we will evaluate how the request for the data from this study relates to the policy that applies to the publication. These evaluations will inform our next steps as we look to address the concerns that have been noted.

Competing interests declared: PLOS ONE Staff

We weren’t sure what the last line was referring to, so contacted Executive Editor Veronique Kiermer. She told us that staff sometimes include their byline under “competing interests,” so the authorship is immediately clear to readers who may be scanning a series of comments.

The trial followed 641 patients, and concluded that two forms of therapy that focus on mental aspects of the disorder were safe, and might be more effective than the therapy commonly favored by patients.

Patients and advocates have disputed these claims — arguing, among other things, that the findings may prompt some to believe chronic fatigue is a mental, not a physical, disorder. (No one knows what causes CFS, which can leave patients bedridden; some scientists think that there might be multiple causes.) Further, advocates argue, a program focused on cognitive behavioral therapy or graded exercise therapy — the two treatments supported by the PACE trial — could actually be harmful to patients by encouraging too much exercise. There have been calls from patients and academics alike for a re-analysis of the data.

The PLOS ONE paper compared the cost-effectiveness of various forms of therapy for CFS, and concluded that cognitive behavior therapy could be the most cost-effective approach to treatment. It has been cited 10 times, according to Thomson Scientific’s Web of Knowledge.

In September, we spoke to White about the choice to not release the data behind their findings (we were considering publishing an account of the trial by David Tuller; ultimately we could not agree on an approach). During that previous interview, White told us:

We have yet to release any raw data. It’s part of our patient’s consent form. We promised that we would keep it confidential. We think it’s really important for our trial — A, for if we do a long-term follow up, which I hope we will. B — When you make a promise to participants, and say ‘I’m going to protect your data, I’m not going to release it to any member of the public,’ we keep that promise. We made a promise and we have to keep it.”

In his request to KCL, Coyne explained his plans for the data:

I am interested in reproducing your empirical results, as well as conducting some additional exploratory sensitivity analyses. Accordingly, and consistent with PLOS journals’ data sharing policies, I ask you to kindly provide me with a copy of the dataset in order to allow me to verify the substantive claims of your article through reanalysis.

To Coyne, the stakes are higher than this one trial. In PLOS Blogs post titled “Why the scientific community needs the PACE trial data to be released,” Coyne explains that his reasons for requesting the data go beyond his skepticism of the PACE trial’s conclusions:

The crisis in the trustworthiness of science can be only overcome only if scientific data are routinely available for reanalysis. Independent replication of socially significant findings is often unfeasible, and unnecessary if original data are fully available for inspection.

KCL’s response to Coyne’s request notes the controversy surrounding the trial, including allegations that the scientists have received death threats from activists:

There have been significant efforts to publicly discredit the trial following the release of the first article in the Lancet journal in 2011. Among other public campaigns, there is a Wikipedia page dedicated to criticisms of this project. The campaign has included deeply personal criticism to the researchers involved as well as significant comment around the decisions not to disclose data and information about the project.

We asked Kiermer what the journal will do if the authors refuse to release the data:

We are looking into the matter and we cannot speculate at this point.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, and sign up on our homepage for an email every time there’s a new post. Click here to review our Comments Policy.

34 thoughts on “PLOS ONE issues editor’s note over controversial chronic fatigue syndrome research”

  1. “The [PLoS] policy also notes that access to the data should not compromise confidentiality in the context of human-subject research.”

    PLoS is to be commended for leading on open access to data and to applying the new standard retrospectively.

    Institutions as well as past and current authors not wishing to comply with the NEW AND IMPROVED PLoS open data policy should not be allowed to play the patient “confidentially” card in response to a reasonable data access request. It is a simple matter to redact patient data to remove personal identifiers.

    Here’s hoping PLoS keeps to it’s guns. If the [redacted] data is not produced after a reasonable request, then the paper must be retracted.

  2. Thank you, Retraction Watch, for covering this very important topic.

    A crucial point: Professor James Coyne wrote to the lead author of the PLOS One PACE paper and asked for the data under PLOS One’s data sharing policy: he didn’t write to King’s College London and he didn’t make his request under the Freedom of Information Act.

    https://jcoynester.wordpress.com/2015/12/13/why-i-dont-know-how-plos-will-respond-to-authors-refusal-to-release-data/

    Bizarrely, King’s College treated it as an FOI request, made him wait the maximum 20 days for a response, and then denied his request in the most insulting terms imaginable.

    The social media response makes good reading:

    https://twitter.com/bobbobme/timelines/676893810383802368

    and includes condemnation by the former editor of the BMJ of the withholding of the data.

    As Professor Chris Chambers of Cardiff University has tweeted, “If @KingsCollegeLon is seeking to do itself ‘reputational damage’, hiding trial data shd do the job.”

  3. The old “patient confidentiality” excuse… I guess KCL never heard of replacing names with strings of randomly-generated characters.

  4. Thankyou so much for covering this important issue, but I think the article has omitted out some key reasons why patients want a re-analysis of the PACE trial data in general.

    Previous reports of the PACE trial results are commonly viewed as inadequate due to: 1) the major, extensive, and questionable deviations from the published trial protocol (in some instances poorly or erroneously justified); 2) the misleading estimates of full recovery based on relaxed criteria presented as stringent (e.g. overlap between trial entry criteria for severe disabling fatigue and the ‘normal range’ or recovery for fatigue and physical function); 3) the multiple factual errors which undermine the reliability and interpretation of some of the results; 4) and the refusal to release pre-specified outcomes as outlined in the trial protocol published in BMC Neurology in March 2007 (who assumed there would be no such changes and strongly encouraged readers to check).

    Given the above problems, and the fact that the trial is repeatedly promoted to patients and clinical commissioners as definitive and highly robust, patients and independent researchers have every right to demand a re-analysis. This is especially important when the trial was non-blinded and the effectiveness of CBT and GET have been challenged by a total absence of meaningful improvements to the range of objective outcome measures used in PACE and similar trials. It appears that the interventions tested may only work for a small net minority of patients and are more palliative than rehabilitative as commonly promoted. We need to know more about the trial results using more stringent thresholds for improvement and recovery.

    With respect to Peter White’s assertion that they cannot release any raw data due to confidentiality agreements. The consent forms are somewhat ambiguous about who is specifically allowed to access the data. QMUL have also claimed in the past that these consent forms do not allow anyone outside the PACE group to access raw data. Yet their publication in PLOS One in 2012 and the sharing of data with the Cochrane Collaboration appears to challenge that. The key promise in the consent forms is that the identity of participants will be protected. Sufficiently anonymised or de-identified data is not regarded as personal or confidential under the FOIA or DPA. Multiple guidelines on confidentiality (e.g. ICO, NHS, GMC) are only concerned with personally identifiable information. On 27 October 2015, the UK’s Information Commissioner’s Office rejected all of QMUL’s arguments for why they cannot release a careful selection of de-identified individual-level trial data under the FOIA and ordered QMUL to disclose it (see FS50565190).

    1. Aallowing the patients to be the arbiters of scientific work is not in their best interests. This topic is particularly contentious as patients wish certain results and are resistant to some other possibilities. People are suffering, dispassionate analysis is required and patient activists are a hindrance and not a help

      1. “Aallowing the patients to be the arbiters of scientific work is not in their best interests.”

        Someone’s identity should not matter. The quality of their arguments and the weight of the evidence should be judged without bias.

        “This topic is particularly contentious as patients wish certain results and are resistant to some other possibilities.”

        I said ‘without bias’. This is certainly a condition surrounded by unpleasant prejudices, but it should not be too difficult for people to remember that they are no more able to read the mind of CFS patients than of any other group.

        “People are suffering, dispassionate analysis is required and patient activists are a hindrance and not a help.”

        Many academics would disagree with you, and now take the time to praise the work patients have done to highlight the serious problems with some of the research into their condition.

        Perhaps I am being unfair to you, but you give the impression of having not taken the time to look at the details of this matter.

  5. The PACE Trial authors changed the criteria after the trial finished so participants were “back to normal” in terms of physical functioning if SF-36 PF≥60. This is lower (worse) than the entry requirement, where at baseline participants needed to have severe disabling fatigue. This requirement is also part of the recovery criteria. In the published protocol, the recovery criteria was SF-36 PF≥85 but we have never been given any data using this threshold. I can’t see how it is in participants’ interest to say they are “back to normal” and “recovered” after treatment when most people would say SF-36 PF≥60 doesn’t represent anything close to normal functioning.

    The PACE Trial investigators have caused people to want to see some of the raw data because of actions like this. People have criticised such changes in published responses but the PACE Trial investigators are not willing to look at the issue another way such as publishing the recovery criteria promised in their protocol.

    1. The PACE Trial authors changed the criteria after the trial finished so participants were “back to normal” in terms of physical functioning if SF-36 PF≥60. This is lower (worse) than the entry requirement, where at baseline participants needed to have severe disabling fatigue. This requirement is also part of the recovery criteria. In the published protocol, the recovery criteria was SF-36 PF≥85 but we have never been given any data using this threshold. I can’t see how it is in participants’ interest to say they are “back to normal” and “recovered” after treatment when most people would say SF-36 PF≥60 doesn’t represent anything close to normal functioning.

      Indeed. I have CIDP, and I created an Excel spreadsheet to self-administer the SF-36 test monthly when I contracted the disease last year. From personal experience, it is possible to have a Physical Functioning score greater than 60 and still be substantially impaired, including not being able to walk.

      ===|==============/ Keith DeHavelle

  6. I’d be curious to know whether you think that what Professor White told you about his reasons for not releasing PACE’s data is consistent with the actual content of the patients’ consent forms, which can be seen online here (p. 113) .

    http://www.meactionuk.org.uk/FULL-Protocol-SEARCHABLE-version.pdf

    You said that White told you, “We have yet to release any raw data. It’s part of our patient’s consent form. We promised that we would keep it confidential.”

    This is the relevant statement on the consent form that patients had to sign at trial entry:

    “14. 1 understand that information collected about me for the trial, including my personal details, a copy of this consent form and all of the questionnaires I complete for the trial, will be held securely by the local trial staff and at the PACE trial centre at Queen Mary, University of London. I give permission for this to happen.”

    I see no promise there to patients to keep their anonymised raw data confidential; I see an undertaking not to leave a list of patients’ names and addresses on the bus. But if that’s how the study authors interpreted that statement, they shouldn’t have submitted their paper to a journal that requires data-sharing as a condition of publication. They must now put up the data, or retract their paper.

    As an ME patient, I was offered a place on PACE and refused. But if I had taken part I would be horrified at the travesty of science that this trial has become. I wouldn’t have wanted to have risked my health in a clinical trial only for the study authors to publish bizarre and misleading analyses and then hide the data so that others couldn’t challenge them.

    More detail about these absurd analyses can be found in the background pages to this petition calling for their retraction, now signed by over 11,000 people:

    http://my.meaction.net/petitions/pace-trial-needs-review-now

    1. Professor Peter White is claiming that he promised to keep patients’ data securely; has he forgotten that some of this data was not held securely? The following extract comes from “Magical Medicine: How to make a disease disappear” (p256):

      http://www.meactionuk.org.uk/magical-medicine.pdf

      On 31st March 2006 Peter White wrote to the West Midlands Multi‐centre Research Ethics Committee to inform them of the theft of a digital audio recording (DAR) of GET sessions from Centre 03 (which is King’s College, ie. Trudie Chalder’s Centre). This confidential information was stolen from an unlocked drawer in the therapists’ office. Peter White informed West Midland MREC that: “There are no lockable cabinets in any of the therapists’ rooms so the drawer was not locked” (cf SSMC Participant Information Sheet). His letter continued:

      “The burglary was reported to Southwark police on the day that it happened, which was Wednesday 22nd March 2006. The crime number is 3010018‐06. The therapist was away on leave 22nd‐24th March and therefore the DAR was not found to be missing until Monday 27th March 2006”. It was only after the theft that Professor Trudie Chalder sought advice on how to secure the data properly.

      The letter also said: “The Principal Investigator for this centre, Professor Trudie Chalder, is awaiting advice from the Trust R&D as to whether the affected participants should be made aware of the theft”.

      The same letter stated that recordings were being downloaded to CD only on a monthly basis, a working methodology that is not compatible with the promises of confidentiality set out in the “Invitation to join the PACE trial” leaflet.

      The letter carries a handwritten annotation dated 13th April 2006: “Noted. Sad! No action needed”.

      It seems that the patients involved were not warned that confidential information about them had been stolen.

  7. This Retraction Watch article claims: “Patients and advocates have disputed these claims — arguing, among other things, that the findings may prompt some to believe chronic fatigue is a mental, not a physical, disorder.”

    In terms of access to data, the issues couldn’t be more simple; Patients simply want transparent reporting of medical trial data. (Data that directly affects patients’ lives.) This includes reporting the PACE trial endpoints as set out in the trial’s protocol. (We currently only have access to the investigators’ post-hoc primary and recovery analyses.)

    The wider issues surrounding the PACE trial are nuanced and complex, and incorporate issues relating to the appropriateness and safety of the interventions in clinical settings, and how the PACE trial directly and indirectly affects health care policy. But these are very complex issues and should not be reported simplistically.

  8. In addition to my last comment, Goldsmith et al. of the FINE trial (sister to the PACE trial and both funded by the UK Medical Research Council) have recently published individual-level data with their paper in PLOS One. Furthermore, upon release of the dataset, PLOS One issued this statement with the paper: “The authors have prepared a dataset that fulfills requirements in terms of anonymity and confidentiality of trial participants, and which contains only those variables which are relevant to the present study.”

    http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0144623

    Here is an extract of a comment I recently left on James Coyne’s blog:

    https://jcoynester.wordpress.com/2015/12/15/plos-one-response-to-concerns-about-kings-college-refusal-to-share-pace-data/#comment-1351

    “By conducting a similar study to PACE and publishing in PLOS One with the revised data sharing policy, the FINE group have seriously undermined the assertions by QMUL/PACE that the above FOIA request is a violation of anonymity, confidentiality, MRC data sharing guidelines, etc. QMUL have attempted to avoid releasing data from the PACE trial by arguing that a minimal selection of ‘anonymised’ (de-identified) individual level data could still somehow be used to track down and harass trial participants. It is good to see that this far-fetched ‘concern’ has not prevented the FINE group from providing open public access to the relevant data up front. The risk of re-identification in both cases is remote, and to my knowledge the several PACE trial participants who have come forward on social media have received only appreciation for sharing their stories. QMUL are substantially exaggerating the risk of re-identification and perhaps generating the anxiety they claim they are trying to prevent. Misleading headlines prompted by poorly-worded press releases/conferences and published papers from the PACE trial have caused the patient community far more distress than will ever be caused by releasing the requested data. […] The obvious trend in the wider research community is towards increased transparency and open data in ways which protect patient confidentiality.”

    One more thing, the commonly promoted narrative of anti-science harassment against the PACE trial, apparently conflates legitimate criticism (including that expressed in published letters to the editor and BMJ Rapid Responses) with malicious harassment. Irrespective of any occasional bad behaviour that is non-representative of the wider patient community, the narrative of harassment has been very effective at marginalising critics of the PACE trial as extremists, and I am glad that people such as James Coyne can see beyond it.

  9. Thank you for your continued and largely excellent coverage of this situation. However, there was a point that I felt deserved some clarification.

    “Patients and advocates have disputed these claims — arguing, among other things, that the findings may prompt some to believe chronic fatigue is a mental, not a physical, disorder. (No one knows what causes CFS, which can leave patients bedridden; some scientists think that there might be multiple causes.)”

    This is true enough, but doesn’t really get to the heart of why many patients and increasing numbers of scientists are concerned about PACE. David Tuller’s lengthy account (mentioned in the article), as well as recent blogposts by both Coyne and Keith Laws have uncovered important methodological concerns about the science in this trial, including the scrapping of objective measurements (which failed to show significant improvement) and overlapping entrance and recovery criteria, which meant that participants could deteriorate during the trial on some measurements and still be adjudged to have improved.

    Although for some who have undertaken these treatments PACE does significantly jar with their own experience of them, patients aren’t making a fuss here because it offends our view of our illness. There are important concerns about the trial that need to be resolved one way or the other, and this can be best accomplished by the data being released.

  10. It is a shame that White did not quote the section of the PACE consent form that he believes means that he is unable to share anonymised data. The closest thing I could find to this was section 14.1:

    “14.1 understand that information collected about me for the trial, including my
    personal details, a copy of this consent form and all of the questionnaires I
    complete for the trial, will be held securely by the local trial staff and at the PACE
    trial centre at Queen Mary, University of London. I give permission for this to
    happen.”

    If there was an explicit prohibition to sharing data, one would expect that to have been mentioned to the Information Commissioners Office, yet nothing was said of it, and the ICO ruled that the PACE data requested should be released (QMUL are appealing that judgement):

    https://www.whatdotheyknow.com/request/selected_data_on_pace_trial_part

    “Patients and advocates have disputed these claims — arguing, among other things, that the findings may prompt some to believe chronic fatigue is a mental, not a physical, disorder.”

    I understand that this is a complicated topic, and difficult to summarise, but I fear that this presents patients and advocates in the way White chooses to (which is not to say that White comes across well himself). The debate about whether CFS should be treated as a mental health condition often serves to distract people from more clear-cut problems with the PACE trial’s design and presentation of results.

    This was a non-blinded trial, where patients receiving CBT and GET were told that these interventions had been shown to be effective, and were given models of their illness which emphasised their own ability to improve their symptoms. Its primary outcomes were subjective self-report questionnaires.

    Results were not released in the manner laid out in the trial’s protocol. Some of the protocol’s primary outcomes still have not been released, and the trial’s ‘recovery’ criteria was watered down to the point that patients could report declines for the trials two primary outcomes and yet still be classed as recovered. (There are a range of ‘recovery’ claims emanating from the PACE trial, but the above applies to even their most conservative, which was described as “contradictory” in the AHRQ’s recent evidence review).

    Before the trial’s results were released patients were expressing concern about the lack of objective outcomes for this nonblinded trial, and the danger of results for questionnaires being biased. eg: actometers were used at baseline, but had been dropped as an outcome measure. A meta-analysis released during the PACE trial of previously unreleased actometer data showed that CBT led to patients completing questionnaires somewhat more positively, but not being able to increase their activity levels. Of the three outcome measures classed as ‘objective’ by the PACE trial team, CBT was not associated with an improvement in any of them, and GET only led to a minor improvement (short of the trial’s criteria for clinical significance) in one: the 6 minute walking test.

    The PACE trial touches upon a number of important moral and political controversies, particularly in the UK, but the basic methodological problems with the trial must not be over-looked. Had a trial of ‘alternative medicine’ been conducted in this way, it would have been universally laughed at. For PACE we’ve instead had years of medical authorities pretending that the only problem is that critics are prejudiced against psychiatry or fail to understand the subtle and sophisticated ways that psychosocial factors can affect patient’s health. The smooth tongues and social connections of posh British Professors may provide the PACE trial with a facade of respectability, but that seems to be the only defense they have against the arguments [threatening harassment?] of their unwashed critics.

  11. If it were an issue of patient confidentiality, then why would they deny the FOI request on the grounds that it was “vexatious”? Why would Coyne be given one explanation, while White gives another explanation to Retraction Watch? The KCL letter says “The requested information relates to economic analysis undertaken by academic staff with considerable experience in this field. External sources were used as part of that analysis and the process took approximately one year to complete. We would expect any replication of data to be carried out by a trained Health Economist.” But why is this relevant at all if the reason is patient confidentiality? Basically all of the reasoning in the KCL letter is irrelevant then.

  12. Surely the right thing to do is to release the data to a totally independent group who have not expressed strong views in this area. The vicious personal attacks, which can take place when there are strongly held views by advocates and patients, militate against responsible science. I know this from MMR and autism.

    Such an independent group should set out a careful protocol for a re-analysis taking into account both the authors’ strategy and some of the criticisms.

    A problem is that it tends to be the case that the only people who have the time to do such a re-analysis unfunded are those with strongly held personal views who are prepared to devote (or divert) their time to getting answers that they wish to find. “scare” stories are beloved by the media and there is a danger that academics can fall into the same trap as a form of publication bias.

    There is a need for funding authorities to allow for replication of analyses by independent groups but they like to have “new” research, rather than verifying what has been published.

    It must be a requirement for those who ask for data to publish their detailed statistical analysis plan (SAP), preferably peer-reviewed, prior to requesting the data. The journal concerned should be prepared to arrange for peer review of that SAP, and the original authors should be given the opportunity to publish their comments as well, prior to any re-analysis.

    We all know that it is possible to “torture the data until it confesses”. This is why, especially in the regulatory field, such SAPs are a vital component of research.

    We also have evidence that investigators change primary outcomes between protocol or trial registration and publication e.g. Lancet. 2008 Jul 19;372(9634):201.; JAMA. 2009;302(9):977-984. The human condition is that we like to get the results that favour our existing hypotheses and this is likely to apply to those wishing to do re-analysis as well.

    C of I: None to declare- I have never published or done research on CFS or similar topics

    1. Stephen Evans states “It must be a requirement for those who ask for data to publish their detailed statistical analysis plan (SAP), preferably peer-reviewed, prior to requesting the data. ”

      A starting point for patients wanting to look at the PACE data would be simply to submit the original protocol. One of the big issues is that the data reported does not match what was promised. Of particular interest would be to test the published protocol definition of recovery instead of the very much weakened one that was published.

      The next stage would be to address issues of objective vs subjective measures. The problem is that where treatments (CBT and GET) are designed to change perceptions of a disease it is very hard to measure success by asking about perceptions. Hence an analysis that looks at whether there are correlations between changes in things like the 6 minute walking test, step test data and job and benefit data and questionnaire reported results would be important. We do know the more objective measures show little if any improvement.

    2. Personally I don’t see why anyone and everyone shouldn’t have access to the data and be able to reanalyse it. After all that is what the PLOS policy calls for and what the authors agreed to when they published. It is the policy of PLOS that anyone should be able to access the data for a good reason. Whereas data can be tortured to confess anything, I dont think that is true of any believable confession. If a reanalysis looked contrived people will see that – and importantly they wont have to take anyone’s word for it, they can check any particular claim themselves just by looking at the available data directly – something that people cannot do currently, rather we must believe what we have been told.

      I am sure that all patients would welcome an independant analysis though, patients and academics have been calling for that already, including the open letter to the Lancet from a group of academics (linked in the main article above). I hope that happens, that those from outside MECFS will study the data and comment on it. Patients want the science to speak for itself. I think it is worth pointing out though that Professor Coyne has no history with MECFS. Until a couple of months ago he hadn’t been involved at all (so far as I am aware). I think it was the recently published follow up paper that drew his attention as it was hailed as a success for CBT and GET even though between-groups comparisons of the long term follow-up data show a null result (and outweigh within-group comparisons). Here is Coyne’s blog about it:
      http://blogs.plos.org/mindthebrain/2015/10/29/uninterpretable-fatal-flaws-in-pace-chronic-fatigue-syndrome-follow-up-study/

      The really obvious thing though would be to simply reanalyse the data according to the author’s own original protocol. I don’t see what complaints there can be against that from the authors themsevles. The current situation where all objective outcomes were dropped in favour of only subjective ones for publication, and where a partisipant could score worse on the published primary outcome measures than when they entered the study, clearly means that people do not have confidence in the published paper. That is only going to be resolved, one way or the other, by releasing the data so everyone can see what it tells us.

  13. I do not understand the statement that White makes about perhaps doing a long term follow up.

    “We think it’s really important for our trial — A, for if we do a long-term follow up, which I hope we will. ” – White

    They have just published a long term follow up paper which reports on patients progress around 2.5 years after the start of the trial. This data collection was finished in April 2011 as reported by this quote from their paper:

    “Between May 8, 2008, and April 26, 2011, 481 (75%) participants from the PACE trial returned questionnaires. Median time from randomisation to return of long-term follow-up assessment was 31 months (IQR 30–32; range 24–53).” – Pace Long Term Follow up Paper

    I believe they may have a 5 year follow up study in progress but the data collection for this should be complete. Given the date of completion of collecting the 2 years follow up data of April 2011 an additional 3 years would take the data collection to April 2014 for the 5 year follow up study. Given it is now late 2015 this data should have been collected.

    Perhaps a question is why did it take them until Oct 2015 to publish the first (2 year) long term follow up given the data was available in end April 2011. Come to think of it data for any 5 year follow up study should have been available for over a year.

    It is perhaps worth noting that the 2 year follow study showed no significant differences between all arms of the trial including the SMC treatment as usual group. Despite this press coverage concentrated on reporting that scores for the favored treatments (CBT and GET) did not deteriorate. The lack of a significant result between the groups was due to others improving. What was also clear is that patients had subsequently had other treatments making it hard to interpret long term follow up data. Difficulty in interpretation of the data will only increase with longer follow up times.

  14. One point in this article that I would like to clarify is that a lot of ME/CFS patients actually don’t give a fig about the results of the PACE trial, which a lot of us feel were completely unremarkable and only confirmed what we as patients have known for a long time (aside from the fact that 5 millions pounds were wasted in obtaining them, which most patients feel would have been much better spent on biomedical research into the underlying pathophysiology of the disease).

    What we do care about however is how the PACE authors have deviated significantly in their reporting of the results from virtually every pre-specified outcome measure described in the published trial protocol in order to obtain even the ‘moderate’ results they have reported thus far. Then to compound this situation, the PACE authors have engaged in an enormous public relations campaign facilitated by the UK Science Media Centre to promote their results which has lead to grossly incompetent, exaggerated, and sensationalized media coverage of the issue both in the UK and abroad. For a quick example, see how a null result (ie. “There was little evidence of differences in outcomes between the randomised treatment groups at long-term follow-up) (1) turned into grossly incompetent and sensationalized headlines following the most recent SMC PACE media briefing (2)- (‘Chronic Fatigue Syndrome sufferers ‘can overcome symptoms of ME with positive thinking and exercise” & ‘ME can be beaten by taking more exercise and positive thinking, landmark study claims’) (3,4).

    Perhaps the most egregious example of the many changes the PACE authors made in their trial was the lowering of one of the thresholds for ‘recovery’ from a 20-25 point increase in the SF-36 PF sub-scale all the way down to a 5 point overlap between the trial’s entry criteria (which required that participants be suffering from ‘severe and disabling fatigue’) and ‘recovery’. A similar threshold substitution was made with the instrument for PACE’s other primary outcome measure, ie the Chalder Fatigue Scale. This is despite the fact that PACE’s Trial Steering Committee had previously recommended that a 5 point gap between entry criteria and ‘positive outcome’ (ie only about half the way to ‘recovery’) be increased due to it representing a merely ‘trivial improvement’. (5)

    Furthermore, when challenged regarding the changes made to the outcome measures, the authors have repeatedly denied having changed the outcomes since they are still using the SF-36 (and Chalder Fatigue Scale) to measure the results! (3) This would be analogous to a trial of growth hormone requiring an entry criteria of 65 inches, with a successful outcome being 85 inches, but during the course of the trial having the researchers lower the outcome criteria all the way down to 60 inches and claiming success! Then when challenged that the ‘success’ was only obtained via altered outcome measures, the authors claim they have not altered the measure since they are still using a tape measure or yardstick or whatever as their measuring instrument! This is a situation that many patients (and a growing number of academics and researchers) find absolutely outrageous.

    To top it all off, the PACE researchers have made a devoted practice of spreading out and burying the objective findings from the trial (nearly all of which were negative) deep down within other papers on ostensibly ‘positive’ findings, sometimes giving these vital outcomes all of a single sentence worth of space in an entire paper.

    Speaking for myself as a ME/CFS patient, basically what I want is for the PACE results to be re-analyzed strictly according to the published trial protocol and compared side by side with what the authors have published thus far. I think this alone would make crystal clear the problem that patients have with PACE and it’s authors. Then a logical next step would be to include objective measures of outcome from the trial, such as 6 Minute Walk Test results, self-paced step test, hours worked, etc. as part of the analysis for potentially ‘successfully treated’ patients as well. The damnedest part of the whole thing is how easily such calculations could be done and how myself and other patients are being smeared as being part of ‘organized campaigns to discredit the trial’ for making such seemingly benign and common sense requests.

    1. http://www.thelancet.com/pdfs/journals/lanpsy/PIIS2215-0366%2815%2900317-X.pdf

    2.http://www.sciencemediacentre.org/cfsme-pace-trial-follow-up-study/

    3. http://www.telegraph.co.uk/news/health/11959193/Chronic-Fatigue-Syndrome-sufferers-can-overcome-symptoms-of-ME-with-positive-thinking-and-exercise.html

    4. “The two primary outcomes for the trial were the SF36 physical function sub-scale and the Chalder fatigue questionnaire, as in the published trial protocol; so there was no change in the outcomes themselves.”
    http://www.virology.ws/2015/10/30/pace-trial-investigators-respond-to-david-tuller/

    5. “The outcome measures were discussed. It was noted that they may need to be an adjustment of the threshold needed for entry to ensure improvements were more than trivial. For instance a participant with a Chalder score of 4 would enter the trial and be judged improved with an outcome score of 3. The TSC suggested one solution would be that the entry criteria for the Chalder scale score should be 6 or above, so that a 50% reduction would be consistant with an outcome score of 3. A similar adjustment should be made for the SF-36 physical function sub-scale.” The PACE Trial First Meeting of the Trial Steering Committee, 22nd April, 2004.

  15. Professor Peter White is claiming that he promised to keep patients’ data securely; has he forgotten that some of this data was not held securely? The following extract comes from “Magical Medicine: How to make a disease disappear” (p256):

    http://www.meactionuk.org.uk/magical-medicine.pdf

    On 31st March 2006 Peter White wrote to the West Midlands Multi‐centre Research Ethics Committee to inform them of the theft of a digital audio recording (DAR) of GET sessions from Centre 03 (which is King’s College, ie. Trudie Chalder’s Centre). This confidential information was stolen from an unlocked drawer in the therapists’ office. Peter White informed West Midland MREC that: “There are no lockable cabinets in any of the therapists’ rooms so the drawer was not locked” (cf SSMC Participant Information Sheet). His letter continued:

    “The burglary was reported to Southwark police on the day that it happened, which was Wednesday 22nd March 2006. The crime number is 3010018‐06. The therapist was away on leave 22nd‐24th March and therefore the DAR was not found to be missing until Monday 27th March 2006”. It was only after the theft that Professor Trudie Chalder sought advice on how to secure the data properly.

    The letter also said: “The Principal Investigator for this centre, Professor Trudie Chalder, is awaiting advice from the Trust R&D as to whether the affected participants should be made aware of the theft”.

    The same letter stated that recordings were being downloaded to CD only on a monthly basis, a working methodology that is not compatible with the promises of confidentiality set out in the “Invitation to join the PACE trial” leaflet.

    The letter carries a handwritten annotation dated 13th April 2006: “Noted. Sad! No action needed”.

    It seems that the patients involved were not warned that confidential information about them had been stolen.

  16. Thanks to journalist David Tuller for all the work he did researching the PACE Trial. He really dug into it and judging with my dealings with him, spent a huge amount of time and energy doing so.

    I’d recommend people read his articles if they want to find more.
    The first one is here: http://www.virology.ws/2015/10/21/trial-by-error-i/

    More can be found: http://www.virology.ws/tag/mecfs/

    I believe James Coyne got interested in it following reading David’s pieces and James’ involvement has now helped bring some issues forward.

    1. An example of the great work David Tuller did can be seen in part 3 of his initial piece, “TRIAL BY ERROR: The Troubling Case of the PACE Chronic Fatigue Syndrome Study (final installment)” http://www.virology.ws/2015/10/23/trial-by-error-iii/ (posted nearly two months ago on October 23).

      He exposed how the PACE Trial investigators’ claim that “sensitivity analyses revealed that the results were robust for alternative assumptions” in the cost-effectiveness paper was not true (this is the PLOS ONE paper that is the main focus of the Retraction Watch blogpost). This level of detailed investigating and reporting was, and is, very impressive.

  17. The PACE Study is a weak one for all kinds of reasons (few trials are genuinely conclusive), and the University has made things much worse for itself by its response. However, I do feel tremendous sympathy for the PACE Investigators themselves – none of who I know personally and for whom I have no brief to speak. Once it was clear that the trial results were going to suggest that there CFS is a complex multifactorial syndrome that may for some people include a psychological component, the events that are described in the University’s letter did unfold. I would never encourage my PhD students to do research in this area and I see little interest in it amongst the top flight young medical researchers in my university, because they will be damned if they do (find active psychological components in the syndrome) and damned if they don’t (find a pathophysiological basis for the disease).

    1. The PACE Trial investigators have brought a lot of the problems on themselves by their spinning of results e.g. claiming people were back to normal functioning with lower scores than entry scores; drastically changing (relaxing) the recovery criteria; spinning the long-term follow-up study as showing evidence for CBT and GET when actually it was a null study; etc. I suggest you read more before taking their side.

    2. If you do poor quality work that has a profound impact upon the lives of patients, some patients will be upset about that. It is the patients who are living with ill health, often unable to work and in poverty, who are now also having to deal with misleading claims being made about their condition and the efficacy of the treatments available to them in journals, doctor’s offices and the mass media. The PACE trial researchers have been part of a campaign which attempted to discredit their patient critics by presenting them as being motivated by removing any association between the illness and psychiatry rather than a pursuit of truth. They did not need to do this, and it has made it very difficult for them to now acknowledge that many of patients’ criticisms were entirely valid, and serve to fundamentally challenge the claims that the researchers have made. My sympathy for the researchers who have put themselves in this uncomfortable position is rather limited.

    3. I am a researcher in the ME/CFS field. Although my (and some of my colleagues’) views have not always jibed with those of patients, I have not faced the same criticism as the PACE researchers. I get criticism but it’s more about the science. Why? Because most patients feel that I and others like me are trying to work in their best interest. In my case, I have even involved patient focus groups in the early stages of some of my work. Some researchers even stay in this field, despite the funding challenges, because of the patients, who send them thank you cards, help them recruit subjects, and fund their work.

      What is required of researchers is they keep an open mind and not accept the mainstream views of ME/CFS without good evidence. The field really requires you to be on your toes and watch out for your own prejudices.

  18. Dr Kenneth Witwer from the Johns Hopkins University School of Medicine, Baltimore, and author of “Data Submission and Quality in Microarray-Based MicroRNA Profiling” ( http://www.clinchem.org/content/59/2/392.long ), also reports problems with access to raw data from some papers in PLOS ONE.

    Copy/pasted from https://www.geneticliteracyproject.org/2015/12/16/seralini-feed-contamination-study-plos-fire-not-following-guidelines-data-access/

    “I interacted with the editor you mention and others at PLOS ONE after identifying data accessibility problems in microRNA datasets http://www.ncbi.nlm.nih.gov/pubmed/23358751

    Of a small number of PLOS ONE studies with missing data that I followed up on, two were retracted by the authors after they responded to journal requests for data and the data were faulty or didn’t support their claims. For the others, putative investigations continued, but there is no outcome almost three years later.

    In August, 2015, I contacted Puebla and some more senior people at PLOS ONE to learn about the outcome of these investigations and to request missing data from two more recent articles. After “we will get back to you soon” messages in September and October, I have heard nothing more. The editors have not responded to follow-up messages.
    To be fair, PLOS ONE is not the only journal with these problems; the incredible volume of studies they publish means editors might be swamped; and of course PLOS ONE is often a destination for manuscripts that won’t be published elsewhere. But PLOS ONE should be careful not to lose more credibility by not enforcing their guidelines.” [posted on 19 December 2015]

    The abstract of his nice paper has a worrisome conclusion: “These findings buttress the hypothesis that reluctance to share data is associated with low study quality and suggest that most miRNA array investigations are underpowered and/or potentially compromised by a lack of appropriate reporting and data submission.”

  19. If scientists and researchers are unwilling to subject their underlying data to peer review, then they should not be afforded the benefits of peer reviewed publication. Peer reviewed publications should consider that if they are unwilling to enforce standards with regard to peer review, then their own reputation and value may be lost.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.