Should residents and fellows be encouraged to publish systematic reviews and meta-analyses?

Michelle Ghert

The ‘publish or perish’ culture is no longer reserved for academic faculty and post-doctoral fellows. The paradigm has spilled over (or bled into) medical training,  aided by the digital revolution. The widespread availability of online library catalogs and referencing software has enabled the mass production of systematic reviews and meta-analyses. 

In short, medical research no longer requires original ideas, just access to the internet, which is perhaps why, as one 2018 editorial put it, there is “Replication, Duplication, and Waste in a Quarter Million Systematic Reviews and Meta-Analyses.”

With all of that in mind, the orthopaedic surgery residents at McMaster University in Hamilton, Ontario, Canada, gathered virtually for their annual research day to debate whether they supported or rejected the status quo that residents be encouraged to publish systematic reviews and meta-analyses. At the start of the debate, following an opening Visiting Professor Presentation on trends in retractions by Retraction Watch co-founder Ivan Oransky, 58% of the residents opposed the status quo, while 42% supported it. 

Aaron Gazendam

Those arguing for the status quo said that systematic reviews had both positives and negatives. They provide an opportunity for trainees to learn about research and the academic publication process. Ethics approval is not required, and therefore research is accessible to trainees at all levels, particularly at levels where trainees do not have the time or resources to conduct original medical research. Occasionally systematic reviews and meta-analyses have clinical impact or can lead to future impactful research.

But those arguing against the status quo pointed out that the explosion of published systematic reviews has saturated the academic literature with studies that add little to no value – an enormous waste of resources . There is no incentive to stop publishing systematic reviews. Instead, tremendous incentives exist to publish them.  Once the relative ease of these studies became common knowledge in medical training, medical students, residents and fellows have found themselves caught in the tsunami. 

And that tsunami only worsened once medical trainees started to mass-produce systematic reviews, their CVs bloated to several pages of listed publications. The next generation must keep up to be competitive for residency, fellowship, and faculty positions. Medical trainees are now under tremendous pressure to ‘publish or perish’ at even earlier stages of their careers.

The pressure to present a ‘buffed up’ CV inevitably results in ‘quantity over quality,’ opponents of the status quo argued. To combat this phenomenon, academic institutions could provide avenues that allow fledgling academics to focus on larger, meaningful research questions without the fear of low academic output in the short-term. Institutions could also consider a more holistic view of academic productivity, taking into account clinical and academic impact in addition to the actual number of publications, and the trainee’s role in the research process. 

But those arguments were not persuasive – perhaps in part because one of the members of the team supporting the status quo was a past debate champion.  The final vote still went to the opponents of the status quo, but by just 6% – 53% to 47% – compared to the 16% before the debate.

Perhaps someone will write a systematic review on the subject.

Michelle Ghert is professor in the Division of Orthopaedic Surgery within the Department of Surgery at McMaster University, where Aaron Gazendam is a resident in orthopaedic surgery.

Like Retraction Watch? You can make a one-time tax-deductible contribution by PayPal or by Square, or a monthly tax-deductible donation by Paypal to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

12 thoughts on “Should residents and fellows be encouraged to publish systematic reviews and meta-analyses?”

  1. As a layperson who likes to check the scientific literature before I have an opinion on something, systematic reviews and meta-analysis can be *immensely* helpful. It is vital (especially in medicine!) for non-specialists to have a way to quickly learn the state of knowledge on a topic without doing their own literature review, and those types of papers are essential for this.

    However, years ago when I was first learning to read the literature, they were both fairly uncommon and when I found one it would almost always be very good. These days, a lot of them are terrible – they often don’t make any attempt to assess the quality of the studies they review, or to actually be systematic in which studies they include, and don’t provide the necessary information to assess quality oneself. (In these situations, duplication is good, because it’s not unusual to find two reviews on the same treatment or condition, a few months apart, with almost no overlap in the studies they cover, and that adds valuable data – the studies you linked seemed to count those as ‘overlapping’ even though they are not.)

    Also, compared to other fields especially, a huge number of these review papers in the medical field seem to be appearing in obscure journals that can’t be easily accessed without a subscription to the specific specialist journal, which makes them less than useless to the people who most need them.

  2. Nowadays reviews are mostly written to inflate own’s citations count. Pretty much useless and the good ones drowned into an ocean of crap. Personally, I remove them from the publication list when I have candidates that want to pursue a career in my group.

  3. I would vote “no” for none of the reasons above.

    If experienced experts are not on the team of one of these reviews then they will inevitably be of questionably quality. And there’s no way the experienced experts are going to be carefully assisting with this reviews if there are so many.

  4. And what if the problem is not that “anyone writes reviews,” but rather “anyone writes anything?”

    There might be already hard data on medical sciences, so whoever wants to post it, you’re welcome.

    Meanwhile, I ran a cursory Google Scholar search on something that I understand a bit more than biomed.

    /allintitle: “iot”/ – ~350 results (2011), ~22800 results (2021);

    /allintitle: “iot” “survey” OR “review”/ – 6 results (2011), ~720 results (2021);

    /allintitle: “encryption”/ – ~2000 results (2011), ~6000 results (2021);

    /allintitle: “iot” “survey” OR “review”/ – 18 results (2011), 65 results (2021);

    So, the number of review articles seems to grow roughly proportionally to the overall number of articles, or slightly faster for emerging topics.

    No conclusions here, just for info.

  5. I am increasingly skeptical of the requirement for residents to publish. Mostly what is published is worthless. It’s often based on convenience samples. I worked for a large upper midwest tertiary care hospital. One fellows program had an aggressive publication process. Part of my duties was to support this with statistical analysis. The research worked off mostly 2 convenience samples – the huge pool of cardiac screenings, and a smaller pool of normal procedures. The data was of mediocre quality really. But the surgical staff was adament that all fellows do 2-3 small studies per year. Mostly they were inconclusive and, in terms of knowledge, did nothing. The whole exercize was “make-work”. Here’s a fact – 80-85% of all physicians are not interested in research, are not good at it, and have no new ideas. Making them publish these tedious analyses is not helpful nor does it advance the science of medical care. Posting under an alias here.

    1. George, might these exercises serve to sensitize residents to be more attentive to, and perhaps even more discerning of, the quality of evidence-based recommendations? Frankly, I think I prefer to be under the care of a physician who has had hands-on experience with actual medical research than one who hasn’t.

  6. As always, the answer should be “it depends on the circumstances”. Though I agree with most arguments pro and contra, some more input:

    1) A systematic review (and meta-analysis) is an excellent starting point for a PhD thesis
    2) If someones publication list consists predominantly of reviews (or perhaps just one), this is an obvious sign of just ticking boxes or buffing up
    3) From a reader’s perspective, a review can present a nice overview of the current state of evidence; however instead of less scrutiny during peer review, these reviews should receive an even more extensive analysis on its quality exactly because of this overview functionality

    That being said, I think academic curiosity is beneficial for anyone who practices medicine. But that also true for teaching (school or sports), entrepreneurship, and legal and administrative experience. It’s just that publications are easy to measure.

  7. There are a lot of bad systematic reviews, but there are also a lot of bad studies. I find it very odd that some researchers are so opposed to the conduct of systematic reviews in general, but seem disinclined to apply the same logic to primary empirical research.

  8. What many funding agencies are now doing more and more is to ask for a list of for instance five top publications, instead of the entire oeuvre of researchers. I think that should become a more general practice to disincentivide publishing large quantities of worthless papers.

    Also, oral debates in general are a pointless waste of time. Debates are won by whoever is best at Gish gallop; being right or wrong has barely any impact.

  9. Academic medicine is broken. All it’s done as space filler in CVs for getting into specialty training and secure non-academic clinical jobs. Nobody cares about quality, but quantity and with the least effort. This is why medicine is bloated by low quality systematic reviews.

  10. now a days, this is common among even undergraduates of biomedical sciences. It may not be systematic reviews but review written on a particular topic. WE have genius students who are still in 3rd or 4th year of undergraduate degree course and supervisors encourage them to write review articles – and were successful in publishing them. I wonder what perspectives they can give on a topic having not been worked. Is superficial knowledge on a topic enough to write a systematic review or review for that matter. I am not surprised that a pre-teen or teenager (with AI assistance) publishes in a top tier journal in the near future – may be a review, meta-analysis or a systematic review. Future is here..

  11. Whether systematic review and meta-analysis should be encouraged or not is a meaningless issue. Whether a crucial question is being asked and whether novel insights are obtained should be the issue of concern. At present huge amount of data are lying around without interpretation. If you ask the right question and reveal some insight from existing data, it’s very valuable work. But a factory of meta-analysis for the sake of publishing might do more bad than good.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.