(This is the ninth post of our #AppliedMedEdMethods101 series. View the others here: Beyond the RCT; Pre-Post Simulation; Discourse Analysis; Retrospective Cohort Studies; Critical Validity; Phenomography and Generalizability Theory)
By Ryan Brydges (@rbrydges)
Imagine you are a clinician educator who has been asked to search the literature on a topic that is new to you, and to produce a knowledge synthesis that best analyzes and integrates the included studies. Typically, this ask has “systematic review” somewhere.
Systematic reviews, like randomized-controlled trials, have a special place in science. I’d argue that we value these research labels more than their products are worth, at times. My opinion comes from my experience conducting a number of knowledge syntheses, using a variety of methods. My main goal in this blog is to help clinicians and researchers in medical education to differentiate between systematic reviews (which I conceptualize as running a systematic and rigorous search of the literature), and knowledge syntheses (of which there are many, many options).
Many people seem to believe that ‘systematic review’ goes only with ‘meta-analysis’, which I clearly disagree with. I run systematic reviews in most of my literature searches, but I approach the synthesis of the included studies very differently based on my questions, my team, and our goals for the project. By reading this blog, my hope is that readers learn to distinguish the logistics of running a systematic search, from the subsequent logistics of deciding how to analyze and integrate knowledge from the included studies.
On running a systematic review: I’ve learned that we researchers often think we can translate the quirky searching techniques we’ve developed through our careers into something systematic. No, no, no! Why do we presume we can do a job that others train hard for, on our own? Our key allies in conducting an actually systematic search of the literature are information specialists, and research librarians. These professionals know how to search every database, how to account for the subtle (or extreme) differences between databases, and how to help us rationalize which databases we need to probe. When I review knowledge syntheses submitted to medical education journals, I’m the reviewer who gives a hard time to authors who conduct the search themselves, without a librarian’s input. At my institution, librarians are now requesting a role as an author for searches they construct, as an appropriate form of recognition for the special knowledge and expertise they bring to a knowledge synthesis team.
On deciding how to synthesize evidence from the studies you’ve included: I’ve learned just how much I didn’t know. Sure, a meta-analysis can be quite informative, but I certainly do not think it deserves the gold-star status our field tends to bestow. I’ve personally been dissatisfied with the outcome of some meta-analyses, where we can only acknowledge that effects have been observed, rather than being able to specify potential mechanisms or explanations for those effects. In my quest for how else I could synthesize the evidence I’ve reviewed systematically, I came across an important article from Kastner et al. (2012). That article is the first in a series from that team, in which they conducted a scoping review (a type of knowledge synthesis) to better understand which knowledge synthesis techniques best suit different research questions. How meta is that!? I found their Table 1 incredibly helpful – “Characteristics of a preliminary list of existing knowledge synthesis methods”. That table opened my eyes to the possibilities of combining quantitative and qualitative evidence (my teams had excluded all qualitative evidence from our previous knowledge syntheses), and gave me plenty of reading on the historical traditions of various techniques I hadn’t heard of to that point, including realist synthesis, scoping reviews, and critical interpretive synthesis.
I don’t mean to dismiss all studies labeled ‘systematic review and meta-analysis’. I believe some I’ve been involved in have helped our field think differently about a topic (see Cook et al., 2013). And yet, I learned much more about myself as a researcher, about how to think about a topic, and about the nature of evidence through a project involving a scoping review and realist synthesis (Brydges et al., 2017). Notably, I believe the meta-analysis has seen and will continue to experience greater uptake than the realist synthesis, so I cannot deny the power of certain labels. So these are factors that must be weighed when deciding how to conduct your knowledge synthesis. Ultimately, systematic reviews and meta-analyses will likely get more attention, more citations, more accolades. Other techniques, like realist synthesis, may get you more insights, and better answers to your questions. That’s the rub. Why and how we choose to conduct the work can have drastically different outcomes for what you know vs. what your CV looks like. Read more about how you might best make these decisions, and about the current state of knowledge synthesis in a series curated for the Journal of Clinical Epidemiology (http://www.jclinepi.com/content/jce-Knowledge-Synthesis-Series).
1. Kastner M, Tricco AC, Soobiah C, Lillie E, Perrier L, Horsley T, Welch V, Cogo E, Antony J, Straus SE. What is the most appropriate knowledge synthesis method to conduct a review? Protocol for a scoping review. BMC Medical Research Methodology. 2012 Dec;12(1):114.
A knowledge synthesis conducted to understand more about knowledge synthesis techniques. While this is only the protocol, the article has a great Table 1 that I share as a resource.
2. Cook DA, Hamstra SJ, Brydges R, Zendejas B, Szostek JH, Wang AT, Erwin PJ, Hatala R. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Medical teacher. 2013 Jan 1;35(1):e867-98.
A systematic review and meta-analysis in which we studied the effects of different instructional design features in simulation-based education, and where I believe we contributed to our understanding of mechanisms beyond simply answering “is there an effect, and if so, how big is it?”
3. Brydges R, Stroud L, Wong BM, Holmboe ES, Imrie K, Hatala R. Core Competencies or a Competent Core? A Scoping Review and Realist Synthesis of Invasive Bedside Procedural Skills Training in Internal Medicine. Academic Medicine. 2017 Nov 1;92(11):1632-43.
My (and my team’s) initial foray into realist synthesis, where we learned a great deal about how to analyze and integrate knowledge across studies producing both quantitative and qualitative evidence. A great learning experience for us, and hopefully a useful read for you!