Jason presents a study focused on mastery learning, that examines the strategies used to acquire skills in early career professionals. The study seeks to answer, are different strategies associated with different learning trajectories? Jason praises the well defined educational concepts, but will how will he rate the paper as a whole? Listen here to find out.
KeyLIME Session 297
Mema et. al., Using Learning Curves to Identify and Explain Growth Patterns of Learners in Bronchoscopy Simulation: A Mixed-Methods Study. Acad Med. 2020 Dec;95(12):1921-1928
Jason R. Frank (@drjfrank)
Currently, as we record this episode, there is a Netflix hit miniseries called the Queen’s Gambit about a young orphan who is discovered as a chess prodigy. Question: What does the main character, Beth Harmon, have in common with many #meded learners? Answer: Mastery learning. In fact, for the highest achieving in any field, they display what psychologist Ellen Winner calls “a rage to master”, a hunger to focus on getting to the next level in an area of human endeavour.
In contemporary competency-based health professions designs, the goal is achievement of a pre-defined level of competence for all elements of a set of competencies. Time becomes a resource for progression towards competence, not the goal of education. The wonderful work of Bill McGaghie, Jeffrey Barsuk, Diane Wayne, & Saul Issenberg has created compelling evidence that such mastery learning approaches are superior for #meded and patient safety. Martin Pusic and others have done pioneering work describing medical trainee learning curves.
However, little is known about how learners actually learn in HPE. What are the strategies used to acquire skills in early career professionals? Are different strategies associated with different learning trajectories?
This unique study enrolled 20 pediatric subspecialty trainees and 7 experts (attending pediatricians) to characterize their patterns of learning while using a VR simulator of bronchoscopy.
To their credit, the authors nicely defined some of their educational concepts (a KeyLIME proskill):
- Learning curves = “rate at which trainees acquire a skill” and can graphically illustrate the relationship between effort and performance over time.
- Mastery learning = curriculum designed for demonstrated performance, not time spent on a task.
- Adaptive expertise = how to complete a task is complemented by the understanding of why an approach would work in a particular context.
They also got points for using Messick’s validity framework in their study design.
The authors used the bronchoscopy simulator as the setting and task for it had valid measures of performance that were sensitive enough to detect learning changes over time.
The trainees completed instructor-led classroom teaching, then self-guided e-learning anatomy modules. Finally all trainees practiced skills on the bronchoscopy trainer supervision available nearby. Practice involved individualized feedback and independent practice. Learners could choose to involve the instructor or now during their practice.
Quantitative data were automatically provided by the simulator, including: accuracy (number of bronchi accessed), speed (time taken), and dexterity (number of collisions with the walls). A standardized composite score of these numbers was created and compared with the average score of expert physicians. This composite performance score was graphed against an index of learning effort (repetitions). Growth mixture modeling (GMM) was used to analyse the groups for patterns of learning curves.
Qualitative data were assembled from field notes from observations of trainee practice, including strategies used, and contact with the instructor. Trainees were interviewed when they completed the sim training program. This included reviewing the trainee’s learning curves. The investigators also performed interviews with the instructors to characterize their experiences with the trainees’ learning behaviours and performances. Constant comparative analyses were used to create a stable set of themes. Adaptive expertise was used a sensitizing concept.
Key Points on the Methods
20/29 trainees and 7/9 faculty provided complete data for the study.
GMM curve analysis identified 2 types of learning curves:
- Fast growth – rapidly improving scores that approached expert performance, and
- Slow growth – lower starting scores that plateaued early at a lower score.
Qualitative analysis explained 2 different patterns of learning and practice:
- “Learner on track” was a pattern characterized by behaviours reflective of adaptive expertise, including early struggle, curiosity, solution searching, instructor consultation, experimentation, and anatomy learning that lead to deeper understanding and better performance; and
- “Mechanistic” learning pattern that involved repetitive practice without creativity and exploration.
3/8 learners displayed the “Learner on track” pattern of learning, but the slow growth trajectory. Most of the Slow Growth learners used the mechanistic practice pattern.
33 documents were identified in the search and 4 were selected as “key large-scale social accountability policy frameworks”. The authors state that these 4 represent the “foundational values, principles, and/or parameters of social accountability in medical education”. All were the basis of later work.
These documents espoused some key content:
- Responding to local health needs
- Working with stakeholders
- Serving communities
- Increasing diversity of a workforce
- Addressing workforce shortages
- Ensuring competent graduates
- Ensuring curriculum reflects priority health needs.
All 4 frameworks were derived from 5 common core social values based on WHO work in the 1990s:
- Effectiveness, and
The authors identified 6 themes and related indicators using CIPP:
- Context = objectives (e.g. community partnerships)
- Inputs = actions taken to achieve goals (e.g. diversity in admissions)
- Processes = activities and curriculum (e.g. community placements)
- Products = institutional outputs, outcomes, and impacts on societal health (e.g. number of doctors/region).
The 4 key frameworks were all inter-related and refer to the work of Charles Boelen:
- WHO 1995: Defining and Measuring the Social Accountability of Medical Schools
- Health Canada 2001: Social Accountability: A Vision for Canadian Medical Schools
- 2009: Social Accountability and Accreditation (aka CPU model)
- 2010: Global Consensus for Social Accountability of Medical Schools.
The authors conclude that this was one of the first studies to provide validity evidence for growth curves in medical education. They caution that learning analytics require more than numerical data to allow interpretation.
Access KeyLIME podcast archives here
The views and opinions expressed in this post and podcast episode are those of the host(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page