Site icon ICE Blog

Simulation Research You should Know About! Part Dos

By Jonathan Sherbino (@sherbino)

On Tuesday, I discussed the Top Simulation Papers from ICRE.  I hope that you also took a peak at the Simulation Summit. Why only read about simulation research if you can go hear about it live?!

So… Part Dos today.

#3. Watson K, A Wright, N Morris, J McMeeken, D Rivett, F Blackstock, A Jones, T Haines, V O’Connor, G Watson, R Peterson, G Jull. Can simulation replace part of clinical time? Two parallel randomised controlled trials. Medical Education. 46(7): 657-67.

This was a multi-centred non-inferiority RCT involving physiotherapy students. (I would direct you to the excellent methods section.  It is encouraging to see medical education research employing high caliber methodologies.)  The study compares whether simulated musculoskeletal (MSK) patient encounters could replace real MSK patient encounters.

The authors demonstrated that replacing 25% of a 4-week rotation with simulated patients showed no difference on end of rotation assessment, using a validated score.

My $0.02:  Hats off to the authors for their rigorous methodology.  But I question their starting premise.  Now, I’m not a physiotherapist (although my spouse is). Neither am I an Australian.  But I’m surprised by the need to adopt a resource intensive approach to learning (developing realistic standardized patient scenarios is hard!) for a common condition – MSK problems.  Simulation is best suited for uncommon, risky or complex learning tasks.  I would hope that we never arrive at a place in medical education where the nuance and experience of real patient encounters is all but replaced with simulated encounters.

#4. Boet S, MD Bould, B Sharma, S Reeves, V Naik, E Triby, T Grantcharov. Within-team debriefing vs. instructor-led debriefing for simulation-based education. A randomized controlled trial. Annals of Surgery. 258(1): 53-8.

This RCT examined whether within-team debriefing (i.e. peer-peer debriefing) was as effective as the ‘gold-standard’ instructor-led debriefing in improving subsequent non-technical skills in an operating room.  Anesthesia trainees, surgical trainees and nurses participated.   Within-team debriefing was allowed video review.  Blinded reviewers scored post debrief performance on a subsequent scenario.    All teams improved from pre to post debrief, but there was no difference between within-team v. instructor-led debriefing.

My $0.02: Again a great study design, although some critiques about the sample size (again Type II error) and the ability of the scale to differentiate performance of non-technical competencies.  However, this study again suggests that expensive simulation resources (e.g. instructor-led debriefing) may not provide a good return on investment.

So, the studies above suggest an important general theme in education that expands beyond simulation.  Instructional methods must be “fit for purpose.” Learning objectives determine the options for instructional methods. Feasibility narrows the list further.  So, what do you think?

Images courtesy of the Royal College

Exit mobile version