Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-22T20:39:57.353Z Has data issue: false hasContentIssue false

Breaking down the silos in simulation-based education: Exploring, refining, and standardizing

Published online by Cambridge University Press:  23 November 2020

Andrew K. Hall*
Affiliation:
Departments of Emergency Medicine, Queen's University, Kingston, ON
Erin E. Brennan
Affiliation:
Departments of Emergency Medicine, Queen's University, Kingston, ON
Rob Woods
Affiliation:
Emergency Medicine, University of Saskatchewan, Saskatoon, SK
*
Correspondence to: Dr. Andrew K. Hall, Queen's University, Department of Emergency Medicine, Kingston Health Sciences Centre, 76 Stuart Street, Kingston, ONK7L 2V7; Email: [email protected]; Twitter: @AKHallMD.

Abstract

Type
Commentary
Copyright
Copyright © Canadian Association of Emergency Physicians 2020

Simulation-based education in Canadian emergency medicine (EM) has been expanding for many years. This month's CJEM includes three papers that help readers understand both the collaborative expansion and some of the ongoing challenges. While local siloed innovations may have driven much of the initial development of simulation-based education, the current innovations are functioning to bring individuals and sites together with common goals and processes. Despite this collective forward movement, there are clear challenges that have come with the increased expectations of simulation, and there is still much work to be done.

In this issue of CJEM, Caners et al.Reference Caners, Baylis, Heyd and Chan1 demonstrate for us how the development of the novel EMSimCases Free Open Access Medical Education site, with its simulation case database and other resources for simulation educators, has fostered a culture of sharing and collaboration across the country. Simultaneously, Baylis et al.Reference Baylis, Heyd and Thoma2 articulate an innovative response to the problem of site or person-specific case development and structure, by harnessing a national group of educators and stakeholders in the creation of a standardized national simulation case template. Via diffusion from site leads to local simulation educators across training sites in Canada, this template is already being used broadly. The combination of a standardized case template and a reputable, peer-reviewed platform for sharing, suddenly brings all simulation educators together in the creation, revision, and implementation of simulation-based education. This has given training programs with limited resources or experience the opportunity to engage in simulation-based training without re-inventing the wheel. The outcome will be an educational experience across Canada that is more standardized and consistent between sites.

Increasing use and availability of simulation has come with increased expectations of what we can do with simulation. In our recent transition to competency-based medical education (CBME)Reference Sherbino, Bandiera and Doyle3 and the clear demand for accountability and documentation of competence, there is motivation for some of our assessments to be performed in the simulated environment. To supplement workplace-based assessment in the heterogeneous and clinically unpredictable emergency department, the simulation-based assessment has been encouraged as a mechanism of standardized assessment across programs, for both summative and formative intents, without any resultant harm or alterations to real-patient care.Reference Hall, Chaplin and McColl4 It makes intuitive sense to many that trainee performance in the simulated environment would be a surrogate for real-world performance. In this issue of CJEM, however, Prudhomme et al.Reference Prudhomme, O'Brien, McConnell, Dudek and Cheung5 highlight the complexities of the relationship between simulation-based assessment and real-world performance. They found, in fact, a minimal correlation between trainee assessments of the same Entrustable Professional Activities (EPAs) in the simulated environment as compared to the real world. So while there is evidence supporting the correlation between performance in simulated and real-world environments, particularly in procedural domains,Reference Brydges, Hatala, Zendejas, Erwin and Cook6 the relationship is clearly more complicated when looking at broad multi-dimensional skills like resuscitation.

So, if simulation is being used with increasing frequency, consistency, and more complex expectations across the country, how can we respond to the challenges that may arise? Following in the collaborative spirit of Caners et al.Reference Caners, Baylis, Heyd and Chan1 and Baylis et al.,Reference Baylis, Heyd and Thoma2 the Emergency Medicine National Simulation Education Researchers Collaborative (EM-SERC)Reference Chaplin, Thoma and Petrosoniak7 and others engaging in both research and evaluation studies can engage in multi-site simulation studies to start asking these important questions. In addition to encouraging the use of simulation for assessment, the transition to CBME has brought a substantial increase in assessment data. Thoma et al.Reference Thoma, Hall and Clark8 have encouraged us to use these data for evaluation recently with a national study aggregating and analysing EPA assessments across Canadian training programs with an aim to understand EPA assessment patterns and progression decisions. Moving forward, as more assessment data emerges from the simulation environment, we can use these data to compare qualitative and quantitative assessment data from both environments to understand their relationship and contributions.

In summary, we need to continue to move forward with our use of simulation, share cases and experiences using sites like EMSimCases,Reference Caners, Baylis, Heyd and Chan1 and use a national case template,Reference Baylis, Heyd and Thoma2 while doing so with an awareness that there are challenges with the increased use of simulation that are still to be addressed. Overall, the understanding that we are all moving forward as group, with shared goals and prioritiesReference Chaplin, Thoma and Petrosoniak7 rather than as siloed institutions, is certainly reassuring. Being able to work and speak as one gives global credibility and the potential to answer tough questions through national collaboration and research. Overall, we hope that EM education will be enhanced by ongoing collaborative efforts to standardize our approach to simulation and critically explore its strengths and limitations.

Competing interests

None declared.

References

REFERENCES

Caners, K, Baylis, J, Heyd, C, Chan, T. Sharing is caring: how EM Sim Cases (EMSimCases.com) has created a collaborative simulation education culture in Canada. CJEM 2020;22(6):819821.CrossRefGoogle Scholar
Baylis, J, Heyd, C, Thoma, B, et al. Development of a national, standardized simulation case template. CJEM 2020;22(6):822824.CrossRefGoogle Scholar
Sherbino, J, Bandiera, G, Doyle, K, et al. The competency-based medical education evolution of Canadian emergency medicine specialist training. CJEM 2020;22:95102.CrossRefGoogle ScholarPubMed
Hall, AK, Chaplin, T, McColl, T, et al. Harnessing the power of simulation for assessment: consensus recommendations for the use of simulation-based assessment in emergency medicine. CJEM 2020;22:194203.CrossRefGoogle ScholarPubMed
Prudhomme, N, O'Brien, M, McConnell, MM, Dudek, N, Cheung, WJ. Relationship between ratings of performance in the simulated and workplace environments among emergency medicine residents. CJEM 2020;22(6):811818.CrossRefGoogle ScholarPubMed
Brydges, R, Hatala, R, Zendejas, B, Erwin, PJ, Cook, DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015;90:246–56.CrossRefGoogle ScholarPubMed
Chaplin, T, Thoma, B, Petrosoniak, A, et al. Simulation-based research in emergency medicine in Canada: priorities and perspectives. CJEM 2020;22:103–11.CrossRefGoogle Scholar
Thoma, B, Hall, AK, Clark, K, et al. Evaluation of a national competency-based assessment system in emergency medicine: a CanDREAM study. J Grad Med Educ 2020;12:425–34.CrossRefGoogle Scholar