Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-23T20:06:48.017Z Has data issue: false hasContentIssue false

Determining content for a simulation-based curriculum in pediatric emergency medicine: results from a national Delphi process

Published online by Cambridge University Press:  20 May 2015

Ilana Bank*
Affiliation:
Division of Pediatric Emergency Medicine, Montreal Children’s Hospital, McGill University, Montreal, QC McGill Centre for Medical Education, McGill University, Montreal, QC Arnold and Blema Steinberg (McGill) Simulation Centre, McGill University, Montreal, QC
Adam Cheng
Affiliation:
University of Calgary, Calgary, AB Research and Development KidSIM-ASPIRE Simulation Research Program, Department of Pediatrics, Alberta Children’s Hospital, Calgary, AB
Peter McLeod
Affiliation:
McGill Centre for Medical Education, McGill University, Montreal, QC Departments of Medicine, McGill University, Montreal, QC Pharmacology, McGill University, Montreal, QC Pediatrics, McGill University, Montreal, QC
Farhan Bhanji
Affiliation:
McGill Centre for Medical Education, McGill University, Montreal, QC Pediatrics, McGill University, Montreal, QC The Royal College of Physicians and Surgeons of Canada, Ottawa, ON.
*
Correspondence to: Dr. Ilana Bank, Division of Pediatric Emergency Medicine, Montreal Children’s Hospital, 2300 Tupper #T121, Montreal, QC H3H 1P3; Email: [email protected]

Abstract

Objectives

By the end of residency training, pediatric emergency medicine (PEM) residents are expected to have developed the confidence and abilities required to manage acutely ill children. Acquisition of competence requires exposure and/or supplemental formal education for critical and noncritical medical clinical presentations. Simulation can provide experiential learning and can improve trainees’ knowledge, skills, and attitudes. The primary objective of this project was to identify the content for a simulation-based national curriculum for PEM training.

Methods

We recruited participants for the Delphi study by contacting current PEM program directors and immediate past program directors as well as simulation experts at all of the Canadian PEM fellowship sites. We determined the appropriate core content for the Delphi study by combining the PEM core content requirements of the Royal College of Physicians and Surgeons of Canada (RCPSC) and the American Board of Pediatrics (ABP). Using the Delphi method, we achieved consensus amongst the national group of PEM and simulation experts. The participants completed a three-round Delphi (using a four-point Likert scale).

Results

Response rates for the Delphi were 85% for the first round and 77% for second and third rounds. From the initial 224 topics, 53 were eliminated (scored <2). Eighty-five topics scored between 2 and 3, and 87 scored between 3 and 4. The 48 topics, which were scored between 3.5 and 4.0, were labeled as “key curriculum topics.”

Conclusion

We have iteratively identified a consensus for the content of a national simulation-based curriculum.

Résumé

Objectifs

Les résidents en médecine d’urgence pédiatrique (MUP) sont censés avoir acquis, à la fin de leur formation, la confiance et les compétences nécessaires pour traiter des enfants souffrant d’une maladie aiguë. L’acquisition des compétences exige une exposition à des tableaux cliniques médicaux extrêmement graves ou non, ou encore une formation supplémentaire en la matière. Les simulations peuvent, d’une part, fournir les acquis expérentiels et, d’autre part, améliorer les connaissances, les habiletés et les attitudes des stagiaires. L’étude décrite ici avait pour objectif principal d’établir le contenu d’un programme national, fondé sur la simulation en MUP.

Méthode

Afin de trouver des participants à l’étude menée selon la méthode Delphi, les auteurs ont communiqué avec les directeurs actuels de programme en MUP et leurs prédécesseurs immédiats ainsi qu’avec des experts dans le domaine de la simulation, dans tous les établissements offrant une formation postdoctorale en MUP, au Canada. Le contenu de base pertinent en vue de l’étude a été élaboré à l’aide des exigences du Collège royal des médecins et chirurgiens du Canada et de celles de l’American Board of Pediatrics en la matière. Le recours à la méthode Delphi a permis d’établir un consensus au sein du groupe national d’experts en MUP et en simulation, et ce, après 3 tours de consultation (application de l’échelle de Likert à 4 points).

Résultats

Les taux de réponse ont été de 85 % au premier tour, et de 77 % au deuxième et au troisième tour de consultation, menés selon la méthode Delphi. Sur 224 sujets présentés au départ, 53 ont été éliminés (score <2). Par ailleurs, 85 ont obtenu un score se situant entre 2 et 3; et 87, entre 3 et 4. Finalement, 48 sujets, dont les scores variaient de 3,5 à 4,0, ont été considérés comme les principaux points à inclure dans le programme.

Conclusion

Une démarche itérative de consensus a permis d’établir le contenu d’un programme national de formation, fondé sur la simulation.

Type
Original Research
Copyright
Copyright © Canadian Association of Emergency Physicians 2015 

INTRODUCTION

By the end of postgraduate training, pediatric emergency medicine (PEM) residents are assumed to possess appropriate knowledge, skills, and attitudes to treat acutely ill children. Achievement of competence in managing acutely ill children requires broad clinical exposure supplemented by educational opportunities.

The volume of acute trauma and resuscitation in pediatric emergency departments is limited.Reference Krauss, Harakal and Fleischer 1 Nevertheless, treatment for seriously ill patients is time-sensitive and dependent on careful coordination of team performance to ensure efficient and effective patient care. It is impractical and unethical for PEM trainees to rely entirely on opportunities arising from clinical interactions with critically ill patients to develop and master the skills required to effectively manage patient resuscitation.Reference Cheng, Goldman and Aish 2 In the United States and Canada, implementation of new duty hour regulations limits the time trainees are permitted to spend providing clinical care. These regulations potentially reduce opportunities for exposure to important clinical problems.Reference Drolet, Spalluto and Fischer 3 , Reference Brion, Neu and Adamkin 4 Within these current circumstances also exists a new push toward competency-based medical education (CBME) to ensure competent practitioners.Reference Snell, Frank and Stoneham 5 Thus, the challenge for PEM training program directors is to minimize the risk of harm to critically ill children, while ensuring that PEM physicians acquire the knowledge, skills, and attitudes necessary for effective and competent clinicians.

PEM training program directors currently address these challenges through the use of multiple methods, one of which is simulation-based education (SBE).Reference Cheng, Goldman and Aish 2 , Reference Adler, Vozenilek and Trainor 6 Procedural skills training using simulation has shown to improve the acquisition of skills such as lumbar punctures,Reference Gaies, Morris and Hafler 7 chest tube insertion,Reference Ballard, Shook and Iocono 8 and intubation.Reference Sudikoff, Overly and Shapiro 9 SBE provides on-demand, experiential learning that can be customized to the learning needs of specific trainees or groups.Reference Cheng, Lang and Starr 10 It is now widely used for teaching clinical expertise in domains that are infrequently encountered in the clinical settingReference Eppich, Howard and Vozenilek 11 - Reference Cheng, Duff and Grant 13 and has proven to be effective in promoting the acquisition of the knowledge, skills, and attitudes necessary to perform effectively as an emergency physician.Reference Cheng, Duff and Grant 13 - Reference Ilgen, Sherbino and Cook 26 SBE affords residents the opportunity to apply and improve their individual and team crisis resource management (CRM) skills,Reference Gilfoyle, Gottesman and Razack 27 as well as practice recognition and medical management of rare events.Reference Tsai, Harasym and Nijssen-Jordan 28 The opportunity to practice in a safe learning environment with dedicated time for debriefing allows the development of competence that is not easily achieved in the clinical environment.Reference Grant, Duff and Bhanji 29 Several simulation-based studies have demonstrated transference of skills from simulation into the clinical environment.Reference Burton, Pendergrass and Byczkowski 30 , Reference Sturm, Windsor and Cosman 31 SBE has been shown to increase skill performance and time to perform tasks in real patients while improving patient outcomes in both pediatric and adult patients.Reference Cheng, Lang and Starr 10 , Reference Cook, Hatala and Brydges 32 SBE can also be an educational activity within a CBME curriculum.Reference Snell, Frank and Stoneham 5 This growing body of literature supports the integration of SBE into PEM training programs in a structured manner.Reference Cheng, Lang and Starr 10 , Reference Cook, Hatala and Brydges 12 Although many SBE programs are well described in the literature, consensus-based learning objectives for simulation-based training have not been formally addressed at a national level in the United States or Canada.Reference Krauss, Harakal and Fleischer 1 , Reference McLeod, Steinert and Meagher 33

Our research project was designed to identify the appropriate content required for a comprehensive simulation-based national curriculum for PEM training. We used the Delphi methodReference Williams and Webb 34 to build consensus among a national group of PEM and simulation experts.

METHODS

Ethics

The Research Ethics Board of the Faculty of Medicine, McGill University, granted ethics approval for the study. We explained the Delphi study process to all participants. The completion of each step of the Delphi process served as informed consent.

Study population

Purposive sampling was used in selecting study participants. Using a list of PEM program directors from the Royal College of Physicians and Surgeons of Canada (RCPSC), we emailed current program directors of all 10 PEM fellowship training programs in Canada and immediate past program directors who have left the program director position within the past 5 years (n=16), inviting them to participate in the study. We also asked the current program director to provide names for one or more simulation experts at their site (n=10). When we had identified all potential participants, we acquired informed consent from all for participation in a three-round Delphi consensus building process.

Data collection—Delphi consensus building process

The Delphi consensus building process employs an iterative process to collect informed judgments from experts in the field.Reference Williams and Webb 34 For the purpose of this research, simulation is defined as the learning process wherein trainees practice a procedure or routine in an authentic learning environment before treating actual patients. These environments use different scenarios and equipment, and thus vary in realism. 35 We included the following in our definition of simulation: 1) high fidelity: whole-body simulation using computerized simulators (e.g., Laerdal Sim-man, Laerdal Sim-baby); 2) task trainers: lifelike models of body parts, such as an arm or pelvis. Task trainers are useful to break down specific tasks into steps; 3) low-fidelity mannequins: mannequins that are not computer-based; and 4) standardized patients: trained actors who simulate patients in a standard manner.

Members of the research team (IB, AC, FB) developed an inclusive list of core content that would be suitable for learning through simulation. The core content was determined by combining the PEM core content requirements of the RCPSC and the American Board of Pediatrics (ABP). 36 , 37 The ultimate list of 306 topics was then iteratively reviewed by the investigators with content and simulation expertise (IB, AC, FB). Subsequently, based on group consensus, 82 content items were removed because they were deemed not suitable for teaching using simulation. Examples of items that were removed include fractures of primary and secondary teeth and management of dysmenorrhea. This latter step helped minimize the quantity of Delphi content in order to maximize full participation of our “expert panel.” The remaining list of items included all topics that had the potential of being taught using simulation as an educational modality.

We then approached the group of 26 PEM and simulation experts to request their input into defining the appropriate content for the training. For each of the 224 items from 22 different content categories, each expert was instructed to rate the suitability of simulation as a pedagogical tool. Each item was rated on four-point scale anchored by the following descriptors: Definitely should be taught using simulation=4; Should be taught using simulation=3; Can be taught using simulation=2; Best taught using methods other than simulation=1. Because we were not aware of any similar studies done in the identification of a simulation curriculum, the descriptors were developed by consensus amongst the authors of the paper and reflect the use of simulation as an innovative and supplemental educational modality.

The experts were encouraged to add to the list any missing learning objectives that they felt should be included in a simulation-training program for PEM. After each iteration of the Delphi process, items that were rated an average of 2 or lower on the four-point scale were eliminated. The score of below 2 was predetermined to be the lower limit at any point within the Delphi process. It had been predetermined that any item rated greater than 3.5 at the completion of the process would be considered to be a key learning objective. The results of this first iteration, including average ratings of each objective and removal of the lower ranked objectives, were resubmitted to the expert panel members asking them to reconsider their ratings.Reference McLeod, Steinert, Meterissian and Child 38 This rating process was repeated a second and a third time, wherein experts were asked to review their ratings while reflecting on the ratings of the entire group.Reference McLeod, Steinert and Meagher 33 , Reference Smith and Simpson 39 - Reference Valani, Yanchar and Grant 41 Survey Monkey 42 was used to administer the survey. It contained no identifiable characteristics of the respondents, so rating submissions remained confidential.

RESULTS

Sixteen past and present program directors and ten simulation experts participated in the process. All 10 of the PEM accredited program sites were represented in this group of participants.

The response rate for the first round of the Delphi was 85% (22 of 26 participants responded). Thirty-eight items were eliminated in the first round. The response rate for the second round of the Delphi was 77% (20 of the 26 participants). During that round, another 15 items were eliminated. In the third and final round of the Delphi, the response rate was 77%. In this last round, no items were eliminated (Figure 1).

Figure 1 Delphi Process

The final list of 172 topics fell into 19 different content categories (Table 1). The Delphi process led to the elimination of all of the dentistry, dermatology, and orthopedics topics. All of the topics in the resuscitation and CRM categories remained, and all were categorized as high priority. Of the 172 topics remaining on the list, 85 scored between 2 and 3, 87 scored between 3 and 4, and 48 scored in the predetermined range of “key learning objectives” (Table 2) (Appendix A).

Table 1 Final Delphi Categories and Topics

Table 2 Key Curriculum Topics

DISCUSSION

To identify content for a national simulation-based curriculum, we built expert consensus using a three-round Delphi process. We propose to use the learning objectives that scored above 3 to form the basis of a new national PEM simulation curriculum. Some PEM teaching programs have developed their own curricula using simulation as a key educational modality to supplement clinical experience. Two such curriculaReference Cheng, Goldman and Aish 2 , Reference Adler, Vozenilek and Trainor 6 have improved education in their local environments. One program is a detailed 2-year modular simulation-based curriculum for PEM trainees. The curriculum received positive feedback from learners and perceived success at addressing gaps in previous training.Reference Cheng, Goldman and Aish 2 Another program addressed the challenges in acute care training by creating a 1-day workshop using six instructional simulated cases.Reference Adler, Vozenilek and Trainor 6 That study produced a PEM instructional curriculum with modest subjective gains and demonstrated the need for more frequent and focused simulation education for trainees. Neither of the previously described curricula was developed using a formalized needs assessment that incorporated external stakeholder input from other institutions, and neither used a national expert panel. As such, it may not be appropriate to assume that these curricula are generalizable to meet the needs of trainees across all PEM training programs.

In the past, the Delphi process has been used to create curricula, such as a national curricula for pediatric traumaReference Valani, Yanchar and Grant 41 and a procedural skills curriculum for medical students.Reference Sullivan, Nyquist and Etcheverry 43 Both projects highlight the importance of conducting needs assessments in structured institutions, to ensure that the results reflect generalizable content. The Delphi consensus method employed in this study assured participation of experts from across Canada determining 48 key curriculum topics (scores of 3.5 or greater after three rounds). These 48 topics can be grouped into four different categories: 1) crisis resource management (CRM), 2) resuscitation, 3) trauma, and 4) medical procedures. In our study, the very high item ratings may reflect their relatively low-volume and high-impact nature in the clinical environment. Items that ranked in the range of 3 to 3.5 are also considered of high importance in PEM training and thus should be included for the future national curriculum. These items tended to be topics that relate to high acuity scenarios but may have more consistent exposure in the clinical environment, such as blunt abdominal trauma, croup, and bronchiolitis. Items rated between 2.0 and 3.0 were identified as objectives that were important but would not be part of the core curriculum. They could still be incorporated at specific training programs based on their perceived local needs.

The next steps in developing the curriculum will include engaging governing bodies at the national level (e.g., RCPSC Specialty Committee in PEM), simulating experts and learners (fellows) across the country, supporting the development of local simulation resources and infrastructure at each institution, and building local interest in participating in a national curriculum. These variables will dictate the structure of the curriculum—whether it be longitudinal or “boot camp” (modular) style. Studies assessing courses given longitudinally have demonstrated longer retention of skills after completion of the course,Reference Wik, Myklebust and Auestad 44 - Reference Riegel, Nafziger and McBurbie 46 yet are challenged with maintaining full participation and attendance for all sessions. Modular or boot camp style courses, such as standardized resuscitation courses offered by the American Heart Association and Heart and Stroke Foundation of Canada (e.g., Advanced Cardiac Life Support Course), have observed skills acquisition, which decay over a 6- to 12-month period of time.Reference Smith, Gilcreast and Pierce 47 - Reference Woollard, Whitfield and Newcombe 51 However, these courses have the demonstrated benefit of ensuring higher attendance and participation. Further discussion with national stakeholders and simulation experts has resulted in a recommendation that the optimal implementation of the curriculum include longitudinal delivery of scenarios spread out over 2 years of fellowship (e.g., sessions every 2 weeks or monthly).

To assist in the translation of the objectives identified in this project into a national curriculum that would allow ease of use for members of the PEM teaching community, several concrete steps are already under way. The steps include the following:

  1. 1. Identification of members of a national working group to develop the curriculum

  2. 2. Determination of the simulation resources required for programs to be able to participate in the curriculum

  3. 3. Creation of simulation cases

  4. 4. Review of cases to ensure that all primary curriculum objectives are covered at least one time (some objectives may be covered more than one time)

  5. 5. Pilot testing of the cases

  6. 6. Creation of an assessment plan

  7. 7. Implementation of the curriculum across all training sites

As soon as the curriculum has been implemented, it may be necessary to revise and edit the cases (and/or instructional design) to ensure that objectives are properly met. The current consensus across all PEM programs in Canada is to provide the national simulation curriculum in a longitudinal manner. Each program will complete the national curriculum with the provided cases on a monthly to bimonthly basis, in addition to any other learning opportunities that the program may provide. Over the 2-year training period, two simulation cases will be run every month, to allow each trainee the ability to cover all objectives set out by the Delphi consensus, and all cases created by the members of the national working group.

Our study had one important limitation. Learners were not included in the Delphi process. We chose to exclude learners (PEM fellows) because we thought that they potentially did not have enough content or simulation experience to complete the study. As SBE becomes increasingly accepted and provided in training centres, it may be important to allow review of the curriculum with current learners, recent graduates, program directors, and PEM experts to address any areas that may be redundant or lacking in the PEM fellowship training.

CONCLUSION

PEM fellows are expected to acquire the knowledge, skills, and attitudes to care for all patients in emergency situations. It may not be possible to acquire these through clinical experience alone. We have attempted to address this dilemma by iteratively determining consensus, using the Delphi method, for the content of a national simulation-based curriculum that could be used to supplement clinical training and optimize education. The next steps in this critical process include engaging key stakeholders to develop, pilot test, implement, and refine the curriculum.

Acknowledgements

Ilana Bank received the John Meakins Campbell Fellowship Award in January 2011. Farhan Bhanji received the Cruess Scholar position to allow for protected time to assist in the completion of the current manuscript. Authors would like to acknowledge The Pediatric Emergency Medicine Simulation Consensus Group Investigators: Darcy Beer, University of Winnipeg; Barbara Blackie, Dalhousie University; Seen Chung, University of Toronto; Michelle Clarke, University of British Columbia; Navid Deghani, University of British Columbia; Vered Gazit, Dalhousie University; Vince Grant, University of Calgary; Elene Khalil, McGill University; Amina Lalani, University of Toronto; Arielle Levy, University of Montreal; Nathalie Lucas, University of Montreal; Kelly Millar, University of Calgary; Jonathan Pirie, University of Toronto; Gurinder Sangha, University of Western Ontario; Sandy Tse, University of Ottawa; and Bruce Wright, University of Alberta.

Supplementary material

To view supplementary material for this article, please visit http://dx.doi.org/10.1017/cem.2015.11

References

1. Krauss, BS, Harakal, T, Fleischer, GR. The spectrum and frequency of illness presenting to a pediatric emergency department. Pediatr Emerg Care 1991;7(2):67-71.Google Scholar
2. Cheng, A, Goldman, RD, Aish, MA, et al. A simulation-based acute care curriculum for pediatric emergency medicine fellowship training programs. Pediatr Emerg Care 2010;26(7):475-480.Google Scholar
3. Drolet, BC, Spalluto, LB, Fischer, SA. Residents perspectives on ACGME regulation of supervision and duty hours- a national survey. NEJM 2010;363(23):e34, doi:10.1056/NEJMp1011413.CrossRefGoogle ScholarPubMed
4. Brion, LP, Neu, J, Adamkin, D, et al. Resident duty hour restrictions: is less really more? J Pediatr 2009;154(5):631-632.Google Scholar
5. Snell, L, Frank, JR, Stoneham, G, et al. Competency based medical education. Royal College of Physicians and Surgeons of Canada, Competence by Design: Reshaping Canadian Medical Education, 2014:99-106. Available at: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/educational_initiatives/rc_competency-by-design_ebook_e.pdf.Google Scholar
6. Adler, MD, Vozenilek, JA, Trainor, JL, et al. Development and evaluation of a simulation-based pediatric emergency medicine curriculum. Acad Med 2009;84(7):935-941.CrossRefGoogle ScholarPubMed
7. Gaies, MG, Morris, SH, Hafler, JP, et al. Reforming procedural skills training for pediatric residents: a randomized, interventional trial. Pediatrics 2009;124(2):610-619.Google Scholar
8. Ballard, HO, Shook, LA, Iocono, J, et al. Novel animal model for teaching chest tube placement. J Kentucky Med Assoc 2009;107(6):219-221.Google Scholar
9. Sudikoff, SN, Overly, FL, Shapiro, MJ. High-fidelity medical simulation as a technique to improve pediatric residents’ emergency airway management and teamwork: a pilot study. Pediatr Emerg Care 2009;25(10):651-656.Google Scholar
10. Cheng, A, Lang, T, Starr, S, et al. Technology-enhanced simulation and pediatric education: a meta-analysis. Pediatrics 2014;133(5):e1313-e1323, doi:10.1542/peds.2013.2139.Google Scholar
11. Eppich, W, Howard, V, Vozenilek, J, et al. Simulation-based team training in healthcare. Simul Healthc 2011;6(Suppl):S14-S19.Google Scholar
12. Cook, DA, Hatala, R, Brydges, R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. J Am Med Assoc 2011;306(9):978-988.Google Scholar
13. Cheng, A, Duff, J, Grant, E, et al. Simulation in paediatrics: an educational revolution. Paediatr Child Health 2007;12(6):465-468.Google Scholar
14. Eppich, W, Adler, MD, McGaghie, WC. Emergency and critical care pediatrics: use of medical simulation for training in acute pediatric emergencies. Curr Opin Pediatr 2006;18(3):266-271.Google Scholar
15. Gaba, DM, Howard, SK, Flanagan, B, et al. Assessment of clinical performance during simulated crisis using both technical and behavioural ratings. Anesthesiology 1998;89(1):8-18.Google Scholar
16. Yee, B, Naik, VN, Joo, HS, et al. Nontechnical skills in anesthesia crisis management with repeated exposure to simulation-based education. Anesthesiology 2005;103(2):241-248.Google Scholar
17. Issenberg, BS, McGaghie, WC, Hart, IR, et al. Simulation technology for health care professional skills training and assessment. J Am Med Assoc 1999;282(9):861-866.Google Scholar
18. Curran, VR, Aziz, K, O’Young, S, et al. Evaluation of the effect of a computerized training simulator (ANAKIN) on the retention of neonatal resuscitation skills. Teach Learn Med 2004;16(2):157-164.Google Scholar
19. Hall, RE, Plant, JR, Bands, CJ, et al. Human patient simulation is effective for teaching paramedic student’s endotracheal intubation. Acad Emerg Med 2005;12(9):850-855.Google Scholar
20. Marshall, RL, Smith, JS, Gorman, PJ, et al. Use of a human patient simulator in the development of resident trauma management skills. J Trauma 2001;51(1):17-21.Google Scholar
21. Gilbart, MK, Hutchisin, CR, Cusimano, MD, et al. A computer-based trauma simulator for teaching trauma management skills. Am J Surg 2000;179(3):223-228.Google Scholar
22. Shapiro, M, Morey, JC, Small, SD, et al. Simulation-based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care 2004;13(6):417-421.Google Scholar
23. Wallin, C-J, Meurling, L, Hedman, L, et al. Target-focused medical emergency team training using a human patient simulator: effects on behaviour and attitude. Med Educ 2007;41(2):173-180.CrossRefGoogle ScholarPubMed
24. Ostergaard, H, Ostergaard, D, Lippert, A. Implementation of team training in medical education in Denmark. Qual Saf Health Care 2004;13(Suppl 1):i91-i95.Google Scholar
25. Holcomb, JB, Dumire, RD, Crommett, JW, et al. Evaluation of trauma team performance using an advanced human patient simulator for resuscitation training. J Trauma 2002;52(6):1078-1086.CrossRefGoogle ScholarPubMed
26. Ilgen, JS, Sherbino, J, Cook, DA. Technology enhanced simulation in emergency medicine: a systematic review and meta-analysis. Acad Emerg Med 2013;20(2):117-127.Google Scholar
27. Gilfoyle, E, Gottesman, R, Razack, S. Development of a leadership skills workshop in paediatric advanced resuscitation. Med Teach 2007;29(9):e276-e283.Google Scholar
28. Tsai, T-C, Harasym, PH, Nijssen-Jordan, C, et al. Learning gains derived from a high-fidelity mannequin-based simulation in the pediatric emergency department. J Formos Med Assoc 2006;105(1):94-98.Google Scholar
29. Grant, VJ, Duff, J, Bhanji, F, et al. Pediatric simulation. Levine AI, DeMaria Jr S, Bryson EO, Schwartz AD, editors The comprehensive textbook of healthcare simulation. New York: Springer; 2013.Google Scholar
30. Burton, KS, Pendergrass, TL, Byczkowski, TL, et al. Impact of simulation-based extracorporeal membrane oxygenation training in the simulation laboratory and clinical environment. Simul Healthc 2011;6(5):284-291.Google Scholar
31. Sturm, L, Windsor, JA, Cosman, PH, et al. A systematic review of skills transfer after surgical simulation training. Ann Surg 2008;248(2):166-179.Google Scholar
32. Cook, DA, Hatala, R, Brydges, R, et al. Technology enhanced simulation for health professions education. J Am Med Assoc 2011;306(9):978-988.Google Scholar
33. McLeod, PJ, Steinert, Y, Meagher, T, et al. The ABC’s of pedagogy for clinical teachers. Med Educ 2003;37(7):638-644.Google Scholar
34. Williams, PL, Webb, C. The Delphi technique: a methodological discussion. J Adv Nurs 1994;19(1):180-186.Google Scholar
35. Government of Western Australia Department of Health. Clinical Simulation Support Unit (CSSU). Available at: http://www.health.wa.gov.au/simulationlearning/home/what.cfm (accessed 20 February 2014).Google Scholar
36. Royal College of Physicians and Surgeons of Canada. Objectives of Training in the Subspecialty of Pediatric Emergency Medicine, version 1.0; 2013.Google Scholar
37. American Board of Pediatrics Content Outline for Pediatric Emergency Medicine. Subspecialty In-trai ning, certification and maintenance of certification examinations. Last revised October 2009. Available at: https://www.abp.org/sites/abp/files/pdf/emer2011.pdf.Google Scholar
38. McLeod, PJ, Steinert, Y, Meterissian, S, Child, S. Using the Delphi process to identify the curriculum. Med Educ 2004;38(5):548.Google Scholar
39. Smith, KS, Simpson, RD. Validating teaching competencies for faculty members in higher education: a national study using the Delphi method. Innov High Educ 1995;19(3):223-234.Google Scholar
40. Rohan, D, Ahern, S, Walsh, K. Defining an anesthetic curriculum for medical undergraduates: a Delphi study. Med Teach 2009;31(1):e1-e5.Google Scholar
41. Valani, RA, Yanchar, N, Grant, V, et al. The development of a national pediatric trauma curriculum. Med Teach 2010;32(3):e115-e119.CrossRefGoogle ScholarPubMed
42. Survey Monkey, http://www.surveymonkey.com (accessed 30 January 2014).Google Scholar
43. Sullivan, M, Nyquist, J, Etcheverry, J, et al. Developments of a comprehensive school wide simulation based procedural skills curriculum for medical students. J Surg Educ 2010;67(5):309-315.Google Scholar
44. Wik, L, Myklebust, H, Auestad, BH, et al. Twelve month retention of CPR skills with automatic correcting verbal feedback. Resuscitation 2005;66(1):27-30.Google Scholar
45. Christenson, J, Nafziger, S, Compton, S, et al. The effect of time on CPR and automated external defibrillator skills in the public access defibrillation trial. Resuscitation 2007;74(1):52-62.CrossRefGoogle ScholarPubMed
46. Riegel, B, Nafziger, S, McBurbie, MA, et al. How well are cardiopulmonary resuscitation and automated defibrillator skills retained over time? Results from the Public Access Defibrillation Trial. Acad Emerg Med 2006;13(3):254-263.Google Scholar
47. Smith, KK, Gilcreast, D, Pierce, K. Evaluation of staff’s retention of ACLS and BLS skills. Resuscitation 2008;78(1):59-65.Google Scholar
48. Woollard, M, Whitfeild, R, Smith, A, et al. Skill acquisition and retention in automated external defibrillator (AED) use and CPR by lay responders: a prospective study. Resuscitation 2004;60(1):17-28.Google Scholar
49. Spooner, BB, Fallaha, JF, Kocierz, L, et al. An evaluation of objective feedback in basic life support (BLS) training. Resuscitation 2007;73(3):417-424.Google Scholar
50. Berden, HJ, Willems, FF, Hendrick, JM, et al. How frequently should basic cardiopulmonary resuscitation training be repeated to maintain adequate skills? Br Med J 1993;306(6892):1576-1577.Google Scholar
51. Woollard, M, Whitfield, R, Newcombe, RG, et al. Optimal refresher training intervals for AED and CPR skills: a randomized controlled trial. Resuscitation 2006;71(2):237-247.Google Scholar
Figure 0

Figure 1 Delphi Process

Figure 1

Table 1 Final Delphi Categories and Topics

Figure 2

Table 2 Key Curriculum Topics

Supplementary material: File

Bank supplementary material

Appendix A

Download Bank supplementary material(File)
File 135 KB