Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-26T00:48:34.710Z Has data issue: false hasContentIssue false

CAEP 2019 Academic Symposium: Got competence? Best practices in trainee progress decisions

Published online by Cambridge University Press:  25 March 2020

Warren J. Cheung*
Affiliation:
Department of Emergency Medicine, University of Ottawa, Ottawa, ON
Teresa M. Chan
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON
Karen E. Hauer
Affiliation:
Division of General Internal Medicine, Department of Medicine, University of California San Francisco, San Francisco, CA
Robert A. Woods
Affiliation:
Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK
Jill McEwen
Affiliation:
Department of Emergency Medicine, University of British Columbia, Vancouver, BC
Lynsey J. Martin
Affiliation:
Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK
Catherine Patocka
Affiliation:
Department of Emergency Medicine, University of Calgary, Calgary, AB
Sandy L. Dong
Affiliation:
Department of Emergency Medicine, University of Alberta, Edmonton, AB
Munsif Bhimani
Affiliation:
Division of Emergency Medicine, Western University, London, ON
Tamara McColl
Affiliation:
Department of Emergency Medicine, University of Manitoba, Winnipeg, MB
*
Correspondence to: Dr. Warren J. Cheung, F6 Clinical Epidemiology Program, The Ottawa Hospital, Civic Campus, 1053 Carling Avenue, Ottawa, Ontario, Canada, K1Y4E9; Email: [email protected]

Abstract

Background

Competence committees play a key role in a competency-based system of assessment. These committees are tasked with reviewing and synthesizing clinical performance data to make judgments regarding residents’ competence. Canadian emergency medicine (EM) postgraduate training programs recently implemented competence committees; however, a paucity of literature guides their work.

Objective

The objective of this study was to develop consensus-based recommendations to optimize the function and decisions of competence committees in Canadian EM training programs.

Methods

Semi-structured interviews of EM competence committee chairs were conducted and analyzed. The interview guide was informed by a literature review of competence committee structure, processes, and best practices. Inductive thematic analysis of interview transcripts was conducted to identify emerging themes. Preliminary recommendations, based on themes, were drafted and presented at the 2019 CAEP Academic Symposium on Education. Through a live presentation and survey poll, symposium attendees representing the national EM community participated in a facilitated discussion of the recommendations. The authors incorporated this feedback and identified consensus among symposium attendees on a final set of nine high-yield recommendations.

Conclusion

The Canadian EM community used a structured process to develop nine best practice recommendations for competence committees addressing: committee membership, meeting processes, decision outcomes, use of high-quality performance data, and ongoing quality improvement. These recommendations can inform the structure and processes of competence committees in Canadian EM training programs.

Résumé

RésuméContexte

Les comités de compétences jouent un rôle important dans les systèmes d’évaluation fondée sur les compétences. Les premiers ont pour tâches d'examiner les données sur la performance clinique et d'en faire la synthèse afin que soient étayés les jugements portés sur la compétence des résidents. Ainsi, des comités de compétences ont été mis sur pied récemment dans les programmes de formation postdoctorale en médecine d'urgence (MU) au Canada, mais leurs travaux reposent sur une documentation plutôt maigre.

Objectif

L’étude visait à élaborer des recommandations consensuelles dans le but d'améliorer le fonctionnement et les prises de décision des comités de compétences, dans les programmes de formation en MU, au Canada.

Méthode

L’étude consistait en des entretiens semi-structurés avec des présidents et présidentes de comités de compétences en MU, puis en leur analyse. Le guide d'entretien découlait d'un examen de la documentation sur la structure des comités, les processus et les pratiques exemplaires. L’équipe a par la suite procédé à une analyse thématique inductive de la transcription des entretiens afin de dégager de nouveaux thèmes. Des recommandations préliminaires, fondées sur les thèmes ont ensuite été formulées, puis présentées au Symposium 2019 de la section des affaires universitaires de l'Association canadienne des médecins d'urgence, sur la formation. Les délégués au symposium représentant la communauté de la MU à l’échelle nationale ont participé, par l'intermédiaire d'une présentation en direct et d'un sondage, à une discussion guidée par un animateur sur les recommandations. Les auteurs ont finalement intégré les commentaires recueillis et sont parvenus à dégager un consensus parmi les participants au symposium sur un ensemble définitif de neuf recommandations susceptibles de donner de bons résultats.

Conclusion

La communauté de la MU au Canada a appliqué un processus structuré afin d’élaborer des recommandations (9) en matière de pratiques exemplaires, à l'intention des comités de compétences, touchant la composition des comités, les processus de réunion, les résultats des prises de décision, l'utilisation de données de qualité sur la performance et l'amélioration continue de la qualité. Ces recommandations peuvent être utilisées pour guider la structure et les processus des comités de compétences dans les programmes de formation en MU au Canada.

Type
CAEP Paper
Copyright
Copyright © Canadian Association of Emergency Physicians 2020

INTRODUCTION

Residency programs face the important task of determining whether their trainees have acquired the necessary competencies to provide high-quality, safe patient care. Historically, readiness for practice was defined by the completion of a prescribed number of years of training within an accredited residency program.Reference Hauer, Chesluk and Iobst1 However, in today's era of greater social accountability, the public has come to expect that programs have a process in place to ensure that their graduating physicians are adequately prepared for independent practice.Reference Cruess and Cruess2 Specifically, there have been calls for increased transparency in confirming the competence of graduating residents.Reference Frank, Snell and Cate3 In response, specialty training programs in Canada are undergoing a fundamental transformation from a time-based to an outcomes-based approach to delivering medical education, aptly named Competence By Design.Reference Frank, Snell and Cate3,4 The emergence and widespread adoption of a Competence By Design framework are in their earliest stages with novel innovations, curricular architectures, and various assessment strategies still in development, focusing on increased accountability and underscoring the importance of individualized learning.

Canadian emergency medicine (EM) specialty training programs implemented Competence By Design on July 1, 2018. With this transition, the responsibility for monitoring resident progress and making promotion decisions no longer falls solely on residency program directors. Instead, synthesis of resident assessment data and decisions about resident progress and promotions occur through a group process at the competence committee.5 Despite a year of experience formally implementing competence committees in Canadian EM specialty training programs, there remain many unanswered questions about how these committees should best approach their work of rendering judgments of competence.Reference Hauer, ten Cate and Boscardin6 What information sources should be used? What group processes should be applied to synthesize this information? How can one be sure that judgments and decisions about progress are trustworthy? Although some research on competence committees in the American context exists,Reference Hauer, Chesluk and Iobst1,Reference Hauer, ten Cate and Boscardin6 there is a paucity of literature to guide the work of competence committees specific to the Canadian context.

The purpose of the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium on Education was to identify a consensus set of high-yield recommendations for educators and residency programs to enhance EM residency training. In this study, we describe the results of the consensus recommendations to optimize the function and decisions of competence committees in Canadian EM training programs.

METHODS

Formation of our expert panel

The CAEP Academic Section assembled an expert working group including EM physicians from across the country, with attention to geographic representation, language, scope of practice, and experience in education scholarship and leadership. Final expert working group composition included nine Canadian emergency physicians representing eight medical schools in five provinces, and one content expert in competence committee scholarship from the University of California, San Francisco (K.E.H.). The expert working group included emergency physicians with certification from the College of Family Physicians of Canada through the Special Competence in Emergency Medicine (CCFP-EM) or EM certification through the Royal College of Physicians and Surgeons of Canada (FRCPC). The group had representation from a variety of education leadership positions and a mix of advanced training in medical education. They met monthly over 1 year by means of teleconference to design and implement the study and ultimately develop preliminary recommendations on EM competence committee best practices.

Study design and ethics

This qualitative study involved semistructured interviews of all competence committee chairs at each of the 14 FRCPC EM programs, including their satellite sites, to explore the purpose, structure, operations, and best practices of competence committees. Thematic analysis of the semi-structured interviews informed a set of preliminary recommendations which were presented, discussed, refined, and voted on at the 2019 CAEP Academic Symposium on Education in Halifax, Nova Scotia. This study was approved by the Ottawa Health Science Network Research Ethics Board (ID: 20180186-01H).

Development of interview guide

With the aid of a university librarian, we conducted a literature review regarding competence committee structure, processes, and best practices by searching PubMed, Embase, and Medline databases for related content.Reference Arksey and O'Malley7 Search terms included competence committee, clinical competency committee, clinical competence, committee membership, group decision making, progression decisions, competence assessment, workplace-based assessment, resident progression, resident competence. This review informed development of the interview guide for the semistructured interviews of competence committee chairs. All expert working group members participated in construction of the interview guide, and a content expert (K.E.H.) ensured that all relevant and emerging literature was captured. Interview questions addressed successes, challenges, and typical discussions occurring during competence committee meetings. W.J.C. and T.M. piloted the interview guide with five competence committee members who were known to the expert working group but were not participants in the study. Based on feedback from pilot participants, minor modifications were made to the wording of the interview guide for clarity and comprehensibility. The final interview guide is available as Online Supplemental Appendix A.

Data collection and analysis

Seven expert working group members conducted individual phone interviews with participants. All participants provided verbal consent. Interviews were audio recorded using Zoom Video Communications (San Jose, CA, USA) and were professionally transcribed. Interviews were deidentified during the transcription process. Three members (W.J.C., T.M., and T.M.C.) conducted an inductive thematic analysis using a qualitative approach described by Braun and Clarke: (a) become familiar with the data, (b) develop initial codes, (c) collate codes into themes, (d) review each theme, (e) define each theme, and (f) finalize the analysis.Reference Braun and Clarke8 Through joint discussions, the three members synthesized the codes from each interview and identified larger themes. They compared themes with the original transcripts to ensure that no themes were missed and to confirm that the themes accurately reflected the content of the interviews. All three investigators iteratively discussed differences in thematic interpretation until they reached agreement.

Box 1. Consensus recommendations to optimize the function and decisions of competence committees in Canadian EM training programs

Recommendation development

The expert working group reviewed the preliminary themes through a series of teleconferences and email correspondences. They collectively synthesized them into five dominant themes: (1) Competence Committee Membership, (2) High-Quality Data, (3) Competence Committee Meeting Process, (4) Competence Committee Decision Outcomes, and (5) Continuous Quality Improvement, and 11 preliminary recommendations within these five themes. These recommendations were presented to 60 emergency physicians at the 2019 CAEP Academic Symposium on Education in Halifax, Nova Scotia on May 25, 2019. Through a live presentation and survey poll guided by an expert working group member (TM), the audience engaged in a facilitated discussion of the recommendations to provide feedback on wording and organization. Revisions were made to the recommendations and the audience used an online platform (www.polleverywhere.com) to vote on the inclusion of revised recommendations based on a predetermined consensus threshold of 80%. Two recommendations did not achieve consensus (supplemental Material Appendix B). Here we present a final set of nine best practice recommendations to optimize the function of, and decisions made by, competence committees in Canadian EM training programs (Box 1).

SUMMARY OF RECOMMENDATIONS

Recommendation 1 Competence committee members should participate in regular faculty development addressing how the committee functions within the larger structure of their local postgraduate medical education system, its intrinsic processes and expected outputs.

Committee members may vary in their experience and knowledge of different assessment methods, competency frameworks, and committee processes.Reference Ekpenyong, Baker and Harris9,Reference Oudkerk Pool, Govaerts, Jaarsma and Driessen10 Frame of reference training can help members develop a shared mental model of the purpose of the committee and the performance expectations at each stage of training.Reference Holmboe and Hawkins11 Faculty development that focuses on group process can also help members mitigate the potential influences of personal cognitive biases and groupthink on committee decisions and outputs.Reference Hauer, ten Cate and Boscardin6,Reference Dickey, Thomas, Feroze, Nakshabandi and Cannon12 Member turnover and the expected evolution of committee processes (see recommendation 8) make it essential that new and longstanding members receive regular training. Such training can be scheduled during committee meetings or may occur as separate faculty development sessions that are organized locally or by national stakeholders.

Recommendation 2 Postgraduate training programs should establish well-defined descriptions of competence committee member qualifications, terms of service, and statements of work in the committee's Terms of Reference.

Committee member characteristics can influence the diversity and representativeness of information available during committee deliberations.Reference Hauer, ten Cate and Boscardin6 Individuals with a demonstrated commitment to improving trainee education should be recruited,Reference Holmboe, Rodak, Mills, McFarlane and Schultz13 and locally determined qualifications should be clearly outlined in the committee's Terms of Reference. Rotating members can help introduce new and diverse opinions and minimize groupthink.Reference Lewis, Belliveau, Herndon and Keller14 This can be accomplished by defining terms of service. The work of competence committees may require members to give of their time and energy outside of meetings (e.g., to review resident data). The member's statement of work should outline these expectations and supports available. To help streamline decision-making processes, the statements of work should also define individual member roles within the committee (e.g., Chair, Program Director) and delineate who are voting and nonvoting members.Reference Kinnear, Warm and Hauer15

Recommendation 3 The competence committee should be provided with data from multiple sources and contexts to generate a comprehensive overview of resident progression.

To make summative judgments of resident competence, the committee must incorporate multiple assessment methods that provide data that is sampled across content and clinical contexts, as well as over time.Reference Van Der Vleuten, Schuwirth and Driessen16 Competence is specific, not generic. Performance in one context does not predict performance in another. Regardless of what is being measured, or how it is being measured, assessment of resident competence and the observed performance are specific to the context.Reference van der Vleuten, Sluijsmans, Joosten-ten Brinke and Mulder17 Additionally, the concern over rater idiosyncrasies in workplace-based assessment can be mitigated by ensuring that data are collected from multiple observers.Reference Carraccio, Englander and Van Melle18 Designing a program of assessment, in which each competency domain is intentionally informed by multiple assessment sources and each source can be used to inform multiple competency domains, can help programs ensure that adequately sampled performance data inform competence committee decisions.Reference Schuwirth and Van Der Vleuten19

Recommendation 4 Discussion of trainee competence should follow clear processes and procedures to ensure fair and transparent resident progress decisions.

Applying structured group procedures facilitates information sharing among members and improves decision quality.Reference Hauer, ten Cate and Boscardin6,Reference Lu, Yuan and McLeod20 Committees should follow clear processes that outline what information is reviewed before and during meetings, how information is shared, and the defined goals of committee discussions. Structured procedures that enable sufficient time for discussion, solicit multiple perspectives, and encourage discussion of alternatives can also help to mitigate biased outcomes.Reference Mesmer-Magnus and Dechurch21,Reference Kerr and Tindale22 Competence committees should develop a “Process and Procedures” document that details the approach to group decision-making, including steps and requirements to reach a final decision (e.g., definition of quorum, process for reaching consensus or majority vote). Ensuring a transparent deliberation and decision-making process will enhance the credibility of the committee's outputs for stakeholders.Reference Donato, Alweis and Wenderoth23

Recommendation 5 Competence committee progress decisions should be based on documented data presented to the committee with specific avoidance of undocumented personal attestations.

The high-stakes decisions made by the competence committee must be credible and trustworthy.Reference Van Der Vleuten, Schuwirth, Driessen, Govaerts and Heeneman24 The introduction of undocumented personal attestations threatens to obscure the committee's decision-making process and introduce potential bias. Dickey et al., have highlighted several cognitive biases that committee members are susceptible to (e.g., bandwagon bias),Reference Dickey, Thomas, Feroze, Nakshabandi and Cannon12 which can emerge when discussions center around undocumented personal experiences. While data from informal sources, such as hallway conversations, may warrant further consideration, they may not always be captured by the formal program of assessment. Therefore, programs should have a process to document these informal data and reconcile them with other resident performance data at the competence committee. Ultimately, the judgments of the committee should stem from a transparent process that relies on data that can be audited by outsiders, if ncessary.Reference Donato, Alweis and Wenderoth23

Recommendation 6 Outcomes and decisions of competence committee proceedings should be promptly communicated to the residents.

Regular communication and performance feedback are powerful learning tools for medical trainees and are essential for personal and professional development.Reference Renting, Gans, Borleffs, Van Der Wal, Jaarsma and Cohen-Schotanus25 Residents with identified deficiencies in certain core knowledge areas, procedural dexterities, attitudes or any CanMEDS competencies should receive high-quality and detailed feedback to guide clinical and academic focus and remediation when needed.Reference Holmboe, Sherbino, Long, Swing and Frank26 Early identification and correction of perceived deficiencies promote clinical advancement, resident well-being, and improved patient care.Reference Carraccio, Wolfsthal, Englander, Ferentz and Martin27,Reference Bing-You and Trowbridge28 Regular feedback is equally important for residents progressing on track and those identified on an accelerated path to provide support and nurture them on a continued path toward mastery in EM.Reference Hauer, Chesluk and Iobst1 Regular communication should include not only the resident cohort, but also relevant stakeholders including their academic advisors and faculty coaches, when applicable.Reference Kinnear, Warm and Hauer15

Recommendation 7 Competence committee progress decisions should inform residents’ development of individualized learning plans with guidance from a faculty coach.

Reflective practice, essential for critical thinking and professional development, is an important skill residents must develop to achieve competence and progress through their training.Reference Albanese29,Reference Ericsson30 Transparency of progress decisions and consistent communication facilitate the development of trainee self-regulation and can inform individualized learning plans. These learning plans can help to close the gap between current and desired performance, and for those on an accelerated path, help them achieve mastery.Reference Nicol and Macfarlane-Dick31 Ensuring the trainee's faculty advisor or academic coach is also given timely access to the committee's findings enhances the utility of feedback, encourages reflective practice, and ensures the trainee has a point of contact to help maintain their developmental trajectory.Reference Gonzalo, Wolpaw, Krok, Pfeiffer and McCall-Hosenfeld32 An effective coach can help advance the trainee's potential and maximize clinical, professional, and academic performance.Reference Lovell33

Recommendation 8 Postgraduate training programs and local competence committees should engage in processes of continuous quality improvement to ensure high-quality data informs valid and defensible progress decisions.

Evaluation of individual competence committee processes and practices is crucial to reduce undesired resident assessment variability and to maintain a transparent, innovative and effective system for progress decisions.Reference Baartman, Prins, Kirschner and van der Vleuten34 Both internal and external stakeholders can be involved in systematically addressing and prioritizing gaps and challenges in the system with regular feedback from administration, faculty, and residents.Reference Brateanu, Thomascik, Koncilja, Spencer and Colbert35 The use of structured continuous quality-improvement practices can empower training programs with an approach to evaluate their processes and effectively restructure their procedures as necessary.Reference Kinnear, Warm and Hauer15,Reference Brateanu, Thomascik, Koncilja, Spencer and Colbert35

Recommendation 9 The national EM community should work collaboratively to share best practices and innovations in competence committee structure and process.

Competence committees across the country should engage in collaborative efforts and share information regarding challenges, best practices, and innovations. We live in the era of technological advancement where cross-program communication and collaboration by means of multiple online platforms is possible. Collaboration between programs will lead to increased efficiency, decreased time and effort on program assessment and redesign, and added mentorship and coaching from programs with greater infrastructure and resources. This collaboration has the potential to enhance the national standard for competence committee procedures and practices.Reference Gill, West, Watzak, Quiram, Pillow and Graham36

CONCLUSION

Competence committees are the cornerstone of a program of assessment within a competency-based medical education framework. In this study, we describe nine key consensus recommendations for EM competence committees addressing: committee membership, meeting processes, decision outcomes, as well as the use of high-quality performance data and engaging in processes of continuous quality improvement. Implementing these recommendations can optimize the function and decisions of competence committees in Canadian EM training programs.

Supplemental material

The supplemental material for this article can be found at https://doi.org/10.1017/cem.2019.480.

Acknowledgements

The authors thank the competence committee chairs for their time and for sharing their insights and experiences. We also thank all of the many dedicated EM educators within our community who participated in the consensus conference in Halifax, Nova Scotia.

Financial support

Transcription costs for this study were kindly supported by the Academic Section of the Canadian Association of Emergency Physicians.

Competing interests

None declared.

References

REFERENCES

1.Hauer, KE, Chesluk, B, Iobst, W, et al. Reviewing residents’ competence: a qualitative study of the role of clinical competency committees in performance assessment. Acad Med 2015;90(8):1084–92.10.1097/ACM.0000000000000736CrossRefGoogle ScholarPubMed
2.Cruess, SR, Cruess, RL.Professionalism and medicine's social contract with society. Clin Orthop Relat Res 2004;6(4):12–6.Google ScholarPubMed
3.Frank, JR, Snell, LS, Cate, O, et al. Competency-based medical education: theory to practice. Med Teach 2010;32(8):638–45.10.3109/0142159X.2010.501190CrossRefGoogle ScholarPubMed
4.Royal College of Physicians and Surgeons of Canada. Competence By Design. 2017 [cited September 13, 2017]; Available at: http://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e (accessed December 11, 2019).Google Scholar
5.Royal College of Physicians and Surgeons of Canada. Royal College of Physicians and Surgeons of Canada: Competence Committees [Internet]. 2017 [cited September 13, 2017]; Available at: http://www.royalcollege.ca/rcsite/cbd/assessment/competence-committees-e (accessed December 11, 2019).Google Scholar
6.Hauer, K, ten Cate, O, Boscardin, C, et al. Ensuring resident competence: a narrative review of the literature on group decision making to inform the work of clinical competency committees. J Grad Med Educ 2016;8(2):156–64.10.4300/JGME-D-15-00144.1CrossRefGoogle Scholar
7.Arksey, H, O'Malley, L.Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005;8(1):1932.10.1080/1364557032000119616CrossRefGoogle Scholar
8.Braun, V, Clarke, V.Using thematic analysis in psychology. Qual Res Psychol 2006;3(2):77101.10.1191/1478088706qp063oaCrossRefGoogle Scholar
9.Ekpenyong, A, Baker, E, Harris, I, et al. How do clinical competency committees use different sources of data to assess residents’ performance on the internal medicine milestones? A mixed methods pilot study. Med Teach 2017;39(10):1074–83.10.1080/0142159X.2017.1353070CrossRefGoogle ScholarPubMed
10.Oudkerk Pool, A, Govaerts, MJB, Jaarsma, DADC, Driessen, EW.From aggregation to interpretation: how assessors judge complex data in a competency-based portfolio. Adv Health Sci Educ Theory Pract 2018;23(2):275–87.10.1007/s10459-017-9793-yCrossRefGoogle Scholar
11.Holmboe, ES, Hawkins, R.Practical Guide to the Evaluation of Clinical Competence. Philadelphia, PA: Mosby/Elsevier; 2008.Google Scholar
12.Dickey, CC, Thomas, C, Feroze, U, Nakshabandi, F, Cannon, B.Cognitive demands and bias: challenges facing clinical competency committees. J Grad Med Educ 2017;9(2):162–4.10.4300/JGME-D-16-00411.1CrossRefGoogle ScholarPubMed
13.Holmboe, ES, Rodak, W, Mills, G, McFarlane, MJ, Schultz, HJ.Outcomes-based evaluation in resident education: creating systems and structured portfolios. Am J Med 2006;119(8):708–14.10.1016/j.amjmed.2006.05.031CrossRefGoogle ScholarPubMed
14.Lewis, K, Belliveau, M, Herndon, B, Keller, J.Group cognition, membership change, and performance: investigating the benefits and detriments of collective knowledge. Organ Behav Hum Decis Process 2007;103(2):159–78.10.1016/j.obhdp.2007.01.005CrossRefGoogle Scholar
15.Kinnear, B, Warm, EJ, Hauer, KE.Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. Med Teach 2018;40(11):1110–5.10.1080/0142159X.2018.1474191CrossRefGoogle ScholarPubMed
16.Van Der Vleuten, CPM, Schuwirth, LW, Driessen, EW, et al. A model for programmatic assessment fit for purpose. Med Teach 2012;34(3):205–14.10.3109/0142159X.2012.652239CrossRefGoogle Scholar
17.van der Vleuten, C, Sluijsmans, D, Joosten-ten Brinke, D.Competence assessment as learner support in education. In: Competence-based Vocational and Professional Education, Technical and Vocational Education and Training: Issues, Concerns and Prospects (ed. Mulder, M), Switzerland: Springer International Publishing; 2017:607–30.10.1007/978-3-319-41713-4_28CrossRefGoogle Scholar
18.Carraccio, C, Englander, R, Van Melle, E, et al. Advancing competency-based medical education. Acad Med 2016;91(5):645–9.10.1097/ACM.0000000000001048CrossRefGoogle ScholarPubMed
19.Schuwirth, LWT, Van Der Vleuten, CPM.Programmatic assessment: from assessment of learning to assessment for learning. Med Teach 2011;33(6):478–85.10.3109/0142159X.2011.565828CrossRefGoogle ScholarPubMed
20.Lu, L, Yuan, YC, McLeod, PL.Twenty-five years of hidden profiles in group decision making: a meta-analysis. Pers Soc Psychol Rev 2012;16(1):5475.10.1177/1088868311417243CrossRefGoogle ScholarPubMed
21.Mesmer-Magnus, JR, Dechurch, LA.Information sharing and team performance: a meta-analysis. J Appl Psychol 2009;94(2):535–46.10.1037/a0013773CrossRefGoogle ScholarPubMed
22.Kerr, NL, Tindale, RS.Group performance and decision making. Annu Rev Psychol 2004;55:623–55.10.1146/annurev.psych.55.090902.142009CrossRefGoogle ScholarPubMed
23.Donato, AA, Alweis, R, Wenderoth, S.Design of a clinical competency committee to maximize formative feedback. J Community Hosp Intern Med Perspect 2016;6(6):33533.Google ScholarPubMed
24.Van Der Vleuten, CPM, Schuwirth, LWT, Driessen, EW, Govaerts, MJB, Heeneman, S.Twelve tips for programmatic assessment. Med Teach 2015;37(7):641–6.10.3109/0142159X.2014.973388CrossRefGoogle ScholarPubMed
25.Renting, N, Gans, ROB, Borleffs, JCC, Van Der Wal, MA, Jaarsma, ADC, Cohen-Schotanus, J.A feedback system in residency to evaluate CanMEDS roles and provide high-quality feedback: exploring its application. Med Teach 2016;38(7):738–45.10.3109/0142159X.2015.1075649CrossRefGoogle ScholarPubMed
26.Holmboe, ES, Sherbino, J, Long, DM, Swing, SR, Frank, JR.The role of assessment in competency-based medical education. Med Teach [Internet] 2010;32(8):676–82.10.3109/0142159X.2010.500704CrossRefGoogle ScholarPubMed
27.Carraccio, C, Wolfsthal, SD, Englander, R, Ferentz, K, Martin, C.Shifting paradigms: from Flexner to competencies. Acad Med 2002;77(5):361–7.10.1097/00001888-200205000-00003CrossRefGoogle ScholarPubMed
28.Bing-You, RG, Trowbridge, RL.Why medical educators may be failing at feedback. JAMA 2009;302(12):1330–1.10.1001/jama.2009.1393CrossRefGoogle ScholarPubMed
29.Albanese, MA.Crafting the reflective lifelong learner: why, what and how. Med Educ 2006;40(4):288–90.10.1111/j.1365-2929.2006.02470.xCrossRefGoogle ScholarPubMed
30.Ericsson, KA.Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 2008;15(11):988–94.10.1111/j.1553-2712.2008.00227.xCrossRefGoogle ScholarPubMed
31.Nicol, DJ, Macfarlane-Dick, D.Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud High Educ [Internet] 2006;31(2):199218. http://www.tandfonline.com/doi/abs/10.1080/03075070600572090CrossRefGoogle Scholar
32.Gonzalo, JD, Wolpaw, DR, Krok, KL, Pfeiffer, MP, McCall-Hosenfeld, JS.A developmental approach to internal medicine residency education: lessons learned from the design and implementation of a novel longitudinal coaching program. Med Educ Online 2019;24(1):1591256.10.1080/10872981.2019.1591256CrossRefGoogle ScholarPubMed
33.Lovell, B.What do we know about coaching in medical education? A literature review. Med Educ 2018;52(4):376–90.10.1111/medu.13482CrossRefGoogle ScholarPubMed
34.Baartman, LKJ, Prins, FJ, Kirschner, PA, van der Vleuten, CPM.Self-evaluation of assessment programs: a cross-case analysis. Eval Program Plann 2011;34(3):206–16.10.1016/j.evalprogplan.2011.03.001CrossRefGoogle ScholarPubMed
35.Brateanu, A, Thomascik, J, Koncilja, K, Spencer, AL, Colbert, CY.Using continuous quality-improvement techniques to evaluate and enhance an internal medicine residency program's assessment system. Am J Med 2017;130(6):750–5.10.1016/j.amjmed.2017.02.007CrossRefGoogle ScholarPubMed
36.Gill, AC, West, C, Watzak, B, Quiram, B, Pillow, T, Graham, L.Twelve tips for curriculum sharing and implementation: don't reinvent the wheel. MedEdPublish [Internet] 2016;5(3). Available at: http://www.mededpublish.org/manuscripts/708/v1 (accessed December 11, 2019).10.15694/mep.2016.000149CrossRefGoogle Scholar
Supplementary material: File

Cheung et al. supplementary material

Appendices A-B

Download Cheung et al. supplementary material(File)
File 24.6 KB