Hostname: page-component-586b7cd67f-g8jcs Total loading time: 0 Render date: 2024-11-25T19:58:58.423Z Has data issue: false hasContentIssue false

Ascertaining top evidence in emergency medicine: A modified Delphi study

Published online by Cambridge University Press:  21 June 2018

Stephanie J. Bazak*
Affiliation:
Royal College Specialty Training Program in Emergency Medicine, McMaster University, Hamilton, ON
Jonathan Sherbino
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON
Suneel Upadhye
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON Niagara Health System and St. Joseph’s Healthcare Hamilton, Hamilton, ON
Teresa Chan
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON Michael G. DeGroote School of Medicine, Hamilton, ON
*
Correspondence to: Dr. Stephanie J. Bazak, McMaster University, Emergency Medicine Program, HHSC Hamilton General Hospital Site, McMaster Clinic, 2nd Floor, 237 Barton St E., Hamilton, ON L8L2X2; Email: [email protected]

Abstract

Objectives

The application of evidence-informed practice in emergency medicine (EM) is critical to improve the quality of patient care. EM is a specialty with a broad knowledge base making it daunting for a junior resident to know where to begin the acquisition of evidence-based knowledge. Our study’s objective was to formulate a list of “top papers” in the field of EM using a Delphi approach to achieve an expert consensus.

Methods

Participants were recruited from all 14 specialty EM programs across Canada by a nomination process by the program directors. The modified Delphi survey consisted of three study rounds, each round sent out via email. The study tool was piloted first with McMaster University’s specialty EM residents. During the first round, participants individually listed top papers relevant to EM. During the two subsequent rounds, participants ranked the papers listed in the first round, with a chance to adjust ranking based on group responses.

Results

A total of eight EM specialty programs responded with 30 responses across the three rounds. There were 119 studies suggested in the first round, and, by the third round, a consensus of>70% agreement was reached to generate the final list of 29 studies.

Conclusions

We produced, via an expert consensus, a list of top studies relevant for Canadian EM physicians in training. It can be used as an educational resource for junior residents as they transition into practice.

Résumé

Contexte

L’application de la pratique fondée sur des données probantes en médecine d’urgence (MU) est cruciale pour l’amélioration de la qualité des soins. La MU est une spécialité qui exige une vaste base de connaissances, tant et si bien que les résidents juniors ne savent pas par où commencerce pour acquérir du savoir fondé sur des données probantes. L’étude visait donc à dresser une liste des « meilleurs articles » dans le domaine de la MU, à l’aide de la méthode Delphi pour permettre l’établissement d’un consensus entre spécialistes.

Méthode

Les participants ont été recrutés parmi les 14 programmes de spécialité en MU, offerts partout au Canada, après un processus de désignation d’experts par les directeurs de programme. L’enquête menée selon une version modifiée de la méthode Delphi consistait en trois tours de vote, chacun tenu par courriel. L’outil d’enquête a d’abord été soumis à un essai chez les résidents inscrits au programme de spécialité en MU de l’Université McMaster. Par la suite, les participants ont dressé individuellement une liste des meilleurs articles en MU au premier tour et, aux deux tours suivants, ordonné les articles soumis la première fois, puis modifié au besoin leur classement selon les réponses du groupe.

Résultats

Des représentants de huit programmes de spécialité en MU ont donné 30 réponses au cours des trois tours de vote. Il y avait 119 études suggérées au premier tour, et le nombre a été ramené à 29 sur la liste définitive, après atteinte d’un consensus>70 %.

Conclusion

Les participants ont établi, par voie de consensus entre spécialistes, une liste des études les plus pertinentes pour les médecins en formation en MU, au Canada. Elle peut servir de ressource didactique aux résidents juniors durant leur passage de l’apprentissage à la pratique.

Type
Original Research
Copyright
Copyright © Canadian Association of Emergency Physicians 2018 

CLINICIAN’S CAPSULE

What is known about the topic?

EM is a specialty with a broad knowledge base making it daunting for a junior resident to know where to begin the acquisition of evidence-based knowledge.

What did the study ask?

What list of “top papers” was formulated in the field of EM using a national Canadian Delphi approach to achieve an expert consensus?

What did the study find?

A list was produced of top studies relevant for Canadian EM physicians in training.

Why does this study matter to clinicians?

The list produced can be used as an educational resource for junior residents.

INTRODUCTION

The ability to critically appraise and apply evidence to clinical practice is a skill required by emergency medicine (EM) specialists and is an accreditation requirement for residency programs.Reference Reisdorff, Hayes, Carlson and Walker 1 - Reference McKibbon, Wilczynski and Haynes 4 However, because EM is a generalist specialty, it can be daunting for a junior learner to know where to even begin their acquisition of evidence-informed EM knowledge. There have been many attempts to create lists of sentinel papers for junior learners to use as a springboard to developing an evidence-based clinical practice. Lists of “top cited papers” in EM have been previously compiled, but these lists are often arbitrary and based on the idiosyncrasies and experiences of the curators.Reference Shuaib, Acevedo and Khan 5 , Reference Tsai, Lee, Chen and Yen 6 In the growing age of FOAM (Free Open Access to Medical Education), many online resources have created lists of “top papers.”Reference Grayzel and Wiley 7 - 9 These lists can be an excellent starting point but tend to be compiled off of the opinions of one or two authors or are focused on recent publications, often excluding sentinel papers that have been published in the past.Reference Huis in’t Veld, Nguyen, Martinez and Mattu 10 , Reference Mattu 11 , Reference Junck, Bender, Ilgen and Diller 14

The Colorado Compendium created a list of 100 sentinel articles for EM residents.Reference Druck, Pearson and Claud 12 , Reference Claud, Druck and Pearson 13 However, it reflects the opinions of a single American academic centre. Similarly, the best evidence in emergency medicine (BEEM) scoreReference Worster, Kulasegaram and Carpenter 15 has been created to define lists of more recent evidence but not used to aggregate papers that have historically changed practice. While other groups have reported aggregating papers via crowdsourcing from senior residentsReference Archambault, Blouin and Poitras 16 , Reference Archambault, Thanh and Blouin 17 at a national review course,Reference Blouin, Dagnone and O’Connor 18 no systematic survey of Canadian academic EM faculty responsible for training junior emergency physicians has been conducted to date.

The objective of this study was to systematically generate a list of “top EM papers” (i.e., practice changing results based on strong methods) to assist junior Canadian EM trainees in developing evidence-informed clinical practice.

METHODS

Participants were recruited from all Royal College of Physicians and Surgeons of Canada specialist EM programs. All 14 program directors were contacted via email with a description of the study and a request to nominate two EM experts from their institution with an interest or background in emergency medical education or research.

Delphi survey

Our modified, three-round Delphi survey was conducted electronically with the aim of gathering and aligning opinions from a national set of experts with different perspectives with the hope of defining a more generalizable list of initial papers.Reference Akins, Tolson and Cole 19 - Reference Humphrey-Murto, Varpio and Wood 21 The Delphi rounds were distributed in 3-week intervals using Google Forms (Google, Mountain View, CA), and delivered via email. The nationwide Delphi was conducted between July and November 2016.

Round 1 consisted of an open-ended questionnaire in which participants were asked to generate a list of what they felt were the most important and influential papers related to the field of EM. Results of Survey 1 were compiled for use in Survey 2. In round 2, participants were asked to rank each paper on the compiled list as Include, Not include, Important but not top priority, or Unfamiliar. Participants were also asked to provide a rationale for their scoring. In round 3, participants viewed the percentage of participants who thought each paper should be included in the final list and the free-text comments. Participants then selected whether they agreed or disagreed with the inclusion. Study results were then categorized into six categories for analysis: 100% inclusion, 70% inclusion, 50% inclusion, primarily “good to know,” no consensus (less than 50% agreement), and primarily exclude or unfamiliar.

Ethics

We received approval from the Hamilton Integrated Research Ethics Board (HIREB) to conduct this study.

RESULTS

Participants

A total of 13 participants participated in all three rounds of the study. Not all participants were involved in each round of the study. Eight of the 14 sites with Royal College training programs in Canada were included. Appendix A lists the breakdown of the sites and participants.

Delphi results

From the first round of the Delphi, a total of 120 papers were nominated (Appendix B). By round 3, there were 10 studies with “100% agreement” for final inclusion. Nineteen other studies included in the final list were endorsed by more than 70% of participants as “must include” (Table 1).

Table 1. Top studies in Delphi survey

DISCUSSION

We identified 29 top papers for junior Canadian EM trainees to use in developing evidence-informed practice. Interestingly, even amongst experts in the field, there is a significant variability in what is considered “must know” studies. This represents a structured attempt with a national consultation of experts to ascertain foundational papers for Canadian trainees.

Although other studies have created similar reading lists for EM,Reference Mattu 11 - Reference Claud, Druck and Pearson 13 , Reference Worster, Kulasegaram and Carpenter 15 - Reference Archambault, Thanh and Blouin 17 this study uses a more robust methodology to systematically create a consensus list with national input. The variability seen in Survey 1 demonstrates how much literature exists that is relevant to the field of EM. In comparing our list with the popularly cited 2016 list from the Academic Life in Emergency Medicine blog,Reference Junck, Bender, Ilgen and Diller 14 there is some degree of overlap; 11 studies were included on both lists. Some reasons for discrepancies between the two lists include a higher prevalence of Canadian-based EM literature, newer studies, and a smaller final list for our study. Future directions may include using a similar methodology to create an annual “must read” lists for continuing professional development and revalidation.Reference Levinson 22 - 24

Limitations

Firstly, the sample size was much smaller than originally anticipated. Although there is no consistent standard in the literature for a Delphi sample,Reference Akins, Tolson and Cole 19 a total of 13 participants is within the typical size range for this type of study. We were also able to achieve only 50% representation from a diversity of geographical regions; the exclusive use of an English language survey may have been a barrier with a francophone faculty.

CONCLUSIONS

Using a modified Delphi technique, we aggregated a nationally endorsed list of top papers for junior EM trainees. This list may anchor an evidence-informed reading list for junior EM trainees.

Acknowledgement

We thank Dr. Yasmin Jajarmi for her contributions on this project while she was a medical student.

Competing interests

None declared.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit https://doi.org/10.1017/cem.2018.404

References

REFERENCES

1.Reisdorff, EJ, Hayes, OW, Carlson, DJ, Walker, GL. Assessing the new general competencies for resident education: a model from an emergency medicine program. Acad Med 2001;76(7):753-757.10.1097/00001888-200107000-00023Google Scholar
2.CanMEDs Framework 2005. Royal College of Physicians and Surgeons of Canada; 2015. Available at: http://www.royalcollege.ca/portal/page/portal/rc/canmeds/framework (accessed 16 April 2018).Google Scholar
3.Hatala, R, Keitz, SA, Wilson, MC, Guyatt, G. Beyond journal clubs: moving toward an integrated evidence-based medicine curriculum. J Gen Intern Med 2006;21:538-541.10.1111/j.1525-1497.2006.00445.xGoogle Scholar
4.McKibbon, KA, Wilczynski, NL, Haynes, RB. What do evidence-based secondary journals tell us about the publication of clinically important articles in primary healthcare journals? BMC Med 2004;2(1):33.10.1186/1741-7015-2-33Google Scholar
5.Shuaib, W, Acevedo, JN, Khan, MS, et al. The top 100 cited articles published in emergency medicine journals. Am J Emerg Med 2015;33(8):1066-1071.10.1016/j.ajem.2015.04.047Google Scholar
6.Tsai, Y, Lee, C, Chen, S, Yen, Z. Top-cited articles in emergency medicine. Am J Emerg Med 2006;24:647-654.10.1016/j.ajem.2006.01.001Google Scholar
7.Grayzel, J, Wiley, JF. What’s new in emergency medicine. Literature review up to May 2017. UpToDate, Inc.; 2017. Available at: http://www.uptodate.com/contents/whats-new-in-emergency-medicine (accessed 23 June 2017).Google Scholar
8.Hynaszkiewicz, I. F1000Prime blog. F1000Prime most influential: critical care and emergency medicine; 2014. Available at: https://blog.f1000.com/2014/03/18/f1000prime-most-influential-critical-care-emergency-medicine/ (accessed 23 June 2017).Google Scholar
9.MDlinx. M3 USA Corporation. The 29 best journal summaries in emergency medicine in 2016; 2016. Available at: https://www.mdlinx.com/emergency-medicine/top-read-articles/best-list.cfm/2016/ (accessed 23 June 2017).Google Scholar
10.Huis in’t Veld, MA, Nguyen, TC, Martinez, JP, Mattu, A. Need-to-know” emergency medicine articles of 2014. Intl J Emerg Med 2015;8(5), doi: 10.1186/s12245-015-0055-6.Google Scholar
11.Mattu, A. Three must-read emergency medicine articles of 2015. Medscape Emergency Medicine; 2016. Available at: http://www.medscape.com/viewarticle/856669_2 (accessed 23 June 2017).Google Scholar
12.Druck, J, Pearson, D, Claud, N. The Colorado Compendium: an article-based literature review program. West J Emerg Med 2008;10(1):21-22.Google Scholar
13.Claud, J, Druck, J, Pearson, D. Colorado Compendium First Edition; 2006. Available at: https://www.emra.org/uploadedFiles/EMRA/Medical_Students/Educational_Materials/Colorado_Compendium.pdf (accessed 23 June 2017).Google Scholar
14.Junck, E, Bender, P, Ilgen, J, Diller, D, et al. Academic life in emergency medicine – 52 articles in 52 weeks. 2nd ed.; 2016. Available at: https://www.aliem.com/2016/10/52-articles-in-52-weeks-2nd-edition-2016/ (accessed 23 June 2017).Google Scholar
15.Worster, A, Kulasegaram, K, Carpenter, C, et al. Consensus conference follow-up: inter-rater reliability assessment of the Best Evidence in Emergency Medicine (BEEM) rater scale, a medical literature rating tool for emergency physicians. Acad Emerg Med 2011;18(11):1193-1200.10.1111/j.1553-2712.2011.01214.xGoogle Scholar
16.Archambault, P, Blouin, D, Poitras, J, et al. Emergency medicine residents’ beliefs about contributing to a Google DocsTM presentation: a survey protocol. J Innov Health Inform 2011;19(4):207-216.10.14236/jhi.v19i4.815Google Scholar
17.Archambault, PM, Thanh, J, Blouin, D, et al. Beliefs about contributing to a Google DocsTM presentation: a survey protocol. J Innov Health Inform 2015;17(4):374-386.Google Scholar
18.Blouin, D, Dagnone, LE, O’Connor, HM. Effect of a review course on emergency medicine residents’ self-confidence. Emerg Med Australas 2008;20(4):314-321.Google Scholar
19.Akins, RB, Tolson, H, Cole, BR. Stability of response characteristics of a Delphi panel: application of bootstrap data expansion. BMC Med Res Methodol 2005;5:37, 10.1186/1471-2288-5-37.10.1186/1471-2288-5-37Google Scholar
20.Humphrey-Murto, S, Varpio, L, Gonsalves, C, Wood, TJ. Using consensus group methods such as Delphi and Nominal Group in medical education research. Med Teach 2016;39:1-6, 10.1080/0142159X.2017.1245856.Google Scholar
21.Humphrey-Murto, S, Varpio, L, Wood, TJ, et al. The use of the Delphi and other consensus group methods in medical education research. Acad Med 2017;92(10):1491-1498, 10.1097/ACM.0000000000001812.10.1097/ACM.0000000000001812Google Scholar
22.Levinson, W. Revalidation of physicians in Canada: are we passing the test? CMAJ 2008;179(10):979-980.10.1503/cmaj.081342Google Scholar
23.Federation of Medical Regulatory Authorities of Canada Revalidation Working Group. Physician revalidation: maintaining competence and performance. Ottawa (ON): The Federation; 2007. Available at: www.fmrac.ca/committees/document /final_reval_position_eng.pdf (accessed 2 September 2017).Google Scholar
24.Consulting with the profession – the revalidation system under consideration. College of Physicians and Surgeons of Ontario; 2007. Available at: http://www.cpso.on.ca/Policies-Publications/Positions-Initiatives/Revalidation-Consultation (accessed 2 September 2017).Google Scholar
Figure 0

Table 1. Top studies in Delphi survey

Supplementary material: File

Bazak et al. supplementary material

Bazak et al. supplementary material 1

Download Bazak et al. supplementary material(File)
File 45.1 KB
Supplementary material: PDF

Bazak et al. supplementary material

Bazak et al. supplementary material 2

Download Bazak et al. supplementary material(PDF)
PDF 139.7 KB