Introduction
Although there has been significant investment in health-related research and development of interventions, translation into policy and routine practice remains slow [Reference Green, Ottoson, Garcia and Hiatt1,Reference Proctor, Ramsey, Saldana, Maddox, Chambers and Brownson2]. For example, in national studies among US public health departments, an estimated 58%–64% of programs and policies were reported as evidence-based [Reference Brownson, Fielding and Green3]. In another study among public health practitioners in health departments, an estimated three-quarters (75%) of programs were reported as evidence-based in two US states [Reference Jacobs, Clayton and Dove6]. Ebell and colleagues also found that 51% of clinical recommendations for primary care practice were based on patient-oriented evidence from original research, with only 18% based on high-quality evidence [Reference Ebell, Sokol, Lee, Simons and Early7]. These studies reflect gaps and barriers and suggest that evidence-based interventions (EBIs) are not being disseminated effectively [Reference Green, Ottoson, Garcia and Hiatt1,Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8–Reference Green, Ottoson, García, Hiatt and Roditis10].
The lingering research to practice gap is attributed to interacting barriers at multiple levels shaped by political, economic, cultural, scientific, and organizational contexts [Reference Glasgow and Emmons11]. These barriers include the lack of relevance of research findings to practice, research findings not packaged for ease of implementation, limited capacity and resources to disseminate or apply research, lack of organizational and structural supports to enhance access and adoption of research, and lack of funding [Reference Glasgow and Emmons11–Reference Tabak, Stamatakis, Jacobs and Brownson13].
Dissemination, an active and intentional process of spreading EBIs to target audiences via determined channels using planned strategies [Reference Rabin, Viglione and Brownson14], is a critical step for effective adoption and implementation of these EBIs [Reference Baumann, Hooley and Kryzer15]. The process of dissemination is influenced by multiple factors related to characteristics of the individual, innovation, organization, and environment [Reference Dobbins, Ciliska, Cockerill, Barnsley and DiCenso16]. However, previous studies have shown that dissemination is too often passive and not aligned between those producing (often researchers) and those applying the research evidence (often practitioners) [Reference Brownson, Eyler, Harris, Moore and Tabak17–Reference Brownson, Fielding and Green19], contributing to low uptake of EBIs [Reference Bero, Grilli, Grimshaw, Harvey, Oxman and Thomson20]. Dissemination approaches by researchers typically include publication in journals and presentations at conferences. Although such practices are important and effective for other researchers, they do not line up well with needs and communications approaches and preferences of practitioners who are the target adopters and implementers of research evidence [Reference Brownson, Eyler, Harris, Moore and Tabak17]. Designing for dissemination (D4D) seeks to address this disconnect to better align how researchers produce and communicate research evidence (push) with how practitioners or policymakers receive and utilize (pull) research evidence, and the structural supports needed to support evidence-based practice (capacity) [Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. D4D is a process to ensure that products of research are designed and developed to match the contextual characteristics (i.e., needs, assets, and resources) of the target audiences, including practitioners, and their setting [Reference Rabin, Viglione and Brownson14,Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson, Colditz and Proctor21].
Previous studies have examined the practice of D4D primarily among researchers. In a study among researchers in academic and national research institutions in the USA found that 73% spent less than 10% of their time on dissemination, 53% had a person or team in their unit dedicated to dissemination, and only a third (34%) involved stakeholders in the process [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8]. A more recent study among dissemination and implementation researchers in the USA and Canada found that overall engagement in dissemination-related activities and stakeholder involvement in the research were more common [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. However, dissemination-related activities (i.e., face-to-face meetings) identified as most impactful to practice or policy were used by only 40% of respondents, while dissemination-related activities such as journal publications, conference presentations, and reports to funders were used by the majority (>70%) of respondents [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. Organizational structures and supports, including dissemination expected by funding agencies and previous work in a practice or policy setting, were identified as significant and most important determinants of dissemination efforts by public health researchers to non-research audiences [Reference Tabak, Stamatakis, Jacobs and Brownson13].
Most prior D4D studies focused mostly on researchers (the push side), with few studies examining the perspectives of practitioners (the pull side). A qualitative study that explored the use of research evidence among public health officials in the USA found that most respondents used research to support grant writing; and primary sources of research evidence were professional organizations and government agencies, compared to research journals [Reference Narain, Zimmerman and Richards23]. In this same study, respondents also indicated a desire to participate in the planning phase of research projects and recommended simplifying for and tailoring for diverse target audiences to enhance usefulness of research evidence [Reference Narain, Zimmerman and Richards23]. Previous studies in Canada found that public health decision-makers in public health departments and community organizations preferred executive summaries of research evidence [Reference Dobbins, Jack, Thomas and Kothari24,Reference Dobbins, Rosenbaum, Plews, Law and Fysh25]. Additionally, organizational characteristics including perceived organizational value on use of research evidence and ongoing training were shown to be influential in the use of research evidence [Reference Dobbins, Cockerill, Barnsley and Ciliska26]. The divergence in perspectives between researchers and practitioners is also reflected in training programs that have focused primarily on building capacity for implementation in researchers [Reference Brownson, Fielding and Green3,Reference Brownson, Fielding and Green19], and Kwan and colleagues stressed that dissemination strategies have focused much more on the push side (researchers’ perspective) with little emphasis on the pull side (practitioners’ perspective), warranting more emphasis on practitioner engagement in dissemination [Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. Since the COVID-19 pandemic, dissemination practices may have changed and thus we need to understand practitioners’ perspectives and preferences to expand our understanding of D4D.
By ensuring that research and interventions are designed and developed in ways that match with priorities and needs of adopters and implementers, D4D approach has the potential to improve the translation of evidence into practice [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. D4D provides the avenue to identify all key stakeholders and collaboratively develop dissemination and implementation approaches that reflect the experiences of adopters, implementers, and beneficiaries of research evidence – a critical step toward achieving health equity [Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson, Kumanyika, Kreuter and Haire-Joshu27]. Thus, this study aims to examine and describe the practice and patterns of D4D among practitioners in the USA. Information from this study will guide the co-design of dissemination products (e.g., research findings and interventions) and strategies engaging relevant stakeholders to maximize the reach and adoption of research evidence and EBIs.
Methods
Study Design and Participants
This cross-sectional study was conducted across the USA among public health and clinical practitioners in spring 2022. Public health practitioners were considered those working in local and state health departments. Clinical practitioners were primary care physicians working in the following settings: pediatrics, family medicine, internal medicine, obstetrics and gynecology, and emergency medicine.
Survey Development and Measures
The survey development was informed by three theoretical frameworks, including Diffusion of Innovations, Knowledge to Action (K2A) Frameworks, and Reach Effectiveness Adoption Implementation and Maintenance (RE-AIM) as well as previous D4D studies [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22,Reference Brownson, Allen and Jacob28–Reference Evenson, Brownson, Satinsky, Eyler and Kohl30]. Diffusion of Innovations helps to understand the spread of new or innovative ideas (e.g., research evidence) and characteristics of adopters, proposing that adoption of an innovation is accomplished in several stages beginning with the awareness of the innovation to the continued use of the innovation [Reference Dearing31,Reference Rogers, Singhal and Quinlan32]. The K2A framework highlights the key elements and outcomes of knowledge (e.g., research evidence) utilization in practice including the need to identify and understand multi-level factors (barriers and facilitators) that influence knowledge use. RE-AIM focuses on implementation outcomes and dimensions that together determine the public health impact of a program or policy (e.g., reach and implementation) [Reference Graham, Logan and Harrison33–Reference Glasgow, Harden and Gaglio36], The linkage between survey content and these theoretical frameworks is summarized in Supplementary material 1. The survey was pretested by conducting cognitive interviewing using the think-aloud technique [Reference Willis37,Reference Beatty and Willis38] among 10 public health practitioners (n = 5) and clinical practitioners (n = 5). The practitioners who pretested the survey were recruited through the professional networks within the Prevention Research Center at Washington University in St Louis. Responses from the interviews informed the revision of the survey, including re-wording and addition or deletion of survey questions to enhance relevance and readability of the survey. The final survey had 24 questions (see Supplementary material 2).
The survey assessed individual (practitioner) and organizational factors. First, awareness and knowledge of research evidence (e.g., Diffusion of Innovation [Reference Dearing31,Reference Rogers, Singhal and Quinlan32]) included items assessing sources of information about and characteristics of presenting research evidence as well as whether an individual or team was designated in the organization or clinic to find and report research evidence. Second, adoption and implementation of research evidence (K2A [Reference Graham, Logan and Harrison33,Reference Wilson, Brady and Lesesne34] and RE-AIM [Reference Glasgow, Vogt and Boles35,Reference Glasgow, Harden and Gaglio36]) included items assessing the frequency, barriers, and facilitators of using research evidence. For the final section, engagement in research included items addressing ways in which respondents were involved in research within the past 2 years, including the impact of the COVID-19 pandemic.
Data Collection
Practitioners were recruited online through local- and national-level rosters – Missouri local health departments for the local public health practitioners, National Association of Chronic Disease Directors (NACDD) for the state health public health practitioners, and American Medical Association (AMA) for the clinical practitioners. Surveys were self-administered and conducted online through Qualtrics (Qualtrics, 2020). An initial email invitation in addition to three email reminders with a unique link to the survey were sent to a random sample of respondents from each list. Data were collected from April to June 2022. Upon completion of the survey, participants received a $50 gift card. This study was approved by Washington University in St Louis Institutional Review Board (IRB No. 202112167).
Analysis
Descriptive analyses were conducted to summarize data using frequencies (percentages) for categorical variables and means (standard deviations) for continuous variables. Subgroup analyses exploring differences between public health and clinical practitioners were assessed in bivariate analyses using chi-squared tests. All analyses were conducted in SAS v9.4.
Results
After deleting 44 invalid responses (e.g., duplicates), analysis was conducted on 623 respondents. The final analytic sample (n = 577) excluded 45 respondents who considered themselves only as researchers and 1 respondent who indicated they had retired. The overall response rate for the survey was 5.5% [9.1% (n = 41/451) among local public health practitioners, 22.5% (n = 262/1,162) among state public health practitioners, and 3.3% (320/9,648) among clinical public health practitioners]. Among 577 respondents, 55% were clinical practitioners and 45% were public health practitioners (Table 1). State public health practitioners (n = 222, 85%) comprised the majority of public health practitioners. The highest proportion of public health practitioners (81%) worked in state health departments, and the highest proportion of clinical practitioners (64.8%) worked in outpatient health facilities. The majority of clinical practitioners (95.7%) had a doctoral degree and public health practitioners had a master’s degree (58%). One-fifth of the respondents considered themselves as both practitioners and researchers (19%). The highest proportion of all practitioners ranked national government agencies (40%), followed by professional associations (27%) and researchers (21%) as their most common source of information for research findings (Table 2). All practitioners most commonly trusted email announcements (31%), reading academic journals (27%), and professional conferences (10%) as their source of information. For public health practitioners, the most common trusted sources of information were national government agencies (55%) followed by researchers (21%), and most often got information about research findings from email announcements (44%) followed by government reports (14%) and reading academic journals (13%). For clinical practitioners, the most trusted sources of information were professional associations (37%) and national government agencies (28%), and the most commonly ranked source of research findings was reading academic journals (38%) followed by email announcements (20%) and professional conferences (13%).
1 Ministry of Health.
2 Private practice, locum, VA, Telemedicine, Corporation.
3 Trade/technical/vocational education beyond high school.
4 Includes Nutrition/Dietetics, Dentistry, Public Administration/Social Welfare, Education, History, Law, Economics.
5 Virgin Islands, Micronesia, Palau.
1 Includes internet search; news media; Up to Date website.
2 Includes academic journals; Pubmed; Up to Date website.
3 Top ranked source of information in ascending order (from most common to least common).
Overall, the majority of all practitioners reported that when presenting research findings, it is very or extremely important that information be relevant to the patients or populations served (92%), provides practical advice about implementation (89%), tells a story of how patients or populations served are affected by an issue (69%), provides data on cost-effectiveness (57%), and is delivered by someone known and respected (50%) (Table 3). Compared to clinical practitioners, a significantly higher proportion of public health practitioners indicated that it was extremely or very important for research findings to be relevant to the populations served (95% vs. 88%, p = 0.005), to present practical advice about implementation (95% vs. 84%, p = 0.001), and tell a story about how an issue affects populations served (78% vs. 62%, p < 0.001).
1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.
There were significant differences in the uses of research findings between public health and clinical practitioners (Table 4). Public health practitioners, compared to clinical practitioners, were more likely to every time or almost every time use research findings to promote health equity (80% vs. 56%, p < 0.001), evaluate programs/policies/services (83% vs. 52%, p < 0.001), address the spread of inaccurate information (64% vs. 57%, p < 0.001), modify existing programs/services (75% vs. 44%, p < 0.001), develop new programs/services (82% vs. 38%, p < 0.001), discontinue an existing program/service (45% vs. 35%, p < 0.001), and to write a grant application (71% vs. 13%, p < 0.001). Lack of time to find research was ranked as the most common barrier for both public health (44%) and clinical practitioners (37%), followed by lack of relevance of research to work needs for public health practitioners (17%), and lack of a brief summary of research findings for clinical practitioners (15%). For both public health and clinical practitioners, easy access to a summary of research findings (30% and 36%), easy access to research findings or data sources (30% vs. 26%), and leaders or direct supervisors placing high priority on research (23% and 13%) were the most important facilitators of using research findings.
1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.
Table 5 presents organizational factors related to use of research findings. A significantly lower proportion of public health practitioners compared to clinical practitioners strongly agreed or agree they had adequate staffing to implement research findings in their work (24% vs. 36%, p = 0.007) and adequate financial resources to implement research findings (23% vs. 36%, p < 0.001). The majority of practitioners (83%) placed a priority on promoting health equity in their work and indicated that it was extremely or very important for the organization/clinic to use research findings. Overall, 20% of the respondents had a designated individual or team responsible for findings and disseminating research findings.
1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.
The majority of all survey practitioners in the survey (57%) indicated that research involvement (e.g., serving on an advisory committee or as a research participant, disseminating research findings) since COVID-19 stayed the same. Overall, about a third of practitioners reported being involved in collecting data, interpreting data, and dissemination findings through personal or professional networks. In the past 2 years, a significantly higher proportion of public health practitioners were involved with collecting data (51% vs. 24%, p < 0.001), interpreting data (45% vs. 21%, p < 0.001), and disseminating findings through personal or professional networks (46% vs. 20%, p < 0.001) compared to clinical practitioners. A significantly higher proportion of clinical practitioners compared to public health practitioners had not been involved in research in the past 2 years (45% vs. 20%, p < 0.001).
Discussion
This study addresses a critical gap in D4D by examining the perspectives of practitioners who are adopters and implementers of research. The most common source of research findings were academic journals and email announcements for clinical and public health practitioners. Email announcements could include newsletters, reports, or web links with information about research findings. The majority of all practitioners in the survey, with significantly higher proportions among public health practitioners, frequently used research findings to promote health equity, address the spread of inaccurate information, and develop, modify, and evaluate programs or services. The most common barrier to using research was a lack of time, while easy access to research evidence was the most common facilitator. Only a third of the practitioners had adequate staffing and financial resources to find and implement research in their work.
Consistent with previous findings in both public health practice and health care [Reference Glasgow and Emmons11,Reference Olswang and Prelock12,Reference Narain, Zimmerman and Richards23,Reference Sullivan, Wayne, Patey and Nasr39–Reference Dodson, Baker and Brownson41], practitioners in our study reported time constraints as the most common barrier to finding and using research in practice. For example, in a survey among state-level public health practitioners, respondents commonly cited lack of time as a barrier to using evidence-based decision-making in practice [Reference Dodson, Baker and Brownson41]. In qualitative studies among healthcare providers including pediatric surgeons and allied health clinicians, the already demanding day-to-day workload poses time constraints not only to patient care but also to prioritizing and using research evidence in practice [Reference Sullivan, Wayne, Patey and Nasr39,Reference Harding, Porter, Horne-Thompson, Donley and Taylor40]. As outlined in a review of factors influencing research translation to practice, many practice settings are faced with competing priorities, tasks, and demands which may exacerbate the challenge of finding and integrating research [Reference Glasgow and Emmons11]. Narain and colleagues note that within a practice or policy setting, there is need to identify and take quick action on feasible solutions which may not always be appreciated or often accounted for in the process of research production and dissemination [Reference Narain, Zimmerman and Richards23].
The biggest facilitator for using research evidence among practitioners in our study was easy access to research and a summary of research findings. These data are similar to findings from a qualitative study among public health officials who favored summaries and systematic reviews as a way of consuming research evidence [Reference Narain, Zimmerman and Richards23,Reference Dobbins, Cockerill, Barnsley and Ciliska26]. This underscores the need for strategies that better align with practitioners’ preferences and simplify the access, retrieval, and integration of research evidence which may, in turn, help to overcome time constraints within practice settings.
We found major differences in research involvement, a key aspect of D4D, between practitioners and researchers in previous studies. All practitioners were more involved in research during the end stages of data collection, interpreting data, and disseminating findings through personal or professional networks. In contrast, previous D4D studies among researchers reported research engagement activities more toward the beginning of the research process. In two D4D studies among researchers in the USA and Canada, the most common methods of involvement included development of research advisory committees (66%–72%), engagement of persons with diverse experiences, perspectives and roles in research proposal development and implementation to enhance relevance of research to practice settings (62%) and to stakeholders (59%), and participation on the research team (63%) [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. This suggests that there is still a need to identify ways to intentionally engage practitioners from conceptualization and throughout the research process. Engagement is critical to enhancing the relevance and translation of research to practice. Although half of practitioners indicated that their involvement in research did not change, the discrepancy in findings in our study may also reflect the challenges faced during the COVID-19 pandemic in bringing stakeholders together for research.
The second difference was related to dissemination-related activities between researchers and practitioners. We found that less than a third of all practitioners, with a much lower proportion among public health practitioners, most often got research information from reading academic journals. Yet, this is the most common approach for disseminating research evidence used by researchers [Reference Brownson42]. Knoepke and colleagues found that dissemination and implementation researchers most frequently disseminated their work by publishing in academic journals (88%), delivering conference presentations (86%), and reporting to funders (74%) [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. In this same study, use of dissemination-related activities most impactful to practice were used less frequently [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. The third contrast in findings from our study compared to that of other studies related to organizational supports. Only 19.8% of practitioners reported having a designated individual or team for finding and disseminating research evidence. This is much lower than reported in a previous study where over half (53%) of the researchers had a person or team within their unit dedicated to dissemination [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8]. This may reflect differences in capacity and structural supports available and accessible to practitioners compared to researchers, particularly those working in institutions with a significant focus on research, such as reported in the 2018 survey study of public health researchers [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8].
Ours is one of several recent findings highlighting the persistent push–pull disconnect between researchers and practitioners [Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson42]. which has implications for the translation and integration of research evidence into practice or policy [Reference Narain, Zimmerman and Richards23]. To improve the translation of research into practice, there is an urgent need to better align and match approaches, preferences, and priorities of researchers and practitioners in the design, dissemination, and implementation of research. This necessitates the design of research and interventions to take into account the needs and contextual characteristics of practitioners and their practice settings in which research findings are intended to impact [Reference Narain, Zimmerman and Richards23]. Bridging the gap and improving the alignment between the research and practice world will also require creating and sustaining an enabling environment for effective research engagement as well as dissemination, integration, and implementation of scientific evidence. Building capacity and organizational support structures will be essential as proposed by the push–pull–capacity model [Reference Brownson42]. This model asserts that “for science to affect practice there must be a combination of the rationale for the science (pull), a demand for the science by practitioners (pull), and the delivery ability of the public health and healthcare systems (the capacity) [Reference Brownson42].” However, based on our results and previous studies, [Reference Jacobs, Clayton and Dove6,Reference Brownson, Fielding and Green19] current gaps still exist. We found that only a third of all practitioners had adequate staffing and financial resources to implement research findings in their work. Similar organizational factors, including expectation by funding agencies and previous work in a practice or policy setting, were shown as significant determinants for dissemination efforts among researchers [Reference Tabak, Stamatakis, Jacobs and Brownson13]. This suggests that strategies for capacity building, such as training, and strategies that build support structures at organizational level, such as staffing and funding, in both practice and research settings may contribute toward research translation. In building capacity, it will be critical to enhance equity by tailoring strategies based on specific needs of each setting and in consideration of contextual and social determinants and existing health disparities [Reference Brownson, Kumanyika, Kreuter and Haire-Joshu27].
Our findings should be considered in light of a few limitations. We used self-report survey data which may be subject to recall bias or response bias. Depending on the roles of respondents within their organizations/clinics/hospitals, it is possible that respondents may not have had all the information about survey questions focused on the organizational settings. Given the low response rate (common in surveys of practitioners [Reference Robins, Leider and Schaffer43,Reference Cook, Wittich, Daniels, West, Harris and Beebe44]), our findings may be subject to nonresponse bias. Those who responded may be different from those who did not respond regarding their perspectives on D4D. The low response rate affects the generalizability of the findings. We surveyed primary care physicians whose responses may not reflect experiences of other healthcare professionals. Given this study was the first to assess how clinical and public health practitioners in the USA learn about research evidence, future research is needed to examine D4D practice among other practitioner types within the healthcare professions as well as policymakers. Additionally, to gain a more comprehensive perspective on D4D and strategies to bridge the research–practice gap, it would have been helpful to have in-depth mixed or qualitative data to supplement our results. Despite these limitations, this is one of the few studies to examine how practitioners access and integrate research evidence, addressing a pertinent gap in our understanding of D4D. In addition, we were able to capture diverse experiences by surveying practitioners within public health and clinical settings.
Conclusion
This study described how practitioners in the USA receive and use research evidence, which is important for researchers undertaking the practice of D4D to reach this audience. We found that there are differences in dissemination activities, research engagement, and organizational supports, among practitioners in contrast to researchers, in previous D4D studies. This provides important insights into where the persistent disconnect between the two worlds exist and provides the opportunity to identify points of intervention. The current study and existing literature suggests the need to identify and develop strategies and tools to more effectively D4D tailored to the needs of those adopting and implementing research evidence.
Supplementary material
The supplementary material for this article can be found at https://doi.org/10.1017/cts.2023.695.
Acknowledgments
The authors thank the local and state public health practitioners as well as the clinical practitioners for their contribution and participation in this study. The authors also acknowledge administrative support for this study from Linda Dix and Mary Adams at the Prevention Research Center in St Louis, Brown School at Washington University in St Louis.
Funding statement
This study was supported by the National Cancer Institute of the National Institutes of Health [grant numbers 3P30CA091842-19S4, P50CA244688, and P50CA244431], Prevention Research Center through the Centers for Disease Control and Prevention, and Barnes Jewish Health Foundation. The content of this study is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Competing interests
The authors have no conflicts of interest to declare.