Hostname: page-component-586b7cd67f-tf8b9 Total loading time: 0 Render date: 2024-11-28T14:59:30.312Z Has data issue: false hasContentIssue false

The Impact of Professional Training in Public and Policy Engagement

Published online by Cambridge University Press:  01 December 2023

Jordan Tama
Affiliation:
American University, USA
Maria Rost Rublee
Affiliation:
Monash University, Australia
Kathryn Urban
Affiliation:
Massachusetts Institute of Technology, USA
Rights & Permissions [Opens in a new window]

Abstract

Engaging with audiences and communities beyond academia is now a common practice for political scientists. Yet, political scientists rarely are trained in how to conduct public or policy engagement, and we know little about the impact that training programs have on their preparedness to communicate with the public and policy makers. In this study, we evaluate whether professional training equips scholars with the skills needed to perform public and policy outreach. We find that a four-day training program generates remarkably large increases in the number of participants reporting that they possess high levels of knowledge, preparation, and confidence for public and policy engagement. This finding suggests that investments in public-engagement training by universities and the discipline of political science have the potential to significantly boost public outreach by faculty members.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of American Political Science Association

Recently, there has been growing recognition in the discipline of political science about the importance of engaging with communities beyond academia, as well as increased interest among political scientists in conducting such outreach. Moreover, in some countries (e.g., the United Kingdom and Australia), public impact has become a key metric by which universities are evaluated. Yet, whereas political scientists are trained extensively in the conduct of research, they typically receive little or no training in how to communicate with public audiences and policy practitioners. In this study, we evaluated whether professional training could provide faculty with greater knowledge, skills, and confidence for the conduct of public and policy engagement. We found that a four-day training program had substantively large and statistically significant effects on faculty preparedness for these types of outreach across several measures.

PUBLIC ENGAGEMENT AND THE ROLE OF PROFESSIONAL TRAINING

In recent decades, public engagement has moved from the fringes of discourse within the political science discipline to become a widely accepted part of the work of political scientists. One key early marker in this movement for greater public engagement was Robert Putnam’s 2002 American Political Science Association (APSA) presidential address. In his speech, Putnam argued that “an important and underappreciated part of our professional responsibility is to engage with our fellow citizens in deliberation about their political concerns, broadly defined,” adding that “[p]olitical science must have a greater public presence” (Putnam Reference Putnam2003, 249). In the subfield of international relations, prominent scholars issued similar calls (Jentleson Reference Jentleson2002; Nye Reference Nye2008; Walt Reference Walt2005). More recently, political scientists outlined different mechanisms through which scholars can connect with public audiences and policy communities (Busby Reference Busby2018; Campbell and Tama Reference Campbell, Tama and ThiesForthcoming; Murphy and Fulda Reference Murphy and Fulda2011); identified best practices for effective and responsible public and policy engagement (Barma and Goldgeier Reference Barma and Goldgeier2022; Berling and Bueger Reference Berling and Bueger2013; Levine Reference Levine2020, Reference Levine2021; Mikhael and Norman Reference Mikhael and Norman2021; Toft Reference Toft2018); and highlighted conditions that facilitate impactful links between research and policy (Avey et al. Reference Avey, Desch, Parajon, Peterson, Powers and Tierney2022; Desch Reference Desch2019; Maliniak et al. Reference Maliniak, Peterson, Powers and Tierney2020; Tama et al. Reference Tama, Barma, Durbin, Goldgeier and Jentleson2023).

APSA has supported several initiatives intended to foster public engagement by political scientists. In 2015, an APSA task force on public engagement issued recommendations for how the discipline can pursue public outreach more effectively (Lupia and Aldrich Reference Lupia and Aldrich2015). Between 2016 and 2018, APSA coordinated workshops that were designed to train scholars in communicating about their research in the public arena.Footnote 1 In 2020, an APSA task force on new partnerships developed a series of programs to facilitate public scholarship and partnerships between political scientists and institutions outside of academia (Smith et al. Reference Smith, Rasmussen, Galston, Han, King-Meadows, Kirkpatrick, Levine, Lieberman, Mylonas, Rigger, Sinclair-Chapman, Shay, Van Vechten and Grigg2020). APSA’s new civic engagement section and Institute for Civically Engaged Research represent additional efforts within the discipline to connect with communities beyond academia (Bennion Reference Bennion2022; Dobbs et al. Reference Dobbs, Hess, Bullock and Udani2021).Footnote 2 At the same time, many political scientists have demonstrated their interest in public engagement by writing for nonacademic outlets. As of 2019, almost 3,500 political scientists had published in the Monkey Cage (Farrell and Knight Reference Farrell and Knight2019).

Yet, political scientists continue to face challenges when it comes to public and policy outreach. Perhaps most important, the professional criteria for evaluating university-based political scientists still do not tend to give much weight to public engagement (Desch et al. Reference Desch, Goldgeier, Petrova and Peh2022; Kendrowski Reference Kendrowski2022; Lupia and Aldrich Reference Lupia and Aldrich2015; Maliniak, Peterson, and Tierney Reference Maliniak, Peterson and Tierney2019). Even in countries where university “impact” is explicitly evaluated by government agencies, promotion criteria for individual faculty members may not value public engagement (Williams and Grant Reference Williams and Grant2018). In addition, with a few exceptions—including the APSA programs noted previously and those run by Bridging the Gap (BTG) (see Jentleson Reference Jentleson2015)—political scientists typically are not trained in how to disseminate their work to nonacademic audiences or to engage with policy practitioners. There also do not seem to be any published studies evaluating the impact of programs designed to train political scientists in public and policy outreach.

…political scientists typically are not trained in how to disseminate their work to nonacademic audiences or to engage with policy practitioners.

However, there are empirical studies of public-engagement training in the natural sciences and across disciplines, and this research highlights the value of such training. Some of this work has been conducted in the United Kingdom, motivated in part by the adoption of the Research Excellence Framework, which requires universities to demonstrate impact beyond academia. Notably, studies have found that scholars in different disciplines are more likely to be willing to participate in public engagement if they have received training in it or if they feel confident or well equipped for it (Burchell, Sheppard, and Chambers Reference Burchell, Sheppard and Chambers2017; Hamlyn et al. Reference Hamlyn, Shanahan, Lewis, O’Donoghue, Hanson and Burchell2015; Seakins and Fitzsimmons Reference Seakins and Fitzsimmons2020). Moreover, the studies reveal that many scholars are reluctant to engage in public outreach because they feel that they lack the skills or expertise needed to do so (Stylinksi et al. Reference Stylinski, Storksdieck, Canzoneri, Klein and Johnson2018). One study of a program that trains natural scientists in public engagement found that the participants reported improvement in their engagement skills and an increase in the frequency of their engagement after undergoing the training (Stylinksi et al. Reference Stylinski, Storksdieck, Canzoneri, Klein and Johnson2018).

We built on these findings by conducting the first study of the impact of training in public and policy engagement for political scientists. We hypothesized that training in public and policy engagement will make political scientists (1) more knowledgeable about options for public and policy engagement, (2) more well prepared to conduct public and policy engagement, and (3) more confident in their public- and policy-engagement skills.

STUDY DESIGN

We conducted the study as part of the 2022 International Policy Summer Institute (IPSI), which is conducted by BTG, a multi-university initiative that aims to foster greater links between scholars of international policy issues and communities outside of academia. Admission to IPSI is based on a competitive application process. For the 2022 IPSI, BTG selected 24 faculty and postdoctoral scholars from various US and international universities to attend the four-day training program, which took place at American University in Washington, DC. Of the 24 participants, 21 were political scientists; the remaining three were trained in history, peace and conflict studies, and African studies. Of the 24 participants, 17 self-identified as female in our pre-workshop survey; the remaining seven self-identified as male (Tama, Rublee, and Urban Reference Tama, Rublee and Urban2023).

The program sessions were directed by six international relations professors affiliated with BTG. They included discussions of engaging with policy officials in the areas of peace building, international development, and security policy; communicating with public audiences through newspaper op-ed articles, blog posts, and podcasts; writing for policy-oriented journals and magazines; developing a media strategy; interacting with think tanks; engaging with congressional staff; and pursuing policy engagement in a responsible and ethical manner. Speakers included academics who have conducted various forms of public and policy engagement, as well as current and former government officials, think-tank experts, newspaper and magazine editors, and communications professionals. The participants also gained experience in being filmed for a mock television interview about their research—after which they received constructive criticism about their performance—and they workshopped their own draft op-ed articles and blog posts in small groups. (See the online appendix for the complete workshop agenda.)

To evaluate the effectiveness of IPSI, we designed pre- and post-survey questionnaires designed to measure changes prompted by the workshop. Used widely across academia, the pre-test/post-test design is useful because it permits researchers to determine to what extent a workshop, training session, or other intervention makes a difference in respondent answers. A positive change between the pre- and post-workshop answers indicates that the intervention was effective (Davis et al. Reference Davis, Baral, Strayer and Serrano2018). As Stratton (Reference Stratton2019, 573) noted, “An advantage of a pre-test and post-test study design is that there is a directionality of the research, meaning there is testing of a dependent variable (knowledge or attitude) before and after intervention with an independent variable (training or an information presentation session).” To ensure that any changes can be attributed to the intervention rather than other events, the post-test must be completed soon after the intervention.

We piloted the survey design with participants in a similar policy-engagement workshop, facilitated by BTG personnel, for Australian academics in September 2020 (Rublee Reference Rublee2023). Using the results from the pilot study, we designed the survey for IPSI participants to measure their perceptions of knowledge, confidence, and preparedness in policy engagement before and after the workshop. (The full surveys are available in the online appendix.) We asked questions related to three types of policy engagement: conducting media interviews, writing for nonacademic outlets, and engaging with policy officials and practitioners. The first question was related to knowledge about mechanisms and pathways for these three types of engagement, using a 5-point Likert scale: very knowledgeable, knowledgeable, somewhat knowledgeable, a little knowledgeable, and not at all knowledgeable. The second question asked about how well prepared respondents felt to conduct these three types of policy engagement, using a 5-point Likert scale: very well prepared, well prepared, somewhat prepared, a little prepared, and not at all prepared. The third question asked how confident participants felt in their skills for conducting the three types of policy engagement, using a 5-point Likert scale: very confident, confident, somewhat confident, a little confident, and not at all confident. Other questions in the surveys asked about the participants’ experience in policy engagement, whether the workshop met their expectations, and how the program could be improved in the future. The only demographic question asked was regarding gender. Because we knew that the cohort would be small, including other demographic questions would have allowed us to identify respondents, and our priority was to maintain anonymity for participants.

The surveys were anonymous and administered through Qualtrics by email, with ethics approval granted through one of our universities. The pre-workshop survey was emailed to all 24 members of the 2022 IPSI cohort on May 31; all participants responded by June 7. The post-workshop survey was emailed on the last day of IPSI (June 15), and we closed responses on June 30. Of the 24 respondents, 21 completed the post-workshop survey (i.e., 87.5%) and all of them completed the survey in its entirety.

FINDINGS

The survey results indicated that participation in the four-day IPSI greatly increased scholars’ knowledge, preparedness, and confidence regarding public and policy engagement. Across all three questions and types of policy engagement included in the questionnaire, the pre- and post-workshop surveys revealed large and statistically significant increases. These findings support our hypotheses that offering political scientists training in public and policy engagement will assist them in becoming more knowledgeable about pathways for engagement, better prepared to conduct engagement activities, and more confident in their public- and policy-engagement skills.

The survey results indicated that participation in the four-day IPSI greatly increased scholars’ knowledge, preparedness, and confidence regarding public and policy engagement.

The following subsections report the results in two ways for each survey question. First, we report the percentage of respondents whose answers were in one of the top two categories (i.e., knowledgeable or very knowledgeable; well prepared or very well prepared; and confident or very confident). In doing so, we combined the top two categories into a single category (i.e., knowledgeable, prepared, or confident) to simplify the presentation of results. Second, we report the average respondent score using Likert-scale values, with 5 representing the top category of very knowledgeable, very well prepared, or very confident and 1 representing the bottom category of not at all knowledgeable, not at all prepared, or not at all confident. The results for each question reveal substantial shifts using both measures, and t-tests indicated that all of these increases were statistically significant (p<0.001).

Knowledge

Participant knowledge about public and policy engagement increased dramatically across the different domains of media interviews, nonacademic outlets, and engagement with policy officials and practitioners. (See Figure 1.) Before IPSI, none of the participants reported feeling knowledgeable about pathways or mechanisms toward conducting media interviews. When the responses were evaluated using Likert-scale values, the average score for this question was 1.71. In the post-workshop survey, the percentage of participants reporting knowledge of pathways or mechanisms toward conducting media interviews had increased to 81%, with an average score of 3.86. Similar increases were reported across other forms of policy engagement. The proportion of participants feeling knowledgeable about pathways or mechanisms to write for nonacademic outlets increased from 25% before IPSI to 95% after IPSI, with average scores increasing from 2.88 to 4.48. Only 4% of participants reported knowledge for pathways or mechanisms to engage policy officials and practitioners in the pre-workshop survey; afterwards, 67% did. The average Likert score increased from 2.12 to 3.81.

Figure 1 Knowledge of Forms of Public and Policy Engagement

Note: The bars show the percentages of survey responses in the top two categories for each question.

Preparedness

Participants also reported substantial increases in their preparedness for different types of public and policy engagement. (See Figure 2.) None of the participants felt prepared to conduct media interviews before participating in IPSI. By the end of the workshop, 52% reported being prepared to do so, with a corresponding increase in average Likert-scale values from 1.75 to 3.67. Only 25% of participants felt prepared to write for nonacademic outlets before IPSI, whereas a full 100% of respondents felt prepared after the workshop. Average scores for this question increased from 2.88 pre-IPSI to 4.57 post-IPSI. Finally, the proportion of participants who felt prepared to engage policy officials and practitioners increased from 4% to 76%, with a change in average scores from 2.33 to 3.9.

Figure 2 Preparedness for Forms of Public and Policy Engagement

Note: The bars show the percentages of survey responses in the top two categories for each question.

Confidence

The results were much the same for the participants’ confidence in their ability to engage with the public and policy makers. (See Figure 3.) The proportion of respondents who reported feeling confident in their ability to conduct media interviews increased from 0% pre-IPSI to 57% post-IPSI, and the average value of the Likert scores for this question increased from 1.83 to 3.48. Before IPSI, 29% of respondents reported feeling confident in their ability to write for nonacademic outlets, with an average score of 3.00. After IPSI, those figures had increased to 86% and 4.24, respectively. Whereas only 8% of participants were confident in their ability to engage policy officials and practitioners before IPSI, 71% reported confidence in this ability after the workshop. Average Likert-score values for this question increased from 2.5 to 3.81 during the same period.

Figure 3 Confidence in Public- and Policy-Engagement Ability

Note: The bars show the percentages of survey responses in the top two categories for each question.

CONCLUSIONS

The results of this study suggest that professional training can bolster greatly the capacity of faculty members to interact with policy practitioners and disseminate their work to public audiences. In particular, we found that several days of focused training markedly improved the knowledge, confidence, and preparedness of international relations scholars to engage with policy officials and practitioners, publish in nonacademic outlets, and conduct media interviews. Because our survey involved only one cohort of one policy training workshop, the generalizability of our findings is limited. Therefore, further research involving larger sample sizes of participants is needed; this also would enable comparisons of the effects of such training programs for scholars with different demographic characteristics. Indeed, if similar programs evaluated their effectiveness using our survey categories (see the online appendix), being able to compare results across them would be useful.Footnote 3 Additional research also could investigate whether training programs of this type have similar effects for political scientists in different subfields or for scholars in other disciplines.

A clear takeaway from this study is that it would be worthwhile for professional associations, political science departments, and schools of public and international affairs to invest in more programs designed to train scholars in public and policy engagement. IPSI is funded by the Carnegie Corporation of New York, a private foundation (Carroll Reference Carroll2023). Even with its generous support, BTG can accommodate only 24 IPSI scholars each year. To reach the larger community of political scientists with public- and policy-engagement training programs, departments, schools, universities, and professional associations must invest more in these programs. Although it may be infeasible for most institutions to support programs as extensive as IPSI, shorter programs or modules also could improve the capacity of faculty members to pursue outreach to practitioners and public audiences. Further research could investigate the effects of one-day, half-day, or modular training programs focused on the development of particular policy- or public-engagement skills.

A clear takeaway from this study is that it would be worthwhile for professional associations, political science departments, and schools of public and international affairs to invest in more programs designed to train scholars in public and policy engagement.

This research also highlights the value of evaluating other types of professional-development programs for political scientists, which have proliferated in recent years but rarely publish results about their effectiveness. For example, “pay-it-forward” mentoring programs, which provide academic advice and networking opportunities to specialized cohorts (e.g., early-career women), are offered and/or funded by numerous organizations in international relations, including the International Studies Association, Women in Conflict Studies, Journeys in World Politics, and the Oceanic Conference on International Studies. Women and other historically excluded scholars welcome these types of professional development programs (Rublee et al. Reference Rublee, Jackson, Parajon, Peterson and Duncombe2020; Zvobgo et al. Reference Zvobgo, Sotomayor, Rublee, Loken, Karavas and DuncombeForthcoming). Assessment of these programs could not only increase their effectiveness but also encourage additional institutional support. However, to our knowledge, no publicly available assessment data are available. Further study of the effectiveness of these and other professional-development programs is a key next step in the efforts to enhance the skill set, reach, inclusiveness, and success of political scientists.

ACKNOWLEDGMENTS

The authors are grateful to the Carnegie Corporation of New York for supporting IPSI and to Emmanuel Balogan, Naazneen Barma, Brent Durbin, James Goldgeier, Bruce Jentleson, Megan Nerdig, and Rachel Whitlark for their work on the 2022 iteration of IPSI. The Australian pilot of the survey design was supported by the Australian Department of Defence Strategic Policy Grant Program (Grant No. 2020-1060-080, “Bridging the Gap: New Voices in Australian National Security”).

DATA AVAILABILITY STATEMENT

Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Harvard Dataverse at https://doi.org/10.7910/DVN/2NJKAU.

Supplementary Material

To view supplementary material for this article, please visit http://doi.org/10.1017/S1049096523000987.

CONFLICTS OF INTEREST

The authors declare that there are no ethical issues or conflicts of interest in this research.

Footnotes

1. See APSA, “Communication Training Workshops.” www.apsanet.org/publicengagement/workshops.

2. See Centennial Center for Political Science and Public Affairs, “The Institute for Civically Engaged Research.” https://connect.apsanet.org/centennialcenter/the-institute-for-civically-engaged-research.

3. The online appendix includes the workshop agenda; the pre-workshop survey; the post-workshop survey; the survey explanatory statement; and participant responses to the open-ended question, “Which parts of the workshop did you find most useful?”

References

REFERENCES

Avey, Paul C., Desch, Michael C., Parajon, Eric, Peterson, Susan, Powers, Ryan, and Tierney, Michael J.. 2022. “Does Social Science Inform Foreign Policy? Evidence from a Survey of US National Security, Trade, and Development Officials.” International Studies Quarterly 66 (1): sqab057.10.1093/isq/sqab057CrossRefGoogle Scholar
Barma, Naazneen H., and Goldgeier, James. 2022. “How Not to Bridge the Gap in International Relations.” International Affairs 98 (5): 1763–81.10.1093/ia/iiac102CrossRefGoogle Scholar
Bennion, Elizabeth A. 2022. “Introduction: APSA’s Civic Engagement Section Shines a Spotlight on Civic Engagement.” PS: Political Science & Politics 55 (2): 385–86.Google Scholar
Berling, Trine Villumsen, and Bueger, Christian. 2013. “Practical Reflexivity and Political Science: Strategies for Relating Scholarship and Political Practice.” PS: Political Science & Politics 46 (1): 115–19.Google Scholar
Burchell, Kevin, Sheppard, Chloe, and Chambers, Jenni. 2017. “A ‘Work in Progress’? UK Researchers and Participation in Public Engagement.” Research for All 1 (1): 198224.10.18546/RFA.01.1.16CrossRefGoogle Scholar
Busby, Joshua. 2018. “On Policy Engagement and Academia: 5 Approaches to Bridging the Gap.” Texas National Security Review, February 20. https://tnsr.org/roundtable/policy-roundtable-bridge-gap-academics-policymakers.Google Scholar
Campbell, Susanna P., and Tama, Jordan. Forthcoming. “Bridging the Gap in International Relations.” In Handbook of International Relations, ed. Thies, Cameron G.. Cheltenham, UK: Edward Elgar Publishing.Google Scholar
Carroll, Kathleen. 2023. Bridging the Gap: How Scholarship Can Inform Foreign Policy for Better Outcomes. New York: Carnegie Corporation of New York.Google Scholar
Davis, George C., Baral, Ranju, Strayer, Thomas, and Serrano, Elena L.. 2018. “Using Pre- and Post-Survey Instruments in Interventions: Determining the Random Response Benchmark and its Implications for Measuring Effectiveness.” Public Health Nutrition 21 (6): 1043–47.10.1017/S1368980017003639CrossRefGoogle ScholarPubMed
Desch, Michael C. 2019. Cult of the Irrelevant: The Waning Influence of Social Science on National Security. Princeton, NJ: Princeton University Press.Google Scholar
Desch, Michael C., Goldgeier, James, Petrova, Ana K., and Peh, Kimberly. 2022. “Policy School Deans Want It All: Results of a Survey of APSIA Deans and Top-50 Political Science Department Chairs on Hiring and Promotion.” International Studies Perspectives 23 (1): 4170.10.1093/isp/ekaa022CrossRefGoogle Scholar
Dobbs, Kristie Lynn, Hess, Douglas R., Bullock, Graham, and Udani, Adriano. 2021. “Civically Engaged Research and Political Science.” PS: Political Science & Politics 54 (4): 711–15.Google Scholar
Farrell, Henry, and Knight, Jack. 2019. “How Political Science Can Be Most Useful.” Chronicle of Higher Education, March 10. www.chronicle.com/article/how-political-science-can-be-most-useful.Google Scholar
Hamlyn, Becky, Shanahan, Martin, Lewis, Hannah, O’Donoghue, Ellen, Hanson, Tim, and Burchell, Kevin. 2015. Factors Affecting Public Engagement by Researchers: A Study on Behalf of a Consortium of UK Public Research Funders. London: TNS BRMB and Policy Studies Institute.Google Scholar
Jentleson, Bruce W. 2002. “The Need for Praxis: Bringing Policy Relevance Back In.” International Security 26 (4): 169–83.10.1162/016228802753696816CrossRefGoogle Scholar
Jentleson, Bruce W. 2015. “The Bridging the Gap Initiative and Programs.” PS: Political Science & Politics 48 (S1): 108–14.Google Scholar
Kendrowski, Karen M. 2022. “Integrating Civic Engagement into Scholarly Reward Systems.” PS: Political Science & Politics 55 (2): 400401.Google Scholar
Levine, Adam Seth. 2020. “Research Impact Through Matchmaking (RITM): Why and How to Connect Researchers and Practitioners.” PS: Political Science & Politics 53 (2): 265–69.Google Scholar
Levine, Adam Seth. 2021. “Single Conversations Expand Practitioners’ Use of Research: Evidence from a Field Experiment.” PS: Political Science & Politics 54 (3): 432–37.Google Scholar
Lupia, Arthur, and Aldrich, John H.. 2015. “How Political Science Can Better Communicate Its Value: 12 Recommendations from the APSA Task Force.” PS: Political Science & Politics 48 (S1): 119.Google Scholar
Maliniak, Daniel, Peterson, Susan, Powers, Ryan, and Tierney, Michael J. (eds.). 2020. Bridging the Theory–Practice Divide in International Relations. Washington, DC: Georgetown University Press.10.2307/j.ctvz0hb31CrossRefGoogle Scholar
Maliniak, Daniel, Peterson, Susan, and Tierney, Michael J.. 2019. “Policy-Relevant Publications and Tenure Decisions in International Relations.” PS: Political Science & Politics 52 (2): 318–24.Google Scholar
Mikhael, Drew, and Norman, Julie. 2021. “Collaboration in Commissioned Research: Benefits and Challenges of Scholar–Practitioner Partnerships in Conflict Contexts.” PS: Political Science & Politics 54 (3): 554–57.Google Scholar
Murphy, Ann Marie, and Fulda, Andreas. 2011. “Bridging the Gap: Pracademics in Foreign Policy.” PS: Political Science & Politics 44 (2): 279–83.Google Scholar
Nye, Joseph S. Jr. 2008. “Bridging the Gap between Theory and Policy.” Political Psychology 29 (4): 593603.Google Scholar
Putnam, Robert D. 2003. “APSA Presidential Address: The Public Role of Political Science.” Perspectives on Politics 1 (2): 249–55.10.1017/S1537592703000185CrossRefGoogle Scholar
Rublee, Maria Rost. 2023. “Bridging the Policy–Academic Gap: Lessons from Australia.” Duck of Minerva, September 7. www.duckofminerva.com/2023/09/bridging-the-policy-academic-gap-lessons-from-australia.html.Google Scholar
Rublee, Maria Rost, Jackson, Emily B., Parajon, Eric, Peterson, Susan, and Duncombe, Constance. 2020. “Do You Feel Welcome? Gendered Experiences in International Security Studies.” Journal of Global Security Studies 5 (1): 216–26.10.1093/jogss/ogz053CrossRefGoogle Scholar
Seakins, Amy, and Fitzsimmons, Alexandra. 2020. “Mind the Gap: Can a Professional Development Programme Build a University’s Public Engagement Community?Research for All 4 (2): 291309.10.14324/RFA.04.2.11CrossRefGoogle Scholar
Smith, Rogers, Rasmussen, Amy Cabrera, Galston, William, Han, Hahrie, King-Meadows, Tyson, Kirkpatrick, Jennet, Levine, Peter, Lieberman, Robert, Mylonas, Harris, Rigger, Shelley, Sinclair-Chapman, Valeria, Shay, Cammy, Van Vechten, Renee, and Grigg, Amanda. 2020. “APSA Presidential Task Force on New Partnerships.” PS: Political Science & Politics 53 (4): 847–49.Google Scholar
Stratton, Samuel J. 2019. “Quasi-Experimental Design (Pre-Test and Post-Test Studies) in Prehospital and Disaster Research.” Prehospital and Disaster Medicine 34 (6): 573–74.10.1017/S1049023X19005053CrossRefGoogle ScholarPubMed
Stylinski, Cathlyn, Storksdieck, Martin, Canzoneri, Nicolette, Klein, Eve, and Johnson, Anna. 2018. “Impacts of a Comprehensive Public Engagement Training and Support Program on Scientists’ Outreach Attitudes and Practices.” International Journal of Science Education, Part B 8 (4): 340–54.10.1080/21548455.2018.1506188CrossRefGoogle Scholar
Tama, Jordan, Barma, Naazneen, Durbin, Brent, Goldgeier, James, and Jentleson, Bruce. 2023. “Bridging the Gap in a Changing World: New Opportunities and Challenges for Engaging Practitioners and the Public.” International Studies Perspectives 24 (3): 285307.10.1093/isp/ekad003CrossRefGoogle Scholar
Tama, Jordan, Rublee, Maria Rost, and Urban, Kathryn. 2023. “Replication Data for ‘The Impact of Professional Training in Public and Policy Engagement.’” PS: Political Science & Politics. DOI:10.7910/DVN/2NJKAU.10.7910/DVN/2NJKAUCrossRefGoogle Scholar
Toft, Monica. 2018. “Making Academic Work Relevant to Policymakers.” Texas National Security Review, February 20. https://tnsr.org/roundtable/policy-roundtable-bridge-gap-academics-policymakers.Google Scholar
Walt, Stephen M. 2005. “The Relationship between Theory and Policy in International Relations.” Annual Review of Political Science 8:2348.10.1146/annurev.polisci.7.012003.104904CrossRefGoogle Scholar
Williams, Kate, and Grant, Jonathan. 2018. “A Comparative Review of How the Policy and Procedures to Assess Research Impact Evolved in Australia and the UK.” Research Evaluation 27 (2): 93105.10.1093/reseval/rvx042CrossRefGoogle Scholar
Zvobgo, Kelebogile, Sotomayor, Arturo, Rublee, Maria Rost, Loken, Meredith, Karavas, George, and Duncombe, Constance. Forthcoming. “Race and Racial Exclusion in Security Studies: A Survey of Scholars.” Security Studies. https://doi.org/10.1080/09636412.2023.2230880.CrossRefGoogle Scholar
Figure 0

Figure 1 Knowledge of Forms of Public and Policy EngagementNote: The bars show the percentages of survey responses in the top two categories for each question.

Figure 1

Figure 2 Preparedness for Forms of Public and Policy EngagementNote: The bars show the percentages of survey responses in the top two categories for each question.

Figure 2

Figure 3 Confidence in Public- and Policy-Engagement AbilityNote: The bars show the percentages of survey responses in the top two categories for each question.

Supplementary material: File

Tama et al. supplementary material

Tama et al. supplementary material
Download Tama et al. supplementary material(File)
File 368.6 KB