Hostname: page-component-cd9895bd7-gbm5v Total loading time: 0 Render date: 2024-12-23T15:58:11.009Z Has data issue: false hasContentIssue false

Bridging the gap between science communication practice and theory: Reflecting on a decade of practitioner experience using polar outreach case studies to develop a new framework for public engagement design

Published online by Cambridge University Press:  20 March 2019

Rhian A. Salmon*
Affiliation:
Centre for Science in Society, Victoria University of Wellington, PO Box 600, Wellington 6140, New Zealand Te Pūnaha Matatini, University of Auckland, Private Bag 92019, Auckland 1011, New Zealand
Heidi A. Roop
Affiliation:
Centre for Science in Society, Victoria University of Wellington, PO Box 600, Wellington 6140, New Zealand University of Washington Climate Impacts Group, Box 355674, Seattle, Washington 98195-5672, USA
*
Author for correspondence: Rhian A. Salmon, Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

The International Polar Year 2007–2008 stimulated a wide range of education, outreach and communication (EOC) related to polar research, and catalysed enthusiasm and networks that persist ten years on. Using a multi-method approach that incorporates case studies, auto-ethnographic interviews, and survey data, we interrogate the opportunities and limitations of polar EOC activities and propose a new framework for practical, reflexive, engagement design. Our research suggests that EOC activities are under-valued and often designed based on personal instinct rather than strategic planning, but that there is also a lack of accessible tools that support a more strategic design process. We propose three foci for increasing the professionalisation of practitioner approaches to EOC: (1) improved articulation of goals and objectives; (2) acknowledgement of different drivers, voices and power structures; and (3) increased practical training, resources and reporting structures. We respond to this by proposing a framework for planning and design of public engagement that provides an opportunity to become more transparent and explicit about the real goals of an activity and what “success” looks like. This is critical to effectively evaluate, learn from our experiences, share them with peers, and ultimately deliver more thoughtfully designed, effective engagement.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © Cambridge University Press 2019

Introduction

The International Polar Year 2007–2008 (hereafter referred to as the IPY) stimulated a range of polar science research activities and intentionally worked to integrate education, outreach and communication (EOC) activities into the key outcomes of this two-year initiative (Allison et al., Reference Allison, Béland, Alverson, Bell, Carlson, Cutler and Hovelsrund2009; Provencher et al., Reference Provencher, Baeseman, Carlson, Badhe, Bellman, Hik and Zicus2011; Rapley et al., Reference Rapley, Bell, Allison, Bindschadler, Casassa, Chown and Zhang2004; Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011). The IPY had a specific mandate to:

…attract, engage, and develop a new generation of polar researchers, engineers, and logistics experts and [to] engage the awareness, interest, and understanding of schoolchildren, the general public and decision-makers worldwide in the purpose and value of polar research and monitoring.

(Rapley et al., Reference Rapley, Bell, Allison, Bindschadler, Casassa, Chown and Zhang2004, p. 11)

The resultant IPY EOC programme not only stimulated new activities related to polar science but also shed light on, and celebrated, well-established efforts that were hitherto little known beyond their immediate communities (e.g. Allison et al., Reference Allison, Béland, Alverson, Bell, Carlson, Cutler and Hovelsrund2009; Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011). As a result, we are now aware of thousands of individual polar outreach activities from more than sixty countries that occurred during, or since, the IPY. The breadth, quality, quantity, and nature of outreach activities may therefore be easier to explore for polar science than other sciences. An attempt to capture these is documented in Provencher et al. (Reference Provencher, Baeseman, Carlson, Badhe, Bellman, Hik and Zicus2011). Here, we take a micro-approach to interrogating this broad and international scope of work by analysing and reflecting on individual EOC activities that we, the authors, were directly involved with. There is certainly ample scope for further research that analyses the larger dataset.

EOC was an umbrella-term widely employed during the IPY to refer to formal and informal education (through teachers, schools and science museums), informal science outreach (through science centres and museums, and scientists), and professional/institutional media, marketing and communications. Here, we primarily consider what we refer to as “outreach” (a term that is often used interchangeably with “science outreach” or “public outreach”). Polar scientists often use the term “outreach” to refer to their varied public engagement activities (Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011). While we appreciate that this term may appear outdated to Public Engagement with Science (PES) scholars, we use this term consciously, as described by Salmon, Priestley, & Goven (Reference Salmon, Priestley and Goven2017):

In this context, outreach should be seen to encompass both one-way “communication” and two-way dialogue, or “engagement” activities, between scientists and different publics. Adoption of the term “engagement” would assume that the activities include dialogical interaction, where this may not be the case. From here on, we refer to communication activities by scientists as “outreach” in order to remain true to their original (ill-defined) form. (p. 54)

The term “outreach” has the advantage, over other more specific and theoretically defined terms, of including a wide array of public engagement activities that a single scientist might become involved in (e.g. visits to schools, science cafes, science festivals, blogging, public lectures and seminars, workshops, art–science collaborations, teacher–researcher collaborations, documentaries, short videos, recruitment events, webinars, etc.). A more complete description of outreach activities carried out during the IPY can be found in Kaiser, Allen, & Zicus (Reference Kaiser, Allen and Zicus2010). Although many scientists use the term outreach to also include activities targeted at policy-makers and government officials, we restrict our study to the analysis of public-focused outreach for this work, although the later discussion does touch on this broader remit within the concept of public engagement.

The IPY is estimated to have engaged 14 million people in 70 countries (Provencher et al., Reference Provencher, Baeseman, Carlson, Badhe, Bellman, Hik and Zicus2011). However, a survey of a small sample of those involved in managing or participating in IPY EOC activities suggests that, at most, 28% carried out any form of evaluation, and, where this did occur, the evaluation component was not generally funded (Provencher et al., Reference Provencher, Baeseman, Carlson, Badhe, Bellman, Hik and Zicus2011). This low evaluation rate means that not only is it difficult to analyse and build on best practice (e.g. through a peer-review process), but there are also very little data with which to carry out research in this field. This lack of professionalism by scientists regarding their outreach activities may be encouraged by the research-dominated priorities of many universities, including performance-based evaluations that place greater value on research, teaching and service commitments than on public engagement (People Science & Policy, 2009).

We, the authors of this paper, met in 2012. At this time, the IPY was still fresh in the memories of individuals, but the national and international coordination offices had been disbanded. While IPY networks and enthusiasm remained, the formal dissociation of the IPY created a new vacuum, specifically regarding having formal EOC champions and channels through which this work could be recognised, celebrated and funded.

By 2012, we both had extensive experience in helping to create the EOC networks that expanded from the IPY. Roop was serving as the co-chair of the Association of Polar Early Career Scientists (APECS) EOC committee and was a founder of Polar Educators International (PEI), and Salmon had been the EOC coordinator for the IPY international office during 2006–2009 and was a co-founder of APECS. At the time we met, we shared common frustrations with our outreach projects. This led us to explore more deeply whether there were common barriers related to how we, in the polar outreach community, approached outreach.

At the heart of our frustration lay a lack of professionalisation of our field. This manifested itself in several ways: we, ourselves, did not feel equipped to design and evaluate our outreach projects in a professional and accountable manner. With no one to whom we were expected to report the success (or otherwise) of any given project, there was also no opportunity or mandate for honest evaluation, reflection and improvement. As is often the case, only “good news” reporting was acceptable or appropriate due to the environment in which (the default of) no reporting is preferable to reporting that illuminates wasted money or initiatives of questionable value. Further, we felt that our time and expertise were regularly under-valued. Many of the outreach activities we, or our colleagues, had engaged in at this time were seen to be an obligatory element of a project or organisation but were rarely rewarded in recognition or financial resources. This was compounded by the fact that the outreach objectives were often opaque or not clearly articulated, with post-hoc assessment of the number of participants being the only tangible measure of success.

In the decade since IPY, we have both secured professionally valued, fully-funded positions in science engagement. By 2018, at the time of writing, Salmon was a senior lecturer and co-founder of the Centre for Science in Society at Victoria University of Wellington, which offers both an undergraduate Minor and a Master’s programme in Science in Society. She had published articles about public engagement (Salmon, Reference Salmon2013a; Salmon & Priestley, Reference Salmon and Priestley2015; Salmon, Priestley, Fontana, & Milfont, Reference Salmon, Priestley, Fontana and Milfont2017; Salmon, Priestley, & Goven, Reference Salmon, Priestley and Goven2017) and established the Engagement Programme (one of five programmes) in a $24M national cross-institutional research effort related to modelling, impacts and implications of climate change for New Zealand (Salmon, Reference Salmon2017). She had been working professionally in EOC practice, teaching or engagement research since 2006.

Roop, at the time of writing, was Lead Scientist for Science Communication at the University of Washington Climate Impacts Group. This role integrates communications work and research into a range of co-produced climate research projects. The Climate Impacts Group works directly with a range of state, federal and tribal entities to provide both useful and useable science to help build resilience to climate change around the Pacific Northwest region of the United States. Roop was also responsible for helping to design, develop and implement project-based communications work and the wider-organisational strategic communications plan. Prior to this, Roop had over ten years of paid communication experience consulting with a range of organisations, including the National Aeronautics and Space Administration (NASA) and the Arctic Research Consortium of the United States (ARCUS), to develop and assess a range of EOC activities, and had also started publishing in this field (Illingworth & Roop, Reference Illingworth and Roop2015; Roop & Dietze, Reference Roop and Dietze2013; Roop, Martinez-Méndez, & Mills, Reference Roop, Martinez-Méndez and Mills2013; Roop, Wesche, Azhinhaga, Trummel, & Xavier, Reference Roop, Wesche, Azhinhaga, Trummel and Xavier2018).

Starting in 2012, we worked (together or individually) on four polar outreach projects: an Antarctic expedition with a remit for public engagement, an Antarctic-themed public festival, the launch of a documentary film raising awareness of Antarctic and climate research, and an educational event for a virtual network of international educators interested in polar research.

We present these here as case studies of polar EOC and interrogate the successes of each, attempting to assess their reach and impact, and to articulate associated shortcomings or flaws in their design and implementation. To scholars of PES, examination of these case studies will not present any surprising or theoretically challenging insights. However, there is scant literature that presents a diverse collection of researcher–practitioner outreach experiences and interrogates them for collective learnings of relevance to a disciplinary field (in our case polar EOC), rather than simply evaluating individual outreach events. That is to say, we are less interested in the findings from individual activities and more concerned with the insights they may hold for the field of practitioner-led engagement as a whole, and more specifically for the polar EOC community.

Contextualising IPY EOC in science communication literature

The closely related fields of Science Communication and Public Engagement with Science (PES) have grown rapidly over the last thirty years, as shown by the increase in related publications, dedicated journals, academic positions, professional memberships, email lists, conferences and academic programmes of study (Davies & Horst, Reference Davies and Horst2016). What was once a sub-section of Science and Technology Studies (STS) is now a multi-disciplinary field comprising scholars from history and philosophy of science, STS, communication and marketing, science, language and rhetoric. This growth in the scholarly field, especially over the past decade, complements and supports our argument that there has been a substantial “professionalisation” of the field since the IPY.

Davies and Horst suggest that “one key aspect of this professionalisation [of science communication] is the development of a sense of community and of being part of a collective” (Reference Davies and Horst2016, p. 88). We argue that the IPY played a key role in stimulating such a community and sub-field, from here on referred to as “polar EOC”, which continues to this day. For example, IPY spurred the creation of APECS and PEI, both of which continue to play an active role in supporting a range of research scientists and educators in their efforts to connect polar science and society (Hindshaw et al., Reference Hindshaw, Mariash, Vick-Majors, Thornton, Pope, Zaika and Fugmann2018; Provencher et al., Reference Provencher, Baeseman, Carlson, Badhe, Bellman, Hik and Zicus2011; Roop et al., Reference Roop, Wesche, Azhinhaga, Trummel and Xavier2019). The unique aspects that define “polar EOC” include it being

  1. (a) geographically focused rather than focused on a discipline, mission, message/behaviour change or audience;

  2. (b) largely volunteer driven but closely associated with professional educators and communicators, with substantial institutional support on international, national and local levels;

  3. (c) stimulated by a long history of IPYs, all of which have had some kind of ongoing legacy (this was the fourth; the others started in 1882, 1932 and 1957, respectively);

  4. (d) focused on research that is cross-disciplinary, logistically challenging and expensive to carry out (hence potentially benefitting more than other fields from a collaborative and collective approach).

In 2008, an edited compilation, Communicating Science in Social Contexts: New Models, New Practices (Cheng et al., Reference Cheng, Claessens, Gascoigne, Metcalfe, Schiele and Shi2008), brought together a range of the latest thinking and a summary of key developments in PES at the time. Although there are more current overviews of Science Communication research available (e.g. Bucchi & Trench, Reference Bucchi and Trench2008; Davies & Horst, Reference Davies and Horst2016; Gilbert & Stocklmayer, Reference Gilbert and Stocklmayer2013), we highlight this collection specifically because it was published half way through the IPY (which ran from March 2007 to March 2009). It is reasonable to assume, therefore, that this collection presents a snapshot of science communication scholarship at the time of the IPY.

The collection presents several useful models for understanding science communication (which are still pertinent today), such as the trajectory from deficit, to dialogue, to participation (Trench, Reference Trench, Bucchi and Trench2008), and the transition from a model of diffusion, to deliberation, to negotiation (Horst, Reference Horst, Bucchi and Trench2008). Both explore the transition from a predominantly one-way, content-heavy communication to a scenario that is focused more on co-production of knowledge. When engaging with practitioners about these theories, we like to use the terms proposed by Susan Stocklmayer (Reference Stocklmayer, Gilbert and Stocklmayer2013) as a transition from knowledge transfer, to knowledge sharing, to knowledge building.

For the purposes of this paper, one of the most relevant articles in this collection is presented by Steve Miller (Reference Miller, Cheng, Claessens, Gascoigne, Metcalfe, Schiele and Shi2008), in which he explores his perception of there being a gap between theory and practice:

Yet the impression remains: on the one hand are the practitioners, often with a background in the natural sciences, medicine or engineering, who organise and take part in public engagement with science activities of one sort or another; on the other hand are the researchers, usually with a background in the social sciences or humanities, writing articles for the journals, aloof from the blood and sawdust of the science communication arena. And the two just do not talk to one another.

(p. 275)

He explores this perceived gap by surveying delegates at the British Association for the Advancement of Science’s 2007 Science Communication conference in London. The conference focused its first day on science communication more generally, and its second day specifically on climate change. His summary of the survey results is “that the ‘average’ science communicator is a (relatively) young and middle-ranking woman, well trained in science but less so in communication, who does not pay a great deal of attention to the relevant research literature.” (p. 280)

In the same year as the conference observed by Miller, Salmon was the Education, Outreach and Communication coordinator for the IPY, grappling with the broad IPY EOC mandate and associated research that had a key focus on climate change. As a PhD atmospheric chemist in her early thirties with no professional science communication training (Salmon, Reference Salmon2013a), she was therefore, at that time, very closely described by the demographic identified in Miller’s survey.

Salmon was based at the IPY international office, in Cambridge, UK, which would have been within easy physical access to the conference. Readers of this retrospective article may well delight in the uncanny timing of such an event, and hope that the connections and collaborations that it catalysed led to a theoretically robust and innovative strategy for EOC delivery through the IPY. Alas, to the best of our knowledge, no-one from the IPY attended that conference, or suggested its relevance to the IPY programme. Further, despite being supported by highly skilled international science communication professionals with expertise across media, formal, and informal education (Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011), the IPY EOC committee did not have any advisors from the PES research community. We therefore regretfully conclude that Steve Miller was validated in his supposition that there was a sizeable gap between theory and practice, at least within the international polar EOC community, at the time.

Although the IPY EOC programme did not have PES researchers on its committees, and most IPY scientist communicators probably did not read PES literature, this does not mean that the IPY outreach activities were entirely uninformed by theory. Members of the IPY EOC committee, and the associated sub-committees, attended dozens of international conferences focused on professional communities working in, for example, formal science education, informal science centres, science media, Antarctic tourism, and Arctic policy. Each of these had a range of keynote speakers and sessions that directly informed the work of the IPY committees and their associated activities (Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011). Although the practitioners may not have been able to articulate their work within specific theoretical frameworks at the time, this does not mean that those same models were not influencing their work. We acknowledge, however, that a deeper understanding of theory and evaluation methodologies might have helped to push the design of the activities even further into the realms of knowledge building, participation and negotiation.

The call for greater connectivity between theory and practice comes not only from the science communication research community but also from practitioners (although this is documented less in peer-reviewed journals). In their 2017 paper entitled The reflexive scientist: an approach to transforming public engagement, Salmon, Priestley and Goven explored “how and why the PES literature does not ‘speak for itself’ to scientists but provides a starting point for conversation rather than a substitute for it” (Salmon, Priestley, & Goven, Reference Salmon, Priestley and Goven2017). They identified that practitioners were frustrated by language used in the literature, lack of practical advice on how to do things differently, and a feeling of “being attacked” by the theorists. There appears, therefore, to be frustration on both sides of the theory–practice divide. Identification of this lack of engagement with theory by practitioners is critical, for what is the point of progressive research and new models of public engagement if those designing and delivering the interventions and activities are not being informed by this thinking?

Rather than simply finger-pointing at the scientists and science communicators for being ignorant and uninformed (a very “deficit-model” approach to bridging the gap), Miller asks if PES researchers themselves might be partially responsible for this lack of collaboration between communities of theory and practice. Further, he suggests that “the science communication community should surely be a great source of information and experience for the research community—a living laboratory in science–citizen interchanges” (Miller, Reference Miller, Cheng, Claessens, Gascoigne, Metcalfe, Schiele and Shi2008, p. 273).

We present our experiences in that spirit, as a “living laboratory” of science communication that we can use to interrogate new mechanisms for public engagement, and greater collaboration between PES researchers and science communication practitioners.

Polar case studies

When we (Salmon and Roop) met in 2012, we shared a frustration at what felt like a lack of professionalisation of outreach work related to polar and climate science, considering the significant time and resources that were being invested in it. To a large degree, this lack of professionalisation was related to the lack of any clear articulation of what constitutes “success” in a given initiative, a lack of knowledge regarding how to evaluate against that, and a need for professional development in this field.

This situation was not unique to us, or to polar EOC more generally; as Horlick-Jones et al. (Reference Horlick-Jones, Walls, Rowe, Pidgeon, Poortinga, Murdock and O’Riordan2007) share in their analysis of the “GM Nation” public debate about genetically modified crops and related matters (also published around the time of the IPY),

We view with some concern the tendency among those involved in conducting engagement processes to have no clear a priori ideas about what it means for their exercise to be a success. As a result, evaluations are not infrequently ad hoc affairs, which are conducted almost as an afterthought to the organisation of the exercise.

(p.20)

They posit that post-event inductive “evaluations”, based on case studies, should be termed assessments rather than evaluations, as they are limited in the “extent to which their results may be replicated or generalised” (p. 20) as well as being necessarily subjective in nature.

Keeping this in mind, we present here our assessment of four 2012 case studies in which we were involved that (a) focused on similar scientific research, (b) had similar goals regarding public engagement, (c) had some form of coordinating outreach committee (compared to an individual outreach activity carried out by a single scientist), and (d) utilised different communication methods across the set. All four activities had strong input from Salmon and Roop, were focused on research carried out in the polar regions, and built on the legacy of the IPY. They therefore clearly emerged and occurred within the broader culture and community of “polar EOC” discussed above, in which the authors were both trained and embedded at that time. Future work and reflection related to the polar EOC community might consider how such outreach communities evolve and professionalise across time, and what lessons can be learnt from associated literature on communities of practice and professional learning communities (e.g. Wenger-Trayner & Wenger-Trayner, Reference Wenger-Trayner and Wegner-Trayner2015).

Table 1 summarises the level of funding, official purpose, and outputs, and details our rudimentary attempts at the time to incorporate some form of evaluation or assessment for the four case studies: an Antarctic expedition, a public festival, the launch of a documentary film, and a virtual educational event.

Table 1. A summary of the four polar case studies considered in the research.

Here we describe our attempts to interrogate these case studies. Although we did not have a well-articulated research question at the time (a common pitfall also identified by Horlick-Jones et al., Reference Horlick-Jones, Walls, Rowe, Pidgeon, Poortinga, Murdock and O’Riordan2007), we were broadly exploring what barriers or opportunities there were related to “professionalisation” of the outreach practice associated with each of these activities. This included barriers to assessment, opportunities for sharing best practice, funding and power structures, and implicit assumptions. We use select examples to illustrate recurring themes and challenges that arose throughout all case studies.

Antarctic expedition

Objectives

The Our Far South expedition was led by the Morgan Foundation, which carries out private research into public policy issues of relevance to New Zealand, usually resulting in the publication of a book (e.g. Morgan, Reference Morgan2014; Morgan & McCrystal, Reference Morgan and McCrystal2009; Morgan & Simmons, Reference Morgan and Simmons2009, Reference Morgan and Simmons2011, Reference Morgan and Simmons2013). This four-week ship-based expedition to New Zealand’s sub-Antarctic islands and Antarctica included “forty everyday New Zealanders” and six experts in Antarctic science. This expedition focused specifically on raising awareness about climate change, conservation, and commercial interests in the area, such as fishing, tourism and mining (Salmon, Priestley, Fontana, & Milfont, Reference Salmon, Priestley, Fontana and Milfont2017).

Outreach and outputs

The associated public engagement campaign began prior to the expedition, with media coverage and visits to local schools by expedition team members who lived around New Zealand. During the expedition, there were daily reports from the ship using a blog and Twitter, and answers to real-time questions from school students. Daily blogs, photographs, and videos in the fictional voice of “Shackleton Bear” were also posted to reach the primary school audience. A significant amount of national and local media coverage occurred during the expedition, including a prominent exhibition at Wellington International Airport. Following the expedition, team members gave presentations about the journey to their local schools using standardised presentation materials, and the Morgan Foundation embarked on a national roadshow involving public lectures, interviews with local media, and school shows. Four books were also published: a photography book, a science book, an adventure book, and a children’s book, as well as an award-winning documentary film. The expedition and associated outreach are estimated to have engaged more than 10,000 school students, catalysed dialogue events reaching at least 3500 individuals, and instigated media articles that reached more than 200,000 people (Salmon, Priestley, Fontana, & Milfont, Reference Salmon, Priestley, Fontana and Milfont2017).

Author involvement and attempts at assessment

Salmon was a paid member of a team that developed and designed the education and public outreach programme associated with Our Far South. As part of this work, although not expected, she collated data about all events and activities that the expedition catalysed, and synthesised these into an internal report for the organisers and funders. The report was well received as it showcased an unexpectedly high reach and, in the eyes of the funders, was seen to be an uncontested success. To probe deeper, Salmon also circulated a survey for expedition participants. The survey explored initial motivations for participation, actual experience, and expectations related to the future impact of this expedition.

Insights

Analysis of the survey results, as well as document analysis about the expedition that sponsoring organisations produced (such as annual reports and Board presentations) indicate that both participants and funders were highly satisfied with the expedition outputs and outcomes. One subset of the team was, however, somewhat dissatisfied: the scientists who participated in the expedition. Follow-up conversations to clarify their responses identified that their personal outreach goals (as distinct from their institutional goals or the goals of their managers) had differed substantially from those of the overall programme. Rather than simply raising awareness around climate change, these individuals were hoping the expedition would stimulate new actions and activities and lead to a greater investment in their research. As these outreach goals had not been clearly articulated at the start, and therefore had not been incorporated, acknowledged, or explicitly discussed with the expedition designers, no activities had been incorporated to address these needs. In retrospect, this felt like a wasted opportunity to the scientist participants. This illustrates the need to articulate and make explicit the different drivers and objectives of all parties involved in an initiative, including the audience/stakeholders, participants, experts, funders, and coordinators.

Antarctic festival

Objectives

New Zealand IceFest 2012 was a four-week public festival aimed at “bringing Antarctica to Christchurch”. The event occurred a year after major earthquakes had devastated the region, and was explicitly funded to encourage residents to return to the city centre and have fun (Salmon, Reference Salmon2013b; Salmon, Priestley, Fontana, & Milfont, Reference Salmon, Priestley, Fontana and Milfont2017).

Outreach and outputs

This festival was organised and supported by Christchurch City Council and cost over NZD 2 million. Held in a central park in the city, the festival included a substantial arts programme, film series, educational programme for schools, writers’ festival, and a programme that profiled over 100 Antarctic experts in a range of dialogue events (e.g. science cafes, panel discussions, public lectures and seminars). There were also a wide range of drop-in weekend activities for visitors to engage with, including hands-on educational activities for children, augmented-reality journeys across Antarctica using tablet technology, an Antarctic field camp and live video-links to Antarctica (Salmon, Reference Salmon2013b; Salmon, Priestley, Fontana, & Milfont, Reference Salmon, Priestley, Fontana and Milfont2017).

Author involvement and attempts at assessment

Salmon was the (paid) designer and coordinator of the science education and outreach programme and Roop participated in the festival as a featured scientist, coordinator of a teacher–researcher initiative, and co-designer of the Flakes, Blobs and Bubbles installation and associated art project hosted at the festival hub (see Illingworth & Roop, Reference Illingworth and Roop2015, for activity details).

A private company was contracted (prior to the festival) to carry out an overall evaluation of the festival, which was primarily focused on brand awareness, economic impact and participant enjoyment. The science education team had the opportunity to include a couple of generic questions in this survey, but these did not provide enough detail for helpful insight into how effective or valuable a given initiative was. In the official post-event report (Blair, Reference Blair2012), the organisers describe themselves as “thrilled” to have attracted “97,000 visitors; exceeding our 75% satisfaction target with 81% of attendees satisfied with the content and delivery of the festival.

To probe the science education and outreach programme in more depth, we distributed a survey around all speakers (before and after the event, as well as an adapted version for those who did not complete the pre-survey) and event facilitators, and made (different but related) surveys available to audience members. These were transcribed into Qualtrics and evaluated using both qualitative and quantitative tools within that software.

Insights

This was the only case study that had articulated Key Performance Indicators, included some form of formal evaluation, and had a clear reporting structure and explicit lines of accountability. A subsequent report was commissioned by the organisers of IceFest 2014, based on the data that we collected, which informed the design of the science and education programme for a future festival. It is worth noting, however, that this was only possible to deliver because we had captured the data at the event itself; the report was commissioned several months after the event had finished, when it would have been too late, and methodologically questionable, to carry out a retroactive survey. This illustrates the need to consider evaluation metrics (and objectives) during the design process of any engagement initiative.

In addition, the need to develop a second evaluation instrument illustrates that there can be (and often are) multiple, overlapping purposes to a given initiative. In the case of this festival, the evaluation commissioned by the Council did not deliver significant insights into the outcomes of the education and outreach component specifically, as it was primarily focused on economic impact, brand awareness, and attendance. This reinforces the need to unpack the multiple purpose(s) or goal(s) of a given initiative and invest effort in articulating clear objectives for each.

Global launch of Thin Ice Climate documentary film

Objectives

Thin Ice Climate is a documentary film directed by geologist Dr Simon Lamb, who aimed to create a documentary about “the inside story of climate science”, representing the voices and perspectives of climate scientists. The film was released in a global launch on Earth Day, April 2013. During this time, free access was granted to individuals wishing to stream the film over a 48-hour period.

Outreach and outputs

The original 73-minute film has a substantial focus on science carried out in, or related to, Antarctica, but also includes footage from northern Norway, New Zealand, North America, and Europe. Filming involved documenting scientists at work in the field and laboratory, and included perspectives from Arctic residents, and international policy-meetings. In addition to free streaming access on Earth Day 2013, over 200 physical screenings were arranged around the world on all seven continents and in 120 countries. The film was viewed over 19,000 times online during this 48-hour period. Many screenings also included a panel discussion or interview with local climate scientists. In addition, scientists with expertise in climate change made themselves available to answer any questions about the film or related topics via Facebook and Twitter.

Author involvement and attempts at assessment

Salmon and Roop played a key role in the design and delivery of the global launch, but not in the content of the film itself. Roop was also paid to support website content development and social media engagement.

A Qualtrics survey was circulated at the end of the Earth Day launch with the aim of understanding who screened the film, their motivations and the perceived outcomes of the launch and film. Despite there being over 200 physical screenings, there were only 17 complete responses to the survey, limiting any robust analysis of the outcomes and impact of the film launch. Although the project and film team largely considered the launch a success, this is based on the number of screenings and reach of the event, rather than any evaluation data from the event itself.

Insights

Although this initiative was celebrated as a great success (as were all of the case studies), evaluation (of either the film or its launch) posed several challenges. The initial goal or purpose was too broad to be able to usefully measure against, the intended audience was unspecified (although post-event data identified an audience with whom it resonated, this was not necessarily the initial target audience), and the funding and power structures driving the initiative were an opaque mix of individual and institutional. In addition, the various individuals working on the project were predominantly unpaid (or worked additional voluntary hours), working out of a commitment to the project or people involved, without clear deliverables or stated boundaries to their responsibilities. This expands on people-focused challenges mentioned earlier, reinforcing the need to clearly acknowledge the different roles, skills, time, expectations and commitments of people involved, especially where there is a substantial voluntary component to the work. Further, as with the aforementioned projects, there was a clear need to integrate an engagement and evaluation design that considers key elements of strategic communications planning at the outset of the project. What were the specific goals and desired outcomes of the film and launch and what was the desired audience response, knowledge gain, or behavioural change?

September 2012 international polar week

Objectives

Following the IPY, leaders in the APECS EOC committee continued the legacy of IPY’s quarterly International Polar Days and re-coined them as biannual International Polar Weeks (IPWs). IPWs aim to provide educators with an ongoing opportunity to engage students with polar science, enable APECS to fulfil organisational outreach goals, and provide opportunities to foster local collaborations between teachers and researchers (Association of Polar Early Career Scientists - International Polar Week, 2018; Xavier, Azinhaga, Seco, & Fugmann, Reference Xavier, Azinhaga, Seco and Fugmann2019).

Outreach and outputs

Roop co-designed an ice core art activity with researcher Dr Dan Zwartz called “Flakes, Blobs and Bubbles” (Illingworth & Roop, Reference Illingworth and Roop2015), which was chosen as the flagship activity of the September 2012 IPW. This activity was translated into 20 languages and in 2012 reached an estimated 5000 students and community members in 16 countries. It also featured as a physical educational activity and art installation that was profiled at NZ IceFest 2012 (see above).

Author involvement and attempts at assessment

Roop was co-chair of the APECS EOC and took the lead in executing four IPWs during 2012–2013. She also led the specific IPW activity profiled in 2012. In our attempt to evaluate this individual event, however, we realised that the lack of clearly articulated objectives of the IPW series overall made such an evaluation difficult.

Insights

Our consideration of the IPWs, and any activities within them, identified a need for clearly defined, measurable objectives against which evaluation can be carried out. The measure of success for the IPWs during this time was primarily based on “word of mouth” enthusiasm and basic participation numbers, as measured by a virtual balloon launch that was incorporated into each event, and random post-event self-reporting. The EOC co-chairs requested that any APECS national committees engaged in IPW report back their participation and an estimated number of people reached. However, this process was not formalised at the time and there was no set reporting structure or method to guide simple reporting on number of participants. As a result, recorded data lack any form of quality control or standardisation. It is, therefore, hard to interrogate the numbers and to ask questions such as what exactly did IPWs achieve, what value did they have, and why should they continue. Although participant numbers are interesting, they do not reflect any of the value, reach or impact of the events themselves.

In the specific case of Flakes, Blobs and Bubbles, there was no entity to whom an evaluation report could be submitted such that it would make a difference to any future activity (Illingworth & Roop, Reference Illingworth and Roop2015). This was generic to all IPW activities: the volunteer base and lack of funding for these events meant there was no governance body that was either expecting or interested in these results. At most, such research would have been of value to the organisers, most of whom were carrying out this work in their spare time and none of whom had a mandate to fundamentally redesign the programme. This illustrates a need for accountability and/or reporting structure. In the case of the 2012–2013 IPWs, very little data were gathered, and no IPW-specific report was ever written; instead there was an expectation of an informal update to the APECS executive committee for their annual reporting requirements. Without any motivation or support, to the best of our knowledge, no formal evaluation or reporting structure has been put in place even though very similar activities are being carried out now as were occurring ten years ago. Although the limitations of this are understandable (lack of funding and a volunteer-based structure common in polar EOC), we feel this is a lost opportunity. Even some basic, well-designed evaluation or reporting (for instance, capturing what participants are aiming to learn, why participants are engaging and whether the event met these needs) might have identified repeated barriers or new opportunities for this series to expand, innovate, and change. This does not mean that the IPW had no utility, but rather that lack of evaluation and measurable objectives makes it difficult to refine, improve and leverage resources that might otherwise expand the reach and impact of these time-intensive efforts. This need for formative evaluation in order to improve future events is not specific to IPWs but, rather, we feel this is resonant of many polar EOC activities that we have been associated with. It is perhaps best illustrated by IPWs here, however, as this is the only long-term, recurring event considered in the four case studies.

Common themes identified from the case studies

On the face of it, all four case studies were a great success: they reached large and diverse audiences; they resulted in extremely positive feedback from attendees, funders and “experts”; they were reported as successful; and they were celebrated by those who had been involved and funded these efforts (where applicable). Collectively, however, our observation of these case studies also pointed to some common issues. We distill these down to three themes, described below.

Theme 1: A lack of (or need for) articulated objectives, including key goals, audiences and messages

This theme was identified from interrogation of the IPWs as a need for clearly defined, measurable, objectives against which evaluation can be carried out, and from the experience with NZ IceFest as a need to unpack the sometimes multiple purpose(s) or goal(s) of a given initiative and invest effort in articulating clear objectives.

Theme 2: A lack of (or need for) acknowledgement of different drivers, voices and power structures

This theme was identified from analysis of the Our Far South expedition as a need to articulate and make explicit the different drivers and objectives of all parties involved; and from the documentary film as a need to clearly acknowledge the different roles, skills, time, expectations and commitments of people involved.

Theme 3: A lack of (or need for) training, resources and structures for strategic engagement, including practical and theoretical training in science communication, resources for evaluation and engagement design, and formalised structures for accountability and reporting

This theme was identified in the interrogation of the IPWs as a need for accountability and/or reporting structure, including consideration of evaluation metrics (and objectives) during the design process of any engagement initiative, and formative evaluation in order to improve future events. This theme was also illustrated by our experience working on the documentary project as a need to integrate an engagement design that considers key elements of strategic communications planning, including goals, audience, and desired knowledge gain or behavioural change.

These themes are not unique to Polar EOC; Horlick-Jones et al. (Reference Horlick-Jones, Walls, Rowe, Pidgeon, Poortinga, Murdock and O’Riordan2007) indicate similar concerns when considering engagement about genetically modified crops, including the “different values and perspectives of those involved (from the sponsors and organisers to the various participants themselves) each of whom may have different rationales for involvement” (p. 20). We believe, however, that it is critical for each community to identify, articulate and scrutinise these issues within their own context in order for their professional practice to mature. This would encourage development of a community with both practitioners and scholars (and, hopefully, scholarly practitioners) who are able to learn from their experiences and subsequently adapt their practice to deliver more thoughtful, effective, and purposeful engagement.

Providing context for the case studies

In order to explore whether our own particular experiences and lessons, and resultant themes, were generalisable, we distributed an anonymous survey around our wider professional network of scientist communicators in 2013. In addition, in 2017, we contracted Joanna Goven, a political and social scientist who had collaborated with Salmon on other projects, to interview us both (separately) to capture our reflections both on those case studies (five years on), and reflections on how and why our approach to outreach and engagement had changed during the ten years since the IPY. These two small forays were used to further test the robustness of the themes and explore if there were additional nuances or gaps that were overlooked or not obvious in the case study review.

Scientist and communications survey

In order to explore whether the experiences in the polar case studies and resulting insights were representative of a broader community, we gained ethics approval to disseminate a survey, primarily aimed at environmental scientists involved with EOC. This 38-question, qualitative survey was distributed across EOC and earth science networks in New Zealand and a range of professional (e.g. Past Global Changes) and online social networks (e.g. Twitter). The survey had 133 respondents from 27 countries, primarily with backgrounds in earth, physical and polar sciences. The interest in contributing to this survey in and of itself indicates a desire from this community for greater scrutiny of the context within which EOC by scientists occurs. Detailed results are currently being analysed and summarised for a future publication. For the purposes of this paper, we share here key factors that resonated, conflicted with or shed further light on themes that emerged from the case studies.

The survey did not further illuminate or contradict our insights into the first theme, the need for articulated objectives such as goals, audiences and messages. While this may have been the fault of survey design, it could also in and of itself be an indicator that the need for articulated objectives wasn’t even “on the radar” of most of these scientist communicators.

The need for acknowledgement of different drivers, voices and power structures, however, was strongly reinforced by the survey results. For example, on average 44% of EOC activities (and up to 100% in several cases) were conducted on the respondent’s own time. In addition, only 14% of respondents consistently received funding for their EOC activities and few received other institutional support in terms of time, professional recognition or financial resources.

The survey results also both supported and expanded on the third theme, the need for training, resources and structures for strategic engagement. Even though a majority of respondents identified as science communicators, there was a notable lack of training in the field. For example, while 90% of survey respondents actively communicated their research, 61% never had any teaching or science communication training, including even a one-day skills-based workshop. There was also a general lack of theoretically grounded evaluation efforts: when describing their evaluation and assessment practices, respondents commonly characterised the success of an event by ‘metrics’ such as Subjective judgement of whether the audience were engaged; Were there questions & interaction?; Did anyone seem excited by what was being presented?; or My own perception from how people respond or feed back.

The initial results from this survey reinforce our own experiences and insights from the case studies, especially by emphasising a lack of recognition, training and support for EOC activities identified in Themes 2 and 3. This would contribute directly to a lack of professional design and delivery of those same EOC activities.

Auto-ethnographic interviews

The interviews with ourselves were useful for further illuminating, illustrating and articulating aspects of the themes that emerged from the case studies. This was extremely clear in the case of the first theme, the need for articulated objectives, which was re-emphasised in several forms throughout the interviews. For example, the clearly articulated goals of NZ IceFest made delivery on “fun” an appropriate goal:

… at times [during IceFest] I sort of wondered what the point was, but it felt more controlled and less random. You know, because somebody somewhere had made a strategic decision that this was something they were going to fund, and it was okay if the outcome was that a bunch of people turn up to Hagley Park and have fun and learn a bit about Antarctica. [RS]

The interviews also illustrated the different drivers, voices and power structures (Theme 2) that can exist in parallel within a given activity. For example, when describing her role as a scientist communicator within an endorsed IPY EOC project led by a professional education organisation, Roop explained:

I think there were big goals, but I think they were not refined in the sense that were appropriate… there were goals in that it was a programme with formal evaluation, but I was never really part of that; so my own role was sort of removed. [HR]

The interviews provided most insight, however, into the identified need for training, resources and structures for strategic engagement (Theme 3). They provided more clarity (than either the case studies or the survey) specifically around strategic engagement, as we were able to draw on our subsequent experiences both learning about public engagement theory and endeavouring to integrate this into our practice. In 2012, we were not in a position to comment on the resources needed for evaluation and strategic engagement design because we had never attempted these. We suspect the same is true for the majority of the 2013 survey respondents. As a result of our shifting perspective, the loose questions that we provided the interviewer in 2017 (Appendix A) also drilled more deeply into questions around evaluation and public engagement theory than the 2013 survey questions did.

The interviews identified that we both continue to struggle with evaluation of our activities, and haven’t found sufficient support in the literature for how to approach practical evaluation of EOC and engagement initiatives:

I’m always trying to get my head around how do you figure out whether you’ve done what you were intending to do? [HR]

When probed on the value of public engagement theory for practitioners, both of us identified that reading associated literature had changed the way that we approach engagement design:

It’s been useful as a way of thinking about “what will we do?”, but it’s also been useful in terms of articulating what we are doing. [RS]

We also found it provided a useful tool for teaching and sharing new approaches with others:

It wasn’t that they were like: “Oh shit! That’s what we need to be doing!” They were like “oh that’s right, you’ve just articulated what I always knew that I was doing” … once you’ve articulated it, what it does is it changes the way you do things and it changes the kinds of activities that you do. [RS]

However, both of us also indicated that there is a barrier to entry in the literature for people like us with a science background.

Finally, a common thread identified by us both was the key role that individual “champions” played in our development as EOC practitioners and professionals:

[The name of a mentor], she sort of -- she is one of the threads that pulls through my whole career in science and education and outreach. [HR]

The importance of individuals as champions of EOC wasn’t limited to ourselves, however, but a core aspect of the volunteer polar EOC networks that we were connected with:

[The virtual balloon launch on polar days] was awesome feedback, because we found out where we had good hubs, but we also found out where there were holes. So, for example, we hadn’t realised there was a massive hole -- we had no-one in China. So, we went out and found someone in the Met office in China, and she became fantastic, and before you knew it, the next Polar Day, China was covered in balloons… all you need is one person in that country. [RS]

This aligns with one of the key recommendations in Salmon et al (Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011): “nurture your volunteer networks. Keep the costs of their participation low and the visibility of their activities high.” It also, however, reflects the perceived acceptability of polar EOC being carried out by volunteers, for free, for the love of it. Many might reasonably argue that this is not only admirable, but also was one of the core reasons for polar EOC to have developed such strong networks internationally. The reliance on volunteer time, however, also re-emphasises EOC as being a somewhat amateur, voluntary, activity not requiring either professional remuneration or lines of accountability. While we greatly value the volunteer networks that continue to work in polar EOC, and acknowledge that these volunteer networks arguably are responsible for creating and sustaining the unique polar EOC community described earlier, we also believe that investment in time and upskilling in this area is key to the professionalisation of this field.

Interestingly, we both observed that we no longer rely on individual champions to represent or fight for us. Rather, the institutional context in which our current positions exist demands that we report on and deliver professional, justifiable and strategic engagement. That is to say, the system, rather than individuals, now champions our work. This still, admittedly, leaves support of our work at the “whim” of the funders at the time, but it is a shift worth noting.

A new framework for planning and improving public engagement design

Collectively, this research exposed a range of barriers that occur across a number of EOC efforts. It also identified a lack of easily accessible tools that enable science outreach practitioners to consider strategic engagement design. While there are many excellent tools and frameworks for practitioners about engagement planning or evaluation, such as those developed by the International Association for Public Participation (International Association for Public Participation, 2018), the American Association for Advancement for Science’s Centre for Public Engagement with Science and Technology (Center for Public Engagement with Science & Technology, 2018) and the UK’s National Coordinating Centre for Public Engagement (NCCPE, 2018), these tend to start with the assumption that the user has already articulated relatively clear objectives, audiences and messages. In contrast, we propose, similar to other literature in this field (e.g. Pidgeon & Fischhoff, Reference Pidgeon and Fischhoff2011), that many outreach activities are carried out using a somewhat amateur approach, based on “what feels right, what’s deliverable within the time and funding constraints, and what researchers have themselves experienced as being effective based on themselves as recipients of information.

Current literature (e.g. Chilvers, Reference Chilvers2012; Salmon et al., Reference Salmon, Priestley, Fontana and Milfont2017) encourages researchers, practitioners and institutions to take a more “reflexive” approach to science communication and engagement design, including an analysis of institutional, personal, historical and political assumptions that drive any given initiative. Building on this, as well as the themes that emerged from the case studies, we developed a prototype framework that helps identify some of the common blind spots that are overlooked when executing and planning EOC activities, projects or sub-projects. Our goal is to develop a tool that is easy to use, can help practitioners to shed light on project motivations, and facilitates the articulation of outcomes and goals in ways that increase the transparency of projects (e.g. What power dynamics exist in a given project? Which publics does the project aim to reach?). By helping articulate goals at the outset, it should expand the potential for practitioners to be able to measure, evaluate and explore the impact of their public engagement efforts; indeed, to then be more prepared to be able to utilise some of the more sophisticated evaluation and design tools that are available.

This framework builds on and supplements recommendations from strategic communications planning and evaluation research that we found particularly useful for our current EOC design and implementation. For example, there is a wide range of strategic communications planning literature that facilitates that articulation of key strategic audiences and associated targeted messaging (e.g. Corner & Clarke, Reference Corner and Clarke2016; Grimm, Reference Grimm2013; Heath & Heath, Reference Heath and Heath2007; Maibach, Leiserowitz, Roser-Renouf, & Mertz, Reference Maibach, Leiserowitz, Roser-Renouf and Mertz2011). Although strategic communications planning is often used to support initiatives and multi-stage projects, it also has relevance for one-off engagement efforts, especially related to drivers, audiences and the power structures associated with a given effort.

Further research has focused on helping scientist-communicators improve and assess their self-efficacy (Robertson Evia, Peterman, Cloyd, & Besley, Reference Robertson Evia, Peterman, Cloyd and Besley2018) and reflect on their efforts and goals to encourage EOC approaches that emphasise “reciprocal exchanges” between scientists and publics (Peterman, Robertson Evia, Cloyd, & Besley, Reference Peterman, Robertson Evia, Cloyd and Besley2017). These exchanges are thought to increase connectedness between scientists and the communities or publics with whom they interact through their EOC efforts (e.g. Mayhew & Hall, Reference Mayhew and Hall2012; Peterman et al., Reference Peterman, Robertson Evia, Cloyd and Besley2017) and is another way to describe dialogue-driven approaches. As an example, Peterman et al. (Reference Peterman, Robertson Evia, Cloyd and Besley2017) developed an Outcome Expectancy Scale, a new evaluation scale intended to help measure “outcome expectations” for both scientists and publics engaged in EOC activities.

The intention of this prototype engagement design framework (Fig. 1) is to provide a structured approach to the planning and design of a given initiative that encourages articulation of implicit motivations, incentives, and actors, hopefully leading to more transparent and deliverable objectives. The final stage of the framework highlights opportunities for evaluation and reporting to help guide practitioners towards comprehensive engagement design that can include using such scales as those proposed by Peterman et al. (Reference Peterman, Robertson Evia, Cloyd and Besley2017) and Roberston Evia et al. (Reference Robertson Evia, Peterman, Cloyd and Besley2018). This framework aims to help by

  • providing a process to encourage increased reflexivity for scientist communicators and other communication project managers;

  • identifying opportunities, gaps and potential collaborators that would improve an initiative;

  • making explicit the drivers and assumptions that are often implicit or unspoken;

  • articulating and encouraging transparency about power dynamics within a project or programme; and

  • providing critical thinking points for planning project engagement design, evaluation and reporting.

Our aim is that this engagement design framework can be used as both a theoretical tool for considering the multiple components and facets of engagement, and as a practical planning instrument. The framework is intended to be an internal, guiding tool, used to facilitate and improve project design and implementation, with the focus on increasing transparency related to the positions and motivations that guide a given initiative. It can be iterated throughout a given project or initiative, can be used as a benchmark as projects progress, and can be used by an individual or in a group setting (for suggestions on how to use it, see Appendix B).

Fig. 1. A prototype framework for increasing planning, design and reflexivity into engagement projects.

As with any framework, there is no unique solution; each use-case will benefit from this framework in a different way. This is also not intended to be a “silver bullet”; it will not solve the challenges and frustrations that can exist in projects or initiatives like those described above, but it may help make projects more resilient against some common pitfalls or identify areas that can be strengthened early in the design and planning stages.

The authors have mapped this framework against a wide range of active projects and refined it based on interrogating future projects, ongoing work, and retrospectively on completed projects. These results are not shared in this paper because we did not seek ethics approval to share that work.

The framework was also trialled in an informal pilot with a cohort from the Wheelhouse Institute in January 2018 (Wheelhouse Institute, 2018). This cohort of seven people included Roop and a range of creative leaders who engage in public spaces, including an artist, a documentarian, an anthropologist, a journalist, and a research scientist. Their informal participation and feedback helped to refine the flow and content of the framework. Future work with this group will further refine the framework and identify the different spaces and initiatives where it might be applicable and useful beyond science communication and, more specifically, polar EOC. The pilot helped to identify this framework’s utility in exposing new facets of projects that had yet to be considered (e.g. power and motivations), the demographics of teams and where work was needed (e.g. the ratio of volunteers relative to paid staff), the personal motivations and drivers behind an initiative, and where audiences and goals needed refinement in order for engagement design to progress.

This is a first iteration of this framework, which we hope will be adapted and refined with further use. Additional research is underway to formally test, evaluate and assess this framework in use across a range of different engagement design planning and implementation efforts.

Discussion and conclusion

Davies and Horst (Reference Davies and Horst2016, p. 6) present science communication as an ecosystem, with “many niches in which different practices of communication sustain themselves and others in a complex web of interdependence and autonomy.” In this paper, we explore the “niche” of polar science education, outreach and communication (EOC), as defined and stimulated by the IPY 2007–2008. This is a unique subset of science communication in that it is focused geographically, and to some extent temporally, rather than by science communication output type. Indeed, one of the strengths of the polar EOC community is the diversity of outreach and education that the IPY stimulated and the polar research and education community today continues to sustain. This includes activities carried out by the media, science centres and museums, journalists, artists, educators, scientists, tourism organisations, policy fora, and more (Salmon et al., Reference Salmon, Carlson, Zicus, Pauls, Baeseman, Sparrow and Kolset2011). Indeed, the international EOC steering committee for the IPY (and its sub-committees) included representatives from each of these sectors, from around the world. The only notable representation lacking was from scholarship of science communication.

Davies and Horst continue with a desire “to open up discussions using real-world practice in order to enrich both scholarly study and the ways in which science communication is imagined and carried out” (p. 9). In this paper, we reflect on our own “real-world practice” through interrogation of four polar EOC case studies with which we were involved, as both scientist communicators and engagement designers, and benchmark this with a survey of other scientist communicators in our community.

Through this process, we identified opportunities for increasing the professionalisation and transparency of this field and practice through (a) improved articulation of objectives, including key goals, audiences and messages; (b) acknowledgement of different drivers, voices and power structures within an initiative; and (c) increased training, resources and structures for strategic engagement, including practical and theoretical training in science communication, resources for evaluation and engagement design, and formalised structures for accountability and reporting. Although further research is required to confirm this, we propose that these opportunities are not unique to the polar EOC community, but rather have relevance across much science communication practice, internationally.

Articulation of these opportunities for professionalisation, combined with a lack of accessible tools for practitioners to develop their outreach and engagement activities, led us to develop a prototype engagement planning framework. This is our contribution towards re-imagining the ways in which science communication is carried out (Davies & Horst, Reference Davies and Horst2016, p. 9). We hope that ideas here will contribute to informing both the changing practice of the field of science communication, as well as the ways that science communication is continually re-imagined in the literature.

This paper is also an attempt to document the practitioner voice and experience in a literature that is, unsurprisingly, dominated by scholars and critics of science communication practice. We are unapologetic about the post-hoc research approach utilised here: although we appreciate that a more robust research methodology could have been applied to the case studies if integrated from the start, this is in many ways one of the key points of this paper – that most EOC activities are not sufficiently articulated to embed useful evaluation, nor are there any drivers or funding for this kind of research. As a result, the majority of EOC activities remain undocumented, and therefore unavailable for the wider community to learn from. This is in part because the practitioners leading the activities lack the expertise, confidence or theory to be able to share them in a peer-reviewed context. It is also because the inherent engagement design process is not well matched to a research context. An unfortunate side effect of this is the lack of practitioners’ voices in science communication literature. New mechanisms need to be developed for the sharing of professional practice and upskilling in engagement research design that open, rather than close, doors for practitioners to engage more deeply in engagement theory and design, and that enable researchers to work more closely with practice and practitioners. We believe firmly that closer collaboration between science communication theory and practice will lead to more robust, transparent, and effective science communication as well as a body of literature that is relevant, accessible to, and informed by practice.

Author ORCIDs

Rhian Salmon, 0000-0003-4402-551X; Heidi Roop, 0000-0002-6349-7873

Acknowledgements

The authors would like to acknowledge Joanna Goven for her collaboration and feedback, Rebecca Priestley for valuable feedback, and Max Soar for research assistance.

Funding

This work was supported by the University of Washington Climate Impacts Group Science Communication Fund and Victoria University of Wellington Faculty of Science (grant number 214414).

Ethical standards

Responses from the survey distributed to scientist communicators were gathered, analysed and are reported here in accord with Victoria University of Wellington Human Ethics Committee approvals HEC19653, 2013.

Appendix A

Interview framework used in the auto-ethnographic process. These questions were asked to each author on a separate call with J. Goven.

  1. 1. Context

In order to provide some context for the subsequent interview, can you please map out your career trajectory in terms of science, education/outreach and engagement.

  1. 2. Activity mapping

In each case, can you briefly explore the nature of the education/outreach/engagement activity as well as your role in it.

  • What were your goals during each activity?

  • What were the goals of the organisers?

  • What was your overall experience?

  • What was good about it?

  • What was frustrating about it?

  • Was it a key part of your career development?

  • Did it change the way you viewed engagement? If so, how?

  1. 3. Current approach

What is your role now? What would you say were the key moments or experiences that led to you holding this position, or to the approach you use in this position?

Do you feel you have appropriate training and knowledge for this role? If so, where did you gain this. If not, what do you feel you are lacking?

  1. 4. Journey

Can you comment on your own personal journey/maturation process through these events and activities?

  • Do you think this mirrors the experience of others? If so, why?

  • Do you think this mirrors a maturity of the field in general (or not)? If so, what evidence do you have for this (or not)?

  • Do you feel that you currently are ahead of the game/about average/behind the ball in terms of expertise in your current area of work? Can you say a bit more about that.

  1. 5. Theory

Have you read any education or PES theory? If so, why and when? Did it make a difference? If so, in what way? What would you find helpful now to support your work?

  1. 6. Future

Where would you like your work to progress to next?

Appendix B

How to use the Engagement Design Framework

Use of the framework is straightforward, with no required start or endpoint. For practitioners wishing to use it as a planning tool, we recommend first familiarising themselves with the framework wheel by applying it to a past case study and using the experience to help articulate successes, barriers and design flaws (e.g. lack of relevant skills such as infographic design or web editing, insufficient funding, or lack of recognition or institutional support for an initiative).

For those wishing to interrogate a new or existing project, we recommend working around the framework wheel, addressing each section as it relates to a specific project or initiative, documenting all “first responses” using sticky notes, note paper or multiple whiteboards (see below “Guiding questions for working around the Engagement Framework Wheel”). We suggest starting with the section the team or individual finds the easiest (our experience indicates starting with “Purpose” can help to provide the foundation for considering the other sections). Continue around the wheel (linearly or randomly) until you have collected as much information as possible for each section. Some projects may not have all of the elements within each section; just considering the presence or absence of these elements is part of the intended process. We’ve found that it can be helpful to structure this work around a timer (e.g. seven minutes per section, so that the whole activity takes no more than 45 minutes).

The next, critical, step is to reflect on what has been learned from the responses to each segment. Below is a list of guiding questions that can be used or adapted for this purpose:

Guiding questions for working around the Engagement Framework Wheel

Use, or adapt, the following questions to reflect on what has been learned from the responses to each segment. We suggest users try and write out full answers, with as much detail as possible and, if working as a team, talk through them as a group. In this example, we work around the wheel clockwise, starting with Purpose.

  1. 1. What is its purpose?

This section helps the design team (or individual) clearly articulate the project mission and mandates, audiences, messages and messengers, influencers and decision-makers, and any anticipated project outcomes and products.

  1. 2. What are the drivers behind this initiative?

This section considers funding sources (e.g. mandates on scope or priority as stated in a request for proposal), self-promotion (e.g. what are the ways in which the leader or project team benefit), individual drivers and motivations, whether a given initiative serves the ‘greater good’ and identifying any political or institutional drivers.

  1. 3. Who is involved and how are they supported, rewarded or otherwise incentivised?

This element helps to identify who is part of a given project (paid staff, volunteer staff) and how they are supported by both leadership and funding. It also asks what expertise is present, missing, or required, as well as identifying any organisational support mechanisms and capacity-building needs or potential.

  1. 4. Where does the funding come from and who holds the power?

The funding element is used to articulate who has the power (self, individuals, boards, funders, managers, indigenous or tribal) and those who have influence, such as co-funders, and whether funding is public, private and/or from an internal source. It is also important for identifying if the funding has any political or policy-driven elements, or is mission or research led.

  1. 5. What is the design process behind this engagement or communication initiative?

The above elements help to frame a project’s objectives and feed into the scope and metrics against which a project or initiative can be evaluated. Does the planned project approach have a theoretical grounding? What evaluation and success metrics will be used? What reporting processes and expectations are required or desired (e.g. peer-review, internal, external, formative and/or summative). What is the engagement strategy and implementation plan based on the identified mission, audiences and intended impact of a given initiative or project?

  1. 6. What can you learn from these responses?

Finally, collating thoughts and reflections from this process helps to map out weaknesses in project design or areas where more focus is needed. Guiding questions could include

  • What are the strengths of this project or effort?

  • What gaps exist and are there any clear weaknesses in the scope, design or team involved?

  • Are there new partners or collaborators who would help strengthen this initiative?

  • Are the mission and goals clear and can they be clearly evaluated or measured?

  • Are the purpose, messages and audiences clear?

  • Do the power dynamics influencing the design and execution of a project strengthen or limit the success of the project?

  • Do any individuals need more acknowledgement, support or training?

  • Are there processes that need to be clarified?

  • What achievements can or should be celebrated?

References

Allison, I., Béland, M., Alverson, K., Bell, R., Carlson, D., Cutler, P., … Hovelsrund, G. (2009). The state of polar research. Geneva, Switzerland: World Meteorological Organization.Google Scholar
Association of Polar Early Career Scientists - International Polar Week. (2018). Retrieved May 31, 2018, from https://www.apecs.is/outreach/international-polar-week.htmlGoogle Scholar
Blair, J. (2012). New Zealand IceFest, event post report. Christchurch, New Zealand: Christchurch City Council.Google Scholar
Bucchi, M., & Trench, B. (2008). Handbook of public communication of science and technology. Abingdon, UK: Routledge.CrossRefGoogle Scholar
Center for Public Engagement with Science & Technology. (2018). Retrieved May 26, 2018, from https://www.aaas.org/pesGoogle Scholar
Cheng, D., Claessens, M., Gascoigne, N. R. J., Metcalfe, J., Schiele, B., & Shi, S. (Eds.). (2008). Communicating science in social contexts: new models, new practices. Dordrecht, the Netherlands: Springer. Retrieved from https://www.springer.com/gp/book/9781402085970CrossRefGoogle Scholar
Chilvers, J. (2012). Reflexive engagement? Actors, learning, and reflexivity in public dialogue on science and technology. Science Communication, 35(3), 283310. doi: 10.1177/1075547012454598CrossRefGoogle Scholar
Corner, A., & Clarke, J. (2016). Talking climate: from research to practice in public engagement (1st ed., 2017 edition). New York, NY: Palgrave Macmillan.Google Scholar
Davies, S. R., & Horst, M. (2016). Science communication culture, identity and citizenship. London, UK: Palgrave Macmillan.Google Scholar
Gilbert, J., & Stocklmayer, S. (Eds.). (2013). Communication and engagement with science and technology: issues and dilemmas. A reader in science communication. New York, NY: Routledge, Taylor & Francis Group.Google Scholar
Grimm, K. (2013, February 25). Want influence? Eliminate blind spots (SSIR). Retrieved May 31, 2018, from https://ssir.org/articles/entry/want_influence_eliminate_blind_spotsGoogle Scholar
Heath, C., & Heath, D. (2007). Made to stick: why some ideas survive and others die (1st ed.). New York, NY: Random House.Google Scholar
Hindshaw, R., Mariash, H., Vick-Majors, T., Thornton, A., Pope, A., Zaika, Y., ... Fugmann, G. (2018). A decade of shaping the futures of polar early career researchers: a legacy of the International Polar Year. Polar Record. doi: 10.1017/S0032247418000591Google Scholar
Horlick-Jones, T., Walls, J., Rowe, G., Pidgeon, N., Poortinga, W., Murdock, G., & O’Riordan, T. (2007). The GM debate: risk, politics and public engagement. New York, NY: Routledge.CrossRefGoogle Scholar
Horst, M. (2008). In search of dialogue: staging science communication in consensus conferences. In Bucchi, M. & Trench, B. (Eds.), Communicating science in social contexts: new models, new practices (pp. 259274). Dordrecht, the Netherlands: Springer. doi: 10.1007/978-1-4020-8598-7_15CrossRefGoogle Scholar
Illingworth, S. M., & Roop, H. A. (2015). Developing key skills as a science communicator: case studies of two scientist-led outreach programmes. Geosciences, 5(1), 214.CrossRefGoogle Scholar
International Association for Public Participation. (2018). Retrieved May 26, 2018, from https://www.iap2.org/default.aspxGoogle Scholar
Kaiser, B., Allen, B., & Zicus, S. (2010). Polar science and global climate: an international resource for education and outreach. Boston, MA: Pearson Custom Publishing.Google Scholar
Maibach, E. W., Leiserowitz, A., Roser-Renouf, C., & Mertz, C. K. (2011). Identifying like-minded audiences for global warming public engagement campaigns: an audience segmentation analysis and tool development. PLoS ONE, 6(3), e17571. doi: 10.1371/journal.pone.0017571CrossRefGoogle ScholarPubMed
Mayhew, M. A., & Hall, M. K. (2012). Science communication in a Café Scientifique for high school teens. Science Communication, 34(4), 546554. doi: 10.1177/1075547012444790CrossRefGoogle Scholar
Miller, S. (2008) So where’s the theory? On the relationship between science communication practice and research. In Cheng, D., Claessens, M., Gascoigne, T., Metcalfe, J., Schiele, B., & Shi, S. (Eds.), Communicating science in social contexts (pp. 275287). Dordrecht, the Netherlands: Springer. doi: 10.1007/978-1-4020-8598-7_16CrossRefGoogle Scholar
Morgan, G. (2014). The Big Kahuna: Turning tax and welfare in New Zealand on its head. (1st ed.). New Zealand: BookBaby.Google Scholar
Morgan, G., & McCrystal, J. (2009). Poles apart: Beyond the shouting, who’s right about climate change? Auckland, New Zealand: Random House New Zealand.Google Scholar
Morgan, G., & Simmons, G. (2009). Health cheque: the truth we should all know about New Zealand’s public health system. Auckland, New Zealand: Public Interest Publishing.Google Scholar
Morgan, G., & Simmons, G. (2011). Hook line and blinkers: everything kiwis ever wanted to know about fishing. Wellington, New Zealand: The Public Interest Publishing Company.Google Scholar
Morgan, G., & Simmons, G. (2013). Appetite for destruction: food - the good, the bad and the fatal. Wellington, New Zealand: The Public Interest Publishing Company.Google Scholar
NCCPE. (2018). Retrieved May 26, 2018, from http://www.publicengagement.ac.uk/Google Scholar
People Science & Policy. (2009). Reward and recognition of public engagement. Report for the Science for All Expert Group. Retrieved from http://webarchive.nationalarchives.gov.uk/20121106091242/http://scienceandsociety.bis.gov.uk/all/files/2010/02/Reward-and-recognition-FINAL1.pdfGoogle Scholar
Peterman, K., Robertson Evia, J., Cloyd, E., & Besley, J. C. (2017). Assessing public engagement outcomes by the use of an outcome expectations scale for scientists. Science Communication, 39(6), 782797. doi: 10.1177/1075547017738018CrossRefGoogle Scholar
Pidgeon, N., & Fischhoff, B. (2011). The role of social and decision sciences in the communicating uncertain risks. Nature Climate Change, 1, 3541.CrossRefGoogle Scholar
Provencher, J., Baeseman, J., Carlson, D., Badhe, R., Bellman, J., Hik, D., … Zicus, S. (2011). Polar research education, outreach and communication during the fourth IPY: how the 2007–2008 International Polar Year has contributed to the future of education, outreach and communication. Paris, France: International Council for Science (ICSU). doi: 10.13140/RG.2.1.4451.5369Google Scholar
Rapley, C., Bell, R. E., Allison, I., Bindschadler, R., Casassa, G., Chown, S., … Zhang, Z. (2004). A Framework for the International Polar Year, 2007–2008. Paris, France: ICSU. Retrieved from https://academiccommons.columbia.edu/catalog/ac:144786Google Scholar
Robertson Evia, J., Peterman, K., Cloyd, E., & Besley, J. (2018). Validating a scale that measures scientists’ self-efficacy for public engagement with science. International Journal of Science Education, Part B, 8(1), 4052. doi: 10.1080/21548455.2017.1377852Google Scholar
Roop, H. A., & Dietze, E. (2013). Challenges and solutions for enhanced paleoscience communication. PAGES News, 21(2), 95.CrossRefGoogle Scholar
Roop, H. A., Martinez-Méndez, G., & Mills, K. (2013). The art of science communication: traps, tips and tasks for the modern-day scientist. PAGES News, 21(2), 90.CrossRefGoogle Scholar
Roop, H. A., Wesche, D., Azhinhaga, P., Trummel, B., & Xavier, J. (2019). Building collaborative networks across disciplines: a review of Polar Educators International’s first five years. Polar Record, in press.CrossRefGoogle Scholar
Salmon, R. A. (2013a). Is climate science gendered? A reflection by a female “climate scientist.” Women’s Studies Journal, 27(1), 4955.Google Scholar
Salmon, R. A. (2013b). New Zealand ICEFEST 2012 science & education programme summary and evaluation. Commissioned by Christchurch City Council, Christchurch, New Zealand.Google Scholar
Salmon, R. A. (2017). Deep South National Science Challenge engagement strategy 2015–2019. doi: 10.5281/zenodo.555995CrossRefGoogle Scholar
Salmon, R. A., Carlson, D. J., Zicus, S., Pauls, M., Baeseman, J., Sparrow, E. B., … Kolset, T. (2011). Education, outreach and communication during the International Polar Year 2007–2008: stimulating a global polar community. The Polar Journal, 1(2), 265285.CrossRefGoogle Scholar
Salmon, R. A., & Priestley, R. (2015). A future for public engagement with science in New Zealand. Journal of the Royal Society of New Zealand, 45(2), 101107. doi: 10.1080/03036758.2015.1023320CrossRefGoogle Scholar
Salmon, R. A., Priestley, R., Fontana, M., & Milfont, T. L. (2017). Climate change communication in New Zealand. Oxford Research Encyclopedia of Climate Science. doi: 10.1093/acrefore/9780190228620.013.475CrossRefGoogle Scholar
Salmon, R. A., Priestley, R., & Goven, J. (2017). The reflexive scientist: an approach to transforming public engagement. Journal of Environmental Studies and Sciences, 7(1), 5368. doi: 10.1007/s13412-015-0274-4CrossRefGoogle Scholar
Stocklmayer, S. (2013). Engagement with science: models of science communication. In Gilbert, J. & Stocklmayer, S. (Eds.), Communication and engagement with science and technology: issues and dilemmas. A reader in science communication. New York, NY: Routledge, Taylor & Francis Group.Google Scholar
Trench, B. (2008). Towards an analytical framework of science communication models. In Bucchi, M. & Trench, B. (Eds.), Communicating science in social contexts: new Models, new practices (pp. 119135). Dordrecht, the Netherlands: Springer.CrossRefGoogle Scholar
Wenger-Trayner, E., & Wegner-Trayner, B. (2015). Communities of practice: A brief introduction. Self-published. Retrieved June 15, 2018, from http://wenger-trayner.com/wp-content/uploads/2015/04/07-Brief-introduction-to-communities-of-practice.pdfGoogle Scholar
Wheelhouse Institute. (2018). Retrieved May 26, 2018, from https://www.wheelhouseinstitute.com/Google Scholar
Xavier, J., Azinhaga, P., Seco, J., & Fugmann, G. (2019). International Polar Week as an educational activity to boost science–educational links: Portugal as a case study. Polar Record. doi: 10.1017/S0032247418000621CrossRefGoogle Scholar
Figure 0

Table 1. A summary of the four polar case studies considered in the research.

Figure 1

Fig. 1. A prototype framework for increasing planning, design and reflexivity into engagement projects.