Hostname: page-component-78c5997874-mlc7c Total loading time: 0 Render date: 2024-11-05T04:44:25.670Z Has data issue: false hasContentIssue false

Preparedness for emerging infectious diseases: pathways from anticipation to action

Published online by Cambridge University Press:  12 December 2014

V. J. BROOKES
Affiliation:
Faculty of Veterinary Science, The University of Sydney, Camden, NSW, Australia Graham Centre for Agricultural Innovation, Charles Sturt University, Wagga Wagga, NSW, Australia
M. HERNÁNDEZ-JOVER
Affiliation:
Graham Centre for Agricultural Innovation, Charles Sturt University, Wagga Wagga, NSW, Australia
P. F. BLACK
Affiliation:
Essential Foresight, Canberra, ACT, Australia
M. P. WARD*
Affiliation:
Faculty of Veterinary Science, The University of Sydney, Camden, NSW, Australia Marie Bashir Institute for Infectious Diseases and Biosecurity, The University of Sydney, Camperdown, NSW, Australia
*
*Author for correspondence: Professor M. P. Ward, Faculty of Veterinary Science, The University of Sydney, Camden, NSW, Australia. (Email: [email protected])
Rights & Permissions [Opens in a new window]

Summary

Emerging and re-emerging infectious disease (EID) events can have devastating human, animal and environmental health impacts. The emergence of EIDs has been associated with interconnected economic, social and environmental changes. Understanding these changes is crucial for EID preparedness and subsequent prevention and control of EID events. The aim of this review is to describe tools currently available for identification, prioritization and investigation of EIDs impacting human and animal health, and how these might be integrated into a systematic approach for directing EID preparedness. Environmental scanning, foresight programmes, horizon scanning and surveillance are used to collect and assess information for rapidly responding to EIDs and to anticipate drivers of emergence for mitigating future EID impacts. Prioritization of EIDs − using transparent and repeatable methods − based on disease impacts and the importance of those impacts to decision-makers can then be used for more efficient resource allocation for prevention and control. Risk assessment and simulation modelling methods assess the likelihood of EIDs occurring, define impact and identify mitigation strategies. Each of these tools has a role to play individually; however, we propose integration of these tools into a framework that enhances the development of tactical and strategic plans for emerging risk preparedness.

Type
Review Article
Copyright
Copyright © Cambridge University Press 2014 

Introduction

Historically, emerging and re-emerging infectious disease (EID) events have had devastating health impacts, particularly on human populations. Records suggest that influenza pandemics have occurred regularly for at least 500 years (an estimated 50 million people died during the 1918 pandemic), and at least 35 million people might be currently infected with HIV [Reference Morens1, Reference Fauci and Folkers2]. The majority of EIDs are caused by pathogens of animal origin [Reference Jones3-Reference Woolhouse and Gowtage-Sequeria5]. For example, recently emerged viral diseases of animal origin that have caused high case-fatality rates in humans include coronaviruses (SARS and MERS-CoV), influenza A (H5N1 and H7N9), henipaviruses (Nipah and Hendra), and Ebola haemorrhagic disease (World Health Organisation Global Outbreak and Response Disease Outbreak News, www.who.int/csr/don/en/) and debate continues about the animal origins of antimicrobial resistance, for example meticillin-resistant Staphylococcus aureus [Reference Heller6, Reference Butaye7]. The emergence of antimicrobial resistance is considered to be one of the greatest current threats to global human health [8]. Many drivers of EID events have been proposed, including interconnected economic, social and environmental changes that allow microbial adaption through mutation, geographical spread, and altered host range [Reference Morse9, Reference Louria10]. Risk factors for EIDs include climate change, ecological changes such as intensification of agriculture and deforestation, changes in human demographics such as population growth and migration, and globalization of trade and travel [Reference Morse9, Reference Jones11, Reference Black and Nunn12]. In a review of the occurrence of EIDs between 1940 and 2004, the number of reports was found to be increasing [Reference Jones3]. This trend is expected to continue, particularly with an increasing global human population (www.census.gov/population/international/data/idb) and increasing trade and travel. Therefore, preparedness is essential to mitigate the potential risk of high-impact EIDs [Reference Brownlie4].

EID preparedness encompasses a range of activities to enhance prevention and control of high-impact EID events, in which the benefits of preventing or reducing the impact of the event far outweigh the investment required in such activities. Traditionally, these activities have been focused around tactical (immediate and short-term) planning. Surveillance has been the mainstay of EID preparedness, both for early identification of spatial, temporal and demographic clusters of adverse health events indicative of an EID, and for prevention of re-emergence of known infectious diseases via the early application of control measures. Consequently, preparedness is currently targeted at known emerging and re-emerging infectious diseases and the responses required should they be detected. However, with increasing recognition of the greater occurrence of EID events and the broad range of risk factors associated with this phenomenon, the scope of preparedness has widened. Through foresight programmes, information is now collected by environmental scanning to detect and assess events and trends that are not specific to health events – but are related to these known risk factors – allowing anticipation of future needs for EID prevention and mitigation [Reference Brownlie4]. Therefore, directing activities for EID preparedness now encompasses strategic (long-term) as well as tactical planning.

This review describes tools currently available for detection, prioritization and investigation of EIDs and threats to human, animal and environmental health (One Health), and how these tools might be integrated to form a systematic approach for directing EID preparedness. A range of tools that have application in this field have been developed during the past century: mathematical models of infectious diseases were first created in the early 1900s (the models of malaria created by Ross for example) to explore how infectious diseases persist in populations and how they might be controlled, and extended to simulation models in the 1980s and 1990s with the increasing power of modern computers and availability of appropriate data; risk assessment methods were developed in the 1970s and 1980s in response to disease risks associated with hazards in the environment and food and in the 1990s as import risk assessment in response to increasing travel and trade and risk of global spread of infectious diseases; formal disease prioritization tools were first developed in the 1990s and have been extended during the past decade; and more recently, environmental and horizon scanning methods have been developed as broad, risk mitigation tools. Some of these tools have been developed specifically for infectious diseases (disease modelling) or more broadly within other disciplines and fields of study (risk assessment and prioritization) and adapted for use in infectious disease research. Overarching these tools is disease surveillance, which has been undertaken in one form or another since the beginnings of recorded history. Given this history of tool development and adaptation, the integration of such tools to address emerging and re-emerging infectious diseases requires a framework so that the sum of efforts are effective. Current approaches which rely on the application of just a single tool can be successful; but with increasingly complex, multifactorial health problems − typified by EIDs – such approaches can be inadequate. For example, the emergence of West Nile virus (WNV) in the United States in 1999 was unanticipated despite data on air traffic movements (‘globalization’), niche modelling and risk assessment [Reference Garmendia, Van Kruiningen and French13]. Since then (after the fact), risk assessment modelling has been applied to determine the likelihood of disease emergence elsewhere [Reference Brown14, Reference Hernández-Jover, Roche and Ward15]. Similarly, the emergence of pathogenic bluetongue and Schmallenberg viruses in northern Europe was unanticipated, even though the spread of bluetongue viruses in southern Europe as a result of climate change was a well-established phenomenon [Reference Baylis16, Reference Guis17]. Avian influenza H1N1 is thought to have emerged in Mexico, a country that was not predicted by various tools to be a ‘hotspot’ for zoonoses emergence [Reference Jones3]. In this review, initially methods to collect and assess information for identification of EIDs and their drivers − including environmental and horizon scanning and surveillance−are discussed. This is followed by a review of methods used to prioritize and investigate requirements for EID preparedness, including disease prioritization, risk assessment and simulation modelling. We then discuss current uses of these methods – individually and as integrated pathways – as well as methodological and external constraints that limit identification, prioritization and investigation of EIDs and future human and animal health threats.

Information collection and assessment

Information is a fundamental requirement to detect the presence of EIDs in a timely manner and anticipate future potential human, animal and environmental health risks. Characteristics of the information regarding its volume, scope (both geographical and disciplinary), disease specificity and degree of certainty, are related to how information is collected and assessed – using surveillance, or horizon and environmental scanning – and ultimately the projected time-frame for the use of the information to direct activities for EID preparedness (Fig. 1). For example, surveillance applications tend to have a narrow geographical and temporal scope, whereas environmental scanning is broad.

Fig. 1. The relationship between information type and collection method, and the projected time-frame of activities for emerging infectious disease preparedness.

Environmental scanning is the process of collecting and assessing information to identify events and trends in the global environment (for example, demographic, social, technological, behavioural and economic changes). This type of scanning is an input component to a group of activities known as ‘strategic foresight’, in which a vision of plausible future scenarios can be developed for the purpose of long-term (strategic) planning in organizations [Reference Horton18]. The basic steps of a foresight programme are shown in Figure 2. Foresight programmes, and hence environmental scanning, are not techniques that are specific to disease identification and investigation; they are used by many organizations to improve or secure their future positions in the global environment. However, during the last decade environmental scanning has been used to collect and investigate information about the drivers of infectious disease emergence with a view to enhancing long-term preparedness for EIDs through foresight programmes [Reference Brownlie4, Reference Willis19, Reference King and Thomas20]. The definition of ‘long-term’ is subjective; for example, a projection of 10 − 25 years was selected within the UK foresight programme for the detection and identification of infectious diseases (www.gov.uk/government/publications/infectious-diseases-preparing-for-the-future). Although ‘horizon scanning’ is a term often used synonymously with ‘environmental scanning’, in the context of identification and investigation of EIDs horizon scanning is also used to describe information collection that is targeted at health-specific sources. Therefore for the purposes of this review we separate the two terms: environmental scanning refers to information about the global drivers of infectious disease emergence, and horizon scanning refers to information collection about adverse health events.

Fig. 2. Steps of a foresight programme, modified from Horton [Reference Horton18] and Voros [Reference Voros23].

Systematic methods for environmental scanning to enhance EID preparedness are not well established, in part because environmental scanning is developed on an ad hoc basis to meet current and anticipated needs specific to the organization [Reference Slaughter21]. In addition, the drivers for emergence of infectious diseases include economic-, social-, environmental- and pathogen-associated factors that interconnect to form a continuously evolving global milieu. Therefore, environmental scanning collects information with little or no disease specificity, very broad geographical and disciplinary scope, and usually a high degree of uncertainty (Fig. 1). Consequently, there is both a wealth of information available at any point in time and uncertainty forces collection over a relatively long time period in order to recognize topics of interest and detect trends – selecting the relevant information and assessing its quality is challenging, and a potential limitation of environmental scanning [Reference Frishammar22].

Information sources that are scanned in this process include a wide range of literature (peer-reviewed and grey literature, government reports and web-based information; for example blogs, list-servers and other information networks) and informal data sources such as public opinion polls and media reports, as well as expert opinion elicitation and industry workshops. Environmental scanning can be organized, or supplemented, by commercial services such as ‘Shaping Tomorrow’ (www.shapingtomorrow.com), that also provide database systems for storage of scanning hits (relevant information). Tangible outputs include information about disease drivers and emerging global issues, as well as intangible benefits through increased collaboration within and between disciplines and organizations. Information collected from environmental scanning is used in foresight activities such as scenario planning, causal layered analysis, and backcasting [Reference Horton18, Reference King and Thomas20, Reference Voros23]. These activities aim to enhance strategy planning by developing future scenarios that are plausible given current information, then assessing requirements to achieve or mitigate the chances of reaching those scenarios. Strategy plans can include policy changes required today or research to develop systems and technology to meet future requirements. Established foresight programmes include the Australian Department of Agriculture's strategic foresight programme (http://www.daff.gov.au/animal-plant-health/animal/strategy), Foresight for Canadian Animal Health [Reference Willis19] and the UK foresight project for detection and identification of infectious diseases (www.gov.uk/government/publications/infectious-diseases-preparing-for-the-future). In a report published in 2006, the UK foresight programme identified future scenarios in which the threat of emerging infectious diseases in the UK, China and sub-Saharan Africa increased over the next 10–25 years; drivers that were consistently considered to be important were increased travel, migration and trade, increased exposure to exotic plants and animals, and adulterated or incorrectly used drugs leading to drug-resistant organisms [Reference Brownlie4]. Climate change was expected to influence disease distribution in both the UK and Africa. Factors that were particularly important in Africa were poverty, conflicts, systems of governance, urbanization, intensification of agriculture, and lack of a disease prevention and control capacity. Drivers of future risk of disease in China also included increased amounts of animal waste, changing sexual lifestyles, changing public attitude to risk perception, loss of genetic diversity in agriculture, and increased levels of wealth and education. Improved cross-disciplinary collaboration and threat detection, identification and monitoring systems were required to meet the challenges posed by these future threats. Since this report was published, activities for EID preparedness have included research to develop surveillance systems and improve the diagnosis of infectious diseases, an example of which is the development of a bio-security chip that can identify 1132 different viruses. This biochip was used in the diagnosis of equine encephalosis virus in Israel, a virus previously unreported north of Southern Africa [Reference Mildenberg24].

In the context of detection and identification of EIDs, horizon scanning is used to describe surveillance activities that collect and assess a broad range of data associated with adverse health events to compliment traditional disease surveillance for early warning of EIDs [Reference Walsh and Morgan25Reference Morgan27]. Information collected for this type of horizon scanning is not necessarily disease-specific, but still has broad geographical scope and comes from a large range of sources; it can be used in short-term activities such as improving time to disease outbreak detection and identification, as well as enhancing medium- to long-term strategy planning as an input to foresight programmes (Fig. 1).

Horizon scanning has been facilitated by advances in technology and the development of internet-based disease outbreak reporting systems such as the International Society for Infectious Diseases' Program for Monitoring Emerging Disease (ProMED-mail, www.promedmail.org), the Global Public Health Intelligence Network (GPHIN, Centre for Emergency Preparedness and Response, Canada), HealthMap (www.healthmap.org), BioCaster (www.biocaster.nii.ac.jp/_dev/), EMPRES-i (www.empres-i.fao.org/eipws3g/), and Aquatic Animal Health (www.aquatic.animalhealth.org/home). Other sources of information for horizon scanning include reports of disease outbreaks from the World Health Organization (WHO) and the World Organization for Animal Health (OIE), peer-reviewed and grey literature, media reports and surveillance reports (such as laboratory data). Formal horizon scanning programmes include the Global Disease Detection (GDD) Program (Centers for Disease Control and Prevention; www.cdc.gov/globalhealth/gdder/gdd/default.htm), the Threat Tracking Tool used by the European Centre for Disease Prevention and Control (www.ecdc.europa.eu) and the risk analysis framework used by the Human Animal Infections Risks and Surveillance (HAIRS) group in the UK [Reference Walsh and Morgan25Reference Morgan27]. The GDD Operations Center is a centralized electronic reporting system. Data from horizon scanning is collected and aggregated with information from GDD partners worldwide, and analysed to identify requirements to provide operational and financial support to strengthen global public health surveillance and response capacity via the Global Outbreak Alert and Response Network (GOARN) and the WHO [Reference Hitchcock28]. By contrast, the HAIRS group is an inter-departmental and cross-disciplinary group of people that meet on a monthly basis to assess the zoonotic and EID risk to the UK population of hazards identified through horizon scanning by organizations such as the Department for the Environment, Food and Rural Affairs (Defra) and Public Health England. Using qualitative algorithms to estimate risk, potential hazards are classified according to the level of response required. This systematic process ensures that hazards are consistently assessed so that actions are justified and therefore defensible. Importantly, knowledge gaps are identified and lack of evidence of risk is differentiated from evidence of no risk. Recent reports from the HAIRS group include a qualitative assessment of the risk presented to human health by cats infected with Mycobacterium bovis, and an assessment of the zoonotic potential of Brucella species in marine mammals (www.hpa.org.uk/webw/HPAweb&HPAwebStandard/HPAweb_C/1317138638591).

Surveillance methods for the systematic collection of information about specific diseases or syndromes – for example, data from laboratory submissions, surveys and health records – are well established and described in both human and animal health contexts [Reference Dufour, Hendrikx, Dufour and Hendrikx29Reference German31]. Surveillance data can be collected locally or regionally – The European Surveillance System (TESSy; http://www.ecdc.europa.eu/en/activities/surveillance/TESSy/Pages/TESSy.aspx) is an example of a regional system in which data from about 50 communicable diseases is collected from multiple surveillance sources. This type of data is likely to have a relatively high level of certainty and be applicable to disease-specific control measures in the short-term, such as outbreak response and tactical planning (Fig. 1). In the context of EID detection it is recognized that traditional surveillance based on collection and analysis of disease or syndrome-specific data has limitations due to the logistics and funding required to systematically collect and report this type of data in a timely manner or over sufficient time periods to detect trends. This is even more difficult in countries constrained by limited public or animal health systems and transport infrastructure, or political and cultural constraints that limit reporting [Reference Halliday32]. While some systems have been developed at very low cost and with wide coverage, routine analysis remains problematic and a barrier to application for EID detection [Reference Ward and Kelman33]. There is also a spatial mismatch between surveillance systems and the areas in which infectious diseases emerge. Jones et al. [Reference Jones3] suggest that the risk of an EID event is greater in South and East Asia, sub-Saharan Africa and South and Central America, for all pathogen types except zoonotic pathogens from wildlife and drug-resistant pathogens (which are as likely to occur in Europe and some areas of North America) [Reference Jones11]. This is supported by a more recent study that found that over 50% of WHO-confirmed infectious disease outbreaks between 1996 and 2009 occurred in Africa [Reference Chan34]. However, a review of surveillance systems for emerging zoonoses (212 peer-reviewed articles describing 221 emerging zoonoses' surveillance or monitoring systems) found that nearly 70% of these systems were based in Europe and North America [Reference Vrbova35]. Although most EIDs are of animal origin, more than 50% of systems evaluated data solely from humans and 70% targeted known pathogens. Moreover, despite the existence of guidelines for evaluation of surveillance systems [Reference German31, Reference Buehler36], only 8% of the articles reported evaluation of the systems; this is a critical requirement to ensure accuracy of reports.

Syndromic surveillance has been facilitated by advances in technology, and can supplement traditional surveillance data to reduce time to EID detection and identification. These methods include ‘infoveillance’ – collection of data via web-based sources and crowdsourcing – and mobile phone reporting (reviewed by Walker [Reference Walker37]). Current infoveillance examples include Google Flu Trends which uses aggregated Google search terms data as an indicator of influenza-like illness (www.google.org/flutrends/), FluTracking.net which invites people to complete a weekly online survey for influenza surveillance (www.flutracking.net/Info), and Flu Detector which infers incidence of influenza-like illness for England and Wales using Twitter feeds (geopatterns.enm.bris.ac.uk/epidemics/). Examples of mobile phone reporting systems include syndromic surveillance for adverse health events in humans in Papua New Guinea, and veterinary syndromic surveillance in Sri Lanka [Reference Robertson38, Reference Rosewell39]. The authors reported similar advantages and limitations. Mobile phone-based surveillance appeared to be acceptable and feasible in low-resource settings, and reporting time for events was reduced in some instances compared to existing traditional surveillance systems. However, validity was difficult to assess and sustainability was potentially limited through technical, geographical, political and social barriers. Recently, a mobile phone reporting system has been developed in Indonesia, primarily as a means to assist stakeholders involved in animal health in the field (farmers, veterinarians, veterinary technicians), and not as a tool to gain information for regional or national disease surveillance [Reference Syibli40]. Initial reports indicate that this system is likely to be a comprehensive and sustained animal health information system (www.wiki.isikhnas.com). The ease of implementation of the system indicates that designing animal and human health information systems for the benefit of those who submit data might be an effective way to design syndromic surveillance systems to provide early warning of adverse health events.

The divisions between environmental scanning, horizon scanning and surveillance are not distinct – they are a spectrum of information collection methods across a spectrum of information types, and information from one area inherently supplements and influences collection of other types and sources of information. These methods have been developed in response to the need to rapidly respond to emerging infectious diseases as well as understand and anticipate drivers of emergence to mitigate the impact of future EID events. The time to detection and public communication of EID events has improved within the last decade [Reference Chan34], but it is unknown whether this is the result of improvements in scanning and surveillance, or the requirements of the International Health Regulations that came into force in 2007 in which member countries must immediately notify the WHO if an event might constitute a public health emergency of international concern [41]. Evaluation of scanning and surveillance systems is essential to develop accuracy and assess benefit. A recent study suggested that information from Google Flu Trends could be unreliable as surveillance for influenza pandemics [Reference Olson42]. Although infoveillance is currently considered supplementary to traditional surveillance, this study highlights the need to develop guidelines and methods to evaluate electronic information collecting and reporting systems as well as traditional surveillance systems. In the face of an increasing rate of emergence of infectious diseases and scarce resources for information collection and assessment, it is likely that reliance on electronic reporting systems – either formal or informal – will increase across both scanning and surveillance.

Collection and assessment of information is just the first stage in preparation for EIDs. As threats emerge or EID events unfold, prioritization is required to allocate resources, and further investigation using risk analysis and simulation modelling is needed to design the most appropriate prevention and control strategies. The following sections discuss these tools and their links to EID preparedness.

Prioritization

Understanding the importance to those affected of the range of potential impacts of emerging threats and EIDs is essential to develop tactical and strategic plans appropriate to the social, cultural, economic and environmental context in which prevention and control activities take place. Resources (capital items and consumables, and availability of the time and expertise needed to deliver effective prevention and control) are also limited, and this is compounded by the increasing occurrence of EIDs that provide competing interests for resource allocation. Therefore, following detection and identification of emerging threats and EIDs, prioritization is required to direct resources for prevention and control, taking this complex background – against which the success of prevention and control is judged – into consideration. Defining the highest priority emerging threats and EIDs is problematical. Diseases cause a variety of tangible and intangible economic, social and environmental impacts and it is recognized that the perception of the importance of these impacts varies between stakeholders [Reference Wilson, Ward and Garner43]. For example, it has been suggested that the general public's perception of EIDs is disproportionately large compared to their actual impact, and that the opportunity cost associated with focusing on EIDs exceeds the benefit achieved in their control [Reference Butler44]. Focusing on tangible economic impacts and neglecting the many intangible social impacts of disease might explain this mismatch between priorities and impacts. Therefore, prioritization of EIDs and human, animal and environmental health threats must account for both the scale of disease impacts as well as the importance of those impacts to decision-makers. Further, the prioritization method must be rapid, transparent and give consistent and repeatable results, so that resource allocation is timely and justified.

The main purpose of disease prioritization in the context of EIDs has been to direct surveillance. These studies have prioritized either EIDs alone [Reference Cox, Sanchez and Revie45, Reference Havelaar46], or together with zoonotic [Reference Ng and Sargeant47Reference McKenzie, Simpson and Langstaff49] or communicable diseases in general [Reference Economopoulou50Reference Carter53]. More recently, prioritization has been used as a tool to direct resources for a broader range of activities to improve EID preparedness, including assessment for immediate response and research (such as risk assessment and disease spread modelling), as well as surveillance [Reference Humblet54Reference Brookes56]. Most prioritization studies have been undertaken in North America, Europe and Australasia. Until recently, disease prioritization used methods developed on an ad hoc basis. However, driven by requirements for transparency and repeatability [Reference Giesecke57], the methodology for prioritization has evolved to follow decision-science methodology using multi-criteria decision analysis (MCDA).

The steps for disease prioritization using MCDA are shown in Figure 3, and are briefly described as follows. Once the purpose of the prioritization is established, the relevant stakeholders and decision-makers are defined and diseases to be prioritized are selected. A group of criteria are chosen that describe the disease impacts on which the prioritization decision is based, and objective measurements for each disease are collected according to the criteria. Stakeholder or decision-maker preferences are evaluated to weight the criteria to reflect their importance to the stakeholders. Separation of objective disease measurements from the subjective criterion weights is a key point in ensuring transparency of the prioritization process, because it removes bias due to decision-makers' opinions and level of knowledge about named diseases. It is important that preferences are evaluated using mechanisms that force stakeholders to make trade-offs between criteria presented within the scale and context of the prioritization. This ensures that criterion weights validly reflect opinion about the importance of disease impacts; Keeney [Reference Keeney58] and Steel et al. [Reference Steele59] provide further information regarding this, and Dodgson et al. [Reference Dodgson60] describe different methods for evaluating stakeholder preferences. Aggregation of disease measurements with criterion weights produces an overall score for each disease, and diseases can be ranked according to median or mean score. Prioritization is an iterative process; as new information becomes available (regarding either new threats, or changes in stakeholders values), or the understanding of impacts is refined through simulation modelling, prioritization should be repeated to ensure that resources are justifiably allocated. MCDA used in disease prioritization has developed two distinct methods to evaluate stakeholder preferences; traditional MCDA in which criteria are weighted directly, and MCDA in which criteria are weighted indirectly.

Fig. 3. Flowchart of steps for disease prioritization using multi-criteria decision analysis (MCDA). Modified from Brookes et al. [Reference Brookes56].

An example of prioritization using traditional MCDA is the decision-support tool known as e-THiR, developed for Defra's Veterinary Risk Group in the UK [Reference Del Rio Vilas55]. This tool prioritizes emerging animal health threats identified by horizon scanning or surveillance, and uses criteria that reflect public opinion, potential impacts of the threat, and capability for response as part of a decision support framework for the management of emerging and existing animal health threats. Del Rio Vilas et al. [Reference Del Rio Vilas55] describe the use of this tool with real case examples. The benefits of e-THiR – and MCDA in general – include the ability to systematically and consistently evaluate threats weighted according to the values of decision-makers. Therefore, the process provides auditable output that can be used as a decision aid to justifiably direct tactical and strategic planning. A particular advantage of e-THiR − and other traditional MCDA methods for disease prioritization − is that threats can be rapidly assessed, both at initial implementation and during on-going use of the tool. A general limitation of traditional MCDA is that evaluation of the opinion of large groups of stakeholders is difficult to implement, making these methods more suitable for use with small groups of experts. Del Rio Vilas et al. also noted that limitations of e-THiR included potential lack of comprehensiveness of criteria (a trade-off for simplicity, to increase acceptability of the tool within the organization), subjectivity of criterion measurements due to scarce or poor quality data, and over-estimation of priority due to biased reporting of some threats. However, these limitations are not specific to this tool; balancing the complexity required to achieve useful information outputs against the simplicity needed to ensure that the process does not become intractable, as well as dealing with insufficient or uncertain data and biases in data availability, are challenges common all forms of disease evaluation.

Disease prioritization using indirect weighting follows the same steps as traditional MCDA (Fig. 3). However, instead of asking stakeholders to directly evaluate criteria, stakeholders are asked to evaluate realistic disease scenarios. Mathematical techniques are then used to infer weights for the criteria; techniques for this include probabilistic inversion and conjoint analysis, both recently used in prioritization of EIDs in Canada, The Netherlands and Australia [Reference Havelaar46–48, Reference Brookes56]. Although these techniques are complex and slower to implement than traditional MCDA, disease prioritization using indirect weighting allows web-based survey administration in which non-technical terminology can be used to describe scenarios. This makes the prioritization process accessible to a wider range of stakeholders, including people who are not disease experts − such as the general public and farmers [Reference Ng and Sargeant47, Reference Brookes61]. Once implemented, prioritization of newly detected threats and EIDs is as rapid in MCDA frameworks that use indirect weighting as those that use direct weighting methods such as e-THiR.

The greatest value in using MCDA for disease prioritization comes from its ability to quantify the importance of disease impacts. In particular, ‘public perception’ – the value that the public places on disease impacts – is recognized as an important driver of policy in animal and public health [Reference Ng and Sargeant62]. However, what constitutes ‘public perception’ is poorly understood and has previously been considered intangible [Reference Döring and Nerlich63]. MCDA, especially using indirect weighting of criteria, enables quantification of public perception.

Although this section has discussed the use of MCDA solely for disease prioritization, methods from decision science (such as MCDA) are used extensively as decision aids in other fields including environmental science and homeland security [Reference Bragge, Ehrgott, Naujoks, Stewart and Wallenius64, Reference Linkov65], and also have current limited use to direct resource allocation in health settings [Reference Del Rio Vilas66, Reference Mintiens and Vose67]. These diverse applications of decision-science demonstrate that there is potential further extension of these methods to enhance the development of tactical and strategic plans for emerging risks and EIDs that are acceptable according to current social, cultural, economic and environmental values.

Risk assessment

Risk analysis methods in animal and public health have been used in the last decades to investigate how likely an undesirable event would be, the broad scale potential consequences of the occurrence of this event and the mitigation strategies to reduce the occurrence of this event. These methods provide objective, transparent and repeatable assessments. As MacDiarmid & Pharo [Reference MacDiarmid and Pharo68] described, risk analysis methods are used to help decision-makers answer the questions: ‘What can go wrong?’, ‘How likely is to go wrong?’, ‘What would be the consequences of it going wrong?’, and ‘What can be done to reduce the likelihood or the consequences of it going wrong?’. However, accurate assessments of the potential risk associated with specific health events or diseases usually require a substantial amount of high-quality data. Often these essential data are lacking, in which case justified assumptions are needed. Threats and EIDs (‘What can go wrong?’) need to be initially identified as part of the risk analysis process and depending on the aim of the risk analysis, identification of these threats and EIDs will follow different methodologies. The decision-maker will generally have a well-defined objective which will drive identification of these threats and EIDs [Reference Vose69].

One of the main applications of risk analysis in animal health is the assessment of the potential risks linked with the international trade of animals or animal products. Since the creation of the World Trade Organization (WTO) in 1995, trade in live animals and food of animal origin between different countries has substantially increased, which delivers benefits to both importing and exporting countries. The agreement on the application of Sanitary and Phytosanitary measures (SPS agreement), which came into force with the creation of the WTO, sets out the legal framework for all international trade to protect human, animal and plant life or health, while guaranteeing that these measures are not more restrictive than those applied at a national level. The agreement establishes that measures applied must be based on international standards and recommendations; however, when these do not exist, a science-based risk assessment must be conducted to set the trade measures. Risk analysis has since facilitated international trade, as well as protected human and animal health in the importing countries, through the assessment of the risk posed by potential hazards associated with a specific commodity and the measures that could be applied to reduce this risk to an acceptable level [70]. The World Organization for Animal Health (OIE) sets the standards for risk analysis in relation to animal health [71]. Import risk analyses, which are conducted by government agencies, are an important tool for biosecurity protection. The initial phase of an import risk analysis is the hazard identification process, during which the pathogenic agents that could be present in the imported commodity and are exotic to the importing country are identified for further investigation during the subsequent risk assessment. Some examples of recent import risk analyses conducted by the Australian Government Department of Agriculture – Biosecurity Risk Analysis are the import risk analyses (IRAs) for freshwater ornamental finfish (with respect to gourami iridovirus and related viruses) and for prawns and prawn products. These IRAs are conducted to classify potential quarantine risks and develop policies to manage them (http://www.daff.gov.au/ba/ira/final-animal).

Although increased international trade has proven to benefit the economy of trading partners, a consequence of this increased trade is that the potential risk of spread of pathogens affecting animals and humans between countries has expanded. According to Brown [Reference Brown72], in the last two decades at least one new emerging disease has been identified every year. An example was the introduction and establishment of WNV into the United States in 1999 and subsequent spread across North America, Central and South America and the Caribbean, causing severe neurological disease and many fatalities in humans, horses and birds [Reference Murray, Mertens and Despres73]. A WNV-infected mosquito in an intercontinental plane landing at New York airport was considered to be the most likely entry route into the United States [Reference Wilkins and Del Piero74, Reference Pollock75]. Risk assessment can also be used to understand why infectious diseases emerge. For example, since the introduction of WNV into the Western Hemisphere, risk assessments have been used to investigate the potential introduction of WNV in several countries, such as the Galapagos [Reference Kilpatrick76], Hawaii [Reference Kilpatrick77], Barbados [Reference Douglas78] and Australia [Reference Hernández-Jover, Roche and Ward15]. The main aim of these assessments was to predict the likely introduction of the virus through different pathways, thus providing some guidance for directing resources for the prevention of this introduction. Hernández-Jover et al. [Reference Hernández-Jover, Roche and Ward15] also investigated the potential spatio-temporal spread of WNV to susceptible species and the impact of the resulting outbreak on human and animal health. This study developed a generic framework that could be applied to assess the potential introduction of other mosquito-borne diseases via international aircraft movements.

Risk analysis methods are also being applied to investigate situations involving wildlife disease. The International Union for Conservation of Nature and the OIE have recently published specific guidelines for wildlife disease risk analysis (DRA) [79]. These guidelines aim to provide decision-makers (such as wildlife managers, government and industry representatives) with the information on how to incorporate the wildlife DRA process into their day-to-day activities, supporting the identification of risk mitigation strategies. Overall, the DRA process provides a framework to investigate how to reduce the potential disease risks associated with wildlife affecting species conservation, animal and human health, agriculture and ecosystems.

Another recent example using a risk analysis framework for investigating the emergence of EIDs is the one developed by Ward & Hernández-Jover [Reference Ward and Hernández-Jover80]. This framework was used to understand the emergence of rabies in the eastern islands of Indonesia, so that scarce resources can be targeted to surveillance activities and the sensitivity of surveillance systems increased. By integrating information on the historical spread of rabies, anthropological studies, and the opinions of local animal health experts, eight critical parameters defining the potential disease spread pathways were identified. Focusing on these key components can allow the identification of areas (islands) most at-risk of an emerging rabies event, a form of spatial risk mapping.

Risk assessment supports EID preparedness by providing tools to accurately assess the potential likelihood of introduction and the spread of previously identified EIDs, resource allocation and identification of mitigation strategies.

Disease simulation modelling

Disease simulation models aim to represent reality in a simplified form so that the behaviour of a disease system can be better understood. Although based on mathematical models, disease simulation models tend to focus more on estimating the impact of a disease on a population and therefore have a natural application when combined with risk assessments to define infectious disease impact at a finer and more dynamic scale [Reference Hagerman81]. Such models – if developed correctly and appropriately validated – can then be used to guide policy development and decision-making with a view to reducing the impact of a disease event, such as the spread of an emerging infectious disease [often in veterinary medicine, exotic and transboundary diseases such as foot-and-mouth disease (FMD), highly pathogenic avian influenza and classical swine fever (CSF)]. Key requirements for developing such simulation models are a description of the population at-risk – the structure of herds and flocks, their geographical distribution and how they come into contact through networks and spatially – and the factors that influence disease transmission events. In livestock systems, complex models have been developed for FMD [Reference Ward82] and CSF [Reference Cowled83]. Impact is often measured as number of herds infected, animals culled, vaccine used, and time to control an outbreak. Epidemiological simulation models have been linked with economic models to measure the impact of disease outbreaks and associated control efforts, including vaccination and traceability systems [Reference Hagerman81]. Such models have also have also incorporated capacity and resource constraints to allow realistic evaluation of control strategies [Reference Hagerman81, Reference Ward82]. An advantage of simulation models when used to explore EIDs is the inclusion of a broad range of drivers of disease spread. These can include population immunity, population turnover, environmental, economic and behavioural drivers. The effect of modifying one or more of these drivers on disease emergence can be investigated.

Traditionally, disease spread models have been developed for emerging (transboundary and epidemic) diseases that have a priori (even if qualitatively) been determined to have high impact. In these situations, the focus is on determining the most effective (generally cost-effectiveness) approach to minimising impact. Thus, within a framework for emerging infectious diseases, disease spread modelling is generally seen as the ‘final’ step in a linear process. However, disease modelling – if approached as a generic tool – can be used to investigate which scenarios might have the greatest impact. For example, scenarios in which an EID affects only one animal species with high morbidity/mortality versus another scenario in which a similar disease affects many species but with lower morbidity/mortality could be explored with respect to impact and thus guide the process of scanning, prioritization and risk assessment. If certain scenarios are predicted to have substantially larger impacts, then these should be the focus of future scanning/prioritization/risk assessment efforts. Furthermore, there has been little focus on modelling the presence of several pathogens within a population on the emergence of one of these as a disease event. Integrating disease modelling tools with other components of emerging infectious disease surveillance and response is a critical need in order to effectively manage these risks.

Discussion

Developing appropriate tactical and strategic plans for the prevention and control of emerging threats and EIDs requires balancing control and prevention measures with the potential and actual risks of an EID against a complex social, cultural and economic background in a global environment. This balance can be difficult to achieve. For example, in the absence of an EID outbreak, mitigation measures can be criticized as too stringent if they limit trade and the travel of people who perceive the risk or impact on them to be low or negligible. By contrast, in the event of an EID measures can quickly be criticized as inadequate, particularly by those who are directly affected. For example, the UK government was widely criticized for its handling of both the bovine spongiform encephalopathy (BSE) epidemic in the 1980s and 1990s, and the 2001 FMD outbreak. Responses to both were considered inadequate and subsequent inquiries found that the lack of a systematic, science-based mechanism for assessing and effectively managing risk resulted in insufficient mitigation measures for BSE, and inadequate preparation due to prioritization of resources to mitigate the impacts of BSE was a factor in the under-resourced response to FMD [Reference Anderson84, Reference Phillips, Bridgeman and Ferguson-Smith85]. There can also be unexpected events following the detection of emerging risks that complicate effective control. Panic due to the perceived risk of a suspected pneumonic plague outbreak in India in 1994 caused mass migration of people, potentially spreading the disease and hampering control efforts [Reference Deodhar, Yemul and Banerjee86]. Moreover, mitigation strategies can be found simplistic and inadequate once instigated. For example, complex social and cultural constraints are major barriers to control of the current outbreak of Ebola virus disease in West Africa requiring increased collaboration between anthropologists, politicians and health professionals and their organizations [Reference Check87, 88]. Mechanisms must be in place to rapidly update tactical plans in the event of unexpected challenges. In addition to achieving an appropriate level of prevention and control and meeting unexpected challenges, tactical and strategic plans must be continuously updated as the global risk landscape changes and new information arises.

The examples above illustrate some of the difficulties encountered when implementing prevention and control measures for emerging threats and EIDs. As already discussed, prioritization, risk assessment and disease modelling can be used individually to assist tactical and strategic planning following information collection and assessment. However, when used together, these tools can provide a comprehensive understanding of emerging threats and EIDs, not only about their potential impact and risk, but also the importance of the emerging threat or EID according to current values of decision-makers and relative to the myriad health concerns that compete for limited resources. Figure 4 illustrates how prioritization, risk assessment and simulation modelling naturally integrate based on the flow of information from surveillance and scanning, and the cycle of information as it is refined by these tools into knowledge useful for tactical and strategic planning. Prioritization assesses information from surveillance and scanning according to the values of decision-makers, and research of high-priority emerging threats and EIDs using risk assessment and disease modelling refine knowledge and provide more detailed understanding of the impacts and their probability of occurrence. This knowledge in turn refines prioritization, and as new information continues to arise, it can be assessed in the context of a more thorough understanding of existing health concerns. At each stage, information is systematically processed to deliver knowledge relevant to tactical and strategic planning. An example of a framework in current use is the Risk Management Cycle used by Defra [Reference Del Rio Vilas55]. Following the BSE and FMD crises in the UK, it was recognized that a systematic process to identify and prioritize emerging threats and EIDs was essential to underpin EID preparedness through improved surveillance and contingency planning [Reference Scudamore89]. In the resulting framework developed to achieve these aims, information from horizon scanning and surveillance is used to identify emerging threats and EIDs. Depending on priority (assessed using the prioritization tool e-THiR [Reference Del Rio Vilas55]) and the current state of knowledge, recommendations can be made regarding allocation of resources or changes to policy, and further research such as risk assessment and disease modelling can be instigated. Knowledge gained from this process is used to update D2R2 (Disease briefing, Decision support, Ranking and Risk assessment database), which is used as a resource to refine the prioritization process (Rupert Hine, personal communication). In this framework, systematic assessment of information characterizes emerging threats and EIDs in the context of existing health concerns according to the values of decision-makers who represent stakeholders. This enables response and contingency planning to be well-directed, and as new information arises (from research as well as scanning and surveillance) plans can be rapidly updated. The framework is science-based and transparent; therefore, activities can be justified and are defensible.

Fig. 4. Framework for the integration of surveillance, horizon and environmental scanning, prioritization, risk assessment and disease modelling, to facilitate preparedness and response to emerging infectious disease events.

Collaboration is essential to both collect and make maximum use of information gained from scanning and surveillance. At the level of surveillance, a ‘One Health’ approach is the minimum requirement [Reference Stärk90], but as the scope of information collection widens through horizon scanning and environmental scanning, increased cross-disciplinary collaboration is required that might include economists, social and environmental scientists, decision analysts, and experts in information technology, politics and logistics. Collection of information must also be global, both geographically and in disciplinary scope, covering everything that affects the interactions between host species, their pathogens and their environment. However, geographical, cultural and political barriers can all limit global collaboration and effective information gathering. Drawing on a greater diversity of data sources and types is one way to improve information collection and lead to a greater likelihood that EIDs are identified and correctly assessed. It is also likely the current trend in increasingly scarce resources for traditional surveillance will continue, placing greater dependence on novel data sources and ways of collecting information. One new such data source that has been proposed is crowdsourcing [Reference Chunara, Smolinski and Brownstein91]. Basing EID preparedness and response on a single data source is likely to result in EIDs being undetected or taking longer to be detected. Greater diversity in collection techniques and different foci will lead to greater diversity in information. This is more a policy and political than technical issue. As a consequence, programmes tend to focus on what emerges – not preventing what might emerge.

Currently, preparedness focuses on horizon scanning and surveillance for rapid detection and identification of EIDs, and control measures focus on mitigating the impact of EIDs after they have emerged. Anticipation of specific EIDs is not possible; the nature of information from environmental scanning is non-specific and highly uncertain, and relates to the risk factors that drive EID events which are not sufficiently understood to permit prediction. However, in terms of surveillance, there is value in focusing on areas undergoing rapid change in either animal populations, production systems or marketing systems, as well as areas undergoing rapid socio-ecological change – for example, land use change in combination with other factors. Human population shifts – either due to civil conflict or economic drivers rapidly attracting or dispersing human populations – also are likely to be important to target in surveillance systems. Therefore, at a minimum, the use of information from environmental scanning through foresight programmes can develop capabilities for early detection in the most likely areas. In this way, preparation for EIDs can be diversified across both anticipatory activities − such as where to focus research and surveillance − and prevention and control of known EIDs.

An example is the case of preparation for pandemic influenza A; the emergence of pandemic strains of influenza virus has received much attention during the past decade. We know that more influenza viruses will emerge and we now know that some of these will be highly pathogenic (HPAI) in poultry, and some will have low pathogenicity (LPAI). Both H7N9 and H5N6 influenza viruses have recently emerged from poultry production systems [Reference Qi92, Reference Lam93] that earlier led to the emergence of H5N1. Based on recent history, we also now know that some of these influenza viruses will cause fatal disease in humans and might be spread globally via human-to-human transmission. Rather than detecting new cases of disease in humans quickly (humans as a sentinel), within a framework for EID preparedness and response we should also be allocating resources to prevention – that is, to address some of the drivers – as distinct from also allocating much-needed resources to public health and veterinary services for disease surveillance and response activities. Knowing that the milieu that supports virus evolution and spread still exists will not prevent new viruses from emerging – it requires action. However, taking preventive action within such animal production systems still presents many challenges that cover the broad spectrum of economic, social, technological and behavioural drivers.

Integration of the tools described in this review aims to ensure that both the drivers of EIDs and EID events are recognised and reported in a timely manner, resources are prioritised effectively, and that maximum information is gained from risk assessment and simulation modelling to direct comprehensive tactical and strategic plans. We propose that an integrated approach to EID preparedness through the coordinated application of available tools should provide a greater overall benefit than individual tools applied in an ad hoc manner. Ultimately, the foundation of EID prevention lies in anticipating, recognising and taking action to alter the course of the drivers of EIDs. Addressing these drivers is a global challenge required to achieve sustainable human development, and health security is only one part of this. Until these drivers are addressed, the focus is preparedness for EID events; horizon scanning and surveillance are the foundation for this, without which tactical and strategic plans fail. Although there is rapid development of electronic reporting methods and novel methods for information collection and collation, reducing traditional surveillance should be questioned unless the validity of these methods can be assessed. Paradoxically, anthropogenic drivers of EIDs – for example, advances in technology and communication that have facilitated increased trade and travel – have allowed development of these information collection and assessment techniques. Until the drivers for EID events are addressed, will we get ahead of the curve that we create, or will we just chase it? This is an intriguing question that is beyond the scope of this review.

Declaration of Interest

None.

References

1. Morens, DM, et al. Pandemic influenza's 500th anniversary. Clinical Infectious Diseases 2010; 51: 14421444.Google Scholar
2. Fauci, AS, Folkers, GK. The world must build on three decades of scientific advances to enable a new generation to live free of HIV/AIDS. Health Affairs 2012; 31: 15291536.Google Scholar
3. Jones, KE, et al. Global trends in emerging infectious diseases. Nature 2008; 451: 990U994.CrossRefGoogle ScholarPubMed
4. Brownlie, J, et al. Foresight. Infectious diseases: preparing for the future. Future threats. London: Office of Science and Innovation, 2006.Google Scholar
5. Woolhouse, MEJ, Gowtage-Sequeria, S. Host range and emerging and reemerging pathogens. Emerging Infectious Diseases 2005; 11: 18421847.Google Scholar
6. Heller, J, et al. Assessing the probability of acquisition of meticillin-resistant Staphylococcus aureus (MRSA) in a dog using a nested stochastic simulation model and logistic regression sensitivity analysis. Preventive Veterinary Medicine 2011; 99: 211224.CrossRefGoogle Scholar
7. Butaye, P, et al. Antimicrobial resistance in bacteria from animals and the environment Preface. Veterinary Microbiology 2014; 171: 269272.Google Scholar
8. World Health Organisation. Antimicrobial resistance: global report on surveillance. 2014 April 2014.Google Scholar
9. Morse, SS. Factors in the emergence of infectious diseases. Emerging Infectious Diseases 1995; 1: 715.Google Scholar
10. Louria, DB. Emerging and re-emerging infections: the societal determinants. Futures 2000; 32: 581594.Google Scholar
11. Jones, BA, et al. Zoonosis emergence linked to agricultural intensification and environmental change. Proceedings of the National Academy of Sciences USA 2013; 110: 83998404.Google Scholar
12. Black, P, Nunn, M. Impact of climate change and environmental change on emerging and re-emerging animal diseases and animal production. 2010. Compendium of technical items presented to the OIE World Assembly of Delegates or to OIE Regional Commissions, 2009. 978-92-9044-789-4.Google Scholar
13. Garmendia, AE, Van Kruiningen, HJ, French, RA. The West Nile virus: its recent emergence in North America. Microbes and Infection 2001; 3: 223229.CrossRefGoogle ScholarPubMed
14. Brown, EBE, et al. Assessing the risks of West Nile virus-infected mosquitoes from transatlantic aircraft: implications for disease emergence in the United Kingdom. Vector-Borne and Zoonotic Diseases 2012; 12: 310320.Google Scholar
15. Hernández-Jover, M, Roche, S, Ward, MP. The human and animal health impacts of introduction and spread of an exotic strain of West Nile virus in Australia. Preventive Veterinary Medicine 2013; 109: 186204.Google Scholar
16. Baylis, M. Research gaps in understanding how climate change will affect arboviral diseases. Animal Health Research Reviews 2013; 14: 143146.Google Scholar
17. Guis, H, et al. Modelling the effects of past and future climate on the risk of bluetongue emergence in Europe. Journal of the Royal Society Interface 2012; 9: 339350.Google Scholar
18. Horton, A. A simple guide to successful foresight. Foresight 1999; 1: 59.Google Scholar
19. Willis, NG, et al. Using foresight to prepare animal health today for tomorrow's challenges. Canadian Veterinary Journal 2011; 52: 614618.Google ScholarPubMed
20. King, DA, Thomas, SM. Taking science out of the box – foresight recast. Science 2007; 316: 17011702.Google Scholar
21. Slaughter, RA. Futures for the third millenium: enabling the forward view. Prospect Media, 1999.Google Scholar
22. Frishammar, J. Characteristics in information processing approaches. International Journal of Information Management 2002; 22: 143156.Google Scholar
23. Voros, J. A generic foresight process framework. Foresight 2003; 5: 1021.Google Scholar
24. Mildenberg, Z, et al. Equine encephalosis virus in Israel. Transboundary and Emerging Diseases 2009; 56: 291291.Google Scholar
25. Walsh, AL, Morgan, D. Identifying hazards, assessing the risks. Veterinary Record 2005; 157: 684687.Google Scholar
26. Palmer, S, Brown, D, Morgan, D. Early qualitative risk assessment of the emerging zoonotic potential of animal diseases. British Medical Journal 2005; 331: 12561260.Google Scholar
27. Morgan, D, et al. Assessing the risk from emerging infections. Epidemiology and Infection 2009; 137: 15211530.Google Scholar
28. Hitchcock, P, et al. Challenges to global surveillance and response to infectious disease outbreaks of international importance. Biosecurity and Bioterrorism – Biodefense Strategy Practice and Science 2007; 5: 206.Google Scholar
29. Dufour, B, Hendrikx, P. In: Dufour, B, Hendrikx, P, ed. Epidemiological Surveillance in Animal Health, 2nd edn. CIRAD, FAO, OIE and AEEMA, 2009.Google Scholar
30. Declich, S, Carter, AO. Public-health surveillance – historical origins, methods and evaluation. Bulletin of the World Health Organization 1994; 72: 285304.Google Scholar
31. German, RR, et al. Updated guidelines for evaluating public health surveillance systems: recommendations from the Guidelines Working Group. Morbidity and Mortality Weekly Report. Recommendations and Reports 2001; 50: 135.Google ScholarPubMed
32. Halliday, J, et al. Bringing together emerging and endemic zoonoses surveillance: shared challenges and a common solution. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences 2012; 367: 28722880.CrossRefGoogle Scholar
33. Ward, MP, Kelman, M. Companion animal disease surveillance: a new solution to an old problem? Spatial and Spatio-temporal Epidemiology 2011; 2: 147157.Google Scholar
34. Chan, EH, et al. Global capacity for emerging infectious disease detection. Proceedings of the National Academy of Sciences USA 2010; 107: 2170121706.Google Scholar
35. Vrbova, L, et al. Systematic review of surveillance systems for emerging zoonoses. Transboundary and Emerging Diseases 2010; 57: 154161.Google Scholar
36. Buehler, JW, et al. Framework for evaluating public health surveillance systems for early detection of outbreaks: recommendations from the CDC Working Group. Morbidity and Mortality Weekly Report. Recommendations and Reports 2004; 53: 111.Google Scholar
37. Walker, JG. New media methods for syndromic surveillance and disease modelling. CAB Reviews 2013; 8: 113.Google Scholar
38. Robertson, C, et al. Mobile phone-based infectious disease surveillance system, Sri Lanka. Emerging Infectious Diseases 2010; 16: 15241531.Google Scholar
39. Rosewell, A, et al. Mobile Phone-based syndromic surveillance system, Papua New Guinea. Emerging Infectious Diseases 2013; 19: 18111818.Google Scholar
40. Syibli, M, et al. The Power of One: realising the dream of an integrated animal health information system in Indonesia. International Conference on Animal Health Surveillance 2 (ICAHS2), Havana, Cuba, 2014.Google Scholar
41. World Health Organization. International Health Regulations (2005). World Health Organization, 2008.Google Scholar
42. Olson, DR, et al. Reassessing Google flu trends data for detection of seasonal and pandemic influenza: a comparative epidemiological study at three geographic scales. PLoS Computational Biology 2013; 9: e1003256.Google Scholar
43. Wilson, SJ, Ward, MP, Garner, MG. A framework for assessing the intangible impacts of emergency animal disease. Preventive Veterinary Medicine 2013; 111: 194199.Google Scholar
44. Butler, CD. Infectious disease emergence and global change: thinking systemically in a shrinking world. Infectious Diseases of Poverty 2012; 1: 55.Google Scholar
45. Cox, R, Sanchez, J, Revie, CW. Multi-criteria decision analysis tools for prioritising emerging or re-emerging infectious diseases associated with climate change in Canada. PLoS ONE 2013; 8: e68338.Google Scholar
46. Havelaar, AH, et al. Prioritizing emerging zoonoses in the Netherlands. PLoS ONE 2010; 5: e13965.Google Scholar
47. Ng, V, Sargeant, JM. A quantitative and novel approach to the prioritization of zoonotic diseases in North America: a public perspective. PLoS ONE 2012; 7: e48519.Google Scholar
48. Ng, V, Sargeant, JM. A quantitative approach to the prioritization of zoonotic diseases in North America: a health professionals' perspective. PLoS ONE 2013; 8: e72172.Google Scholar
49. McKenzie, J, Simpson, H, Langstaff, I. Development of methodology to prioritise wildlife pathogens for surveillance. Preventive Veterinary Medicine 2007; 81: 194210.Google Scholar
50. Economopoulou, A, et al. Infectious diseases prioritisation for event-based surveillance at the European Union level for the 2012 Olympic and Paralympic Games. Eurosurveillance 2014; 19: 613.Google Scholar
51. Balabanova, Y, et al. Communicable diseases prioritized for surveillance and epidemiological research: results of a standardized prioritization procedure in Germany, 2011. PLoS ONE 2011; 6.Google Scholar
52. Doherty, JA. Establishing priorities for national communicable disease surveillance. Canadian Journal of Infectious Diseases 2000; 11: 2124.Google Scholar
53. Carter, A, National Advisory Committee on Epidemiology Subcommittee. Establishing goals, techniques and priorities for national communicable disease surveillance. Canadian Journal of Infectious Diseases 1991; 2: 3740.Google Scholar
54. Humblet, M-F, et al. Multidisciplinary and evidence-based method for prioritizing diseases of food-producing animals and zoonoses. Emerging Infectious Diseases 2012; 18.Google Scholar
55. Del Rio Vilas, VJ, et al. An integrated process and management tools for ranking multiple emerging threats to animal health. Preventive Veterinary Medicine 2013; 108: 94102.Google Scholar
56. Brookes, VJ, et al. Building a picture: Prioritisation of exotic diseases for the pig industry in Australia using multi-criteria decision analysis. Preventive Veterinary Medicine 2014; 113: 103117.CrossRefGoogle ScholarPubMed
57. Giesecke, J. Choosing diseases for surveillance. Lancet 1999; 353: 344344.Google Scholar
58. Keeney, RL. Common mistakes in making value trade-offs. Operations Research 2002; 50: 935945.Google Scholar
59. Steele, K, et al. Uses and misuses of multicriteria decision analysis (MCDA) in environmental decision making. Risk Analysis 2009; 29: 2633.Google Scholar
60. Dodgson, J, et al. Multi-criteria Analysis: a Manual: Department for Communities and Local Government: London, 2009.Google Scholar
61. Brookes, VJ, et al. Identifying and measuring stakeholder preferences for disease prioritisation: a case study of the pig industry in Australia. Preventive Veterinary Medicine 2014; 113: 118131.Google Scholar
62. Ng, V, Sargeant, JM. A stakeholder-informed approach to the identification of criteria for the prioritization of zoonoses in Canada. PLoS ONE 2012; 7: e29752.CrossRefGoogle Scholar
63. Döring, M, Nerlich, B. The Social and Cultural Impact of Foot-and-Mouth Disease in the UK in 2001: Experiences and Analyses: Manchester University Press, 2009.Google Scholar
64. Bragge, J, et al. Bibliometric analysis of multiple criteria decision making/multiattribute utility theory. In: Ehrgott, MN, Naujoks, B, Stewart, TJ, Wallenius, J, eds. Multiple Criteria Decision Making for Sustainable Energy and Transportation Systems: Proceedings of the 19th International Conference on Multiple Criteria Decision Making, 2010, pp. 259–268.Google Scholar
65. Linkov, I, et al. Risk informed decision framework for integrated evaluation of countermeasures against CBRN threats. Journal of Homeland Security and Emergency Management 2012; 9.Google Scholar
66. Del Rio Vilas, VJ, et al. Prioritization of capacities for the elimination of dog-mediated human rabies in the Americas: building the framework. Pathogens and Global Health 2013; 107: 340345.Google Scholar
67. Mintiens, K, Vose, D. Multi-criteria decision analysis for evaluating control options during FMD outbreaks. Society for Vererinary Epidemiology and Preventive Medicine, 2012; Glasgow.Google Scholar
68. MacDiarmid, SC, Pharo, HJ. Risk analysis: assessment, management and communication. Revue Scientifique et Technique 2003; 22: 397408.Google Scholar
69. Vose, D. Risk Analysis: A Quantitative Guide, 3rd edn. Chichester: John Wiley & Sons, 2008.Google Scholar
70. World Trade Organization. Understanding the WTO Agreement on Sanitary and Phytosanitary Measures, 1998 (https://www.wto.org/english/tratop_e/sps_e/spsund_e.htm). Accessed 25 July 2014.Google Scholar
71. World Organization for Animal Health (OIE). Import risk analysis, chapter 2·1. In: Terrestrial Animal Health Code 2009. 2009.Google Scholar
72. Brown, C. Emerging diseases: the global express. Veterinary Pathology 2010; 47: 914.Google Scholar
73. Murray, KO, Mertens, E, Despres, P. West Nile virus and its emergence in the United States of America. Veterinary Research 2010; 41.Google Scholar
74. Wilkins, PA, Del Piero, F. West Nile virus: lessons from the 21st century. Journal of Veterinary Emergency and Critical Care 2004; 14: 214.Google Scholar
75. Pollock, SL, et al. Raising chickens in city backyards: the public health role. Journal of Community Health 2012; 37: 734742.Google Scholar
76. Kilpatrick, AM, et al. Predicting pathogen introduction: West Nile virus spread to Galapagos. Conservation Biology 2006; 20: 12241231.Google Scholar
77. Kilpatrick, AM, et al. Quantitative risk assessment of the pathways by which West Nile Virus could reach Hawaii. Ecohealth 2004; 1: 205209.Google Scholar
78. Douglas, KO, et al. A quantitative risk assessment of West Nile virus introduction into Barbados. West Indian Medical Journal 2007; 56: 394397.Google Scholar
79. World Organization for Animal Health (OIE), International Union for Conservation of Nature. Guidelines for wildlife disease risk analysis. Paris: World Organization for Animal Health and International Union for Conservation of Nature, 2014.Google Scholar
80. Ward, MP, Hernández-Jover, M. A generic rabies risk assessment tool to support surveillance. Preventive Veterinary Medicine (in press).Google Scholar
81. Hagerman, AD, et al. Emergency vaccination to control foot-and-mouth disease: implications of its inclusion as a U.S. Policy Option. Applied Economic Perspectives and Policy 2012; 34: 119146.Google Scholar
82. Ward, MP, et al. Simulation of foot-and-mouth disease spread within an integrated livestock system in Texas, USA. Preventive Veterinary Medicine 2009; 88: 286297.Google Scholar
83. Cowled, BD, et al. Controlling disease outbreaks in wildlife using limited culling: modelling classical swine fever incursions in wild pigs in Australia. Veterinary Research 2012; 43.CrossRefGoogle ScholarPubMed
84. Anderson, I. Return to an Address of the Honourable the House of Commons Dated 22 July 2002 for the Foot and Mouth Disease 2001: Lessons to be Learned Inquiry Report. London: The Stationery Office, 2002.Google Scholar
85. Phillips, N, Bridgeman, J, Ferguson-Smith, M. The BSE inquiry. London: The Stationery Office, 2000.Google Scholar
86. Deodhar, NS, Yemul, VL, Banerjee, K. Plague that never was: a review of the alleged plague outbreaks in India in 1994. Journal of Public Health Policy 1998; 19: 184199.Google Scholar
87. Check, Hayden E. World struggles to stop Ebola. Nature 2014; 512: 355356.Google Scholar
88. ProMED-mail. Ebola virus disease – West Africa (90): Sierra Leone, Ghana meeting, history, 2014. Archive Number: 20140716·2615640.Google Scholar
89. Scudamore, J. Partnership, priorities and professionalism – a strategy for enhancing veterinary surveillance in the UK. Department for Environment, Food and Rural Affairs, 2003.Google Scholar
90. Stärk, KDC, et al. One health surveillance: more than a buzz word? Preventive Veterinary Medicine (in press).Google Scholar
91. Chunara, R, Smolinski, MS, Brownstein, JS. Why we need crowd sourced data in infectious disease surveillance. Current Infectious Disease Reports 2013; 15: 316319.Google Scholar
92. Qi, X, et al. Whole-genome sequence of a reassortant H5N6 avian influenza virus isolated from a live poultry market in China, 2013. Genome Announcements 2014; 2.Google Scholar
93. Lam, TT-Y, et al. The genesis and source of the H7N9 influenza viruses causing human infections in China. Nature 2013; 502: 241.Google Scholar
Figure 0

Fig. 1. The relationship between information type and collection method, and the projected time-frame of activities for emerging infectious disease preparedness.

Figure 1

Fig. 2. Steps of a foresight programme, modified from Horton [18] and Voros [23].

Figure 2

Fig. 3. Flowchart of steps for disease prioritization using multi-criteria decision analysis (MCDA). Modified from Brookes et al. [56].

Figure 3

Fig. 4. Framework for the integration of surveillance, horizon and environmental scanning, prioritization, risk assessment and disease modelling, to facilitate preparedness and response to emerging infectious disease events.