Skip to main content Accessibility help
×
Hostname: page-component-5cf477f64f-zrtmk Total loading time: 0 Render date: 2025-03-31T21:51:30.893Z Has data issue: false hasContentIssue false

3 - It’s Not (All) about the Information

The Role of Cognition in Creating and Sustaining False Beliefs

Published online by Cambridge University Press:  13 March 2025

Madelyn R. Sanfilippo
Affiliation:
University of Illinois School of Information Sciences
Melissa G. Ocepek
Affiliation:
University of Illinois School of Information Sciences

Summary

This chapter focuses on how it is possible to develop and retain false beliefs even when the relevant information we receive is not itself misleading or inaccurate. In common usage, the term misinformed refers to someone who holds false beliefs, and the most obvious source of false beliefs is inaccurate information. In some cases, however, false beliefs arise, not from inaccurate or misleading information, but rather from cognitive biases that influence the way that information is interpreted and recalled. Other cognitive biases limit the ability of new and accurate information to correct existing misconceptions. We begin the chapter by examining the role of cognitive biases and heuristics in creating misconceptions, taking as our context misconceptions commonly observed during the COVID-19 pandemic. We then explain why accurate information does not always or necessarily correct misconceptions, and in certain situations can even entrench false beliefs. Throughout the chapter, we outline strategies that information designers can use to reduce the possibility that false beliefs arise from, and persist in the face of, accurate information.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2025
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Introduction

There is currently a great deal of talk about “misinformation,” “disinformation,” and “fake news.” There are widespread calls to limit the promotion of false information, particularly by the press, and to correct false information when and where it appears. These discussions, and these concerns, focus on the content of communications and on the need to ensure that communications are factually correct and accurate. Both are, indisputably, critical goals in the battle against “fake news” and other types of misleading information.

In the final analysis, however, it is not the false information per se that is of concern – rather, it is the false beliefs that can arise from that information that are the problem. Christopher Fox (Reference Fox1983) makes this issue eminently clear in his careful analysis of information and misinformation. According to Fox, to “misinform” someone is intentionally to cause them to believe something that is untrue, typically by promulgating false or inaccurate information. Critical to this definition is the end state – the state of false belief – and consistent with this focus, the term “misinformed” is widely used as an adjective to describe someone who holds incorrect or false beliefs. Thus, teens are said to be misinformed about the proper use of condoms (Rosenberg Reference Rosenberg2001), patients are identified as misinformed about the causes and treatment of the common cold (Braun et al. Reference Braun, Fowles, Solberg, Kind, Healey and Anderson2000), and prior to the 2016 United States election, a surprisingly high proportion of eligible voters were said to be misinformed about Barack Obama’s birthplace (Holman and Lay Reference Holman and Lay2019).

In this chapter, we focus on the state of holding false beliefs, and examine one particular cause of this state – specifically the biased cognitive processing of objectively true information. Our selective focus adds to a rich body of literature discussing and distinguishing information, misinformation, and disinformation (e.g., Fox Reference Fox1983). We recognize that false beliefs can arise from objectively inaccurate information, especially where there is an intention to mislead. We also recognize that trust figures largely in the development of beliefs, and misplaced trust could lead one to believe false information if the source of that information has in the past been reliable. More insidious is the impact of the increasing sophistication of false information (e.g., false deepfake videos depicting, with great apparent realism, events that did not happen) on our ability and willingness to trust, and thus believe, information that is in fact accurate (see Fallis Reference Fallis2020 for a discussion of this point). The development of true, or accurate, beliefs relies at least in part on the ability of the public to discern accurate information and reject false information, and to distinguish relevant from irrelevant information, and digital literacy training to improve these skills is an important aspect of the battle against misinformation (Scherer and Pennycook Reference Scherer and Pennycook2020). Our analysis adds a new perspective on the issue of false beliefs, suggesting an additional avenue through which such beliefs can arise, and additional strategies that can be used to minimize the development of false beliefs.

Figure 3.1 Visual themes related to everyday misinformation.

Setting the Context

To begin, a clarification: we are in no way denying the value of accurate information or the cost of false information. Both can and do tend to have the effects one would anticipate, in that accurate information tends to foster accurate beliefs, and false information tends to undermine them. Efforts to reduce the spread of misinformation (Bak-Coleman et al. Reference Bak-Coleman, Kennedy and West2022; Kim et al. Reference Kim, Tabibian, Oh, Schölkopf and Gomez-Rodriguez2018), to identify and correct false information (Rubin et al. Reference Rubin, Brogly, Conroy, Chen, Cornwell and Asubiaro2019; Vraga and Bode Reference Vraga and Bode2017), and to assist information consumers to do the same (Sharon and Baram-Tsabari Reference Sharon and Baram‐Tsabari2020; Vraga, Tully, and Bode 2020) are now and will remain tremendously important.

Nevertheless, we contend that even if these goals are perfectly achieved, misapprehensions will persist – because people are not simply passive receivers of information. The act of [mis]informing does not end with [mis]information – a great deal happens in the mind after information is encountered. When someone receives information, it is filtered through cognitive processes, including strategies and heuristics that can introduce bias, and incorporated into existing belief structures. In other words, the information is used. Our focus is on those moments, and processes, of information use – what Savolainen (Reference Savolainen2009) terms the typically “unspecified ‘appendix’ of information seeking” (116) – and the impact that these have on beliefs.

Thus, in this chapter, we examine the impact of information use in developing and sustaining misapprehensions – even when the information being processed is not itself misleading or inaccurate. Examples of this type of false belief abound. For example, many people incorrectly estimate the frequency of murder to be much higher than that of suicide (Fischoff, Slovic, and Lichtenstein Reference Fischhoff, Slovic and Lichtenstein1977), and travelers often express greater concern about flying as opposed to driving. There are, however, no false news reports or other statements that assign a higher probability to murder as compared to suicide, or a greater risk to flying as compared to driving. The misapprehensions arise instead from the ways in which decision makers cope with the limitations of what Herbert Simon termed “bounded rationality” (Wheeler Reference Wheeler2018). These limitations lead us to rely on mental shortcuts, or heuristics, that support the best possible decisions with the limited resources – attention, memory, and information – that we have at our disposal. Although these heuristics generally work well, they lead to severe and systematic errors in thinking, or cognitive biases (Tversky and Kahneman Reference Tversky and Kahneman1974, 1124). These biases, in turn, can lead us to develop or persist in incorrect conclusions and false beliefs.

We explore these issues in the specific context of the COVID-19 pandemic, but the arguments, and conclusions, have wide application. We structure the chapter around specific misconceptions, exploring how cognitive biases and heuristics can influence these misperceptions, and how careful information design and education can ameliorate those effects.

Misconceptions in the Context of the COVID-19 Pandemic

Since the onset of the coronavirus pandemic in 2020, much research has documented the existence and impact of COVID-related misinformation (e.g., Roozenbeek et al. Reference Roozenbeek, Schneider and Van Der Linden2020; Tasnim, Hossain, and Mazumder Reference Tasnim, Hossain and Mazumder2020). In this section, we examine how people’s biased processing of accurate COVID-related information contributed to five misconceptions observed over the course of the pandemic. We explain how the misconceptions stem from different kinds of cognitive biases, and outline information design and educational strategies that could potentially reduce the biases’ effects. We also show how the biases influenced people’s willingness to take up preventive actions, such as wearing masks, complying with social distancing guidelines, and getting vaccinated. Within the governing knowledge commons framework (Frischmann, Madison, and Strandburg Reference Frischmann, Madison, Strandburg, Frischmann, Madison and Strandburg2014), these preventive actions constitute distinct “action arenas” (Ostrom Reference Ostrom, Menard and Shirley2005) where participants make decisions with varying outcomes.

Misconception 1: “COVID-19 Isn’t Really a Problem – There Are Only a Few Cases”

Even in the earliest stages of the COVID-19 crisis, there was widespread understanding that case counts were growing – but there was also widespread and severe underestimation of the rate of growth, and thus severe underestimation of the number of future COVID-19 cases (Banerjee and Majumdar Reference Banerjee and Majumdar2020; Villanova Reference Villanova2022). Part of the explanation lies in exponential growth bias: a general tendency to underestimate the impact of exponential growth. This bias is well-demonstrated in a children’s book by Demi entitled One Grain of Rice. According to the story, an Indian peasant girl was offered a reward of her choice for service to the king. Her request, which appeared modest to the sovereign, was for a single grain of rice on the first day, and for each of the following thirty days an additional amount of double the number of grains received the day before. The greedy king readily agreed, thinking he had struck a very good bargain. On the last day of the month, however, the girl received four storehouses full of rice in addition to all that had been provided up to that time. In the story, the king has fallen prey to exponential growth bias, and the young girl has used his misconception to her advantage to feed her entire community.

Unfortunately, the impact of exponential growth bias is not as positive when it comes to responding to the COVID-19 crisis. Especially in the initial stages, the spread of infectious diseases follows an exponential function, with a few positive cases exploding into a widespread pandemic if the disease is, as with COVID-19, sufficiently transmittable (Banerjee, Bhattacharya, and Majumdar Reference Banerjee, Bhattacharya and Majumdar2021). Initially, the number of reported COVID-19 infections was low, and a focus on this low number, coupled with a predictable misapprehension of the impact of exponential growth, led many to underestimate the severity of the situation (Lammers, Crusius, and Gast Reference Lammers, Crusius and Gast2020). During the outbreak of March 2020, for example, the number of coronavirus patients in the United States doubled about every three days; however, a study conducted during this period showed that American participants mistakenly perceived the growth of cases of the virus as linear (Lammers, Crusius, and Gast Reference Lammers, Crusius and Gast2020). This misperception could, in turn, hinder the adoption of measures to fight and contain the pandemic. Research has shown that, in the comprehension of COVID-19 disease data, the exponential growth bias can predict noncompliance with safety measures such as handwashing, mask-wearing, and the use of sanitizers (Banerjee, Bhattacharya, and Majumdar Reference Banerjee, Bhattacharya and Majumdar2021). Moreover, interventions that correct the misperception of exponential coronavirus growth have been shown to significantly increase support for social distancing (Lammers, Crusius, and Gast Reference Lammers, Crusius and Gast2020).

Some instructional or educational strategies appear to reduce the effect of exponential growth bias. Lammers, Crusius, and Gast (Reference Lammers, Crusius and Gast2020) found that exponential growth bias could be reduced by asking audiences to read a few sentences that explained the bias (e.g., “keep in mind that many people forget that the speed by which the corona virus spreads, increases each day …”). Even greater reduction in the bias was observed in an experimental context where participants were encouraged to “step through” case number increases when asked to estimate a total number at a specific future date given an initial case count and doubling time (e.g., estimate cases at day fifteen given five initial cases and a doubling time of three days).

Exponential growth bias can also be reduced by careful information design. There is some evidence that presenting growth information in terms of doubling times rather than growth rates (i.e., “the number of cases is expected to double every fifteen days” as compared to “cases are expected to grow at a rate of 5 percent per day”) can reduce exponential growth bias (Schonger and Sele Reference Schonger and Sele2021). Graphical representations of exponential growth are especially difficult to grasp (Berenbaum Reference Berenbaum2021), and the dominant mode of presenting COVID-19 case numbers in the print and online media has been graphical (Banerjee, Bhattacharya, and Majumdar Reference Banerjee, Bhattacharya and Majumdar2021). Banerjee, Bhattacharya, and Majumdar (Reference Banerjee, Bhattacharya and Majumdar2021) experimentally demonstrated that the representation of past COVID-19 case numbers in numerical form (see Table 3.1) significantly decreased the exponential growth bias relative to graphical representation (see Figure 3.2). From this, they suggest that data should be shown via raw numbers (see Table 3.1) and presented “alongside familiar ‘flatten-the-curve’ style graphics” (8).

Table 3.1 Hypothetical growth of COVID-19 cases over fifteen days with a doubling time of three days, presented numerically

DayNumber of COVID-19 cases
15
310
620
940
1280
15160

Figure 3.2 Hypothetical growth of COVID-19 cases over fifteen days with a doubling time of three days, presented graphically.

Misconception 2: “COVID-19 Won’t Happen To Me”

Epidemiologists raised the threat of a global pandemic as early as January 2020 and “announced that more than 40–70 % of the world population could be infected within the end of the year” (Bottemanne et al. Reference Bottemanne, Morlaàs, Fossati and Schmidt2020, 2); in the same month, the Chinese city of Wuhan, with a population of 11 million people, went into full quarantine (Cai Reference Cai2020). Despite this and other strong and mounting objective evidence of widespread infection, people underestimated their personal risk of contracting and/or transmitting COVID-19. Survey data collected in several European countries in February 2020, for example, revealed that a large majority of participants believed that their risk of contracting the coronavirus was around 1 percent (Raude et al. Reference Raude, Debin and Bonmarin2020). (By comparison, as of February 2022, approximately 58 percent of the United States population – and 75 percent of US children – had contracted COVID-19, according to clinical testing of blood samples for SARS-CoV-2 antibodies (Ducharme Reference Ducharme2022).) Many also viewed themselves as less likely than average, and less likely than others, to contract the disease. During the early phases of the outbreak in March 2020, German adults who had not yet tested positive for COVID-19 perceived family members and friends to be at a higher risk of infection than themselves (Gerhold Reference Gerhold2020), and Polish university students estimated their likelihood of contracting the coronavirus as lower than that of their peers (Dolinski et al. Reference Dolinski, Dolinska, Zmaczynska-Witek, Banach and Kulesza2020). Meanwhile, one respondent in an American poll claimed that they were “cautiously optimistic” the United States would “nip [the virus] in the bud” before it could spread as it had in Europe (Allyn and Sprunt Reference Allyn and Sprunt2020).

This unwarranted and elevated perception of personal safety is consistent with the optimism bias (Weinstein Reference Weinstein1983). Optimism bias leads us to think ourselves more likely than average to experience positive outcomes, and less likely than average to experience negative outcomes. This bias appears to arise from an inappropriate comparison group – when thinking about our risk of serious consequences of disease, for example, we implicitly compare ourselves to others who we believe to be less healthy and more vulnerable than ourselves, rather than to a more representative “average” other – and by comparison, we estimate ourselves as less likely to experience the negative effects (Weinstein Reference Weinstein1983).

The misapprehension of reduced personal risk could lead people to ignore public health recommendations and refrain from personal hygiene practices and precautions (Pascual-Leone et al. Reference Pascual‐Leone, Cattaneo, Macià, Solana, Tormos and Bartrés‐Faz2021; see Wise et al. Reference Wise, Zbozinek, Michelini, Hagan and Mobbs2020 for a discussion of the relationship between risk perception and protective behaviors). Indeed, research shows that individuals with high optimism bias engaged less in protective behavioral changes during the COVID-19 pandemic in 2020 (Fragkaki et al. Reference Fragkaki, Maciejewski, Weijman, Feltes and Cima2021).

The delivery of objectively accurate risk information, even by trusted messengers, will not correct this bias (Felgendreff et al. Reference Felgendreff, Korn, Sprengholz, Eitze, Siegers and Betsch2021); however, early research on the optimism bias indicates that personalized risk comparators on a group (Weinstein Reference Weinstein1983) or individual (Alicke et al. Reference Alicke, Klotz, Breitenbecher, Yurak and Vredenburg1995) basis can help to correct risk perceptions. This targeted communication approach, with messages designed for specific individuals or subpopulations, will be effective in reducing optimism bias when audiences can be segregated. For general (untargeted) communications, there is specific research in the COVID-19 context, consistent with prior research in other domains, showing that unrealistic optimism is reduced by communications (video or text) that emphasize the risk-reducing activities of others (e.g., compliance with medical recommendations for social distancing; Dolinski et al. Reference Dolinski, Kulesza, Muniak, Dolinska, Węgrzyn and Izydorczak2022). In general, interventions that explicitly or implicitly provide an appropriate comparison group for personal risk estimation will help to mitigate optimism bias.

Misconception 3: “Vaccines Don’t Work”

Research shows that, in real-world settings, COVID-19 vaccines offer a high degree of protection against SARS-CoV-2-related diseases (Zheng et al. Reference Zheng, Shao, Chen, Zhang, Wang and Zhang2022), and the efficacy of these vaccines has been reported extensively and positively in the mainstream media (e.g., Hayes Reference Hayes2021; Thomas and Hanna Reference Thomas and Hanna2021). Nonetheless, many remain “vaccine hesitant” (Kirzinger et al. Reference Kirzinger, Sparks and Brodie2021), choosing not to take the vaccination. One common argument among the vaccine hesitant is that “vaccines don’t work” – and the cited evidence is that the vaccinated continue to make up “most people admitted to hospital with Covid-19” (Benedictus Reference Benedictus2021),Footnote 1 and a large proportion of those dying from the disease (latest data from Canada indicates that from December 2020 to June 2022 there have been 10,385 COVID-related deaths among the unvaccinated, and 7,423 COVID-related deaths among the vaccinated: Statista 2022). If vaccines work, the reasoning goes, those who are vaccinated should not be contracting COVID-19, should not be hospitalized with COVID-19, and should not die from the disease. This is, of course, a misunderstanding of the math and science. No vaccine is 100 percent effective, but an effective vaccine reduces the likelihood of contracting a disease, and can also reduce the severity of the disease if it is contracted. According to the best available data, COVID-19 vaccines provide both types of protection (Centers for Disease Control and Prevention 2022). So, why the persistent belief that the vaccines don’t work?

This misperception arises at least in part from a failure to take into account the vaccination base rate, or the proportion of people in the entire population who are vaccinated compared to those who are unvaccinated. Imagine a virus that will, without any protection, affect 30 percent of the population. Imagine also that 95 percent of the population have received a vaccine for the virus. If the vaccine provides no protection, then 30 percent of the unvaccinated and 30 percent of the vaccinated will be infected, and if we consider the positive cases only, 95 percent of those will be individuals who have received the vaccine (because the vaccine offers no protection, and 95 percent of the population has received the vaccine: Egger and Egger Reference Egger and Egger2022). If the vaccine provides perfect protection (which never happens), then all cases will be among the unvaccinated. In general, however, reality falls somewhere in between. Say, for example, that the vaccine is 80 percent protective against disease; in other words, only 20 percent of those who are vaccinated and who would otherwise have been infected will actually get the virus. In this case, 30 percent of the unvaccinated population will still get the disease, but the rate among the vaccinated will fall to 6 percent (the 20 percent for whom the vaccine is ineffective, out of the 30 percent who would otherwise be expected to become infected).

So far, so good – but here comes the surprising part: under these circumstances, the large majority of infections will still occur in people who are vaccinated. In fact, almost 80 percent of the people who get the condition will have received the vaccination. Put another way, of 1,000 randomly selected people in this population, 15 unvaccinated are expected to contract COVID-19 (30 percent of the 5 percent of the population who are unvaccinated), and 57 among the vaccinated are expected to get the disease (20 percent of the 30 percent who would otherwise get COVID-19, among the 95 percent of the population who are vaccinated). In general, if the proportion of those who have a condition is anything less than the proportion of the population vaccinated against the disease, then the vaccine is providing protection – but that doesn’t seem right when the large majority of those we see with the condition have been vaccinated against it.

Base rates – in this case, the rate of vaccination in the population – have a surprisingly large and counterintuitive impact on outcomes, leading to inaccurate intuitions in a wide variety of circumstances. Thus, for example, we tend to over-interpret the results of accurate screening tests for rare conditions (Burkell Reference Burkell2004), incorrectly assuming that a positive screening test is a strong indicator that the condition is present. Screening tests, by definition, test for relatively rare conditions with a fast and low-cost initial test designed to capture the large majority of true positive cases. These tests are by definition imperfect, and are focused on avoiding false negative results (saying that someone does not have the condition when in fact they do) at the expense of an increased risk of false positive results (saying that someone has the condition when in fact they do not) that can only be resolved through further testing. As a consequence, and as illustrated in Table 3.2, a large proportion of positive screening test results are actually false (in the case presented, 95 of 150 positive results, or 66 percent) that is, the individual does not in fact have the identified condition. This situation arises precisely because there are so many people without the condition – or, to put it another way, because the base rate of the condition in the population is low, with the result that even a small tendency to give a false positive result creates a large absolute number of such results. Base rate neglect may also play a role in the alarm recently expressed by bicycling advocacy groups over an increase in bicycling deaths (Advocacy Advance 2021).Footnote 2 While there are undoubtedly many explanations for this shift, one that is not regularly acknowledged is the increase in the number of cyclists on the road (Mazerolle Reference Mazerolle2021).Footnote 3 This represents an increase in the base rate of cycling as a mode of transportation, which would increase the number of observed cycling accidents even if no other risks to cyclists were to change.

Table 3.2 Hypothetical screening test results for a population of 1,000; condition present in 5% of the population, test results in 0% false negative results (sensitivity 100%), 10% false positive results (specificity 90%)

Screen negativeScreen positive
Actual negativeTrue negative results: 855False positive results: 95Actual negative cases: 950
Actual positiveFalse negative results: 0True positive results: 50Actual positive cases: 50
Screen negative results: 855Screen positive results: 145

Base rate neglect may explain, at least in part, why a sizable segment of unvaccinated people perceive COVID-19 vaccines to be ineffective (Kirzinger et al. Reference Kirzinger, Sparks and Brodie2021) despite credible proof to the contrary.Footnote 4 Evidence of this misperception can be easily observed, even among healthcare professionals (Kampf Reference Kampf2021). Governmental data showing a higher proportion of vaccinated than unvaccinated people in COVID-19 deaths has been used to argue for the inefficacy of vaccines online (e.g., The Exposé 2022).Footnote 5 Base rate neglect was also at the root of the February 2022 pronouncement by a high-ranking Canadian politician that the COVID-19 vaccine was no longer effective (Quon Reference Quon2022).Footnote 6 In that case, data showing that the number of new COVID-19 cases in the province of Saskatchewan was “about the same in [sic] vaccinated and unvaccinated people” was presented as evidence of vaccine inefficacy. This conclusion ignored the critical fact that nearly 80 percent of the province’s eligible population had received at least two vaccine doses, and even a small proportion of breakthrough cases would result in a relatively large number of vaccinated individuals testing positive for COVID-19.

At the same time, many recognize that these perceptions arise from base rate neglect, and efforts to correct this misperception have appeared in a variety of sources including news articles (e.g., Devis Reference Devis2021; Ferreira Reference Ferreira2022)Footnote 7Footnote 8 and social media posts (e.g., Rumilly Reference Rumilly2021).Footnote 9 Approaches to addressing base rate neglect include graphics that “zoom out” to show the base rate of vaccinated and unvaccinated groups in a hypothetical population (Rumilly Reference Rumilly2021), allowing viewers to visualize how the unvaccinated population is disproportionately hospitalized with COVID-19, even if more hospitalized patients are vaccinated than unvaccinated. Animations demonstrating changes in the percent of vaccinated (as opposed to unvaccinated) COVID-19 hospitalizations with changes in the base rate of vaccination in the population have also been used to achieve this outcome (Panthagani Reference Panthagani2022).Footnote 10

Many of these alternative visualizations are presented in the context of “corrective” or educational information (e.g., in a blog intended to “help you stay on the frontline of health information”: Panthagani Reference Panthagani2022). These are effective tools for motivated audiences focused on the accuracy of their own perceptions, but it is important also to ensure that information about case counts is initially presented in ways that minimize base rate neglect and thereby minimize the inaccurate interpretation that vaccines do not provide protection. Case counts presented with vaccination status but without drawing attention to the base rate of vaccination in the population invite misinterpretation arising from base rate neglect. At the very least, therefore, media and other public reports should include the relevant base rate information along with counts. A better approach is to take advantage of presentation formats that have been demonstrated to lead to better understanding. Specifically, frequency histograms (Burkell Reference Burkell2004) that show infected and uninfected members of vaccinated and unvaccinated populations have been demonstrated to reduce base rate neglect.

Misconception 4: “Vaccines Aren’t Safe”

In general, vaccines are safe and effective at reducing disease, having undergone rigorous testing for both efficacy and negative side effects before approval. Although the COVID-19 vaccines were developed in a relatively short time in response to an acute health crisis, they are no exception to these general rules, and have been widely demonstrated to reduce incidence and severity of COVID-19 infection with few if any negative health consequences (Centers for Disease Control and Prevention 2022; Henry and Glasziou Reference Henry and Glasziou2021). Nonetheless, and consistent with public response to other vaccines, there remains a portion of the population who refuse the vaccine, citing safety concerns (King et al. Reference King, Rubinstein, Reinhart and Mejia2021; Monte Reference Monte2021).Footnote 11 The issue of vaccine safety is complex, and indeed there are questions about the long-term impact, including unintended consequences, of the mRNA vaccines (Seneff and Nigh Reference Seneff and Nigh2021). In many cases, however, hesitancy is linked to an overestimation of the incidence of vaccine-related adverse effects, including among unvaccinated health professionals (Ehrenstein et al. Reference Ehrenstein, Hanses, Blaas, Mandraka, Audebert and Salzberger2010).

This inaccurate risk perception can be attributed at least in part to the availability bias, which is a tendency to base frequency or probability estimates on the ease with which specific instances can be recalled or imagined (Tversky and Kahneman Reference Tversky and Kahneman1973). When less likely outcomes are more salient – for example, by virtue of the frequency with which they are reported – the probability of these events can be overestimated. Thus, for example, we will tend to overestimate the likelihood of an automobile accident immediately after witnessing one, and we will overestimate the likelihood of an airplane crash if one has recently been covered in the news. The availability bias is stronger for negative than for positive events, perhaps because those events are more salient (Stapel, Reicher, and Spears Reference Stapel, Reicher and Spears1995). Media coverage is a significant factor in the availability bias, and has been shown to influence health concerns among the general public (Brezis, Halpern-Reichert, and Schwaber Reference Brezis, Halpern-Reichert and Schwaber2004) and even physician diagnostic decisions (Schmidt et al. Reference Schmidt, Mamede, Van Den Berge, Van Gog, Van Saase and Rikers2014).

A rare serious adverse effect of a vaccine covered in the media provides “a vivid and emotionally compelling anti-vaccination message, likely to be recalled during decision making” that could cause one to overestimate the probability of an adverse effect following immunization (Azarpanah et al. Reference Azarpanah, Farhadloo, Vahidov and Pilote2021, 8). In March 2021, for example, reports emerged in the mainstream media of a possible link between AstraZeneca’s COVID-19 vaccine and rare but potentially fatal blood clots (e.g., Gronholt-Pedersen Reference Gronholt-Pedersen2021; Reuters staff 2021).Footnote 12, Footnote 13 Some of these reports included descriptions of individual cases, such as that of a sixty-year-old Danish woman who developed a blood clot and died after vaccination (Gronholt-Pedersen Reference Gronholt-Pedersen2021). In the wake of this media coverage, one March 2021 poll showed that Canadians were much warier of the AstraZeneca vaccine than other vaccines approved for use in Canada (Bryden Reference Bryden2021).Footnote 14 Although blood clots among those with low blood platelet counts were eventually confirmed to be a side effect of the AstraZeneca vaccine, the European Medicines Agency (2021) stressed that the complication was “very rare” and that the vaccine’s benefits in preventing COVID-19 outweighed the risks of side effects.

In a context where low-probability events, such as vaccine side effects, are more likely to be reported than higher-probability outcomes, such as vaccination without incident, the availability bias will lead audiences to overestimate the risk of the low-probability event. Counteracting this bias requires that the alternative – safe and incident-free vaccinations – are made more salient and more easily recalled. Strategies to achieve these outcomes include multiple repetitions of the items judged less frequent, and/or increasing the memorability of the reports of incident-free vaccinations (Lewandowsky and Smith Reference Lewandowsky and Smith1983). The latter could be supported, for example, by the intuitively appealing strategy of reporting stories of high-profile individuals who have received vaccination and not experienced side effects. Some strategies to reduce availability bias focus on instruction and training (e.g., Mamede et al. Reference Mamede, de Carvalho-Filho and Schmidt2020). These approaches are most useful when specific individuals are engaged in repeated risk estimation tasks (e.g., physicians making a diagnosis), and often involve guided reflection on decision-making processes along with specific stepwise strategies including the consideration of alternative conclusions (Prakash, Sladek, and Schuwirth Reference Prakash, Sladek and Schuwirth2019).

Misconception 5: “We’ve Had the Pandemic – So We’re Safe for a Long Time”

No one knows when the next pandemic will occur, and our expectations on this point matter. Belief that another pandemic is likely in the near future will spur prevention and resilience activities, while the assumption that it will be a long time before we have another pandemic will foster complacency and a disinterest in taking up risk mitigation activities. Like deciding whether to take actions to curb the spread of COVID-19, choosing to engage in or avoid measures that could prevent the next pandemic can be conceptualized as a distinct action arena. Although there is no “correct” prediction regarding when we will next see a pandemic, it is important that perceptions are unbiased and appropriately calibrated to objective and current information about risk. These perceptions, however, are influenced by biases and heuristics, and it is crucial that we understand what these are, how they operate, and how they will influence the perceived risk of a new pandemic.

There are numerous cognitive biases that could and likely do influence our expectations about when another pandemic will occur. One of these is the gambler’s fallacy, well demonstrated in the history of the “Lion’s Share” slot machine at the MGM Grand hotel-casino. In 2014, this slot machine paid out a progressive jackpot of $2.4 million after collecting money for twenty years from unsuccessful gamblers (CNN Wire 2014). Some, like the author of one blog post, believed that the long “dry” period meant the slot machine was “due for a win” (Best US Casinos, n.d.)Footnote 15 Assuming a fair machine with outcomes determined randomly, this intuition is unequivocally wrong. Players at that particular machine were no more likely than those at any other machine with the same odds to have a “winning” spin – because like all independent random events, slot machine spins have no “memory.” For each new instance, the probability of winning is exactly the same. The contrary belief – essentially the belief that luck (good or bad) must change – is called the gambler’s fallacy (Tversky and Kahneman Reference Tversky and Kahneman1974).

The gambler’s fallacy isn’t limited to the casino. The fallacy leads people to believe that improbable events operate on an implicit “schedule,” and thus become increasingly likely as time passes without an event (in Eastern Canada, after a few years without a big snowstorm you’ll hear the refrain “we’re due for a big one”) and, once they occur, are unlikely to be repeated, at least in the near future (think about the familiar adage “lightning never strikes twice”). Soccer goalkeepers fall prey to the gambler’s fallacy when faced with penalty kicks (Wogan Reference Wogan2014), and asylum judges, loan officers, and baseball umpires show signs of the gambler’s fallacy when making multiple decisions (Chen, Moskowitz, and Shue Reference Chen, Moskowitz and Shue2016).

The gambler’s fallacy appears to be operating in at least some expectations regarding pandemics. COVID-19 has been described as a “once-in-a-lifetime” (e.g., Guterres, n.d.)Footnote 16 or “once-in-a-century” (e.g., Gates Reference Gates2020) event.Footnote 17 Even World Health Organization (WHO) Director General Tedros Adhanom Ghebreyesus characterized the pandemic as a “once-in-a-century health crisis” during a 2020 meeting of the WHO’s emergency committee (Reuters staff 2020).Footnote 18 This belief might partially arise from the fact that the Spanish flu – which has attracted “unprecedented interest” due to COVID-19 and has clear parallels to the COVID-19 pandemic itself (Simonetti, Martini, and Armocida Reference Simonetti, Martini and Armocida2021, E613) – broke out approximately 100 years prior, in 1918. According to some:

the gambler’s fallacy is almost irresistible when we think about pandemics. We think the fact that a new disease has emerged from the natural world so recently, and caused such a terrible catastrophe, means that we’re due some luck. Surely we must be due a long reprieve before the next one. Surely we will have time to prepare.

Even if pandemics were completely random events – and they are not – this reasoning would be flawed: the fact that we had a pandemic 100 years ago and another one just recently does not suggest that the next is due in another 100 years. Indeed, research that models pandemic risk “shows that the frequency and severity of spillover infectious disease – directly from wildlife host to humans – is steadily increasing” (Smitham and Glassman Reference Smitham and Glassman2021). Contrary to the assumption that it will be a long time before another severe pandemic, one research team estimates “the annual probability of a pandemic on the scale of COVID-19 in any given year to be between 2.5 and 3.3 percent, which means a 47–57 percent chance of another global pandemic as deadly as COVID in the next 25 years” (Smitham and Glassman Reference Smitham and Glassman2021). Moreover, and contrary to the gambler’s fallacy, barring factors that increase the annual risk (e.g., increased zoonotic spillover; see Birch Reference Birch2021), the chances of a new pandemic in each of the next 100 years is identical, and a new pandemic just as likely after one virus-free year as it is after 99 virus-free years.

The gambler’s fallacy (also known as belief in the law of small numbers; Tversky and Kahneman Reference Tversky and Kahneman1971) is a remarkably persistent cognitive bias (see, e.g., Bishop, Thompson, and Parker Reference Bishop, Thompson and Parker2022). Basic training regarding random events does not appear to reduce the bias (Beach and Swensson Reference Beach and Swensson1967). There is some evidence, however, that the perceptual grouping of events can influence the gambler’s fallacy. In particular, if an event (say, for example, the upcoming year) is seen as grouped with or a continuation of a prior series of events (e.g., the past 100 years), then the gambler’s fallacy is evident. If, however, the event is viewed as the beginning of a new series, the gambler’s fallacy is reduced or even eliminated (Roney and Trick Reference Roney and Trick2003). In the COVID-19 context, this would suggest that the underestimation of the risk of a new pandemic could be reduced if risk discussions were forward looking (e.g., “over the next 100 years”), rather than backward looking (“in the past 100 years”).

Information Is Not a Panacea

By now, you should be convinced that inaccurate beliefs can and do arise in the face of accurate information – and that careful information design and/or education can reduce this possibility. At this point, we want to address a different question: specifically, is correct(ive) information a panacea for the misinformed?

The seemingly obvious approach to correcting misperceptions is to provide people with accurate information. This approach is consistent with the knowledge deficit model (or information deficit model) which suggests that public skepticism toward modern science and technology is mainly caused by insufficient information – or a “knowledge deficit” – which can be remedied by more information about these topics (Dickson Reference Dickson2005; Simis et al. Reference Simis, Madden, Cacciatore and Yeo2016). In the case of COVID-19 vaccine hesitancy, for example, some have argued that clear and transparent communication about vaccine risks and benefits could increase vaccine uptake among the general public (Kerr et al. Reference Kerr, Freeman, Marteau and van der Linden2021).

The knowledge deficit model generally attributes misapprehension to a lack of appropriate information and assumes that “a thorough and accessible explanation of facts” (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022, 13) should overcome incorrect beliefs, including those arising from objectively false information and those that are the product of the types of cognitive biases outlined earlier in this chapter. Research indicates, however, that merely conveying accurate information is often insufficient to correct misperceptions, for two interrelated reasons. First, we have a tendency to selectively attend to and seek information that is consistent with preexisting beliefs, in a bias termed the confirmation bias (Jones and Sugden Reference Jones and Sugden2001; Wason Reference Wason1960). Second, objectively false information that is encountered and accepted continues to influence our thinking, even in the face of new and accurate information, including when that corrective information has been accepted as true – a phenomenon known as the continued influence effect (CIE) (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022; Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020).

Confirmation Bias

Confirmation bias describes the tendency to selectively seek out and attend to information that is consistent with previously held beliefs or attitudes (Jones and Sugden Reference Jones and Sugden2001; Wason Reference Wason1960); evaluation bias, which is associated, is the tendency to more positively evaluate information that is consistent with prior beliefs and attitudes. The effect of these two biases is to minimize cognitive dissonance (Festinger Reference Festinger1957), or the uncomfortable state of internal cognitive inconsistency that can arise when we are confronted with something inconsistent with our beliefs and/or attitudes. The effects of confirmation bias are exacerbated in the online social environment, where past information consumption choices influence future offerings, not only to the individual but also to others who are identified as “similar” based on complex algorithmic processing (Ling Reference Ling2020). Confirmation bias contributes to opinion polarization, through a feedback loop wherein even a slight bias or tendency in thinking (e.g., belief that vaccines are unsafe) is exaggerated by selective attention to supporting information, which in turn creates an even stronger tendency to ignore inconsistent messages (Modgil et al. Reference Modgil, Singh, Gupta and Dennehy2021; Xu et al. Reference Xu, Coman, Yamamoto and Jimenez Najera2022). Surprisingly, there is some evidence that the tendency to select confirmatory messages and to perceive them as more convincing is higher among individuals with higher literacy levels (in the case of the research, health literacy levels: Meppelink et al. Reference Meppelink, Smit, Fransen and Diviani2019). Evaluation biases similarly lead to discounting of disconfirming evidence. Information that is strongly inconsistent with previously held beliefs is evaluated as less credible, and is therefore less likely to be accepted, than information consistent with those beliefs (Christensen Reference Christensen2021).

Confirmation bias can be reduced (somewhat paradoxically) by making the corrective information more difficult to process – creating what some researchers have termed “disfluency” in information processing (Hernandez and Preston Reference Hernandez and Preston2013; see also Rajsic, Wilson, and Pratt Reference Rajsic, Wilson and Pratt2018). Disfluency is affected by features such as the visual clarity of text and is believed to promote a critical and analytical mindset (Hernandez and Preston Reference Hernandez and Preston2013). Research has shown, for instance, that high school students score higher on exams when learning materials are presented in hard-to-read fonts (Diemand-Yauman, Oppenheimer, and Vaughan Reference Diemand-Yauman, Oppenheimer and Vaughan2011). In two studies, Hernandez and Preston (Reference Hernandez and Preston2013) found that participants’ preexisting attitudes toward certain issues became less extreme when they read arguments on the issues in a disfluent format (e.g., light gray, bolded and italicized Haettenschweiler font vs. standard 12-point Times New Roman). The second study further determined that participants were only able to disconfirm their prior biases when they were not under “cognitive load” (i.e., distraction or time pressure) (Hernandez and Preston Reference Hernandez and Preston2013).

In some cases, simply increasing the salience of corrective information can ameliorate false beliefs. Schwind and colleagues (Schwind et al. Reference Schwind, Buder, Cress and Hesse2012; Schwind and Buder Reference Schwind and Buder2012) suggest that explicitly recommending information that is inconsistent with prior preferences or beliefs (e.g., by highlighting the results in an online search) can be effective in reducing or correcting those misapprehensions.

Confirmation bias may also play a role in situations where efforts to deliver corrective information “backfire.” Worldview backfire effects occur when corrections that challenge people’s worldviews bolster their belief in the original misinformation (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). This phenomenon was originally identified by Nyhan and Reifler (Reference Nyhan and Reifler2010), who found that corrections that contradicted individuals’ political beliefs increased their prior misperceptions.

Worldview backfire effects have been conceived of as a product of directionally motivated reasoning (Holman and Lay Reference Holman and Lay2019; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). In short, individuals may be motivated to arrive at either an accurate conclusion or a particular, directional conclusion (Kunda Reference Kunda1990), and “worldview backfire effects transpire when directional motivations take precedence over accuracy goals” (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020, 170). These effects are believed to be tied to both confirmation bias and disconfirmation bias, the latter of which involves calling to mind opposing arguments, or “counterarguing,” when faced with ideologically dissonant information (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). In essence, when a correction challenges false beliefs that are central to one’s identity, they may discount the accurate information and generate counterarguments to it (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). For instance, someone identifying as an antivaxxer might perceive information proving that the risks of a vaccine do not outweigh the risks of a disease to be an identity threat, and subsequently ignore this worldview-inconsistent evidence while selectively focusing on evidence that supports their own position (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022).

To avoid worldview backfire effects, corrections “should be tailored to their target audience: the subset of people for whom these corrections would feel the most threatening” (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020, 171). Specifically, corrections should be framed to be consonant with their target audience’s values and worldviews (Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). When attempting to convince someone with an “ecocentric” outlook that nanotechnology is safe, for example, the target may be more receptive to evidence of the technology’s safety if its use is portrayed as part of an effort to protect the environment; here, potential benefits and opportunities are highlighted rather than risks and threats (Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012). Corrections that threaten a target’s worldview can also be made more palatable by pairing them with a self-affirmation or identity affirmation, which involves a message or task that affirms one’s basic values (Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012) or highlights sources of self-worth (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022).

The Continued Influence Effect (CIE)

The fact that misinformation may continue to influence people’s thinking – even after a retraction has been acknowledged and recalled – is known as the continued influence effect (CIE) (Johnson and Seifert Reference Johnson and Seifert1994; Seifert Reference Seifert2002). The CIE has been explained by dual-process theory and mental model theory (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). Dual-process theory assumes that memory retrieval can be automatic (fast and unconscious) or strategic (deliberate and effortful) (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). While automatic processing is “relatively acontextual, distilling information down to only its most essential properties,” strategic processing is needed “to retrieve specific details about a piece of information” (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020, 174). From this perspective, it is possible that individuals could automatically retrieve a piece of misinformation and fail to recall relevant features, such as its source or perceived accuracy (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020).

On the other hand, mental model theory suggests that people “construct models of external events in their heads, which they continuously update as new information becomes available” (Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020, 174). Retractions may create a gap in an individual’s model of an event, motivating them to continue to invoke the original misinformation (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022; Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Wittenberg and Berinsky Reference Wittenberg, Chloe, Berinsky, Persily and Tucker2020). This effect can be reduced by providing an alternative causal explanation for an event that fills the gap left behind by the retracted misinformation (Ecker et al. Reference Ecker, Lewandowsky and Tang2010; Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022; Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Seifert Reference Seifert2002). For instance, to correct the misperception that a fire was caused by negligence, a causal explanation (“there is evidence for arson”) is more effective than a plain retraction (“there was no negligence”) (Ecker et al. Reference Ecker, Lewandowsky and Amazeen2022, 21). One study found that a specific warning describing the CIE reduced, but did not eliminate, participants’ continued reliance on outdated information (Ecker et al. Reference Ecker, Lewandowsky and Tang2010). When the warning was combined with a plausible alternative explanation for the retracted information, the CIE was further reduced – though still not fully eliminated (Ecker et al. Reference Ecker, Lewandowsky and Tang2010).

Moving Beyond Panaceas

At the outset of this section, we asked the question of whether information is a “panacea” for the misinformed. Not surprisingly, the answer is a resounding “no.” Complex problems such as the issue of mis/disinformation require complex, multifaceted, and multitiered solutions (Ostrom and Cox Reference Ostrom and Cox2010), and are resistant to simple, single-strategy approaches. Addressing the pressing issue of false beliefs – and false information – will require interventions with information consumers, information providers and producers, and information hosting platforms, and regulatory bodies including governments at the very least. Moreover, and critically, while interventions at the individual level including training and appropriate information design will be essential to reducing the impact of mis/disinformation, interventions with hosting platforms including regulation by government and other bodies will be critical to an effective and appropriate governance response.

Conclusion

To be misinformed is to hold false beliefs, and the most obvious source of false beliefs is misinformation. In some cases, however, false beliefs arise even when the information received and processed is entirely accurate – and delivering accurate information does not always or necessarily correct prior misconceptions.

In this chapter, we have demonstrated how information use – rather than the accuracy of information itself – can cause people to become and remain misinformed. In our discussion, we have focused on one specific context – that of COVID-19 – and within that context we have documented particular situations in which cognitive biases have (or could have) a meaningful effect on reasoning and decision making. Throughout the chapter, we have referred to many other circumstances in which cognitive biases can influence decisions, and information professionals should be aware of and responsive to these potential influences whenever there is a situation of judgment or decision making under uncertainty (Kahneman, Slovic, and Tversky Reference Kahneman, Slovic and Tversky1982), including, for example, elections (and campaign materials), healthcare decisions, or decisions about resource allocation (e.g., to climate change initiatives). The list of cognitive biases we have highlighted in this discussion are illustrative rather than exhaustive, and there are many resources identifying additional biases and heuristics (e.g., Kahneman, Slovic, and Tversky Reference Kahneman, Slovic and Tversky1982), including discussions of the use of these techniques to promote particular viewpoints or decisions (e.g., Thaler, Sunstein, and Balz Reference Thaler, Sunstein, Balz and Shafir2013).

Cognitive biases and heuristics affect information processing, with impact on attitudes, beliefs, and choices. We rely on them because they work – making “good enough” reasoning possible in the context of cognitive limitations. But “good enough” is far from perfect, and from time to time – as discussed in this chapter – these biases and heuristics lead us astray. The bad news is that these biases are persistent and unconscious; the good news is that they can be to some extent mitigated by careful information design and education. Rather than provide an exhaustive review, our aim has been to highlight examples of biases in information processing that contribute to the formation and maintenance of false beliefs, and to identify possible strategies that reduce their impact. These strategies, among others, can help information professionals ensure that the accurate information they provide supports equally accurate beliefs. These strategies cannot replace other aspects of governance solutions, including platform regulation (Ostrom and Cox Reference Ostrom and Cox2010), but they form part of an effective multisectoral response to the problem of misinformation.

References

Advocacy Advance. 2021. “Bicycle Injury and Fatality Statistics during the Pandemic.” May 27. www.advocacyadvance.org/2021/05/bicycle-injury-and-fatality-statistics-during-the-pandemic/.Google Scholar
Alicke, Mark, Klotz, Mary Lou, Breitenbecher, David L., Yurak, Tricia J., and Vredenburg, Debbie S.. 1995. “Personal Contact, Individuation, and the Better-Than-Average Effect.” Journal of Personality and Social Psychology 68 (5): 804825. https://doi.org/10.1037/0022-3514.68.5.804.CrossRefGoogle Scholar
Allyn, Bobby, and Sprunt, Barbara. 2020. “Poll: As Coronavirus Spreads, Fewer Americans See Pandemic as a Real Threat.” NPR, March 17. www.npr.org/2020/03/17/816501871/poll-as-coronavirus-spreads-fewer-americans-see-pandemic-as-a-real-threat.Google Scholar
Azarpanah, Hossein, Farhadloo, Mohsen, Vahidov, Rustam M., and Pilote, Louise. 2021. “Vaccine Hesitancy: Evidence from an Adverse Events Following Immunization Database, and the Role of Cognitive Biases.” BMC Public Health 21 (1686): 113. https://doi.org/10.1186/s12889–021-11745-1.CrossRefGoogle ScholarPubMed
Bak-Coleman, Joseph B., Kennedy, Ian, and West, Jevin D. et al. 2022. “Combining Interventions to Reduce the Spread of Viral Misinformation.” Nature Human Behaviour. https://doi.org/10.1038/s41562-022-01388-6.CrossRefGoogle Scholar
Banerjee, Ritwik, Bhattacharya, Joydeep, and Majumdar, Priyama. 2021. “Exponential-Growth Prediction Bias and Compliance with Safety Measures Related to COVID-19.” Social Science and Medicine 268 (January): 19. https://doi.org/10.1016/j.socscimed.2020.113473.CrossRefGoogle ScholarPubMed
Banerjee, Ritwik, and Majumdar, Priyama. 2020. “Exponential Growth Bias in the Prediction of Covid-19 Spread and Economic Expectation.” IZA Discussion Paper No. 13664. http://dx.doi.org/10.2139/ssrn.3687141.CrossRefGoogle Scholar
Beach, Lee Roy, and Swensson, Richard G.. 1967. “Instructions about Randomness and Run Dependency in Two-Choice Learning.” Journal of Experimental Psychology 75 (2): 279282. https://doi.org/10.1037/h0024979.CrossRefGoogle ScholarPubMed
Benedictus, Leo. 2021. “Most People Admitted to Hospital with Covid-19 Are Vaccinated.” Full Fact, October 21. https://fullfact.org/health/economist-vaccination-status/.Google Scholar
Berenbaum, May R. 2021. “On COVID-19, Cognitive Bias, and Open Access.” Proceedings of the National Academy of Sciences 118 (2): 13. https://doi.org/10.107/pnas.2026319118.CrossRefGoogle ScholarPubMed
Best US Casinos. 2022. “Mysterious Slot Lions Share Hits for $2.4M at MGM.” Accessed October 21. www.bestuscasinos.org/vegas/lions-share-jackpot-won/.Google Scholar
Birch, Jonathan. 2021. “Animals, Humans and Pandemics: What Needs to Change?” The London School of Economics and Political Science, March 9. www.lse.ac.uk/philosophy/blog/2021/03/09/animals-humans-and-pandemics-what-needs-to-change/.Google Scholar
Bishop, D. V. M., Thompson, Jackie, and Parker, Adam J.. 2022. “Can We Shift Belief in the ‘Law of Small Numbers’?Royal Society Open Science 9 (3): 127. https://doi.org/10.1098/rsos.211028.CrossRefGoogle ScholarPubMed
Bode, Leticia, and Vraga, Emily K.. 2015. “In Related News, That Was Wrong: The Correction of Misinformation through Related Stories Functionality in Social Media.” Journal of Communication 65 (4): 619638. https://doi.org/10.1111/jcom.12166.CrossRefGoogle Scholar
Bottemanne, Hugo, Morlaàs, Orphée, Fossati, Philippe, and Schmidt, Liane. 2020. “Does the Coronavirus Epidemic Take Advantage of Human Optimism Bias?Frontiers in Psychology 11: 15. https://doi.org/10.3389/fpsyg.2020.02001.CrossRefGoogle ScholarPubMed
Braun, Barbara L., Fowles, Jinnet B., Solberg, Leif, Kind, Elizabeth, Healey, Margaret, and Anderson, Renner. 2000. “Patient Beliefs about the Characteristics, Causes, and Care of the Common Cold.” Journal of Family Practice 49 (2): 153156.Google ScholarPubMed
Brezis, Mayer, Halpern-Reichert, Daphna, and Schwaber, Mitchell J.. 2004. “Mass Media-Induced Availability Bias in the Clinical Suspicion of West Nile Fever.” Annals of Internal Medicine 140 (3): 234235. https://doi.org/10.7326/0003-4819-140-3-200402030-00024.CrossRefGoogle ScholarPubMed
Bryden, Joan. 2021. “Canadians Far More Wary of AstraZeneca than Other COVID-19 Vaccines: Poll.” CP24, March 30. www.cp24.com/news/canadians-far-more-wary-of-astrazeneca-than-other-covid-19-vaccines-poll-1.5367801?cache=yes%3FautoPlay%3Dtrue%3FclipId%3D89950.Google Scholar
Burkell, Jacquelyn. 2004. “What Are the Chances? Evaluating Risk and Benefit Information in Consumer Health Materials.” Journal of the Medical Library Association 92 (2): 200208.Google ScholarPubMed
Cai, Jane. 2020. “Wuhan Quarantine: Shutting Down a City Five Times the Size of London.” South China Morning Post, January 23. www.scmp.com/news/china/politics/article/3047365/wuhan-quarantine-shutting-down-city-five-times-size-london.Google Scholar
Centers for Disease Control and Prevention. 2022. “COVID-19 Vaccines Work.” June 28. www.cdc.gov/coronavirus/2019-ncov/vaccines/effectiveness/work.html.Google Scholar
Chen, Daniel L., Moskowitz, Tobias J., and Shue, Kelly. 2016. “Decision Making under the Gambler’s Fallacy: Evidence from Asylum Judges, Loan Officers, and Baseball Umpires.” Quarterly Journal of Economics 131 (3): 11811242.CrossRefGoogle Scholar
Christensen, Love. 2021. “Optimal Persuasion under Confirmation Bias: Theory and Evidence from a Registered Report.” Journal of Experimental Political Science 10 (1): 420. https://doi.org/10.1017/XPS.2021.21.CrossRefGoogle Scholar
CNN Wire. 2014. “Couple Wins $2.4 Million Jackpot at Vegas Slot Machine that Hasn’t Paid Out in 20 Years.” Fox 8, August 25. https://myfox8.com/news/couple-wins-2-4-million-jackpot-at-vegas-slot-machine-that-hasnt-paid-out-in-20-years/.Google Scholar
Cook, John, Ecker, Ullrich, and Lewandowsky, Stephan. 2015. “Misinformation and How to Correct It.” In Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource, ed. Scott, Robert A., and Kosslyn, Stephen Michael. Hoboken, NJ: Wiley, 117.Google Scholar
Demi, . 1997. One Grain of Rice: A Mathematical Folktale. New York: Scholastic,Google Scholar
Devis, Deborah. 2021. “Why Are There So Many Vaccinated People in Hospital?” Cosmos, September 20. https://cosmosmagazine.com/health/covid/why-are-there-so-many-vaccinated-people-in-hospital/.Google Scholar
Dickson, David. 2005. “The Case for a ‘Deficit Model’ of Science Communication.” SciDevNet 27: 16. www.scidev.net/global/editorials/the-case-for-a-deficit-model-of-science-communic/.Google Scholar
Diemand-Yauman, Connor, Oppenheimer, Daniel M., and Vaughan, Erikka B.. 2011. “Fortune Favors the Bold (and the Italicized): Effects of Disfluency on Educational Outcomes.” Cognition 118 (1): 111115. https://doi.org/10.1016/j.cognition.2010.09.012.CrossRefGoogle ScholarPubMed
Dolinski, Dariusz, Dolinska, Barbara, Zmaczynska-Witek, Barbara, Banach, Maciej, and Kulesza, Wojciech. 2020. “Unrealistic Optimism in the Time of Coronavirus Pandemic: May It Help to Kill, if So—Whom: Disease or the Person?Journal of Clinical Medicine 9 (5): 19. https://doi.org/10.3390/jcm9051464.CrossRefGoogle ScholarPubMed
Dolinski, Dariusz, Kulesza, Wojciech, Muniak, Paweł, Dolinska, Barbara, Węgrzyn, Rafał, and Izydorczak, Kamil. 2022. “Media Intervention Program for Reducing Unrealistic Optimism Bias: The Link between Unrealistic Optimism, Well‐Being, and Health.” Applied Psychology: Health and Well‐Being 14 (2): 499518.Google ScholarPubMed
Ducharme, Jamie. 2022. “Almost 60% of Americans Have Had COVID-19, CDC Says.” Time, April 26. https://time.com/6170735/how-many-people-have-had-covid-19/.Google Scholar
Ecker, Ullrich K. H., Lewandowsky, Stephan, and Amazeen, Michelle A. et al. 2022. “The Psychological Drivers of Misinformation Belief and Its Resistance to Correction.” Nature Reviews Psychology 1 (1): 1329. https://doi.org/10.1038/s44159–021-00006-y.CrossRefGoogle Scholar
Ecker, Ullrich K. H., Lewandowsky, Stephan, Swire, Briony, and Chang, Darren. 2011. “Correcting False Information in Memory: Manipulating the Strength of Misinformation Encoding and Its Retraction.” Psychonomic Bulletin and Review 18 (3): 570578.CrossRefGoogle ScholarPubMed
Ecker, Ullrich K. H., Lewandowsky, Stephan, and Tang, David T. W.. 2010. “Explicit Warnings Reduce but Do Not Eliminate the Continued Influence of Misinformation.” Memory and Cognition 38 (8): 10871100. https://doi.org/10.3758/mc.38.8.1087.CrossRefGoogle ScholarPubMed
Egger, Sam, and Egger, Garry. 2022. “The Vaccinated Proportion of People with COVID-19 Needs Context.” The Lancet 399 (10325): 627. https://doi.org/10.1016/S0140–6736(21)02837-3.CrossRefGoogle ScholarPubMed
Ehrenstein, Boris P., Hanses, Frank, Blaas, Stefan, Mandraka, Falitsa, Audebert, Franz, and Salzberger, Bernd. 2010. “Perceived Risks of Adverse Effects and Influenza Vaccination: A Survey of Hospital Employees.” European Journal of Public Health 20 (5): 4959499.CrossRefGoogle ScholarPubMed
European Medicines Agency. 2021. “COVID-19 Vaccine AstraZeneca: Benefits Still Outweigh the Risks Despite Possible Link to Rare Blood Clots with Low Blood Platelets.” March 18. https://www.ema.europa.eu/en/news/covid-19-vaccine-astrazeneca-benefits-still-outweigh-risks-despite-possible-link-rare-blood-clots-low-blood-platelets.Google Scholar
The Exposé. 2022. “Italy’s National Institute of Health Reveals the Fully Vaccinated Now Account for 7 in Every 10 Covid-19 Deaths.” April 9. https://expose-news.com/2022/04/09/italy-7-in-10-covid-deaths-vaccinated/.Google Scholar
Fallis, Don. 2020. “The Epistemic Threat of Deepfakes.” Philosophy and Technology 34 (4): 623643.CrossRefGoogle ScholarPubMed
Felgendreff, Lisa, Korn, Lars, Sprengholz, Philipp, Eitze, Sarah, Siegers, Regina, and Betsch, Cornelia. 2021. “Risk Information Alone Is Not Sufficient to Reduce Optimistic Bias.” Research in Social and Administrative Pharmacy 17 (5): 10261027. https://doi.org/10.1016/j.sapharm.2021.01.010.CrossRefGoogle ScholarPubMed
Ferreira, Jennifer. 2022. “Making Sense of the Numbers: Greater Proportion of Unvaccinated Are Being Hospitalized.” CTV News, February 8. www.ctvnews.ca/health/coronavirus/most-covid-19-hospitalizations-in-provinces-are-among-the-vaccinated-here-s-why-1.5770226.Google Scholar
Festinger, Leon. 1957. A Theory of Cognitive Dissonance. Vol. 2. Palo Alto: Stanford University Press,CrossRefGoogle Scholar
Fischhoff, Baruch, Slovic, Paul, and Lichtenstein, Sarah. 1977. “Knowing with Certainty: The Appropriateness of Extreme Confidence.” Journal of Experimental Psychology: Human Perception and Performance 3 (4): 552564.Google Scholar
Fox, Christopher John. 1983. Information and Misinformation: An Investigation of the Notions of Information, Misinformation, Informing, and Misinforming. Westport, CT: Greenwood.CrossRefGoogle Scholar
Fragkaki, Iro, Maciejewski, Dominique F., Weijman, Esther L., Feltes, Jonas, andGoogle ScholarGoogle Scholar
Frischmann, Brett M., Madison, Michael J., and Strandburg, Katherine J.. 2014. “Governing Knowledge Commons.” In Governing Knowledge Commons, ed. Frischmann, Brett M., Madison, Michael J., and Strandburg, Katherine J., 143. Oxford University Press.CrossRefGoogle Scholar
Gates, Bill. 2020. “Responding to Covid-19: A Once-in-a-Century Pandemic?New England Journal of Medicine 382 (18): 16771679. www.nejm.org/doi/full/10.1056/nejmp2003762.CrossRefGoogle ScholarPubMed
Gerhold, Lars. 2020. “COVID-19: Risk Perception and Coping Strategies.” https://doi.org/10.31234/osf.io/xmpk4.CrossRefGoogle Scholar
Gronholt-Pedersen, Jacob. 2021. “Woman Who Died after AstraZeneca Shot Had ‘Highly Unusual’ Symptoms, Officials Say.” Global News, March 16. https://globalnews.ca/news/7696802/astrazeneca-denmark-blood-clot-suspension/.Google Scholar
Guterres, António. 2022. “All Hands On Deck to Fight a Once-in-a-Lifetime Pandemic.” United Nations. Accessed October 21. www.un.org/en/un-coronavirus-communications-team/all-hands-deck-fight-once-lifetime-pandemic.Google Scholar
Hayes, Kimani. 2021. “Gottlieb Says Vaccines Still Effective at Preventing Serious Illness from Delta Variant.” CBS News, August 2. www.cbsnews.com/news/covid-19-vaccines-delta-variant-effectiveness-gottlieb/.Google Scholar
Henry, David, and Glasziou, Paul. 2021. “How Well Do COVID Vaccines Work in the Real World?” The Conversation, July 1. https://theconversation.com/how-well-do-covid-vaccines-work-in-the-real-world-162926.Google Scholar
Hernandez, Ivan, and Preston, Jesse Lee. 2013. “Disfluency Disrupts the Confirmation Bias.” Journal of Experimental Social Psychology 49 (1): 178182.CrossRefGoogle Scholar
Holman, Mirya R., and Lay, J. Celeste. 2019. “They See Dead People (Voting): Correcting Misperceptions about Voter Fraud in the 2016 US Presidential Election.” Journal of Political Marketing 18 (1–2): 3168. https://doi.org/10.1080/15377857.2018.1478656.CrossRefGoogle Scholar
Johnson, Hollyn M., and Seifert, Colleen M.. 1994. “Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Journal of Experimental Psychology: Learning, Memory, and Cognition 20 (6): 14201436.Google Scholar
Jones, Martin, and Sugden, Robert. 2001. “Positive Confirmation Bias in the Acquisition of Information.” Theory and Decision 50 (1): 5999.CrossRefGoogle Scholar
Kahneman, Daniel, Slovic, Paul, and Tversky, Amos. 1982. Judgment under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Kampf, Günter. 2021. “COVID-19: Stigmatising the Unvaccinated Is Not Justified.” The Lancet 398 (10314): 1871. https://doi.org/10.1016/S0140–6736(21)02243-1.CrossRefGoogle ScholarPubMed
Kerr, John R., Freeman, Alexandra L. J., Marteau, Theresa M., and van der Linden, Sander. 2021. “Effect of Information about COVID-19 Vaccine Effectiveness and Side Effects on Behavioural Intentions: Two Online Experiments.” Vaccines 9 (4): 122. https://doi.org/10.3390/vaccines904379.CrossRefGoogle ScholarPubMed
Kim, Jooyeon, Tabibian, Behzad, Oh, Alice, Schölkopf, Bernhard, and Gomez-Rodriguez, Manuel. 2018. “Leveraging the Crowd to Detect and Reduce the Spread of Fake News and Misinformation.” In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, 324–332). https://doi.org/10.1145/3159652.3159734.CrossRefGoogle Scholar
King, Wendy C., Rubinstein, Max, Reinhart, Alex, and Mejia, Robin. 2021. “COVID-19 Vaccine Hesitancy January–May 2021 Among 18–64 Year Old US Adults by Employment and Occupation.” Preventive Medicine Reports 24: 19. https://doi.org/10.1016/j.pmedr.2021.101569.CrossRefGoogle ScholarPubMed
Kirzinger, Ashley, Sparks, Grace, and Brodie, Mollyann et al. 2021. “KFF COVID-19 Vaccine Monitor: July.” KFF, August 21. www.kff.org/coronavirus-covid-19/poll-finding/kff-covid-19-vaccine-monitor-july-2021/.Google Scholar
Kunda, Ziva. 1990. “The Case for Motivated Reasoning.” Psychological Bulletin 10 (3): 480498. https://doi.org/10.1037/0033-2909.108.3.480.CrossRefGoogle Scholar
Lammers, Joris, Crusius, Jan, and Gast, Anne. 2020. “Correcting Misperceptions of Exponential Coronavirus Growth Increases Support for Social Distancing.” Proceedings of the National Academy of Sciences 117 (28): 1626416266. https://doi.org/10.1073/pnas.2006048117.CrossRefGoogle ScholarPubMed
Lewandowsky, Stephan, Ecker, Ullrich K. H., Seifert, Colleen M., Schwarz, Norbert, and Cook, John. 2012. “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest 13 (3): 106131. https://doi.org/10.1177%2F1529100612451018.CrossRefGoogle ScholarPubMed
Lewandowsky, Stephan, and Smith, Paul W.. 1983. “The Effect of Increasing the Memorability of Category Instances on Estimates of Category Size.” Memory and Cognition 11 (4): 347350.CrossRefGoogle ScholarPubMed
Ling, Rich. 2020. “Confirmation Bias in the Era of Mobile News Consumption: The Social and Psychological Dimensions.” Digital Journalism 8 (5): 596604. https://doi.org/10.1080/21670811.2020.1766987.CrossRefGoogle Scholar
Mamede, Sílvia, de Carvalho-Filho, Marco Antonio, and Schmidt, Henk G. et al. 2020. “‘Immunising’ Physicians against Availability Bias in Diagnostic Reasoning: A Randomised Controlled Experiment.” BMJ Quality and Safety 29 (7): 550559.CrossRefGoogle ScholarPubMed
Mazerolle, John. 2021. “Great COVID-19 Bicycle Boom Expected to Keep Bike Industry On Its Toes for Years To Come.” CBC News, March 21. www.cbc.ca/news/business/bicycle-boom-industry-turmoil-covid-19-1.5956400.Google Scholar
Meppelink, Corine S., Smit, Edith G., Fransen, Marieke L., and Diviani, Nicola. 2019. “‘I Was Right about Vaccination’: Confirmation Bias and Health Literacy in Online Health Information Seeking.” Journal of Health Communication 24 (2): 129140. https://doi.org/10.1080/10810730.2019.1583701.CrossRefGoogle ScholarPubMed
Modgil, Sachin, Singh, Rohit Kumar, Gupta, Shivam, and Dennehy, Denis. 2021. “A Confirmation Bias View on Social Media Induced Polarisation during Covid-19.” Information Systems Frontiers 26 (2): 417441.CrossRefGoogle Scholar
Monte, Lindsay M. 2021. “Household Pulse Survey Shows Many Don’t Trust COVID Vaccine, Worry about Side Effects.” United States Census Bureau, December 28. www.census.gov/library/stories/2021/12/who-are-the-adults-not-vaccinated-against-covid.html.Google Scholar
Nyhan, Brendan, and Reifler, Jason. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32 (2): 303330. https://doi.org/10.1007/s11109–010-9112-2.CrossRefGoogle Scholar
Ostrom, Elinor. 2005. “Doing Institutional Analysis: Digging Deeper than Markets and Hierarchies.” In Handbook of New Institutional Economics, ed. Menard, Claude and Shirley, Mary M., 819848. Boston: Springer.CrossRefGoogle Scholar
Ostrom, Elinor. 2009. Understanding Institutional Diversity. Princeton, NJ: Princeton University Press,CrossRefGoogle Scholar
Ostrom, Elinor, and Cox, Michael. 2010. “Moving Beyond Panaceas: A Multi-tiered Diagnostic Approach for Social-Ecological Analysis.” Environmental Conservation 37 (4): 451463.CrossRefGoogle Scholar
Panthagani, Kristen. 2022. “Q: If 50% of COVID Hospitalizations Are among the Vaccinated, Does that Mean the Vaccines Aren’t Working?” Dear Pandemic, February 3. https://dearpandemic.org/base-rate-fallacy/.Google Scholar
Pascual‐Leone, Alvaro, Cattaneo, Gabriele, Macià, Dídac, Solana, Javier, Tormos, José M., and Bartrés‐Faz, David. 2021. “Beware of Optimism Bias in the Context of the COVID‐19 Pandemic.” Annals of Neurology 89 (3): 423425. https://doi.org/10.1002/ana.26001.CrossRefGoogle ScholarPubMed
Prakash, Shivesh, Sladek, Ruth M., and Schuwirth, Lambert. 2019. “Interventions to Improve Diagnostic Decision Making: A Systematic Review and Meta-analysis on Reflective Strategies.” Medicsal Teacher 41 (5): 517524. https://doi.org/10.1080/0142159X.2018.1497786.CrossRefGoogle ScholarPubMed
Quon, Alexander. 2022. “Why Experts Say Premier Moe’s Claim Vaccines No Longer Protect against COVID-19 Is Based on Misunderstanding.” CBC News, February 3. www.cbc.ca/news/canada/saskatchewan/covid-19-transmission-scott-moe-1.6336479.Google Scholar
Rajsic, Jason, Wilson, Daryl E., and Pratt, Jay. 2018. “The Price of Information: Increased Inspection Costs Reduce the Confirmation Bias in Visual Search.” Quarterly Journal of Experimental Psychology 71 (4): 832849.CrossRefGoogle Scholar
Raude, Jocelyn, Debin, Marion, and Bonmarin, Isabelle et al. 2020. “Are People Excessively Pessimistic about the Risk of Coronavirus Infection?” https://psyarxiv.com/364qj/.CrossRefGoogle Scholar
Reuters staff. 2020. “Impact of Coronavirus Will be Felt for Decades to Come, WHO Says.” Reuters, July 31. www.reuters.com/article/us-health-coronavirus-who-idUSKCN24W27L.Google Scholar
Reuters staff. 2021. “Austria Suspends AstraZeneca COVID-19 Vaccine Batch after Death.” Reuters, March 7. www.reuters.com/article/health-coronavirus-austria-nurse-idUSL8N2L506P.Google Scholar
Roney, Christopher J. R., and Trick, Lana M.. 2003. “Grouping and Gambling: A Gestalt Approach to Understanding the Gambler’s Fallacy.” Canadian Journal of Experimental Psychology 57 (2): 6975.CrossRefGoogle ScholarPubMed
Roozenbeek, Jon, Schneider, Claudia R., and Van Der Linden, Sander et al. 2020. “Susceptibility to Misinformation about COVID-19 around the World.” Royal Society Open Science 7 (10): 115.CrossRefGoogle ScholarPubMed
Rosenberg, J. 2001. “Young People in the United States Are Often Misinformed about the Proper Use of Condoms.” Perspectives on Sexual and Reproductive Health 33 (5): 235.Google Scholar
Rubin, Victoria L., Brogly, Chris, Conroy, Nadia, Chen, Yimin, Cornwell, Sarah E., and Asubiaro, Toluwase V.. 2019. “A News Verification Browser for the Detection of Clickbait, Satire, and Falsified News.” Journal of Open Source Software 4 (35): 13. https://doi.org/10.21105/joss.01208.CrossRefGoogle Scholar
Rumilly, Marc (@MarcRummy). 2021. “‘X% of Hospitalized Patients Are Fully Vaccinated’ Doesn’t Always Mean What You Think It Does. The Real Meaning Depends on the Base Rates. Without Knowing That, the Quote Can Mean Anything.” Twitter post, July 23.Google Scholar
Savolainen, Reijo. 2009. “Information Use and Information Processing: Comparison of Conceptualizations.” Journal of Documentation 65 (2): 187207.CrossRefGoogle Scholar
Schlager, Tobias, and Whillans, Ashley V.. 2022. “People Underestimate the Probability of Contracting the Coronavirus from Friends.” Humanities and Social Sciences Communications 9 (1): 111. http://dx.doi.org/10.1057/s41599–022-01052-4.CrossRefGoogle Scholar
Schmidt, Henk G., Mamede, Sílvia, Van Den Berge, Kees, Van Gog, Tamara, Van Saase, Jan L. C. M., and Rikers, Remy M. J. P.. 2014. “Exposure to Media Information about a Disease Can Cause Doctors to Misdiagnose Similar-Looking Clinical Cases.” Academic Medicine 89 (2): 285291.CrossRefGoogle ScholarPubMed
Schonger, Martin, and Sele, Daniela. 2021. “Intuition and Exponential Growth: Bias and the Roles of Parameterization and Complexity.” Mathematische Semesterberichte 68 (2): 221235.CrossRefGoogle ScholarPubMed
Schwind, Christina, and Buder, Jürgen. 2012. “Reducing Confirmation Bias and Evaluation Bias: When Are Preference-Inconsistent Recommendations Effective – and When Not?Computers in Human Behavior 28 (6): 22802290.CrossRefGoogle Scholar
Schwind, Christina, Buder, Jürgen, Cress, Ulrike, and Hesse, Friedrich W.. 2012. “Preference-Inconsistent Recommendations: An Effective Approach for Reducing Confirmation Bias and Stimulating Divergent Thinking?Computers and Education 58 (2): 787796.CrossRefGoogle Scholar
Scherer, Laura D., and Pennycook, Gordon. 2020. “Who Is Susceptible to Online Health Misinformation?American Journal of Public Health 110 (S3): S276S277.CrossRefGoogle ScholarPubMed
Seifert, Colleen M. 2002. “The Continued Influence of Misinformation in Memory: What Makes a Correction Effective?” In Psychology of Learning and Motivation, Vol. 41, 265292. Cambridge, MA: Academic Press,Google Scholar
Seneff, Stephanie, and Nigh, Greg. 2021. “Worse than the Disease? Reviewing Some Possible Unintended Consequences of the mRNA Vaccines against COVID-19.” International Journal of Vaccine Theory, Practice, and Research 2 (1): 3879.CrossRefGoogle Scholar
Sharon, Aviv J., and Baram‐Tsabari, Ayelet. 2020. “Can Science Literacy Help Individuals Identify Misinformation in Everyday Life?Science Education 104 (5): 873894. https://doi.org/10.1002/sce.21581.CrossRefGoogle Scholar
Simis, Molly J., Madden, Haley, Cacciatore, Michael A., and Yeo, Sara K.. 2016. “The Lure of Rationality: Why Does the Deficit Model Persist in Science Communication?Public Understanding of Science 25 (4): 400414. https://doi.org/10.1177/0963662516629749.CrossRefGoogle ScholarPubMed
Simonetti, Omar, Martini, Mariano, and Armocida, Emanuele. 2021. “COVID-19 and Spanish Flu-18: Review of Medical and Social Parallelisms between Two Global Pandemics.” Journal of Preventive Medicine and Hygiene 62 (3): E613E620. https://doi.org/10.15167/2421-4248/jpmh2021.62.3.2124.Google ScholarPubMed
Smitham, Eleni, and Glassman, Amanda. 2021. “The Next Pandemic Could Come Soon and be Deadlier.” Center for Global Development, August 25. www.cgdev.org/blog/the-next-pandemic-could-come-soon-and-be-deadlier.Google Scholar
Stapel, Diederik A., Reicher, Stephen D., and Spears, Russell. 1995. “Contextual Determinants of Strategic Choice: Some Moderators of the Availability Bias.” European Journal of Social Psychology 25 (2): 141158.CrossRefGoogle Scholar
Statista. 2022. “Number of Confirmed COVID-19 Deaths in Canada from December 14, 2020 to August 21, 2022, by Vaccination Status.” www.statista.com/statistics/1257040/number-covid-deaths-canada-by-vaccination-status/.Google Scholar
Tasnim, Samia, Hossain, Md Mahbub, and Mazumder, Hoimonty. 2020. “Impact of Rumors and Misinformation on COVID-19 in Social Media.” Journal of Preventive Medicine and Public Health 53 (3): 171174.CrossRefGoogle ScholarPubMed
Thaler, Richard H., Sunstein, Cass R., and Balz, John P.. 2013. “Choice Architecture.” In The Behavioral Foundations of Public Policy, ed. Shafir, Eldar, Ch. 25. Princeton, NJ: Princeton University Press.Google Scholar
Thomas, Naomi, and Hanna, Jason. 2021. “Moderna’s Covid-19 Vaccine Shows 93% Efficacy through 6 Months, as Company Expects to Finish Application for Approval this Month.” CNN, August 5. www.cnn.com/2021/08/05/health/moderna-coronavirus-vaccine-efficacy/index.html.Google Scholar
Tversky, Amos, and Kahneman, Daniel. 1971. “Belief in the Law of Small Numbers.” Psychological Bulletin 76 (2): 105110.CrossRefGoogle Scholar
Tversky, Amos, and Kahneman, Daniel. 1973. “Availability: A Heuristic for Judging Frequency and Probability.” Cognitive Psychology 5 (2): 207232.CrossRefGoogle Scholar
Tversky, Amos, and Kahneman, Daniel. 1974. “Judgment under Uncertainty: Heuristics and Biases.” Science 185 (4157): 11241131.CrossRefGoogle ScholarPubMed
Villanova, Daniel. 2022. “Linear Biases and Pandemic Communications.” Medical Decision Making 42 (6): 765775.CrossRefGoogle ScholarPubMed
Vraga, Emily K., and Bode, Leticia. 2017. “Using Expert Sources to Correct Health Misinformation in Social Media.” Science Communication 39 (5): 621645. https://doi.org/10.1002/pra2.2015.145052010083.CrossRefGoogle Scholar
Vraga, Emily K., Tully, Melissa, and Bode, Leticia. 2022. “Empowering Users to Respond to Misinformation about Covid-19.” Media and Communication (Lisboa) 8 (2): 475479. https://doi.org/10.17645/mac.v8i2.3200.CrossRefGoogle Scholar
Wason, Peter C. 1960. “On the Failure to Eliminate Hypotheses in a Conceptual Task.” Quarterly Journal of Experimental Psychology 12 (3): 129140.CrossRefGoogle Scholar
Weinstein, Neil D. 1983. “Reducing Unrealistic Optimism about Illness Susceptibility.” Health Psychology 2 (1): 11.CrossRefGoogle Scholar
Wheeler, Gregory. 2018. “Bounded Rationality.” Stanford Encyclopedia of Philosophy. November 30. https://plato.stanford.edu/entries/bounded-rationality/.Google Scholar
Wise, Toby, Zbozinek, Tomislav D., Michelini, Giorgia, Hagan, Cindy C., and Mobbs, Dean. 2020. “Changes in Risk Perception and Self-Reported Protective Behaviour during the First Week of the COVID-19 Pandemic in the United States.” Royal Society Open Science 7 (9): online 200742.CrossRefGoogle ScholarPubMed
Wittenberg, , Chloe, , and Berinsky, Adam J.. 2020. “Misinformation and Its Correction.” In Social Media and Democracy: The State of the Field, Prospects for Reform, ed. Persily, Nathaniel and Tucker, Joshua A., 163198. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Wogan, Tim. 2014. “Gambler’s Fallacy Trips up Goalies.” Science, July 31. www.science.org/content/article/gamblers-fallacy-trips-goalies.Google Scholar
Xu, Shan, Coman, Ioana A., Yamamoto, Masahiro, and Jimenez Najera, Christina. 2022. “Exposure Effects or Confirmation Bias? Examining Reciprocal Dynamics of Misinformation, Misperceptions, and Attitudes toward COVID-19 Vaccines.” Health Communication 38 (10): 111.Google ScholarPubMed
Yahoo Finance. 2022. “61% of Americans Underestimate Their Odds of Contracting Long COVID-19.” May 24. https://finance.yahoo.com/news/61-americans-underestimate-odds-contracting-140100108.html.Google Scholar
Zheng, Caifang, Shao, Weihao, Chen, Xiaorui, Zhang, Bowen, Wang, Gaili, and Zhang, Weidong. 2022. “Real-World Effectiveness of COVID-19 Vaccines: A Literature Review and Meta-analysis.” International Journal of Infectious Diseases 114: 252260. https://doi.org/10.1016/j.ijid.2021.11.009.CrossRefGoogle ScholarPubMed
Figure 0

Figure 3.1 Visual themes related to everyday misinformation.

Figure 1

Table 3.1 Hypothetical growth of COVID-19 cases over fifteen days with a doubling time of three days, presented numerically

Figure 2

Figure 3.2 Hypothetical growth of COVID-19 cases over fifteen days with a doubling time of three days, presented graphically.

Figure 3

Table 3.2 Hypothetical screening test results for a population of 1,000; condition present in 5% of the population, test results in 0% false negative results (sensitivity 100%), 10% false positive results (specificity 90%)

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×