Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-09T07:21:15.370Z Has data issue: false hasContentIssue false

The calculus of ignorance

Published online by Cambridge University Press:  08 March 2022

Thomas T. Hills*
Affiliation:
Department of Psychology, University of Warwick, Coventry, UK
Rights & Permissions [Opens in a new window]

Abstract

Type
Book Review Essay
Copyright
Copyright © The Author(s), 2022. Published by Cambridge University Press

‘Never tell me the odds.’ Han Solo

At least a minority of people consider it morally offensive for someone to tell them about the negative environmental impact of their own diet, especially when that diet has a greater negative impact (Bose et al., Reference Bose, Hills and Sgroi2020). Less than a quarter of people at risk of Huntington's disease elect to determine if they are a carrier of the lethal gene even though most who find out the answer, whether positive or negative, are happier than those who remain uncertain (Wiggins et al., Reference Wiggins, Whyte, Huggins, Adam, Theilmann and Bloch1992). On these and many other topics, people often choose not to know. What motivates this willful ignorance, what are its implications, and what should we do about it, if anything?

Why we choose not to know is the topic of a new book by Ralph Hertwig and Christoph Engel, entitled Deliberate Ignorance: Choosing Not to Know. The question: If knowledge is so important to so many things – democracy, the environment, making choices in one's own best interest – then what is the psychological basis for deliberate ignorance? The book itself is a model of scientific crowdsourcing, collecting more than 40 of the world's experts on topics such as cognitive science, law, biology, history, bioethics, and economics, with the goal of laying out a foundation for starting to understand why it is that we often don't want to know.

In a leading chapter, Dagmar Ellerbrock and Ralph Hertwig set the stage with the case of the Stasi files. During the era of the German Democratic Republic, East Germany's Secret Police – the Stasi –relied on civilian informers to provide information about who was disloyal to the party. Those living in East Germany could expect family members, friends, colleagues, even their spouses to potentially be an informer, providing information about them to the Stasi. This information along with who the informers were was collected in the Stasi files. After East Germany ceased to exist as a political entity, the Stasi files were eventually made public. A curious individual could go and look up their own file and discover who among their network of friends and family was an informant, what information was provided, and what impact it had. But would you want to know? Should you even be allowed to know? Opinions are deeply divided, and as Ellerbrock and Hertwig's chapter reveals in somewhat spine-tingling detail – imagine if you had been an informer – the history of the Stasi files is marked by controversy and incredible twists and turns.

The award-winning movie Das Leben der Anderen exposes some of these complications when it tells the story of the playwright, Georg Dreyman, who looks at his file after the fall of the Berlin Wall and learns how his life was not what he had suspected: he was being watched by the Stasi all the time. Had Dreyman known he was being watched he would have led a different life. His ignorance protected his liberties even as it exposed him to persecution. There are moments when ignorance is a gift.

Before reading Deliberate Ignorance, I naively asked myself if choosing not to know is simply the flip side of choosing to know. There are volumes of research on information sampling and search. If I can summarize the results of that work in one sentence, it is that people's choices depend on what they know, how they learn it, and the costs and benefits of finding out more. If I know what people want to know, don't I already know what they don't want to know? Well, no, not at all. As Caplan (Reference Caplan2000) notes in his work on rational irrationality, irrationality and ignorance are goods like any other, and people are willing to trade them for wealth. Sometimes the truth is more than just inconvenient – it carries a cost bigger than its value. Or, as Upton Sinclair put it, ‘It's difficult to get a man to understand something, when his salary depends upon his not understanding it’ (Sinclair, Reference Sinclair1934/1994, p. 109).

Deliberate Ignorance covers an expanse of such cases, ranging from Stasi files to ‘alternative facts’. This is undergirded by an attempt to set out a psychological, moral, and economic rationality for epistemic choice. It is a calculus of ignorance. As Vincent Ostrom, following Kuhn, put it, ‘Problems of epistemic choice - the choice of conceptualizations, assertions, and information to be used and acted upon in problem-solving modes - must necessarily loom large’ (Ostrom, Reference Ostrom, Rowley, Schneider and Tollison1993, p. 164). Knowledge is a choice among alternatives and our truths are supported as much by the facts we highlight as by those we obscure.

A good place to start understanding what we obscure is with Hertwig and Engel's taxonomy of deliberate ignorance, which I paraphrase here as a list:

  • Emotion-regulation and regret avoidance: what you don't know can't hurt you.

  • Suspense and surprise maximization: knowing takes the fun out of finding out.

  • Performance enhancement: knowing the odds can change the odds. Recall Han Solo.

  • Persuasion: what you don't know can make your arguments more convincing.

  • Self-discipline: not knowing about temptations can prevent temptation.

  • Eschewing responsibility and avoiding liability: ignorance of consequences reduces accountability.

  • Impartiality: not knowing things limits the biases that knowing them might engender.

  • Cognitive sustainability: knowing one thing might mean not knowing something else.

Threaded throughout the book is an additional item: that what we do and do not know is a cultural code, a form of social information that allows us to quickly signal and detect ingroup status (Hyatt & Simons, Reference Hyatt and Simons1999). What we know about climate change, gun death statistics, country music, the significant digits of pi, or Trump's Christian values can quickly out us as ‘one of them’, and – depending on our company – have a far more immediate impact on our well-being than the content of the knowledge itself. Knowledge is not separate from the beliefs that motivate its acquisition. As a consequence, what other people know can tell us what they believe and influence whether or not we listen to them at all (e.g., Marshall, Reference Marshall2015).

Ignorance, therefore, has costs and benefits. Knowing about temptations drives us to temptation, as anyone who occasionally glances at the sidebar of YouTube will immediately recognize. At the same time, not knowing the consequences of our actions (or letting ourselves believe it's too complicated for anyone to really know) allows us to blindly carry on enjoying many of the liberties we so cherish. As the book describes in its chapters on norms, these costs and benefits accrue collectively and quickly scale up to populations.

One of the implications is that not knowing what drives deliberate ignorance reduces our ability to counter it; our own ignorance about deliberate ignorance means that we have no way of predicting the impacts of providing information in the first place. It is all well and good to spend generously to identify the consequences of our actions and to find technological solutions to resolve them, but people's willingness to find out about and take up these solutions will ultimately decide their success. To rephrase Ostrom, deliberate ignorance may be what stands between us and the success of democracy, public health, environmental sustainability, and many other outcomes critical to our future.

This is more than just mildly unsettling. Deliberate ignorance and the echo chambers, filter bubbles, group polarizations, and self-serving confirmation biases that arise out of it are everywhere. They shape what we know and believe. Looked at in this way, the information superhighway is not the expressway to collective wisdom;it is a license for each of us to get off wherever we want.

One of the refreshing aims of Deliberate ignorance is not simply to further document our irrationality, but rather to explore why deliberate ignorance might be rational in light of what Simon (Reference Simon1957) called our bounded rationality. Humans simply don't know it all, couldn't remember it if you told them, and even in the rare cases where they do know it all, they don't deal with it objectively. Indeed, the rapidly rising costs of considering the evidence needed even to think about tomorrow has been argued to be a rate-limiting step in the evolution of human cognition (Trapp et al., Reference Trapp, Parr, Friston and Schröger2021).

These limitations matter because the proliferation of information in various forms over the past several centuries has, in interacting with the limitations of our cognitive biases, created a distorted information ecosystem (Hills, Reference Hills2019). We are not disinterested parties when it comes to consuming and sharing information and our short- and long-term interests are often unaligned. Plus, there is a substantial cost to processing information which often goes unconsidered when providing it. I presume the great quantities of university documents, policies, rules, and guidelines I get in my inbox each day are largely because someone believes I will do my job better if I have this information, even though just reading it all – even if I could remember it – would leave me no time to do my job at all. I remain deliberately ignorant.

I suspect Simon (Reference Simon and Greenberge1971) felt the same when he wrote that ‘a wealth of information creates a poverty of attention’. And that same poverty is behind Downs' (Reference Downs1957) critique of the cost-benefit ratio of information necessary for a functional democracy. Taking that poverty to heart, we desperately need sludge audits focused directly on reducing informational friction (Sunstein, Reference Sunstein2020): the costs of finding and processing information, before we forget or are interrupted. Behavioral change approaches like nudging and boosting have similar aims that, respectively, create information environments that guide people toward better decisions or provide information in a way that allows them to choose more accurately for themselves (Hertwig & Grüne-Yanoff, Reference Hertwig and Grüne-Yanoff2017).

Beyond the information that overloads, there is also the information that misleads. In Robert MacCoun's chapter on ‘Blinding to remove biases in science and society’, we learn that providing more information about job candidates often leads employers to choose less qualified people. When employers know the gender or the ethnicity of a candidate – two traits that should be irrelevant – that information influences who gets hired. Knowing that people have a criminal history, by providing a box for them to check if they do, prevents people with criminal histories from getting jobs. Orchestras hire poorer sounding performers when they can see the performers during the audition. And similarly, knowing who the authors of scientific articles are, and what universities they call home, influences what scientific insights and discoveries we collectively acknowledge.

Finally, there is a growing kind of information that directly promotes deliberate ignorance. Consider that despite the fact that weapons of mass destruction (WMDs) were never found in Iraq, government propaganda campaigns to promote beliefs about WMDs have left more than half of the US population with the false belief that WMDs were found. As Lewandowsky points out in his chapter on the ‘Willful construction of ignorance’, collective memory politics to promote false beliefs are evolving to instead promote the belief that there are no objective truths at all. As Lewandowsky notes for Trump, and Pomerantsev (Reference Pomerantsev2019) notes for many other political actors, the aim is to devalue objective truth altogether. This form of ‘anything goes’ relativism directly promotes deliberate ignorance as a badge of honor. One is reminded of the last slogan printed on the Ministry of Truth in Orwell (Reference Orwell1949): ‘Ignorance is strength.’

Deliberate ignorance is a timely topic about which we already know a lot. Information avoidance, rational ignorance, willful blindness, heuristics, the ostrich effect, confirmation bias, and many others, all represent places where people prefer (often rationally) less than all of the information available. Deliberate Ignorance provides a thoughtful and sophisticated critique of that knowledge in a way that offers guidance to how we can use it and what remains to be known. Above all, it emphasizes how striking a balance between knowledge and ignorance is a practical necessity and also a skill – a needed cultural competency that we must develop.

References

Bose, N., Hills, T. and Sgroi, D. (2020), Climate change and diet, IZA. Bonn: Institute of Labor Economics (IZA). Discussion Paper No. 13426.Google Scholar
Caplan, B. (2000), ‘Rational irrationality: A framework for the neoclassical-behavioural debate’, Eastern Economic Journal, 26: 191211.Google Scholar
Downs, A. (1957), An economic theory of democracy. New York: Harper & Row.Google Scholar
Hertwig, R. and Grüne-Yanoff, T. (2017), ‘Nudging and boosting: Steering or empowering good decisions’, Perspectives on Psychological Science, 12: 973986.CrossRefGoogle ScholarPubMed
Hills, T. T. (2019), ‘The dark side of information proliferation’, Perspectives on Psychological Science, 14(3): 323330.CrossRefGoogle ScholarPubMed
Hyatt, J. and Simons, H. (1999), ‘Cultural codes – Who holds the key? The concept and conduct of evaluation in Central and Eastern Europe’, Evaluation, 5: 2341.CrossRefGoogle Scholar
Marshall, G. (2015), Don't even think about it: Why our brains are wired to ignore climate change. New York: Bloomsbury Publishing.Google Scholar
Orwell, G (1949), Nineteen eighty-four. London: Secker and Warburg.Google Scholar
Ostrom, V (1993), ‘Epistemic choice and public choice’, in Rowley, C. K., Schneider, F., and Tollison, R. D. (eds.), The next twenty-five years of public choice, Dordrecht: Springer, 163176.CrossRefGoogle Scholar
Pomerantsev, P. (2019), This is not propaganda: Adventures in the war against reality. London: Faber & Faber.Google Scholar
Simon, H. A. (1957), Models of man. New York: John Wiley & Sons.Google Scholar
Simon, H. A. (1971), ‘Designing organizations for an information-rich world’, in Greenberge, M. (ed.), Computers, communications, and the public interest. Baltimore, MD: Johns Hopkins Press, 3752.Google Scholar
Sinclair, U. (1934/1994). I, candidate for governor. Berkeley, CA: University of California Press.Google Scholar
Sunstein, C. R. (2020), ‘Sludge audits’, Behavioural Public Policy 120.Google Scholar
Trapp, S., Parr, T., Friston, K. and Schröger, E. (2021), ‘The predictive brain must have a limitation in short-term memory capacity’, Current Directions in Psychological Science, 30: 384390.CrossRefGoogle Scholar
Wiggins, S., Whyte, P., Huggins, M., Adam, S., Theilmann, J., Bloch, M., S. B. Sheps, M. T. Schechter, M. R. Hayden, and the Canadian Collaborative Study of Predictive Testing (1992), ‘The psychological consequences of predictive testing for Huntington's disease’, New England Journal of Medicine, 327: 14011405.CrossRefGoogle ScholarPubMed