1. Introduction
Suppose you don't know much about climate change beyond the fact that most all climate scientists think the planet is getting warmer largely thanks to human activity; and suppose you come across an article in a newspaper or magazine with a headline along the lines of “The Science of Climate Change is not Settled.” Should you read it?
We can interpret this question in purely epistemic terms. According to a standard view, in addition to our moral and prudential obligations, human beings have a distinct set of epistemic obligations.Footnote 1 Some of these obligations will be obligations to form or maintain only certain sorts of beliefs (more generally, doxastic attitudes). But, in addition, many of the actions we perform or omit will indirectly influence whether we form or maintain epistemically successful beliefs and avoid forming or maintaining epistemically unsuccessful beliefs (we can understand epistemic success to be either true or accurate belief or knowledge); and as such, some of our epistemic obligations are obligations to perform or omit certain belief-influencing actions.Footnote 2 Understood in these terms, the question at issue is: is it epistemically permissible for you to read the article?
Probably the most natural answer to this question is “of course it is.” After all, you won't be compelled to accept any of the author's claims; and perhaps it's best to be familiar with the arguments on both sides of such a controversial and important issue. However, if Levy (Reference Levy2006) is right, it's highly likely that reading the article will leave you worse off epistemically. For instance, because you don't know very much about the topic, you won't be able to spot all of the author's mistakes. You might end up believing at least some of the author's false claims; and even those falsehoods that you avoid believing are likely to diminish the accuracy of your beliefs on the topic. Arguably, then, we should agree with Levy that you ought not to read the article.Footnote 3
If you have an epistemic obligation not to read the article, the source of this obligation is plausibly your lack of expertise regarding climate science. For instance, a practicing climate scientist doesn't have the same reasons to avoid reading the article – she'll be able to spot the author's mistakes, and so her doxastic attitudes aren't likely to be influenced by the author's false claims. Yet, discussions of our epistemic obligations to perform or omit belief-influencing actions typically abstract away from these important differences between believers: we're told that we all have an epistemic obligation to gather evidence, or to avoid dubious sources of information, or to diversify the sources of information we rely on.Footnote 4 Conversely, cases of the sort just described suggest that experts and laypeople possess fundamentally different epistemic obligations. And if so, it is crucially important that discussions of our epistemic obligations bear the distinction between experts and laypeople in mind – after all, we all have beliefs concerning a great many topics, and yet each of us is an expert regarding very little. It may be that, with respect to topics concerning which we are experts, we ought to behave a certain way (perhaps we ought to conduct thorough investigations and reflect carefully on the evidence we collect); and it may be that, with respect to most other topics, we ought to behave very differently (perhaps our principal obligation is to allocate trust appropriately and consult the right people).
In order to establish that experts and laypeople have fundamentally different epistemic obligations, I will focus on scientific issues where a clear consensus exists amongst the relevant experts: that the climate is changing significantly largely due to human activity, that species result from evolution rather than intelligent design, that vaccines are safe and effective, that COVID-19 is more dangerous than the flu, and so on. With respect to such issues, I maintain that laypeople, but not experts, have an epistemic obligation to avoid unnecessary exposure to anti-consensus material: articles, videos, lectures, etc., the principal claims of which are inconsistent with the expert consensus.
Claiming that laypeople have an obligation to actively avoid dissenting voices in such cases might seem to be endorsing a dangerous form of dogmatism; however, the argument for this conclusion is grounded in epistemic humility – in a recognition of our cognitive limitations and vulnerabilities. More specifically, the argument is that each of us is particularly vulnerable to false claims when we are not experts on some topic – such falsehoods have systematic negative impacts on our doxastic attitudes that we can neither prevent nor correct. And while we can't avoid falsehoods generally, there is a particular category of falsehood that we can often avoid: falsehoods that are inconsistent with the available evidence. Since, then, we are likely to encounter false claims inconsistent with the available evidence when we consume anti-consensus material; and since the doxastic attitudes of laypeople, but not experts, are likely to be systemically negatively influenced by encountering such falsehoods; laypeople, but not experts, have an epistemic obligation not to consume anti-consensus material unnecessarily.Footnote 5 The conclusion is not that you ought to believe whatever the experts say simply because they say it – the conclusion defended here says nothing about what you ought to believe, and it certainly does not imply that you shouldn't study the evidence and arguments supporting the expert consensus. Rather, the point is simply that when you encounter someone who denies the clear expert consensus on some scientific issue, you shouldn't listen to that person.
2. Clear expert consensus
Before proceeding to the argument, I should clarify some points regarding the notion of clear expert consensus. According to most philosophers’ definitions, experts are different from laypeople in two chief respects: knowledge and ability.Footnote 6 That is, an expert on some topic is someone who possesses significantly greater knowledge of the available evidence relevant to that topic than most people do, and who is significantly better able to interpret and draw conclusions from that evidence than most people are. So understood, expertise comes in degrees – for instance, you might know significantly more than most people on some topic because you took a single university course. But for present purposes, the experts who matter are those who occupy the far end of the spectrum. Consequently, I will assume that someone is an expert on some topic if and only if, first, he knows about as much as anyone concerning the available evidence relevant to that topic, and second, he is about as good as anyone at interpreting and drawing conclusions from that evidence. For instance, with respect to specific scientific topics, the experts are individuals with advanced degrees in some relevant scientific discipline, and who publish research on the topic in peer-reviewed venues.Footnote 7
Since, for a given scientific issue, the relevant experts will be scientists who publish research on related scientific topics, there is clear expert consensus on a given scientific issue just in case there is a large group of relevant scientists who overwhelmingly agree on that issue – that is, the vast majority endorse some particular verdict with respect to that issue. We can understand agreeing about or endorsing some verdict in terms of what these scientists have published, what they believe, or both. So, for instance, given that roughly 85% of the relevant scientists who express a view on the issue when surveyed maintain that anthropogenic climate change is occurring, and given that more than 95% of peer-reviewed papers that take a stand on the question support that thesis, there is clear expert consensus that the climate is changing largely due to human activity.Footnote 8
In addition to these terminological matters, there are two substantive issues concerning expert consensus that I should address. First, the ensuing argument will assume that those who reject the clear expert consensus on any given scientific issue are more likely to be mistaken than those who endorse it; but one might worry about the reliability of expert consensus. For instance, drawing on Goldman (Reference Goldman2001), one might argue that widespread agreement amongst experts does not improve their chances of being correct when those experts have not formed their beliefs independently of one another. However, scientists typically form their beliefs at least partially independently of one another.Footnote 9 And, as Coady (Reference Coady2006) and Lackey (Reference Lackey, Christensen and Lackey2013) show, even if they didn't, the fact that the vast majority of scientists endorse some scientific proposition would still make it more likely that that proposition is true. Accordingly, we ought to insist that someone who endorses the clear expert consensus on some scientific question is significantly more likely to be correct than someone who rejects it: an individual scientist's verdict on some scientific issue is more likely to be correct than an individual non-scientist's verdict on that issue (since the scientist's verdict is more likely to be supported by the available evidence, and claims that are supported by the available evidence are more likely to be true than claims that are not); and a large group of scientists’ verdict on some scientific issue is significantly more likely to be correct than a much smaller group of scientists’ verdict on that issue (since a large of group of scientists’ verdict is more likely to be supported by the available evidence than is a much smaller group of scientists’ verdict).
Second, the ensuing argument will assume that, at least very often, laypeople are able to determine when clear expert consensus exists on some scientific issue; but one might worry that laypeople face significant difficulties determining what the vast majority of experts believe about any given scientific issue. First, it might be difficult for many laypeople to recognize who the experts are – specifically, it might be difficult to recognize that scientists are the experts on any given scientific question. For instance, evidence of bias, problematic incentives, intellectual dishonesty, or even outright corruption within the scientific community might lead many laypeople to conclude that scientists are not the most reliable interpreters of the evidence concerning issues such as climate change and evolution – but that some groups of non-scientists, such as political or religious leaders, are better able to draw conclusions from that evidence. However, while there is good evidence that scientists are sometimes biased or dishonest when they investigate scientific questions (especially politically loaded scientific questions), there is vastly more evidence of bias and dishonesty amongst political and religious leaders (and any other relevant group of non-scientists). Moreover, while there is extensive evidence that scientists are reliable interpreters of the available evidence on a wide range of scientific questions – most all of us encounter examples of successful scientific research on a more or less constant basis – there is very little evidence that any group of non-scientists is similarly reliable. So, it's not the case that misleading evidence makes it difficult for laypeople to recognize that scientists are the individuals best able to answer even politically loaded scientific questions.
(Following Levy (Reference Levy2019; Reference LevyForthcoming), one might object that many laypeople have reasonable grounds not to trust the scientific consensus on many scientific issues – namely, that most scientists do not belong to the religious and political groups with which these laypeople identify, and so are not likely to share their values.Footnote 10 The suggestion is that if scientists don't share my religious and political affiliations then they are less likely to be benevolent towards me, and so it's reasonable for me not to trust their judgement on politically loaded scientific issues. However, while it might be reasonable to assume that individuals who share my religious and political affiliations are more likely to be benevolent towards me, and so are less likely to deceive or exploit me, it is not reasonable to assume that such individuals are better able than most to answer scientific questions accurately.Footnote 11 So, if I were to conclude that, say, the political and religious leaders whom I trust are better able to interpret and draw conclusions from the available evidence concerning climate change and evolution, I would be weighing evidence of benevolence significantly more heavily than evidence of competence – and that's unreasonable.)
Alternatively, even if it isn't difficult for laypeople to determine that the scientists are experts, they might find it difficult to determine what the vast majority of experts believe concerning any given scientific issue. For instance, a layperson who regularly encounters scientists asserting that vaccines are safe and effective, might also sometimes encounter self-professed experts asserting that vaccines cause autism – self-professed experts who, while not scientists, have credentials sufficient to make it seem as though they have a scientist's knowledge and abilities. As Ballantyne (Reference Ballantyne2019: Ch. 9) and Levy (Reference LevyForthcoming: Ch. 4) emphasize, it can be quite difficult for a layperson to determine whether such a self-professed expert is a genuine expert; however, at least in very many cases, encountering self-professed experts won't make it difficult for a layperson to determine what the vast majority of experts believe. If I determine that the scientists researching relevant topics are all experts on a given scientific issue (which, again, is not particularly difficult), then even if I mistakenly assume that the self-professed experts I encounter are also experts, these individuals will still constitute a tiny subset of all the experts I recognize.Footnote 12 As such, there is a simple method I can employ to determine what the vast majority of experts believe concerning some scientific issue: determine what the vast majority of scientists believe concerning that issue. And fortunately, in very many cases, it is not particularly difficult for a layperson to determine what the vast majority of scientists believe concerning some scientific issue. For instance, Anderson (Reference Anderson2011: 149) outlines some simple methods: you can read review articles in scientific publications; you can find surveys of scientists working on the topic; or you can read reports or consensus statements (or summaries thereof) from scientific organizations, such as the National Academy of Sciences. Even more simply, you can read books and articles written by scientists and science journalists for a general audience (or watch documentaries, listen to podcasts, etc.): such material will make clear what issues the relevant scientific community considers settled, and what issues remain controversial. Of course, there will be cases where laypeople will find it difficult to identify clear expert consensus when it exists; the present point is just that there are very many important cases where it is not difficult for laypeople to identify clear expert consensus.
3. Our vulnerability to falsehoods
The suggestion that we have an epistemic obligation to avoid reading or listening to certain material will strike many as implausible. This reaction is grounded in the natural assumption that each of us has considerable control over how we respond to what we read and hear: we are free to reject claims we find implausible, to suspend belief whenever we're unsure about some claim, and when we happen to be taken in by some falsehood, we can always correct our mistake with a bit of research. However, considerable psychological research has established that this natural assumption is mistaken. This research shows that, first, the falsehoods we encounter typically have a systematic negative influence on our doxastic attitudes, particularly when we don't have much relevant background knowledge; and second, that there is nothing we can do to either eliminate a falsehood's negative impact when we first encounter it, or to largely correct its negative impact after the fact.Footnote 13
When we consume written or verbal material, the principal mechanism by which we can prevent false claims from influencing our doxastic attitudes is by comparing each claim to our store of existing beliefs. Researchers sometimes refer to this process as plausibility checking: each of us checks the claims we encounter against what we already believe in a spontaneous, automatic fashion; and we typically reject claims inconsistent with what we already believe automatically.Footnote 14 But, of course, we will only manage to reject a given falsehood using this procedure so long as it is inconsistent with our existing beliefs.
Crucially, when we encounter claims the truth value of which we don't know (for instance, claims concerning which our background beliefs entail no verdict), each of us exhibits a truth bias – a tendency to accept such claims as true.Footnote 15 This tendency is sufficiently strong that we often accept claims of unknown truth value even when we have very good reasons not to do so. A well-known experiment – Gilbert et al. (Reference Gilbert, Tafarodi and Malone1993) – provides particularly striking evidence of this phenomenon. Subjects were presented with a series of statements regarding a fictional crime, and then asked to rate the perpetrator's dangerousness and to recommend a prison sentence. These statements were explicitly labelled as either true or dubious – subjects were informed that the true statements were taken from the actual crime report while the dubious statements were taken from entirely unrelated crime reports.Footnote 16 Gilbert and colleagues found that, when required to perform a mildly demanding task while reading the statements, subjects tended to judge the perpetrator more severely when the dubious statements they had read increased the severity of the crime. However, a more recent version of this experiment – Pantazi et al. (Reference Pantazi, Kissine and Klein2018) – found that subjects demonstrated a significant tendency to treat the dubious claims as true even when not subjected to distractions of any kind.Footnote 17
In addition, each of us is more likely to believe a given claim the more frequently we encounter it – a phenomenon known as the truth effect.Footnote 18 So, because each separate time we encounter some claim increases our confidence that that claim is true, simply repeating some falsehood often enough can sometimes cause us to believe it.Footnote 19 Again, our background beliefs can prevent us from accepting some falsehood, even if it is repeated frequently; but when the claim isn't in tension with our existing beliefs, the effect is remarkably robust. For instance, Henkel and Mattson (Reference Henkel and Mattson2011) showed that individuals who read a particular statement on three separate occasions over the course of two weeks, judged that statement to be either probably or certainly true almost 70% of the time – despite the fact that they were given explicit advance warning that the source of the information was unreliable.
We are also particularly likely to believe false claims made by individuals we trust. As Levy (Reference Levy2019, Reference LevyForthcoming) argues, because human beings are social creatures with limited resources, we have a significant tendency to defer to individuals who belong to groups we identify with, and to individuals who have acquired a certain level of prestige. For instance, a recent study by Barber and Pope (Reference Barber and Pope2019) found that Republicans were much more likely to endorse a stereotypically liberal policy when told that Donald Trump supports it. Such deference may sometimes be rational (Levy maintains it typically is); but a consequence of this strategy is that when individuals who belong to the relevant groups (especially prestigious individuals) make false claims that are not in tension with our existing beliefs, there's a good chance we'll end up believing them. For instance, there is considerable evidence that Americans’ views about the existence (or seriousness) of climate change has been driven largely by the claims communicated by partisan elites; as a result, large numbers of Americans have acquired false beliefs on the topic.Footnote 20
It's extremely important to recognize that false claims have significant negative impacts on our doxastic attitudes even when we don't believe them. Perhaps most problematically, encountering falsehoods frequently undercuts the positive impact that encountering truths would otherwise have. Being exposed to significant quantities of dubious information can have a paralyzing effect, leading us to conclude that there's no way to tell what's true – a fact that politicians and industry groups often take advantage of. But even a single encounter with false information can have significant effects. For instance, van der Linden et al. (Reference van der Linden, Leiserowitz, Rosenthal and Maibach2017) showed that, typically, reading the statement that “97% of climate scientists have concluded that human-caused climate change is happening” significantly improved the accuracy of an individual's estimate of the current level of scientific consensus regarding climate change. However, when such an individual subsequently read the false statement that “there is no consensus on human-caused climate change,” encountering this falsehood completely eliminated the accuracy-improving impact that reading the true statement would otherwise have had.
One might think that there are strategies we can employ to protect our doxastic attitudes from the negative influence of false claims; however, when we don't have much in the way of background beliefs concerning some topic, we won't be able to eliminate (only partially mitigate) the negative influence of the falsehoods we encounter. For instance, you might think it would help to read slowly and carefully; or, perhaps it would be better to adopt a skeptical mindset, or to actively develop certain cognitive skills. But research has shown that reading slowly and carefully does not reduce the consequences of encountering false claims.Footnote 21 And research has also shown that individual differences in cognitive ability and personality have little or no bearing on one's susceptibility to some of the cognitive features that help make us vulnerable to falsehoods.Footnote 22 Alternatively, you might think that it would help to recognize and try to actively resist the truth bias and the truth effect; but the available evidence suggests that such attempts would produce only minor improvements.Footnote 23
Even so, one might think we can always correct some falsehood's negative influence after the fact – for instance, by fact-checking the dubious claims we encounter. However, research has shown that the negative influence of false claims cannot be largely reversed or corrected. In some cases where you encounter a falsehood and later encounter a correction, you might simply reject the correction out of hand – an outcome that is particularly likely when the correction is incompatible with your deeply held beliefs.Footnote 24 More commonly, encountering the correction will improve the accuracy of your doxastic attitudes – but only partially and temporarily. Research has shown that individuals who read a correction of some falsehood tend to have more accurate doxastic attitudes than they would have if they hadn't read the correction, but not so accurate as they would have if they had never encountered the falsehood in the first place.Footnote 25 Moreover, research has also shown that this beneficial effect of reading a correction tends to diminish rather quickly. For instance, Porter and Wood (Reference Porter and Wood2019: 42–3) found that, in a wide range of cases, the beneficial effects of reading corrections of false information “were not statistically distinguishable from zero after three days.”
In a best case scenario you might discover a correction to some false information and find the correction fully convincing. However, even in such cases the false information that you previously encountered but now reject will continue to influence your doxastic attitudes in systematic ways – a phenomenon known as the continued influence effect.Footnote 26 For instance, Green and Donahue (Reference Green and Donahue2011) found that reading a story had the very same influence on the “story-relevant beliefs” of subjects who were later told it was intentionally fabricated as it did on the beliefs of subjects who hadn't been told that it was fabricated.Footnote 27 Gordon et al. (Reference Gordon, Quadflieg, Brooks, Ecker and Lewandowsky2019) recently attempted to determine the mechanism behind this phenomenon via neuroimaging. They found that when you encounter a correction of false information, rather than simply being deleted from the brain, the false information remains stored in memory and continues to be activated when you consider related topics.
4. Laypeople's epistemic obligation
So, when we encounter false claims concerning topics we don't know much about, they are likely to negatively influence our doxastic attitudes in ways we can't prevent or correct. Consequently, it seems plausible that, when possible, we ought to take steps to avoid being exposed to such falsehoods – for instance, plausibly you ought not to seek the testimony of a pathological liar if you can help it. Of course, we can't avoid being exposed to falsehoods generally; but there are certain categories of falsehoods that we can avoid. In particular, we can often avoid falsehoods that are inconsistent with the available evidence. As we've seen, when clear expert consensus exists on some scientific issue, anti-consensus material is significantly more likely to contain falsehoods than is pro-consensus material. As such, we can minimize the false claims concerning such issues that we encounter by avoiding anti-consensus material.
Accordingly, the conclusion defended in the present section is that, when clear expert consensus exists on some scientific issue, laypeople, but not experts, have an epistemic obligation to avoid unnecessary exposure to anti-consensus material.Footnote 28 If you want to establish that a certain action is obligatory, a natural strategy is to show that each of the most plausible theories of right action entails that that action is obligatory. For instance, a simple way to establish that we ought not to kill human beings unnecessarily is to show that the most plausible consequentialist, deontological, and virtue-based accounts of right action each entails that we ought not to do so. While theories of epistemically obligatory action are not as well developed as theories of morally obligatory action, it isn't difficult to determine what epistemological analogues of the standard moral theories will look like. Accordingly, there are good reasons to maintain that the most plausible theories of epistemically obligatory action all entail that laypeople, but not experts, have an epistemic obligation to avoid unnecessary exposure to anti-consensus material.
First, one might characterize what makes a belief-influencing action epistemically obligatory in consequentialist terms. According to this theory, some action is an epistemic obligation if and only if it leads to epistemically good outcomes – for example, forming true beliefs and avoiding false beliefs, or acquiring knowledge and avoiding ignorance. If you are a layperson concerning some scientific issue and you consume anti-consensus material, the falsehoods you encounter are likely to produce a wide range of epistemically bad outcomes. Since you don't have many background beliefs about the topic, you are likely to end up believing the falsehoods you encounter thanks to the truth bias and truth effect; and you will be particularly likely to accept these falsehoods if the author or speaker is a prestigious member of a group with which you identify. Crucially, those falsehoods that you manage to avoid believing will still counteract the beneficial impact that encountering true claims would otherwise have. Moreover, these negative consequences are not likely to be reversed: if you later encounter a correction of some falsehood (and don't automatically reject it), the correction's beneficial influence is likely to be merely partial and temporary. And, even in cases in which you fully endorse the correction, the falsehood will remain stored in memory and infect the beliefs you form on the topic. Of course, any anti-consensus material you consume might well have many true things to say; but you will be able to learn most all of those truths by consuming pro-consensus material on the topic. And while some anti-consensus material will include truths you can't find elsewhere, the positive epistemic impact of consuming such material will be significantly outweighed by all of the negative impacts we've just reviewed.
Conversely, when an expert consumes anti-consensus material, she is likely to avoid most all of the epistemically bad outcomes that you would likely suffer. Because the expert has extensive background knowledge on the topic, she will automatically reject most of the falsehoods she encounters via plausibility checking. And because she will be better than laypeople at determining who is competent to speak on the topic, she won't have a tendency to accept the claims of prestigious non-experts. Moreover, the expert can employ strategies to protect herself that are not available to laypeople – for instance, research has shown that individuals can largely avoid being influenced by false statements when they actively correct or explicitly rate the accuracy of those statements as they encounter them (a strategy that requires that you know in advance that the statements are false).Footnote 29 And whereas the expert isn't likely to suffer the epistemic harms that a layperson would when consuming anti-consensus material, she is likely to achieve the epistemic benefits – particularly when some anti-consensus material is produced by a genuine expert, there is a good chance that the expert who consumes it will learn something important that she wouldn't have learned otherwise. So, for an expert, consuming anti-consensus material is not likely to do much harm and will frequently produce significant epistemic benefits.
Second, one might characterize what makes a belief-influencing action epistemically obligatory in deontological terms. According to this theory, some action is an epistemic obligation if and only if it conforms to our epistemic duties. For instance, an epistemic analogue of Kant's categorical imperative might be something like: act in such a way that you treat true belief (or accurate belief, or knowledge) always as valuable for its own sake. (Another way to express the point – drawing on Sylvan (Reference Sylvan2020) – would be that our actions should always manifest respect for truth or accuracy.) The characteristic feature of a Kantian view is that it prohibits the sorts of epistemic trade-offs that a consequentialist view permits – for instance, the Kantian insists that you ought not to acquire false beliefs in order to acquire other true beliefs. So, in order to act in accordance with this Kantian theory you must do everything you can to acquire only true beliefs; and in cases where clear expert consensus exists on some scientific issue, that means, for a layperson, consuming only pro-consensus material. If you, a layperson, consume anti-consensus material because you think you might learn something important, you're making a trade-off: you're allowing yourself to form a number of false (or less accurate) beliefs because there is a chance that you might acquire some true beliefs. (Another way to express the point: given that a layperson's doxastic attitudes are likely to be negatively influenced in systematic ways when she, say, reads an article dismissing climate change, or listens to a speech denying evolution, her actions do not manifest respect for truth or accuracy when she does these things.) Conversely, because an expert isn't likely to acquire false (or less accurate) beliefs when he consumes anti-consensus material, he is not making a similar trade-off when he does so; accordingly, the expert doesn't violate the epistemic analogue of Kant's categorical imperative when he consumes anti-consensus material.
Finally, one might characterize what makes a belief-influencing action epistemically obligatory in terms of epistemic virtues. According to this theory, some action is an epistemic obligation if and only if it is an action that an epistemically virtuous person would characteristically perform. A standard assumption is that an epistemically virtuous person is one who is motivated by the desire to acquire true beliefs (or knowledge) and avoid false beliefs.Footnote 30 Such an individual will not perform actions likely to diminish the accuracy of her doxastic attitudes unnecessarily. For instance, an epistemically virtuous person is epistemically cautious and possesses epistemic humility.Footnote 31 So, she is aware that she is particularly vulnerable to falsehoods when she has little background knowledge on some topic; and consequently, she takes the only precaution that will ensure that she maintains true beliefs and avoids false beliefs – namely, she avoids anti-consensus material. For a layperson to consume anti-consensus material because she believes she isn't likely to be influenced by the falsehoods it contains is to exhibit epistemic overconfidence; and to consume such material because she finds it reassuring, or engaging, or because she simply doesn't care about the risks, is to exhibit epistemic carelessness.
One might think that an open-minded individual will seek out anti-consensus material – that to always avoid such material is to exhibit the epistemic vice of dogmatism. But since epistemic virtues are directed at truth, being open-minded only requires giving some author or speaker a fair hearing when doing so has a reasonable chance of helping one form true beliefs and avoid false beliefs.Footnote 32 Accordingly, open-mindedness would only require that laypersons consume anti-consensus material so long as they could do so without being negatively influenced by the falsehoods such material is likely to contain. Cassam (Reference Cassam2019: Ch. 5) suggests that a self-confident thinker will trust himself not to be taken in by misleading evidence; but, again, a layperson who isn't worried about being influenced by the falsehoods contained in anti-consensus material is inappropriately overconfident. A layperson who avoids anti-consensus material because he recognizes that his lack of background knowledge and various cognitive limitations make him vulnerable to the falsehoods such material contains, exhibits epistemic humility, not dogmatism.
Conversely, an epistemically virtuous expert will sometimes consume anti-consensus material. Because an expert isn't vulnerable to the falsehoods anti-consensus material contains, his desire to acquire true beliefs and avoid false beliefs won't motivate him to avoid such material. Neither does he exhibit epistemic overconfidence or carelessness if he assumes that he isn't likely to be influenced by the falsehoods such material contains – that assumption is entirely appropriate given his store of relevant background knowledge. In fact, it seems clear that an open-minded expert will sometimes consume anti-consensus material – in particular, when that material is produced by a genuine expert. An expert who refuses to consider expert-generated anti-consensus material, even though there is little risk he will be taken in by any false claims it contains, thereby rejects the opportunity to possibly learn something novel and important. So, an expert who maintains that no anti-consensus material is ever worth considering exhibits dogmatism.
(A potential objection to the preceding argument is that most laypeople aren't obligated to avoid unnecessary exposure to anti-consensus material because most laypeople aren't aware of the extent of their vulnerability to false information. However, this objection assumes that individuals must recognize their vulnerability to false information in order to possess the epistemic obligation at issue – and we should reject that assumption. In particular, non-culpable ignorance is more plausibly regarded as providing an excuse for violating this obligation rather than eliminating it. Alternatively, we could respond to this objection by appealing to the distinction between objective and subjective obligationsFootnote 33 – that is, we could maintain that laypeople who are aware of their vulnerability to false information have both a subjective and an objective obligation to avoid unnecessary exposure to anti-consensus material, while laypeople who are ignorant of this vulnerability have only an objective obligation to avoid unnecessary exposure to anti-consensus material. However, some philosophers will insist that there is no sense in which a layperson has an epistemic obligation to avoid anti-consensus material if she has no reason to believe (from an internalist perspective) that she is vulnerable to false information in the ways at issue. Such philosophers should interpret the present argument as establishing a conditional thesis: if a layperson is aware of her vulnerability to false claims, then she has an epistemic obligation to avoid unnecessary exposure to anti-consensus material.Footnote 34)
5. Conclusion
Consequently, each of the most plausible theories of epistemically obligatory action entails that laypeople, but not experts, have an epistemic obligation to avoid unnecessary exposure to anti-consensus material. The overarching principle here is that each of us possesses cognitive features that make us vulnerable to falsehoods – particularly falsehoods that concern topics regarding which we know very little. As such, we ought to try to avoid being exposed to such falsehoods when we can. And when we are laypeople regarding some scientific issue where clear expert consensus exists, we have an available method for minimizing the falsehoods concerning that topic that we encounter: we can refuse to consume any anti-consensus material (since it is significantly more likely than pro-consensus material to contain falsehoods). But, of course, if we happen to be an expert on some scientific issue, we don't run the same risks when we consume anti-consensus material; so, while experts may, laypeople ought not to consume anti-consensus material unnecessarily. We've now seen, then, that experts and laypeople have fundamentally different epistemic obligations; accordingly, these important differences between believers ought to shape philosophical examinations of our epistemic obligations to perform or omit belief-influencing actions.
The widespread consumption of anti-consensus material has caused enormous numbers of laypeople to form incredibly dangerous false beliefs concerning topics such as climate change, evolution, vaccines, and COVID-19. Fortunately, then, the foregoing argument enables us to give everyone some rather specific advice regarding how best to conduct one's inquiries.Footnote 35 If you are a layperson regarding a scientific issue concerning which there is clear expert consensus, then if you encounter an author or speaker whose principal claims are advertised to be inconsistent with that consensus, don't read that author or listen to that speaker. For instance, if a television pundit starts going on about how COVID-19 is no more dangerous than the flu, or how it can be easily treated with hydroxychloroquine or colloidal silver, change the channel. If a friend or relative has some unconventional views about the dangers of vaccines, don't let him share them with you. If a leader of your religious community is going to be speaking about the lie that is the theory of evolution, don't attend services that day. And if you come across an article or book claiming that anthropogenic climate change isn't a serious problem – regardless of whether the author is a prestigious political figure, a respected intellectual, or even a legitimate climate scientist – make sure you don't read it.Footnote 36