Hostname: page-component-586b7cd67f-gb8f7 Total loading time: 0 Render date: 2024-11-24T22:40:40.528Z Has data issue: false hasContentIssue false

Engaging with “Fringe” Beliefs: Why, When, and How

Published online by Cambridge University Press:  13 June 2023

Miriam Schleifer McCormick*
Affiliation:
School of Arts and Sciences, University of Richmond, Richmond, VA, USA
Rights & Permissions [Opens in a new window]

Abstract

I argue that in many cases, there are good reasons to engage with people who hold fringe beliefs such as debunked conspiracy theories. I (1) discuss reasons for engaging with fringe beliefs; (2) discuss the conditions that need to be met for engagement to be worthwhile; (3) consider the question of how to engage with such beliefs, and defend what Jeremy Fantl has called “closed-minded engagement” and (4) address worries that such closed-minded engagement involves problematic deception or manipulation. Thinking about how we engage with irrational emotions offers a way of responding to these concerns. Reflection on engagement with fringe beliefs has wider implications for two distinct philosophical discussions. First, it can help illuminate the nature of beliefs, lending support to the view that not all states which are deeply resistant to evidence thereby fail to be beliefs. Second, an implication of the view I put forth is that it need not constitute a lack of respect to adopt what Peter Strawson called “the objective stance” in relationships.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

The central questions in the philosophy of disagreement concern if and how one's epistemic state should shift when confronted with what has come to be called an epistemic peer. “Peer disagreement” is disagreement between two people who are in equally good epistemic positions and equally likely to be correct. Should encountering such peer disagreement lower your confidence in the proposition believed or are you permitted to remain steadfast in your original belief?Footnote 1

I am concerned with a question that focuses on disagreements of a different kind. How should you respond when you think part of the reason why someone disagrees with you is that they are not your epistemic peer, when they have come to have a clearly false belief that the evidence does not support? What are the appropriate reactions towards someone who believes, for example, the earth is flat, the COVID-19 pandemic is a hoax, or that there is a cabal of Satan-worshipping pedophiles controlling the deep state? I will use the term “fringe beliefs” for beliefs that have substantial amount of evidence that is accessible to the believer showing they are false, ones that you know (or are justified in believing) are false.Footnote 2

Some may argue that the moment one discovers disagreement, this discovery is enough to diminish one's capacity to know. I am not here going to take a stand on what is required for knowledge, and on a certain infallibilist views of knowledge, given skeptical possibilities, knowledge may be unattainable. The kind of case I am concerned with here is when a subject p (who I am calling “you”) does not take the party in the disagreement to be in an equally good epistemic position, where you can see that they came to the view they have because they have failed to attend to available evidence or have exhibited some other fairly obvious epistemic flaw. This can be contrasted with the kinds of cases Christensen, and others, discuss where, even if you are very confident in your view, it at least seems like an open question as to whether you should at least, for example, check the calculation you made to ensure you got it right. It is important to distinguish these beliefs from those that are simply non-mainstream or minority views keeping in mind that believers in the heliocentric view of the universe and eighteenth century abolitionists occupied such positions.

While some of the examples I consider are beliefs in conspiracy theories, the class of beliefs I am concerned with are not co-extensive with such beliefs. Some conspiratorial beliefs are well-grounded, and some “fringe” beliefs may not include a conspiratorial element.Footnote 3 Someone who believes, for example, that the COVID vaccine causes enlarged testicles may not think there is a conspiracy in place to keep this information hidden. Recently it has become harder to see these sorts of beliefs as “fringe,” as they have been propagated by politicians and media and have become more mainstream. I cannot give a strict definition of the category of beliefs I am interested in, but hope the examples I provide are sufficient for the reader to understand the kind of problem that concerns me. A concern that arises is that what gets placed in this category may fluctuate with who holds power, especially given that part of how I have characterized them, includes a subjective element, namely that you take them to be obviously false. It is important to remember that there is also an objective element; these beliefs must be false with a substantial amount of available evidence showing they are false. If the propositions I have provided as examples were actually true (e.g., Earth is flat after all) then I have been mistaken to place it in the category.

I will argue that, in many cases, there are good reasons to engage with people who hold fringe beliefs. I will (1) discuss reasons for engaging with people who hold fringe beliefs; (2) discuss the conditions that need to be met for engagement to be worthwhile; (3) consider the question of how to engage with such beliefs, defending what Jeremy Fantl has called “closed-minded engagement”; and (4) address worries that such closed-minded engagement involves problematic deception or manipulation. Thinking about how we engage with irrational emotions offers a way of responding to these concerns.

Reflection on engagement with fringe beliefs has wider implications for two distinct philosophical discussions. First, it can help illuminate the nature of beliefs, lending support to the view that not all states which are deeply resistant to evidence thereby fail to be beliefs. Second, an implication of the view I put forth is that it need not constitute a lack of respect to adopt what Peter Strawson called “the objective stance” in relationships.

1. Why engage with fringe beliefs?

What is engagement? As academic philosophers and teachers, we recognize what it feels like to be engaged, or when our audience or students are engaged. Indeed, one of the difficulties of lockdowns during the COVID-19 pandemic was the loss of in-person contact, resulting in a diminished capacity for engagement. The kind of engagement found in philosophical discussion includes a willingness to listen and learn, a sharing of ideas and reasons, and an openness to revision. Engagement requires meaningful contact, connection, and involvement.

In thinking about why we generally approve of engaging with those with whom we disagree, we can remind ourselves of Mill's reasons in chapter two of On Liberty. To not engage with others' views, even when we are quite sure they are false, can lead to us missing out on truth; we should be aware of our fallibility. Even if we are certain that a belief is true, Mill says if we don't engage with others “it will be held as a dead dogma, not a living truth.” We come to understand our reasons for the belief better through engagement. Further, such engagement can help legitimize collective decisions.

These are good reasons, in general, to engage with others, but they may seem to not apply in cases where the beliefs are so clearly false that there is no possibility I will change my view. Indeed the question of what kind of effect learning about these disagreements should have on my epistemic state is clear. Steadfastness is clearly the appropriate attitude, and no adjustment to my credence is needed. If epistemology is a solitary venture about acquiring the most knowledge possible then there is no reason to engage with these believers at all. But if we are attuned to the social dimension of epistemology, and we see that false and potentially pernicious beliefs are spreading, we have good reason to engage with at least some of these believers with the aim of altering their epistemic attitudes, and to allow for more collective knowledge. Many recent discussions about what responsibilities we have as believers go beyond ensuring that we have done our best, a là Descartes, to clear our individual minds of prejudices and do what we can to form mostly true or justified beliefs. We also have responsibilities to one another in our capacities as believers. We rely on one another in figuring out what to believe, and I can expect you to have reasons to believe what you do and be able to articulate them if asked.Footnote 4

While that nature and rationality of conspiracy theories has been the topic of a lively philosophical debate for at least two decades,Footnote 5 philosophical discussion on the topic of engaging with fringe beliefs and believers is found mostly in suggestive material in the blogosphere. One recurring idea is that various kinds of community building are needed. John Greco stresses the importance of epistemic communities, and the need to think about “building better, or at least different, epistemic communities. And that involves the hard work of building relationships of trust. One reason for hope in this regard is that the boundaries of epistemic communities are neither rigid nor singular … epistemic communities intersect and overlap, and they come in and out of existence all the time.”Footnote 6

In thinking about why people are drawn to conspiracy theories, it is often pointed out that they offer a way of making sense of the complexities and contingencies that people face; they empower people with a feeling of control. Bortolotti and Ichnino have shown that many irrational beliefs are attempts to protect mental health by responding to the human need for control, understanding, and belonging. Further, sometimes irrational narratives point to something that really does require a response.Footnote 7 Engagement with such believers may offer alternative ways of addressing these needs which could help diminish the spread of these irrational beliefs and thus improve the epistemic health of our communities. It would preferable if collective epistemic health does not come at the cost of individual mental health. The hope is that by engaging with those who are attracted to fringe beliefs, and by better understanding their attraction, such beliefs would no longer be needed to protect mental health.

A final reason to engage is that to not engage is to fail to respect the believer. Respecting another person involves respecting their right to put forward reasons. If we fail to engage, we fail to treat these believers as rational agents whose acts and attitudes are governed by reasons. To dismiss someone as a “lost cause,” to assume that they only hold the beliefs they do because they are intellectually or morally flawed, does not only risk losing potential knowledge; it is morally problematic. Indeed it seems we are ethically required to keep an open mind when confronted with disagreement, and that failing to do so entails failing to respect the party in the disagreement.Footnote 8 Whatever general requirement we have here, however, may often fail to hold when it comes to fringe beliefs. If someone advocates a fringe claim insincerely, or someone exhibits extreme irrationality where they are no longer responding to reasons, they may well lose their right to be respected. I will now consider the conditions under which we “ought” to engage with such believers. In most cases this is not an obligation but rather, it is often permissible, and a good thing to do; engaging exhibits care, and can also lead to a reduction of harm.

2. Limits on when to engage

Normally, the kind of engagement I have been discussing includes a degree of open-mindedness. Even if I am convinced of a certain philosophical view, when the time comes for Q&A, I am open to revising my arguments in light of discussion, and sometimes discussion leads me to change my view. It may seem that this kind of open-minded engagement is out of place when it comes to fringe beliefs. I will now consider a number of reasons to not engage with these beliefs.

One may worry that engagement makes one vulnerable to a potential loss of knowledge. Kripke (Reference Kripke and Kripke2011) presents this worry in articulating the “dogmatism paradox.” If I know something then I also know that any evidence against it is misleading and I so can resolve to not be swayed by it. In this way, taking a dogmatic attitude, rather than obstructing knowledge, can protect it. Jeremy Fantl presents a similar argument against open-minded engagement in certain circumstances. If you know something, “you can know that a relevant counterargument is misleading even if you have no idea what the content of the argument is (outside of its conclusion).” You should not engage open-mindedly with such arguments because “you shouldn't be willing to reduce your confidence conditional upon engaging with the argument, finding the steps compelling, and being unable to expose a flaw” (Reference Fantl2018: 132).

Defending the view against open-mindedness in such circumstances, Robin McKenna (Reference McKenna2023) suggests that those who are most vulnerable to this diminishing confidence, and hence loss of knowledge, are those who are at risk of experiencing testimonial injustice. For such agents, he argues, there cannot be an obligation to engage with challenges to their well-grounded beliefs.

Another worry is that such engagement legitimizes the viewpoints and allows them to spread. By taking the time to refute the claims of a Holocaust denier, or someone who believes that Bill Gates is out to control people through 5 G technology once microchips are implanted in them through vaccines, I may misrepresent something as a legitimate disagreement that, in fact, is not.Footnote 9

Finally, an important reason to not engage is that to do so is a waste of time and cognitive resources. Cassam, who argues we have an epistemic obligation to engage with all challenges to our views, even the ones he calls “crackpot,” considers this concern, noting that “A worry about this approach is that it places excessive demands on knowers.” He asks, “How many ordinary citizens are competent to refute the claims of 9/11 conspiracy theorists?” Cassam's answer is that most are, especially if we work at educating citizens to distinguish truth from lies. When considering whether it is asking too much of agent (S) to engage with conspiracy theorists he says “It's certainly at odds with the notion that S can retain her right to be confident that P, and her knowledge that P, without having to lift a finger….If asking ‘too much’ of S means requiring her to do something to protect her knowledge then it's true that too much is being asked of her” (Reference Cassam2019: 119).

Cassam's response to this worry is too quick. Even if one has the ability to engage and refute these beliefs, is it really a good investment of a limited resource? Whether engagement is too demanding likely depends on the context of one's life, and the other roles one plays beside knower. Some factors to consider in deciding when to engage include the following: the background and motives of the believer, the perniciousness of the belief's content, the potential of the belief to spread, and one's relationship to the believer. Some people may assert propositions such as “the Holocaust never occurred,” even if they do not believe it, because it is a way to express and support hatred. Recently we have seen politicians making claims about election fraud that they quite clearly did not believe but that they thought would increase their popularity. When someone argues in bad faith or simply lies, reasons to engage diminish. Indeed, some of what might be called “fringe sources” like “news” stations whose hosts propagate claims they know or false, which then lead to others forming beliefs based on their testimony, are better ridiculed, chastised, or “deplatformed” if possible.Footnote 10

On the contrary, if a good friend who has generally been open to rational discussion loses their job and, when stuck in their house for months, and after spending a lot more time on social media, begins to espouse some of the tenets of QAnon, finding a way to engage without alienating this friend is important. But rather than start with a bombardment of facts or contemptuous dismissal, it is likely more productive to try to understand how and why they were drawn to these ideas. Engagement of this kind is not open-minded; there is an asymmetry in the parties. My mind is not open to adopting my friend's belief about QAnon, but I hope that their mind is open to change. I will now turn to a further discussion of closed-minded engagement and some potential problems with it.

3. Closed-minded engagement

There are good reasons to engage even when one's mind is not open, but can one still count as engaging if one's mind is closed? We frequently engage with very young children in this way. We do this because we are curious and want to understand them and their perspective, and, at times, to teach them. Similarly, when engaging with freshmen students in their first philosophy classes, I do not need to be open to changing my mind on how to best interpret a philosophical text or how to think about free will. But discussion on these topics still counts as engagement if I am interested in their ideas and thought processes.

Closed-minded engagement involves being empathetic, listening, and trying to understand the other and how and why they came to have these beliefs. An example of this kind of engagement is found in the Listening Project, a conflict resolution organization that tries to effect socially just outcomes in community disputes. The Listening Project seeks to change attitudes on topics such as race and homosexuality by “understanding where people are starting from and seeing the potential. It involves listening at a very deep level so that one builds a relationship of trust and respect.” One hears similar language from Christian Piccolini, a reformed Neo-Nazi who seeks to change the attitudes of those in the grip of racist ideology.Footnote 11

The risk of losing knowledge is diminished in this way, as is the worry that the belief will gain legitimacy, since there is no moment in such engagement when the belief is treated as one that could be justified or true. The emphasis, instead is on the believer, trying to understand why they believe what they do and seeking ways to get them to see their own beliefs as problematic or unstable. If I am engaging with the believer, does this mean I am not engaging with the fringe belief? Even if I am not focusing primarily on the rational status of the belief, the hope is that this kind of engagement will lead to a discussion of the belief. This differs from a case when one has decided that there is no longer any point in discussing a certain topic with someone but one still engages with that person in other ways and on other topics.

I have borrowed the idea of closed-minded engagement from Fantl, who thinks that engaging with fringe beliefs in this way tends to involve a problematic insincerity, deception, and potential manipulation: “If you present yourself as open or at least fail to emphasize the degree to which your mind is closed, if you are in fact closed-minded you can easily be problematically insincere” (Fantl Reference Fantl2018: 158). He argues that the only way the Listening Project can succeed in changing minds is if the engagement appears more open than it actually is. Commenting on a conversation between Walters, a member of the Listening Project, and Jeff, a person with racist beliefs, Fantl says:

There is a kind of insincerity in Walters's “nonthreatening questions and accepting manner.” First, because the questions have to be asked with an “open-mind and heart,” Walters can't represent himself as disagreeing – which he presumably does – with Jeff's initial claims about the source of difficulties in the nearby black community. Second, Walters does not give a complete representation of his purposes in asking Jeff the questions he's asking. (Reference Fantl2018: 163–64)

Why should trying to correct a false, and potentially harmful, belief be different from trying to correct some harm more generally? In general, I don't have to pretend to be open-minded when I confront someone about something that might be wrong or bad for them. I don't have to pretend, for example, that I am willing to be convinced of the benefits of heroin use when I express my deep concern about its harm.Footnote 12 The difference in the doxastic context is that in discussing this belief with someone it appears as if one is engaged in inquiry. Typically, when we ask people for reasons and respond to them, we are partners in the activity of inquiry. Indeed, the norms of shared inquiry include the standard of mutual answerability.Footnote 13 When engaging with fringe beliefs, that is not what we are doing. Yet there is much empirical evidence that being upfront about closed-mindedness is not effective. When I state explicitly at the outset that I find the other's views abhorrent and obviously false, the possibility of their openness and belief-alteration diminishes.Footnote 14

Returning to the example of conversation with a heroin user, we can imagine two scenarios: one where the heroin user sees nothing wrong with heroin use, and another where the heroin user does not disagree but is powerless to stop using heroin. The first case is more akin to conversation with a person with fringe beliefs. In both situations, we want the person to see their own behavior or beliefs differently and to recognize that they have been led down a path that has diminished their autonomy.

I noted earlier that an important reason to engage is that not to engage displays a lack of respect. The same criticism can be applied to closed-minded engagement. Am I not required to take what Peter Strawson calls the “objective attitude” with believers in conspiracy theories? Must I view someone with such fringe beliefs “as a subject of social policy; as a subject for what, in a wide range of sense, might be called treatment … to be managed or handled or cured or trained?” (Reference Strawson1974: 9) rather than the “participant attitude,” which is essential to “ordinary adult human relationships” and includes a range of reactions such as resentment, gratitude, and forgiveness?

My answer is a qualified “yes.” Engaging closed-mindedly does require using the objective attitude, but in a way that Strawson points to that is not so worrisome, namely in a way that is compatible with viewing the person as “normal and mature.” According to Strawson (Reference Strawson1974: 9), we can use the objective attitude as a “resource” in interpersonal relationships, and this does not require taking a “wholly objective” view of the person. Strawson recognized that adoption of these attitudes is not wholly dichotomous. He says “I must deal here in dichotomies and ignore the ever-interesting and ever-illuminating variety of case” (Reference Strawson1974: 9). In a conversation with a person who holds fringe beliefs, different attitudes may be present in the same human interaction.

4. Belief alteration modeled on emotion alteration

Fantl considers ways one might avoid deception in closed-minded engagement, but says these will not work if the aim is to convince others. For example, he considers Rosenberg's nonviolent communication (NVC) process: “Rosenberg advocates a mutual honesty of expression that the Listening Project doesn't emphasize. While the Listening Project emphasizes passive techniques for drawing the speaker out until they change their mind, Rosenberg's ‘NVC Process’ is designed to get the listener and the speaker to understand each other about their feelings and needs” (Reference Fantl2018: 172).

The immediate goal is mutual understanding and humanizing, not mind-changing, though there is hope that the process can lead to a change in perspective. Fantl is doubtful that complete honesty about one's closed-mindedness can lead to belief alteration, saying, “you can only permissibly engage closed-mindedly by being honest about your closed-mindedness. This can be a difficult trick to pull off effectively, if your goal is to convince others, since honesty about your closed-mindedness is just the sort of thing that can alienate your listener and put them on the defensive.”

But one can hope that engagement will lead to belief alteration but see that as different from needing to convince others through direct confrontation.Footnote 15 The route can be indirect, through mutual understanding and by helping to create an openness to take in new information. This seems to be the point of the NVC process. If we think about how we engage with someone when aiming to alter their emotions, we can see how this might work. There are different strategies we might take when someone has an irrational or problematic emotion. Reflecting on these can help us imagine ways of engagement that, though deviating from the thoroughgoing participant stance where we regard each other as equally rational, need not devolve into the wholly objective stance either. Is there really a difference between trying to directly convince someone they are wrong, and engaging in a way that aims for them to change their belief? And, again, do I need to mask my intentions for the belief alternation to be successful? While being fully explicit in your intentions, and hopes in such interactions might impede success, not saying everything need not entail deception. Again, thinking of how we engage with problematic emotions seems very helpful. Coming right out and saying exactly how you feel about someone's misplaced anger or fear is often not helpful but being less than fully forthcoming does not mean one is being deceptive.

Whether a particular emotion is appropriate, justified, rational, or one I “ought” to have partly depends on the evidence. Suppose I am angry at my neighbor because I think she ran into my boat on a foggy day. If the fog lifts to reveal no one is in the neighbor's boat, my anger will disappear, and if it does not, then I can be rightfully criticized.Footnote 16 How to assess the appropriateness of anger and other emotions in more murky circumstances is complicated. Even if it turns out that I was correct that my neighbor carelessly ran into my boat, other considerations can support being critical of my anger. These may include, for example, that I have a disposition to become angry too quickly, which has been detrimental to me and my relationships. If I am trying to alter this disposition, then I have a reason to not become angry despite the evidence. Though my anger may be fitting in such a case, I have a value-related reason to not have this fitting attitude. Footnote 17

So there are times when I ought not to have a particular emotion, but you telling me that I shouldn't feel this way, and offering evidence for why I shouldn't, does little to alter my emotion. To put it simply, evidence does not immediately affect feelings. We see this clearly in the case of recalcitrant emotions; I can still experience fear even if I am convinced that the spider is not at all dangerous. Fringe beliefs are similar in that the believer's conviction is very intense, so it is not surprising that such beliefs are difficult to alter. Thinking of beliefs as like emotions can explain why, if one's belief is seen as centrally connected to one's identity, even if there is overwhelming evidence against the belief, it remains resilient. The way to get people to be open to changing their beliefs is to find ways to make them more secure and less defensive, so that the presentation of facts does not feel like an assault on who they are.Footnote 18

Thinking about how we appropriately respond to each other's emotional states can help address the worry about closed-minded engagement being a failure to respect someone as a person. At times, recognizing that persons are not only rational agents, capable of choosing their own ends, but also part of the causal order as “cogs in the grand machine of nature” (Langton Reference Langton1992: 495) is what is needed for proper respect. As noted above, the objective and participant stances are not mutually exclusive. According to Mark Schroeder, the objective stance need not be diminishing or disrespectful “because persons are in fact a kind of thing, it should not be surprising if the participant stance – the perspective par excellence from which we engage with one another as persons – does not preclude prediction, interpretation under a causal lens, or even, perhaps, at least some kinds of management” (Reference Schroder2019: 100–1).

If I know you, and I know that you have a tendency to snap when you are tired and hungry, it is not disrespectful for me to explain your behavior by appealing to these “merely causal” explanations. Conversely, I can fail to respect someone by demanding rational engagement when it is not appropriate. Schroeder considers how to react when his wife snaps at him for about the Jasmine vines he is working on coming along too slowly by thinking that she may have had a bad afternoon at work. While he agrees that “ if every time Maria complained to me about something, I were to immediately speculate about her mood, that would be a sign of something wrong with our relationship” but says that in certain circumstances such as “the vines about which she is complaining are the same ones she complimented last week, or if she easily transfers her complaints to other topics rather than engaging with me about the growth of the jasmine” then “it is not, at the very least, a flaw in our relationship, if I resort to the causal mode of interpretation of her behavior” (Reference Schroder2019: 101). I think a stronger claim can be made: to insist that his wife respond to reasons and to react to her with disappointment or resentment would exhibit a flaw. It would be failure to see her for who she is at this moment and such failures, I submit, can be characterized as a kind of lack or respect.

And what applies to one's emotional state, such as irritability, and one's behavior, such as complaining or lashing out, can also sometimes apply to doxastic states. A striking recognition of this has been recently articulated by Stuart Thompson, a New York Times columnist who spent weeks listening to a QAnon audio chat channel. Hearing the voices of QAnon believers made them seem less strange to Thompson, and some of the appeal of this community, which offered clear answers and solidarity, started to make more sense:

Beneath the anger in their voices is often pain or confusion. When the chat dies down to just a few members, they'll share stories about their struggles with affording health insurance or the shame of going on government assistance. Hearing them talk with one another, I could start understanding the pull of conspiracy communities – how they exploit the vulnerable and create a worldview out of shared enemies. Then you can watch those views harden. And while none of it excuses participation in a dangerous collective delusion, it takes the complex process of radicalization and gives it a human dimension.Footnote 19

This type of objective stance with its “causal mode of interpretation” can humanize the person one is trying to understand or engage with. This is not to say that members of the group could never be reasoned with or that there is no possibility of discussing evidence for their beliefs with them. As Scott Aikin and Robert Talisse (Reference Aikin and Talisse2021) argue, evidence does matter to some degree; QAnon believers needed explanations when QAnon predictions failed to come true. In the face of evidence, some gave up their beliefs and others found their convictions weakened. But to even hope to have any kind of discussion of evidential reasons for fringe beliefs, it is important to understand the practical reasons that make them so attractive to many.Footnote 20

These believers are not unique in having beliefs that are not formed or maintained on the basis of evidential, or what some call “epistemic” reasons. In a recent Financial Times column, John Thornhill writes: “Ultimately, we cannot reason people out of beliefs that they have not reasoned themselves into. But we can, and should, punish those who profit from harmful irrationality.”Footnote 21 Yet, the vast majority of beliefs are ones we do not reason ourselves into. Rather, we find ourselves with them, many the result of perceptual experience we have forgotten and testimony we have not questioned. Thornhill's point is that, for a great many of our beliefs, once we are presented with reasons or evidence against them, we give them up. Sometimes this is easy. Most of us grew up thinking Pluto was a planet. Yet, when the scientists re-classified Pluto, explained the reasons why that many of barely understood, most of us quite easily changed our beliefs. Sometimes, however, even quite mundane beliefs can be hard to give up, if they have hung around long enough.Footnote 22

What Thornhill meant is not that these beliefs were not ones “reasoned into” but rather that these beliefs are not reasons-responsive in the way that beliefs ordinarily are; fringe beliefs are resilient beliefs that do not seem to respond to evidence. Yet, some of our most foundational beliefs are like this, such as the belief that there is an actual world of physical objects. Arguments that show that these beliefs cannot be justified by appeals to evidence do nothing to alter conviction. One may object, though, that in such cases one is not being presented with evidence against one's beliefs. Perhaps if evidence mounted in favor of our living in a computer simulation, for example, the beliefs would start to shift. Belief in a physical world of objects is such a central belief, around which so many other beliefs hinge, that change would require a lot of evidence. This is also case with some fringe beliefs. When a whole way of seeing things needs to shift, when an edifice of beliefs is at risk of crashing down, the effect of evidence will be weaker than in other, more mundane cases. Comparison to religious or superstitious beliefs can be helpful here; religious beliefs do not exhibit certain characteristics typical of “ordinary” beliefs, in that they are highly resistant to countervailing beliefs or evidence and don't connect to action in the way many beliefs do. Many philosophers have taken this as reason to claim that religious beliefs should not be considered beliefs at all.Footnote 23

The difference between such evidence-resistant beliefs and fringe ones is, at least, partly, explained by the content. One of the factors to consider when deciding whether to engage (see above) is the perniciousness of the belief's content. Most of us have superstitious beliefs or engage in superstitious behavior. Sports fans and players engage in rituals to improve their chances of winning. Many think uttering certain words out loud can “jinx” a situation, such as commenting on how smooth the traffic is or how fine the weather is. A sample of the many superstitious beliefs and practices that were reported when I surveyed my friends (many of whom are academic philosophers) were the following:

Sneeze or stumble on the way out and whatever was being planned will turn out poorly.

When flipping a coin, heads: one should make the conservative choice; tails: take the risk.

It is bad luck to split a pole, hang laundry on New Year's Day, not salute magpies,

It is good luck to say “Rabbit, rabbit, rabbit” on the first day of a month, to tap your dashboard seven times before a road trip, to make a wish when you see a shooting star, and to avoid scratching an itch on your right palm.

One may wonder if these count as real beliefs. They do not seem sensitive to the evidence, or as connected to actions as many beliefs are. People don't tend to stop performing such rituals even if they have no effect on one's luck. While the metaphysics of belief is beyond the scope of this paper, there is a risk of depopulating the category of belief if we adhere too strictly to the idea if a state does not respond to evidence or result in the kind of behavior typical or expected of belief, it is not a belief after all, but a different state.Footnote 24 I did specifically ask my friends what are some superstitious beliefs they have. Even if they are beliefs, given their benign content, there may be no need for engagement to change them.

It is significant that many of my friends were somewhat embarrassed by having these “crazy” beliefs.Footnote 25 The worry about many of the holders of fringe beliefs is that they do not view these beliefs as different from other, more mundane beliefs. They view them as well supported by evidence, and a few may be willing to act on them in ways that can lead to serious harm to themselves and others. I see the main goal of engaging as getting people with fringe beliefs to recognize that some of their beliefs are different and less reason-responsive than other beliefs they hold. And “we” should be willing to do the same, namely, recognize which of our beliefs are less open to question, less responsive or grounded in evidence, and then we can ask why some beliefs have this special status. If part of the reason I believe in the ultimate goodness of humanity is that it helps me be more cheerful and charitable, I can endorse this practical reason. If part of the reason that someone believes that democrats are run by a Satan worshipping cabal of pedophiles is that it allows her to be part of community she lacks, or offers her a sense of power and control she has lost in life, this is not a reason she can endorse. A key difference is that I can be aware of this practical reason and still maintain my belief while the “fringe” believer needs to remain unaware so as to maintain her belief. It is possible to understand ourselves better, and to see that beliefs do not only depend on evidence or on reasons that can be articulated and shared. The kind of engagement I have been defending would be aiming to help increase the fringe believer's awareness of their motivations, rather than presenting them with a lot of evidence that they are wrong.

When engaging with others, there is a range of ways to relate between fully collaborative inquiry and adopting a wholly objective stance. As we persist in finding ways to engage with those who hold fringe beliefs, I propose that we use this mixture of participant and objective attitudes, as when engaging with irrational emotions – bringing empathy and respect to engagement while rejecting ideas that are false and dangerous. This mixture of engagement is crucial to improve the epistemic health of our community.

5. Conclusion

I have argued that one has good reasons to engage close-mindedly with people who hold fringe beliefs when (i) the cognitive and emotional costs of doing so are not too high and (ii) such engagement has the potential to lead to a change of view. I attempted to assuage the concern that this kind of engagement must involve problematic deception or manipulation. While such engagement differs from the mere asking and giving of reasons or evidence for beliefs, it does not treat the believer as incapable of such rational discourse. Throughout my discussion I have assumed that at least many people count as fully believing these propositions even though they are highly resistant to the evidence. This seems like an uncontroversial assumption since they are widely labeled and treated as beliefs. Yet, as mentioned above, many epistemologists have suggested that attitudes that fail to appropriately respond to evidence or to result in the kind of behavior typical or expected of belief are not actually beliefs. I hope the foregoing discussion has helped to highlight the problematic implications of such restriction. If epistemology focuses on some special sub-set of beliefs, at the exclusion of delusions, religious faith, political convictions – if these are not proper subjects of epistemic evaluation – then many other intuitively irrational and unjustified beliefs, such as those that have been the focus of this paper, will also be excluded. But something has gone wrong if epistemology is no longer concerned with investigating and evaluating such beliefs.Footnote 26

Footnotes

1 See Christensen (Reference Christensen2007: 187–89) for a clear introduction of the issue and what is meant by an epistemic peer.

2 Neil Levy (Reference Levy2020) has called the kinds of belief I am interested in “bizarre,” and a recent NPR discussion calls them “bonkers.” Quassim Cassam (Reference Cassam2019: 117, 120) uses the term “crackpot” as a descriptor. An anonymous reviewer started to wonder in what sense the attitudes I am concerned with are either “fringe” or “beliefs.” As I acknowledge, the more widespread such beliefs are, the less calling them “fringe” is appropriate and so my use of this term is perhaps aspirational. I would also like to avoid a term that it is not as value-laden as “crackpot.” Further, on certain views of belief, ones that take beliefs to necessarily be responsive to evidence, or to be “aiming” at truth in such a way that they cannot be based on practical reasons, the counter-evidential resilience of these attitudes would reveal that they are not properly characterized as beliefs. Such “evidentialist” and “normativist” views of belief have been widely criticized (Leary Reference Lear2017; Rinard Reference Rinard2015; Reference Rinard2019; Schleifer-McCormick Reference Schleifer-McCormick2015; Reference Schleifer-McCormick2018; Reference Schleifer-McCormick, Carter and Bondy2019). Nonetheless, such non-doxasticism about many “problematic states” that do not behave as certain standard views predict they should is on the rise. I will discuss these issues a little more below, and I discuss them more extensively in (Reference Levy2022), but I will borrow Levy's (Reference Levy2022) response to such worries and say they are “beliefy” enough for us to treat them as such. He acknowledges that “people do not (straightforwardly) believe many of the things they profess to believe,” but argues, “differences in our beliefs (in states that are beliefy enough to count for our purposes) are at the heart of many of the central political issues of our day, such as disputes over climate change and vaccination. What people say they believe tends to explain and predict how they will act” (7).

3 There is disagreement among theorists about how best to define conspiracy theories. Some argue that part of what makes a theory a conspiracy theory is that the theory is lacking in evidence, is false, or is otherwise deficient. Harris (Reference Harris2022) has suggested we label such definitions “pejorative” definition. Harris, and many others, suggest that what is essential to conspiracy theories is that they allege conspiracies that are contrary to the claims of authorities, but they could turn out to be true. Coady (Reference Coady2007) and Dentith (Reference Dentith2017), among others, argue for a neutral definition of conspiracy theories which states that a conspiracy theory is just a theory about two or more people working together in secret towards some end, and that each one needs to be assessed individually to determine its epistemic status.

4 For discussion of these kinds of responsibilities see Osbourne (Reference Osbourne2021), Chrisman (Reference Chrisman, Schmidt and Ernst2020), and Goldberg (Reference Goldberg2017).

5 See footnote 3.

6 https://blogs.cardiff.ac.uk/openfordebate/2020/09/07/why-are-we-so-polarized-and-how-can-we-move-forward-a-perspective-from-social-epistemology/. In a recent article Hannah Gunn (Reference Gunn2020) offers a conception of what an epistemic community is and what it means for such a community to be healthy. An epistemic community “consists of a collection of individuals who commit together to some epistemic aims or ends” (572) and “Healthy epistemic communities, then, will be ones with supportive practices of communication. They will be places that are organized to promote dialogue and deliberation, and they will work to support the development of communicative competence alongside competencies aimed at belief regulation” (569).

7 See https://theconversation.com/conspiracy-theories-may-seem-irrational-but-they-fulfill-a-basic-human-need-151324. Aragor Eloff presents a similar idea: “Empathetic and nuanced engagement with conspiratorial narratives can help nurture meaningful collective responses to, the problems conspiracy theories suggestively outline… That the world is increasingly complex and uncertain means that very little is, or ever could be, orchestrated in the contrived ways conspiracy theorists propose, but it also means we have to become better equipped to deal with that complexity and uncertainty. While this may seem like a relatively solitary existential pursuit, a genuine sense of security is grounded in healthy, thriving communities.” https://www.newframe.com/beneath-conspiracy-theories-the-class-war/.

8 Marušić and White (Reference Marušić and White2023) make this point when discussing what attitude to take when confronted with peer disagreement: “As long as the disagreement persists, and as long as one remains engaged in reasoning with the other, one has to regard the arguments mooted so far as not yet settling the question under dispute…the significance of the disagreement is interpersonal and, in a broad sense, ethical rather than evidential.”

9 This concern is lessened if the engagement is not public. As we shall see, a certain kind of private engagement with such believers may well be worthwhile.

10 Thanks for an anonymous referee for pointing out that we may want to treat “fringe sources” quite differently than we treat “fringe believers.”

11 For example, in this interview with the Washington Post, Piccolini says, “You can scream and use facts against somebody who's not thinking rationally, and you'll never change their mind, no matter what you say. We have to understand the motivation for why people hate.”

12 Thanks to John Greco for posing this question.

13 See Marušić and White (Reference Marušić and White2023).

14 For an overview of some of these studies, see http://www.newyorker.com/science/maria-konnikova/i-dont-want-to-be-right. This 2014 piece cites Brendan Nyhan on changing people's minds about vaccines. According to Nyhan, vaccines are “not inherently linked to ideology.… That means we can get to a consensus.” Seven years later, beliefs about vaccines and diseases have in fact become political, so Nyhan's optimism may have been misplaced. A recent article (on which Nyhan is a co-author) shows how difficult it can be to correct people's beliefs about epidemics, and how dangerous misperceptions can be. See Carey et al. (Reference Carey, Chi, Flynn, Nyhan and Zeitzoff2021).

15 One finds a similar approach being recommended in “street epistemology” which encourages conversations about beliefs where “The goal is not to convince people to change our minds or to prove them wrong, but to help them become more aware of our own thought processes and to encourage critical thinking.” See https://streetepistemology.com/.

16 This story about the boatman and the fog is one that Ram Dass frequently tells.

17 For a discussion on how value-related reasons and fit-related reasons can come apart see Howard (Reference Howard2019a; Reference Howard2019b). While most theorists admit there can be legitimate value-related reasons for emotions and other attitudes, there is still a lot of hesitancy to admit there can be such reasons for belief. I have argued elsewhere that this exception for doxastic attitudes is unmotivated.

18 One may worry that this kind of engagement is problematically paternalistic; the examples I gave about when we engage closed-mindedly including children and students. While there may be an unevenness in these interactions, it need not be problematic. A model of engaging with the aim of mind-changing though indirect means is found in the doctor–patient relationship. While this kind of paternalism can lead to an epistemic injustice where the patient's autonomy is questioned, it need not be the case. For a discussion on this kind of epistemic paternalism, see Bandini (Reference Bandini, Bernal and Axtell2020).

19 “Inside a Pro-Trump QAnon Chat Room,” January 26, 2021.

20 Once we have a better understanding of why they hold on to these beliefs, we can consider best ways to engage. If, for example, we discover they have beliefs for reasons of social signaling in fringe communities, we might disapprove of the community, and its body of beliefs, and wish to engage with individuals to try to convince them to leave the community.

21 Conspiracy theorists destroy a rational society: resist them, January 14, 2021.

22 The radio program “This American Life” featured a number of stories like this. One described a college student in a discussion about endangered species and she asked “are unicorns endangered or extinct?” She didn't just feel embarrassed about being wrong; letting go of a belief that one has held for a long time feels like a loss, even when its content is not normative.

Though the content in this case is not normative, that it has to do with unicorns may explain the student's feelings; it is not really a loss of belief that matters, one may think, but the loss of unicorns. Belief revision about more mundane matters perhaps does not feel like much of anything. What is central, however to the cases discussed is not their content but their longevity. One example was that the belief that the “crossing” sign which uses an “X” is pronounced “Zing.” When a belief hangs around long enough it becomes part of you.

23 This is, for example, Robert Audi's position. Audi (Reference Audi1991; Reference Audi2008) argues that there is a kind of propositional faith that does not entail belief which he calls “fiducial faith.” His two main reasons for distinguishing between these states are first, that faith includes a positive attitudinal element that belief need not, and second, that faith is compatible with a higher degree of doubt than is belief. This attitude is not simply a cognitive or intellectual state which is what a belief is, according to Audi who accepts the cognitive conception. For similar reasons, William Alston (Reference Alston, Jordan and Howard-Snyder1996) thinks many attitudes of religious faith should be classified as “acceptances” rather than “beliefs.” A recent proponent of this view is Neil Van Leeuwen (Reference Van Leeuwen2014; Reference Van Leeuwen2017), who argues that we should view the attitudes of factual belief and religious credence as different cognitive attitudes and, again this is because “religious cognitive attitudes lack the defining characteristics of factual beliefs (2).” One such defining characteristic is that they are evidentially vulnerable, and another is that they don't have relationship with action that one finds with “factual beliefs.” Recently, Michael Hannon and Jeroen de Ridder (Reference Hannon, de Ridder, Hannon and de Ridder2021) have argued that most so-called political beliefs are not genuine beliefs for similar reasons. They say, for example: “In general, deeply held political beliefs seem unresponsive to evidence, driven by affect, and formed on largely non-evidential grounds…For these reasons, political beliefs (and other identity-constitutive beliefs) seem to be a different cognitive attitude than many ordinary world-modelling beliefs. In politics, we often care more about belonging and team loyalty than truth because, for many, politics is not really about truth” (158).

24 This problem of “depopulation” has been recognized recently by Ganapini (Reference Ganapini2020) and Helton (Reference Helton2020). Their way of addressing the issue is to relax the revisability or rationality constraints on belief. The revisability constraint is often taken to imply that if one recognizes evidence which undermines one's belief, one will thereby lose the belief, namely that what it means to believe is that you take it to be evidentially supported (see Adler Reference Adler2002; Gendler Reference Gendler2008; Shah Reference Shah2003; Van Leeuwen Reference Van Leeuwen2017; Velleman Reference Velleman2000). Helton (Reference Helton2020) defends a weaker revisability condition where one only needs to have the capacity to revise one's belief even if one actually does not in order to count as believing. The mere capacity to revise may well exist in even the most persistent delusions. In a similar vein, Marianna Bergamaschi Ganapini (Reference Ganapini2020) argues that to be a belief, a cognitive state needs to be “minimally rational…in the sense that they respond to perceived irrationality by re-establishing coherence” (3272).

The problem with merely relaxing the revisability and rationality constraints is that it does not offer a way to make normative distinctions among evidence-resistant, or action-discordant beliefs. Some of these “problematic states” are not obviously flawed or irrational. Indeed, an important motivation for many proponents of non-doxasticism is that they view the norms governing belief as different from the norms governing these problematic states. If one recognizes that there can be value-related reasons to trust others, or to hold certain religious or political “beliefs,” and one denies that such reasons can be genuine reasons for belief, then this motivates the idea that these are not beliefs after all. It is widely held that the only genuine reasons for believing are evidential, or alethic; if my belief is based on a non-evidential reason (if such a thing is even possible) then such a belief must be defective. And yet one may not want to treat all these problematic states as being on par. One could accept that there are two distinct categories of attitudes here: “regular factual beliefs” and then find a label for these evidence-resistance or action-discordant attitudes. Suggestions for what to call these problematic states include “quasi-beliefs” (Bayne and Hattiangadi Reference Bayne, Hattiangadi and Nottelmann2013), “in-between beliefs” (Schwitzgebel Reference Schwitzgebel and Nottelmann2013), “aliefs” (Gendler Reference Gendler2008), “besires,” (Swartzer Reference Swartzer2013), and “bimagination” (Egan Reference Egan, Bayne and Fernandez2009). I (Reference Levy2022) argue that rather than endorse non-doxasticism, we re-think the standard view of belief so that we can include these states as beliefs and make normative distinctions between them.

25 An anonymous referee suggested that on Ganapini's account, these superstitions wouldn't count as beliefs, because the agent happily lives with their apparent irrationality, and offers this quotation in support “to know whether an attitude is a belief, scrutinize how the attitude and the system react to detected irrationality and internal incoherence. If you see discomfort and coping strategies to avoid conflict, you are bearing witness to a belief.” I think my friends' embarrassment points to this discomfort. Again, as I said in footnote 2, for the purposes of this discussion it only matters that they are “beliefy” enough.

26 Versions of this paper were presented to the Social (Distance) Epistemology Series, The Ethics Working Group at the University of Richmond, The Value of Irrationality Workshop at the University of Zürich, The GRIN (normativity research center) Université de Montreal, and to the “Extreme Beliefs Project” Vrije Universiteit Amsterdam. Thanks to all who attended these presentations for their thoughtful questions and comments. Thanks to Scott Aiken, Jeremey Fantl, Melanie Sarzano, Sebastian Schmidt, and an anonymous referee of this journal for written comments on earlier drafts.

References

Adler, J. (2002). Belief's Own Ethics. Cambridge: MIT Press.CrossRefGoogle Scholar
Aikin, S. and Talisse, R. (2021). ‘Deep Disagreement and the QAnon Conspiracy Theory.’ 3 Quarks Daily. January 25, 2021. https://3quarksdaily.com/3quarksdaily/2021/01/deep-disagreement-and-the-qanon-conspiracy-theory.html.Google Scholar
Alston, W. (1996). ‘Belief, acceptance and religious faith.’ In Jordan, J. and Howard-Snyder, D. (Eds.), Faith, Freedom and Rationality (pp. 10–7). Lanham: Roman & Littlefield.Google Scholar
Audi, R. (1991). ‘Faith, Belief, and Rationality.’ Philosophical Perspectives, 5, 213239. https://doi.org/10.2307/2214096.CrossRefGoogle Scholar
Audi, R. (2008). ‘Belief, Faith, and Acceptance.’ International Journal for Philosophy of Religion, 63(1/3), 87102. https://doi.org/10.1007/s11153-007-9137-6.CrossRefGoogle Scholar
Bandini, A. (2020). ‘Epistemic Paternalism in Doctor-Patient Relationships.’ In Bernal, A. and Axtell, G. (eds), Epistemic Paternalism; Conceptions, Justifications, and Implications, pp. 123238. London: Rowman and Littlefield.Google Scholar
Bayne, T. and Hattiangadi, A. (2013). ‘Belief and Its Bedfellows.’ In Nottelmann, N. (ed.), New Essays on Belief, pp. 124–44. London and New York: Palgrave Macmillan UK. https://doi.org/10.1057/9781137026521_7.CrossRefGoogle Scholar
Carey, J.M., Chi, V., Flynn, D.J., Nyhan, B. and Zeitzoff, T. (2021). ‘The Effects of Corrective Information about Disease Epidemics and Outbreaks: Evidence from Zika and Yellow Fever in Brazil.’ Science Advances 2020, 110.Google Scholar
Cassam, Q. (2019). Vices of the Mind. Oxford: Oxford University Press.CrossRefGoogle Scholar
Chrisman, M. (2020). ‘Believing as We Ought and the Democratic Route to Knowledge.’ In Schmidt, S. and Ernst, G. (eds), The Ethics of Belief and Beyond: Understanding Mental Normativity, pp. 4770. Abingdon: Routledge.CrossRefGoogle Scholar
Christensen, D. (2007). ‘Epistemology of Disagreement: The Good News.’ Philosophical Review 116(2), 187217.CrossRefGoogle Scholar
Coady, D. (2007). ‘Are Conspiracy Theorists Irrational?Episteme 4, 193204.CrossRefGoogle Scholar
Dentith, M.R.X. (2017). ‘Conspiracy Theories on the Basis of the Evidence.’ Synthese 196, 2243–61 (2019). https://doi.org/10.1007/s11229-017-1532-7.CrossRefGoogle Scholar
Egan, A. (2009). ‘Imagination, Delusion, and Self-Deception.’ In Bayne, T. and Fernandez, J. (eds), Delusions, Self-Deception, and Affective Influences on Belief-Formation, pp. 263–80. New York, NY: Psychology Press.Google Scholar
Fantl, J. (2018). The Limitations of the Open Mind. Oxford: Oxford University Press.CrossRefGoogle Scholar
Ganapini, M.B. (2020). ‘Belief’s minimal rationality.’ Philosophical Studies, 177, 3363–3282. https://doi.org/10.1007/s11098-019-01369-y.Google Scholar
Gendler, T. (2008). ‘Alief and Belief.’ The Journal of Philosophy 105(10), 634–63. https://doi.org/10.5840/jphil20081051025.CrossRefGoogle Scholar
Goldberg, S. (2017). ‘Should Have Known.’ Synthese 194, 2863–94.CrossRefGoogle Scholar
Gunn, H.K. (2020). ‘How Should We Build and Epistemic Community?The Journal of Speculative Philosophy 34(4), 561–81.CrossRefGoogle Scholar
Hannon, M. and de Ridder, J. (2021). ‘The Point of Political Belief.’ In Hannon, and de Ridder, (eds), The Routledge Handbook of Political Epistemology, pp. 156–66. Abingdon: Routledge.CrossRefGoogle Scholar
Harris, K.R. (2022). ‘Conspiracy Theories, Populism, and Epistemic Autonomy.’ Journal of the American Philosophical Association 9(1), 2136.CrossRefGoogle Scholar
Helton, G. (2020). ‘If You Can’t Change What You Believe, You Don’t Believe It.’ Nous, 54(3), 501526. https://doi.org/10.1111/nous.12265.CrossRefGoogle Scholar
Howard, C. (2019 a). ‘Fitting Love and Reasons for Loving.’ Oxford Studies in Normative Ethics 6, 116–37.CrossRefGoogle Scholar
Howard, C. (2019 b). ‘The Fundamentality of Fit.’ Oxford Studies in Metaethics 14, 216–36.Google Scholar
Kripke, S. (2011). ‘Two Paradoxes of Knowledge.’ In Kripke, S. (ed.), Philosophical Troubles: Collected Papers (Volume 1), pp. 2751. New York: Oxford University Press.CrossRefGoogle Scholar
Langton, R. (1992). ‘Duty and Desolation.’ Philosophy 67, 481505.CrossRefGoogle Scholar
Lear, S. (2017). ‘In Defense of Practical Reasons for Belief.’ Australasian Journal of Philosophy 95(3), 529–42.CrossRefGoogle Scholar
Levy, N. (2020). ‘Partisan Worlds: Left and Right Don't Occupy Different Realities.’ iai news. Issue 88. May 19, 2020. https://iai.tv/articles/partisan-worlds-auid-1548.Google Scholar
Levy, N. (2022). Bad Beliefs: Why They Happen to Good People.Google ScholarPubMed
Marušić, B. and White, S. (2023). ‘Disagreement and Alienation.’Google Scholar
McKenna, R. (2023). Non-Ideal Epistemology (see Chapter 6). Oxford: Oxford University Press.CrossRefGoogle Scholar
Osbourne, R. (2021). ‘A Social Solution to the Puzzle of Doxastic Responsibility: A Two-Dimensional Account of Responsibility for Belief.’ Synthese 198(10), 9335–56.CrossRefGoogle Scholar
Rinard, S. (2015). ‘Against the New Evidentialists.’ Philosophical Issues 25, 208–23. doi: 10.1111/phis.12061.CrossRefGoogle Scholar
Rinard, S. (2019). ‘Believing for Practical Reasons.’ Nous 53(4), 763–84.CrossRefGoogle Scholar
Schleifer-McCormick, M. (2015). Believing against the Evidence: Agency and the Ethics of Belief. Abingdon: Routledge.Google Scholar
Schleifer-McCormick, M. (2018). ‘Responding to Skepticism about Doxastic Agency.’ Erkenntnis 83, 627–45. doi: 10.1007/s10670-017-9906-2.CrossRefGoogle Scholar
Schleifer-McCormick, M. (2019). ‘Can Beliefs Be Based on Practical Reasons?’ In Carter, A. and Bondy, P. (eds), Well Founded Belief: New Essays on the Epistemic Basing Relation, pp. 215–34. New York and Abingdon: Routledge.CrossRefGoogle Scholar
Schleifer-McCormick, M. (2022). ‘Belief as Emotion.’ Philosophical Issues 32, 104–19. doi: 10.1111/phis.12232.CrossRefGoogle Scholar
Schroder, M. (2019). ‘Persons as Things.’ Oxford Studies in Normative Ethics 9, 95115.CrossRefGoogle Scholar
Schwitzgebel, E. (2013). ‘A Dispositional Approach to Attitudes: Thinking Outside the Belief Box.’ In Nottelmann, N. (ed.), New Essays on Belief, pp. 7599. London and New York: Palgrave MacMillan UK. https://doi.org/10.1057/9781137026521_5.CrossRefGoogle Scholar
Shah, N. (2003). ‘How Truth Governs Belief.’ The Philosophical Review, 112(4), 447482. https://doi.org/10.1215/00318108-112-4-447.CrossRefGoogle Scholar
Strawson, P. (1974). ‘Freedom and Resentment.’ In Strawson (ed.), Freedom and Resentment and Other Essays, pp. 128. London: Methuen & Co.Google Scholar
Swartzer, S. (2013). ‘Appetitive Besires and the Fuss about Fit.’ Philosophical Studies 165(3), 975–88. https://doi.org/10.1007/s11098-012-0006-5.CrossRefGoogle Scholar
Van Leeuwen, N. (2014). ‘Religious credence is not factual belief.’ Cognition, 133(3), 698715. https://doi.org/10.1016/j.cognition.2014.08.015.CrossRefGoogle Scholar
Van Leeuwen, N. (2017). ‘Do religious “beliefs” respond to evidence?Philosophical Explorations, 20, 5272. https://doi.org/10.1080/13869795.2017.1287294.CrossRefGoogle Scholar
Velleman, D. (2000). The Possibility of Practical Reason. Oxford: Oxford University Press.Google Scholar