Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-22T17:44:44.026Z Has data issue: false hasContentIssue false

Concerns about Contextual Values in Science and the Legitimate/Illegitimate Distinction

Published online by Cambridge University Press:  03 June 2024

Inmaculada de Melo-Martín*
Affiliation:
Weill Cornell Medicine, Cornell University, New York, NY, USA
Rights & Permissions [Opens in a new window]

Abstract

Philosophers of science have come to accept that contextual values can play unavoidable and desirable roles in science. This has raised concerns about the need to distinguish legitimate and illegitimate value influences in scientific inquiry. I discuss here four such concerns: epistemic distortion, value imposition, undermining of public trust in science, and the use of objectionable values. I contend that preserving epistemic integrity and avoiding value imposition provide good reasons to attempt to distinguish between legitimate and illegitimate influences of values in science. However, the trust and the objectionable values concerns constitute no good reason for demarcation criteria.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of the Philosophy of Science Association

1. Introduction

Although not without critics (Betz Reference Betz2013; Hudson Reference Hudson2016; Cassini Reference Cassini2022), most philosophers of science now accept that science is value laden and that it is all the better for it. True, the value-free ideal of science never precluded all value influences in the core of science. It has never denied that epistemic values play a crucial role in scientific reasoning. But many philosophers of science have gone further and argued that contextual values can also exercise unavoidable and desirable influences at the core of science, that is, in decisions regarding experimental design, choice of methodologies, characterization of data, and interpretation of results (e.g., Longino Reference Longino1990; Douglas Reference Douglas2000, Reference Douglas2009; de Melo-Martín and Intemann Reference de Melo-Martín and Intemann2007; Elliott Reference Elliott2013; Biddle Reference Biddle2013; Brown Reference Brown2014; Anderson Reference Anderson2004; Dupré Reference Dupré, Kincaid, Dupre and Wylie2007; Wylie and Nelson Reference Wylie, Hankinson Nelson, Kincaid, Dupre and Wylie2007).Footnote 1

Accepting that contextual values influence scientific reasoning, however, raises concerns about possible negative effects on the scientific knowledge produced.Footnote 2 Such concerns have led philosophers of science to a debate now known as “the new demarcation problem” (Holman and Wilholt Reference Holman and Wilholt2022). While the “old” demarcation problem attempted to find criteria to distinguish between science and pseudoscience (Popper Reference Popper1963), the new one seeks to distinguish between legitimate and illegitimate influences of contextual values in science (Holman and Wilholt Reference Holman and Wilholt2022).

Various demarcation criteria have thus been proposed (ibid.), with philosophers also debating whether a set of necessary and jointly sufficient criteria or an open-ended list are needed to address the problem (Koskinen and Rolin Reference Koskinen and Rolin2022; Resnik and Elliott Reference Resnik and Elliott2023). The debate, however, would benefit from some more clarity. This is so because when demarcating legitimate and illegitimate influences of values in science, philosophers of science often fail to distinguish among what are conceptually distinct concerns regarding the value influences in science. Distinguishing among these concerns is important because the reasons for the concerns are different and they require different strategies to address them. Moreover, some of the concerns constitute no good reason for demarcation criteria.

Here I discuss four concerns usually given to distinguish between legitimate and illegitimate influences of contextual values in the core of scienceFootnote 3 : epistemic distortion, value imposition, undermining of public trust in science, and the use of objectionable values.Footnote 4 I contend that preserving epistemic integrity and avoiding value imposition provide good reasons for distinguishing between legitimate and illegitimate influences of contextual values in science. However, the undermining of trust and the objectionable-values concerns constitute no good reason for demarcation criteria. This is so because neither of them provides additional grounds for concern. Furthermore, to the extent that the latter problem is properly understood as one involving conflict of values, the distinction between legitimate and illegitimate values is unhelpful as at stake are disagreements about which values should be influencing research.

2. Contextual Values in Science: Why Worry?

One could worry about the influence of contextual values in science for various reasons. I do not claim that the ones discussed here are either the only reasons or the only appropriate ones. My claim is that some of those reasons call for demarcation between legitimate/illegitimate value influences while others do not. Also, I am not arguing that the problems discussed here are wholly unrelated. Indeed, I believe a reason why it is difficult to address these various problems is their connections. My argument is that the problems discussed are conceptually distinct and call for different solutions.Footnote 5 Failing to distinguish them when trying to find demarcation criteria is therefore bound to provide wrong answers, that is, inadequate criteria or ineffective solutions. Moreover, when I include some authors as concerned with one or another problem mentioned in the discussion below, I do not mean to suggest that they are concerned only with one of those problems. Indeed, my claim is that, whether implicitly or explicitly, several of the authors discussed are trying to attend to several of these problems without differentiating them.

2.1. The Epistemic Distortion Concern

Perhaps the most obvious reason for worrying about the influence of values in science is that they can distort research results.Footnote 6 At least one of the primary motivations for the value-free ideal of science is precisely to protect the epistemic integrity of science against problems such as wishful thinking or confirmation bias (Haack Reference Haack1998; Betz Reference Betz2013; Douglas Reference Douglas2009). The worry is that if contextual values are allowed to influence scientific reasoning, this could lead scientists to accept or assert hypothesis, theories, models, or interpretations of data based on how they wished the world to be or what their prior beliefs are rather than on how the world really is.Footnote 7 Values might then promote our social, ethical, or political aims at the expense of our epistemic ones. The case of Trofim Lysenko is often seen as a paradigm case of how political values can thwart the epistemic integrity of science (Gordin Reference Gordin2012). Science scholars have documented many other cases were sexist, androcentric, heterosexist, racist, and classist assumptions adversely influenced the epistemic soundness of research results (e.g., Gould Reference Gould1981; Hrdy Reference Hrdy and Bleier1986; Martin Reference Martin1991; Fausto-Sterling Reference Fausto-Sterling1992; Longino Reference Longino2013; Richardson Reference Richardson2013; Lloyd Reference Lloyd2005).

Epistemic distortions can involve forms other than those affecting the reliability of scientific results. Some philosophers are concerned about “disconnected expectations.” In these cases, values can bias methodological choices in ways that lead audiences of the research to have a systemically distorted understanding of what the research has shown (Holman and Wilholt Reference Holman and Wilholt2022). Values here influence not the conduct of research per se, but the degree to which methodological choices align with the expectations placed on them by others. This can be because some scientists simply flaunt conventions about epistemic risks without explicit indication, thus confusing other scientists (Wilholt Reference Wilholt2009) or the public (John Reference John2015), or because there is a mismatch between the methodological choices used and how the research is presented (Carrier Reference Carrier2013). Take, for instance, studies evaluating the health risk of exposure to low doses of bisphenol A, which exhibits hormone-like properties that mimic the effects of estrogen in the body (Wilholt Reference Wilholt2009). Some industry studies used the CD-SD strain of rat, which is particularly insensitive to any estrogen. Some of those studies included positive control groups that were exposed to the well-known estrogenic drug DES, where both the positive and negative controls failed to show differences. This should have alerted investigators to the unsuitability of the CD-SD strain, but they simply failed to mention the positive control in their publications. Industry researchers, arguably influenced by financial interests, thus flaunted conventions about epistemic risks without explicit indication, leading others to believe that the risks of bisphenol A were lower than warranted (ibid.).

That contextual values can sometimes lead to biased science seems uncontroversial. Similarly, a significant amount of evidence shows that value-laden methodological choices can lead others to misunderstand what research results show (Wilholt Reference Wilholt2009; Carrier Reference Carrier2013). Nonetheless, as the work of feminist scientists in various scientific fields have shown, contextual values can also be used in ways that enrich the epistemic soundness of the research (Hrdy Reference Hrdy and Bleier1986; Fausto-Sterling Reference Fausto-Sterling1992; Wylie Reference Wylie, Angela, Lunbeck and Schiebinger2001; Haraway Reference Haraway1989; Anderson Reference Anderson2004; Wylie and Nelson Reference Wylie, Hankinson Nelson, Kincaid, Dupre and Wylie2007). Because value influences can have positive and negative effects on the epistemic integrity of research, protecting such integrity calls for demarcating uses of value influences in science that are legitimate, that is, those that enhance or do not undermine the epistemic value of research, and those that are illegitimate, that is, those that produce epistemic distortions.

Various demarcation criteria have been proposed to address this concern. I will not assess their success here, but simply briefly describe some of them. An influential criterion has been proposed by Douglas (Reference Douglas2009, Reference Douglas2000). She calls for distinguishing between the kinds of roles that values can play in a variety of decisions. For her, contextual values can legitimately play an indirect role in determining how much evidence is needed to accept a hypothesis. They ought not play direct roles in determining whether a hypothesis is warranted by evidence (Douglas Reference Douglas2000, Reference Douglas2009). The direct/indirect distinction would putatively protect research from the wishful thinking or confirmation bias problems because judgments about what the evidence is, or whether a hypothesis is warranted, are insulated from contextual values.

Philosophers of science have also offered proposals to address concerns regarding the influence of values that can distort people’s expectations of research results. Such proposals defend coordinating strategies for setting methodological standards (Holman and Wilholt Reference Holman and Wilholt2022) and amount to a call for “truth in advertisement” (Carrier Reference Carrier2013). Scientific communities must collectively propose appropriate methodological standards to communicate with each other (Wilholt Reference Wilholt2009) or should be governed by fixed, high standards so that lay people can make sense of the information provided (John Reference John2015). Although, of course, those standards can be open to scrutiny and change, scientists challenging the conventional standards must be explicit about the changes.

Identifying uses of values that systematically, or more likely than not, lead to epistemic problems could help prevent such uses. But whether the concern is about biased research results or disconnected expectations, the problem with contextual values in these cases is about how they are used or how they influence reasoning. It is not, I contend, a problem about which particular values are used. That is, ethically, socially, and politically unobjectionable values, for example, equality, safety, solidarity and objectionable ones, for example, racists or sexist values, could all be used in ways that negatively affect the epistemic integrity of the research. Of course, it might be the case that some values—such as sexist and racist ones—are more likely to be used in epistemically damaging ways, but still the epistemic problem is about how they are used or influence reasoning rather than with the particular values themselves.

Addressing the epistemic distortion concern calls, then, for determinations about the mechanisms by which contextual values result (or are likely to result) in bias. Perhaps values are used in place of evidence (Douglas Reference Douglas2009), lead people to disregard contrary evidence (Anderson Reference Anderson2004), or direct people to use rigged methods (Elliott Reference Elliott2017).

This problem also calls for methods that can minimize or eliminate biasing mechanisms. This might involve, for instance, procedural strategies, such as the existence of avenues for criticism, the presence of shared standards, requirements for uptake of criticism, and for equality of intellectual authority. When followed by scientific communities these strategies can neutralize the biasing effects some uses of contextual values can have, thus preserving the objectivity of inquiry (Longino Reference Longino2002, Reference Longino1990). With these mechanisms, scientific decision making negatively influenced by values or interests is likely to be caught and corrected by others in the community who have different values, interests, and perspectives. These mechanisms do not make value influences disappear, but they ensure that the use of values receive critical scrutiny and that any negative influence such values have on scientific reasoning will be identified and corrected.

2.2. The Value Imposition Concern

A second reason grounding worries about the influence of contextual values in science is political (McMullin Reference McMullin, Asquith and Nickels1983; Lacey Reference Lacey1999; Mitchell Reference Mitchell, Machamer and Wolters2004; Betz Reference Betz2013). Science informs public policy and personal decisions and thus has significant implications for everyone. Even if the epistemic integrity of science is protected, if scientists use value judgments when conducting or communicating research, this gives them power in shaping policy and influencing personal decisions (John Reference John2019; Pielke Reference Pielke2007; Betz Reference Betz2013). Scientists can thus impose their values on everyone else, whether or not others share those values, hence violating democratic principles and infringing on personal autonomy. It becomes a form of coercive paternalism (Alexandrova Reference Alexandrova2017). For example, when conducting research, scientists who judge public health a priority might conclude, in the face of uncertainty, that certain substance is toxic. Such conclusion can then lead policy makers to limit or prohibit its use. Alternatively, scientists more concerned with economic losses, might conclude, when facing uncertainty, that the substance is safe. This information can encourage regulators to allow its use. In both cases, the scientists’ values affect the scientific conclusions they reach and with them the policies that might be implemented, whether or not the public shares the scientists’ preferences for some values over others when facing uncertainty. Because scientific conclusions are also relevant to many personal decisions, for example, whether to consume certain products or engage in particular activities, science laden with value judgments that might not be shared by, and are quite likely hidden from, individuals would also jeopardize personal autonomy. It denies people access to relevant information about the grounds for scientific conclusions, hence leading people to rely on values to which they may have good reasons to object.

Scientists having this power is problematic for several reasons.Footnote 8 First, they have no special expertise or authority in making ethical, political, and social value judgments, and thus, this task should not be left to them alone. Second, scientists as a group are not representative of the values held by members of pluralistic societies. Because scientific results can affect many people in significant ways, in pluralistic societies stakeholders should have some say in determining which values to endorse when conducting research. In a context where increasing numbers of scientists have commercial interests (Benea et al. Reference Benea, Turner, Michelle Roseman, Bero, Turner and Thombs2020), this concern is even more relevant as such interests might directly conflict with promoting knowledge that benefits the common good or with interests held by large sections of the public. Third, there are some reasonable disagreements about social, political, and ethical values. Those disagreements can be about whether certain values should be promoted or undermined when conducting research or about how to interpret the values in question. In pluralistic societies relevant stakeholders should have an opportunity to determine which social, political, or ethical values to endorse in cases of conflict, for example, whether when facing uncertainty, one should prioritize some risks over others, or risks affecting some groups over others. Arguably, in democratic societies, deciding collective goals and values should not be left to a handful of unelected scientists. Fourth, to the extent that scientists’ values are not widely shared by the public, that they are using their own values in their reasoning can undermine warranted public trust in science (Douglas Reference Douglas2023; Wilholt Reference Wilholt2013). This does not mean that publics can justifiably trust only research influenced by values they share, but it calls attention to the fact that the interests of scientists and those of at least some publics might conflict.

The value imposition concern thus expresses the worry that using values in science gives scientists disproportionate power in shaping policy and personal decisions and deprives policy makers and individuals of their right to partake in such decisions. This is inconsistent with democratic ideals and notions of personal autonomy. Thus, proposals to address this problem should primarily ensure that scientists alone are not making decisions about which values to use. This concern then also calls for demarcating legitimate and illegitimate value influences in science. From this perspective, if values are chosen by procedures that disregard recognition of the diversity of values espoused by relevant communities, then those values are illegitimate. If the selection of values follows procedures that consider the variety of relevant value interests, they are legitimate.

Importantly, the value imposition concern pertains to whether the values influencing scientific research have been selected following procedures that attend to the values of relevant stakeholders. It is also not an issue about which particular contextual values are used or imposed. That is, what the values are, for example, profit, efficiency, safety, equity, is not what is at stake. The issue is whether the social, political, or ethical values in question have been selected by procedures that are attentive to the interest of relevant communities. Addressing the value imposition concern then calls for identification of procedures that maximize the possibility that values shaping scientific investigations—or communicating the results of those investigations (John Reference John2019)—are shared, agreed upon, or scrutinized by relevant communities.

A variety of proposals have been offered to address the value imposition concern. Generally, they focus on identifying democratic and deliberative processes that facilitate that relevant parties, and not just scientists, have a say in what values should guide research (Intemann Reference Intemann2015; Schroeder Reference Schroeder2021; Kitcher Reference Kitcher2011; Lusk Reference Lusk2021; Elliott Reference Elliott2017) or that scientists communicate their value-influenced findings to communities that share those values (John Reference John2019). Often the proposals call for engaging relevant stakeholders in various ways, including community-based advisory boards, citizen panels, deliberative polling from relevant communities, seeking consensus, or identifying features of epistemic practices that allow for political debate and are not easily influenced by interested parties (Intemann Reference Intemann2015; Douglas Reference Douglas2009; Schroeder Reference Schroeder2021; John Reference John2021; Elliott Reference Elliott2017). In some cases, rather than offering specific ways of engaging with relevant publics, philosophers have proposed ideal democratic procedures where value judgments are accepted under conditions of ideal endorsement (Kitcher Reference Kitcher2011). Value judgments would thus be accepted if and only if they would be endorsed by an ideal conversation among all humans, under conditions of perfect mutual engagement, and aimed at serious equality of opportunity for all people to have a worthwhile life.

These procedures can be used in different ways. For some, they help select values with which to make or assess inductive risk decisions (Douglas Reference Douglas2017). For others, they can set the epistemic and social aims of research (Intemann Reference Intemann2015). The value judgments made by scientists when conducting research would need to promote those democratically endorsed aims. In other cases, value judgments that pass the test of ideal endorsement can be used in determining that a scientific claim is true enough and significant enough (Kitcher Reference Kitcher2011). Usually, these proposals are underdeveloped and call for further theoretical work on the meaning of democratic practices and values.

Many of the proposals to address the value imposition concern have been criticized on various practical and theoretical grounds (Havstad and Brown Reference Havstad, Brown, Elliott and Richards2017; Brown Reference Brown2020; Le Bihan Reference Le Bihan2023). Proponents of strategies to address the imposition concern usually recognize the challenges in determining the relevant population whose values must be considered, deciding how to engage relevant communities, and establishing how to address complexities involved in reaching agreements in context where stakeholders espouse a plurality of values. However, the difficulties, or even the impossibility, of developing and implementing procedures to ensure that shared values are used in conducting research have no bearing on whether the imposition problem gives us a reason for attempting to distinguish between legitimate and illegitimate value influences.

2.3. The Undermining of Public Trust Concern

Another reason given to worry about the influence of contextual values in science appeals to the importance of public trust in science (Bright Reference Bright2018; Elliott Reference Elliott2022; Holman and Wilholt Reference Holman and Wilholt2022). That trust, moral and epistemic, is central to producing science is uncontroversial. Scientists much place trust in the testimony of colleagues, their techniques, experiments, data, results, and theories to be able to carry out research (Hardwig Reference Hardwig1985, Reference Hardwig1991; Wilholt Reference Wilholt2013; Frost-Arnold Reference Frost-Arnold2013). When scientific projects involve teams of researchers from multiple disciplines, working at various institutions, and in different countries, researchers are epistemically dependent on one another. This makes trust all the more important (Andersen Reference Andersen2016). But epistemic trust is also significant to the interactions between science and society (Scheman Reference Scheman, Tuana and Morgen2001; Wilholt Reference Wilholt2013; Grasswick Reference Grasswick2010; Anderson Reference Anderson2011). People must trust scientific experts and rely on the information they provide to make sense of complex scientific phenomena about which they lack expertise. Likewise, because science is essential in policy decisions, the public must trust scientists to be able to participate in democratic discussions involving scientific knowledge. Fully realizing science’s goal of benefiting society thus requires warranted trust on the side of the public regarding scientific testimony.

Liam Kofi Bright (Reference Bright2018) has recently called attention to the trust-based arguments W. E. B. du Bois offered to defend the value-free ideal of science. The concern is that people will not trust science if they think scientists are motivated by goals other than the pursuit of truth, that is, if their research is influenced by contextual values. Given the importance of public trust in science, du Bois concluded that scientists should avoid incorporating contextual values in their reasoning (Bright Reference Bright2018).

Does the undermining of public trust concern call for demarcation criteria to distinguish between legitimate/illegitimate influences of contextual values in science? I do not think it does. This is so for two reasons. First, as conceptualized by du Bois, this concern simply calls for the exclusion of contextual value influences altogether, rather than for demarcation criteria between legitimate/illegitimate value influences. I will put this conceptualization aside, given that I take value influences in science to be unavoidable and often desirable. Second, understood in a way that accepts the value-laden nature of science, whatever concerns we might have about the undermining of trust in science result from the problems covered by the epistemic distortion and the value-imposition concerns. Let me explain.

Normatively speaking, we ought to be concerned with preserving warranted trust in science or avoiding warranted distrust rather than with simply preserving public trust. Trust is a complex, multifactorial phenomenon (Jones Reference Jones1996; Baier Reference Baier1986; Hardin Reference Hardin2002; Hollis Reference Hollis1998; Holton Reference Holton1994; O’Neill Reference O’Neill2002; Potter Reference Potter2002). In general, people do not trust others completely, but trust others to do certain things. For example, people may trust investigators to conduct research according to appropriate scientific standards, but not to take care of their children. They may trust investigators to produce reliable knowledge, but not to fix their house. In trusting, we presuppose that the person trusted is competent in some regard, such as conducting research or communicating scientific information. We also presuppose that the person trusted will be rightly motivated to do what we are entrusting them with doing. Both competency and willingness or motivation are central elements of warranted trust—though scholars disagree about the exact nature of such motivation (Baier Reference Baier1986; Hardin Reference Hardin2002; Holton Reference Holton1994; O’Neill Reference O’Neill2002).

People could indeed fail to trust scientists and their testimony if they believed scientists’ motivations are other than the truth, that is, if they believed that the research is influenced by contextual values. Whether this would be the case is an empirical question (Hicks and Lobato Reference Hicks Daniel and Lobato2022; Elliott et al. Reference Elliott, McCright, Allen and Dietz2017).Footnote 9 But regardless of what the empirical evidence could say, this worry is problematic for two reasons. First, it seems to incorrectly presuppose that any focus on contextual values would be at the expense of truth. However, this constitutes a false dichotomy as, doubtlessly, science could be focused—and presumably it is—on both truth and other contextual values, for example, truths that are of relevance to human beings, serve to advance the wellbeing of particular entities, or contribute to environmental health. Moreover, it could also be that the epistemic soundness of at least some research would be enhanced by the influence of other important values. For instance, perhaps a commitment to equality leads to research results that are more generalizable.

Second, this worry seems to conflate trust with warranted trust—or failing to trust with doing so warrantedly. But surely, we can be mistaken about placing or failing to place our trust. That is, sometimes people place their trust on those who are not trustworthy and fail to trust or even distrust those who are in fact trustworthy. Trust and distrust are in those cases unwarranted. Hence, concerns about whether the public trusts science that is influenced by contextual values must consider whether the public’s trust or their failure to trust science is warranted. However, if contextual values are unavoidable and are—at least sometimes—also desirable, then, people’s failure to trust scientists who use them would be unwarranted.

What could damage public trust justifiably in this context is the use of contextual values in ways that undermine the epistemic integrity of the research, or of values selected in ways that fail to be representative of, or are unconcerned with, the various publics interests. After all, people entrust researchers with the production of reliable knowledge and with doing so in ways that consider the interests of relevant stakeholders. If values are used in ways that disregard such goals, then people would be warranted in not trusting scientists. But this is precisely what grounds the epistemic and the value-imposition concerns, respectively. Hence, the undermining of trust concern provides no additional reasons to find demarcation criteria.

Could public warranted trust not be damaged on grounds that scientists use values that some publics find ethically or politically objectionable? Provided that the values in question are not used in ways that undermine the epistemic integrity of the research and that they have been selected following procedures that attend to the interests of relevant stakeholders, it is not clear on what justifiable grounds they would do so. After all, the particular values that some publics might find ethically or politically objectionable might be welcomed by other publics. Certainly, in pluralistic societies people might ultimately disagree with research results that has been shaped by legitimate value influences, that is, attend to the epistemic and the value-imposition concern. In these cases, people would be justified in disagreeing with the results and in calling for research that uses other values they shareFootnote 10

The discussion in the literature regarding this worry also evinces that epistemic and political concerns ground the worry about undermining trust. Indeed, those who have called attention to the problem (Bright Reference Bright2018) and those who have explicitly attempted to address it (Elliott Reference Elliott2022; Schroeder Reference Schroeder2021; John Reference John2021; Boulicault and Schroeder Reference Boulicault, Schroeder, Vallier and Weber2021) have placed the discussion squarely in the context of ensuring that science production includes values selected in ways that attend to the interests of relevant parties and that those values do not undermine the epistemic integrity of the research. Hence, some (John Reference John2015, Reference John2021), for instance, focus on the epistemic aspect of the problem and argue that addressing the trust concern requires that scientists employ high epistemic standards. Others (Schroeder Reference Schroeder2021; Boulicault and Schroeder Reference Boulicault, Schroeder, Vallier and Weber2021) call attention to idiosyncratic values as the source of the trust concern and contend that addressing this worry requires that scientists appeal to the values of the public or its representatives.

It seems then, that concerns about undermining public trust is science provide no additional reasons to find demarcation criteria between legitimate/illegitimate influences of values in science. Solutions to the epistemic distortion and the value-imposition problems would also address the undermining of trust concern.

2.4. The Objectionable Value Concern

Another reason for concern regarding the influence of values in science focuses on the particular values utilized. The worry is that some contextual values are simply the, ethically or politically, wrong—or right—values to use when conducting research (Clough and Loges Reference Clough and Loges2008; Kourany Reference Kourany2010; Goldenberg Reference Goldenberg2015; Leuschner and Fernandez Pinto Reference Leuschner and Fernandez Pinto2021, Reference Leuschner and Fernandez Pinto2022; Brown Reference Brown2020). This concern is based on the recognition that contextual values can have significant effects on what knowledge is produced. For instance, values can influence the framing of research questions and thus lead to different and even incompatible results. Contextual values can influence ways of weighting the consequences of error or of determining what impacts to assess and which are (most) important.

Contrary to the epistemic and value imposition worries, what I call the objectionable value concern is precisely about which particular values influence research. Instead of an issue about how values are used, or whether they have been chosen by procedures that attend to relevant interests, this concern calls for the identification of some values as illegitimate because they are ethically or politically objectionable ones, and of others as ethically or politically good, and thus legitimate values to influence science.

Importantly, many of those worried about the objectionable value concern, do so in terms of epistemic failures of the research using such values (e.g., Biddle and Leuschner Reference Biddle and Leuschner2015; Leuschner and Fernandez Pinto Reference Leuschner and Fernandez Pinto2021, Reference Leuschner and Fernandez Pinto2022). Their worry, however, is with the particular values that influence the research, that is, which values rather than with how the values are used. They fail to separate, I contend, what are relevantly different problems that are grounded on distinct worries and that call for different solutions.

Proposals attending to this concern seek to demarcate legitimate from illegitimate values on the basis of some values. Hence, values, such as racist and sexist ones that are widely recognized as unethical (Leuschner and Fernandez Pinto Reference Leuschner and Fernandez Pinto2021, Reference Leuschner and Fernandez Pinto2022; Kourany Reference Kourany2016) are illegitimate and should be excluded. Values that are widely shared or are consistent with certain conceptions of the good are legitimate and should be allowed to influence scientific reasoning. For instance, some have argued that only values that contribute to human flourishing, that meet the needs of society, or that promote public rather than private interests are legitimate values that should be allowed to influence scientific reasoning (Kitcher Reference Kitcher2011; Kourany Reference Kourany2010). For others, the values guiding research should be those likely to meet the needs of marginalized communities (Harding Reference Harding2008), or should be specific values such as equality (Kourany Reference Kourany2010). Others have proposed a Rawlsian solution to ascertain which values must influence scientific inquiry (Cabrera Reference Cabrera2023). Still others defend the need for a plurality of mandates in science so as to allow for partisan science (Hilligardt Reference Hilligardt2023).

Is the objectionable value concern one that calls for demarcation criteria to distinguish between legitimate/illegitimate value influences? I doubt it. Under some conceptualizations, this concern, like the trust-undermining one, provides no additional grounds for worry. Under others, what is at stake is value conflicts. For this, demarcation criteria between legitimate/illegitimate value influences in science are misplaced as what is at stake is precisely what values should be influencing science.

In some instances, the objectionable values concern focuses on the influence in scientific reasoning of what are thought to be uncontroversially ethically or politically objectionable values, such as racist and sexist ones (Kourany Reference Kourany2010; Leuschner and Fernandez Pinto Reference Leuschner and Fernandez Pinto2021, Reference Leuschner and Fernandez Pinto2022). Clearly, there are good reasons to worry about such influences, but they are covered by the epistemic and the value imposition concerns. First, although some have argued that these types of values can produce epistemically sound science (Kourany Reference Kourany2010), this is not obvious. At least some evidence suggests that such ethically or politically objectionable values can lead to biased research or to research that is empirically weaker (Clough Reference Clough2003; Anderson Reference Anderson1995). This might be the case because the objectionable values are unsupported by empirical evidence (Clough and Loges Reference Clough and Loges2008), they are less fruitful, or work as heuristics that lead researchers to ignore relevant data, disregard particular phenomena, or attend to irrelevant evidence (e.g., Anderson Reference Anderson2004; Douglas Reference Douglas2009; Elliott Reference Elliott2017). The many instances documented by feminists and race theorist scholars where these values have in fact resulted in biased science corroborates this. But insofar as this is the case, concerns about epistemic integrity rather than about the values themselves serve as reasons to distinguish between the illegitimacy or legitimacy of these value influences.

Moreover, to the extent that sexist and racist values can be rejected on democratic grounds, then such ethically or politically objectionable values would be excluded as illegitimate by appropriate value selection procedures.Footnote 11 Indeed, some of the proposals to address the imposition concern explicitly call for strategies that put constrains on democratic procedures to ensure that those types of values are not chosen. For some, the constraints are provided by commitments to feminists principles, which would preclude the influence of sexist or racist values in the conduct of research (Intemann Reference Intemann and Grasswick2011). Others have proposed mechanisms such as filtering to remove objectionable values or laundering to clean-up values based on false assumptions and prevent their use in science (Schroeder Reference Schroeder2021).Footnote 12 In any case, the political concern about imposition of values grounds exclusion of ethically or politically uncontroversially objectionable values as illegitimate. If so, worries about such values provide no additional reasons to demarcate between legitimate and illegitimate value influences in research.

In other instances, discussions regarding the objectionable value concern involve value conflicts; that is, the problem is not about uncontroversially unethical values influencing science but about the use of values that are arguably contested.Footnote 13 Take, for instance, the Klamath Project (KP) case used by Holman and Wilholt (Reference Holman and Wilholt2022) in their discussion of the new demarcation problem. The KP is a federal water-management project that supplies water to thousands of acres of farmland in the upper Klamath Basin. The Upper Klamath Lake (UKL) is the primary source of KP irrigation water, and it is also home to two species of federally endangered sucker fishes. In 2001, fears that the KP threatened the survival of the sucker led to two conflicting scientific assessments. One, by the U.S. Fish and Wildlife Service (FWS) called for a complete suspension of the KP water deliveries from UKL and thus would have resulted in significant financial loses to farmers. Another, by the National Research Council (NRC), and commissioned by the U.S. Department of Interior (DOI), concluded that there was no sound scientific basis for accepting the FWS 2001 report recommendation for UKL water levels and thus provided justification for a continuation of irrigation. Values clearly played a role in both scientific assessments. The FWS report was guided by the Endangered Species Act requirement not to take actions that could jeopardize the continued existence of the endangered species of sucker or result in the destruction or adverse alteration of their habitat. However, the NRC assessment did not prioritize protection of the sucker. Values—the public interest of biodiversity or the private interests of farmers—led the reports to weighing the negative impacts of the KP and the risks of error in conflicting ways.

Now, assuming that both the FWS and the NRC studies are epistemically sound, the values they espouse are consistent with appropriate attention to the values of relevant stakeholders—those of the farmers, and those of protecting endangered species and of the Klamath Tribes for whom the fish was tribal trust species—arrived at by politically legitimate procedures.Footnote 14 If so, it would not be surprising that none of the demarcation strategies considered by Holman and Wilholt (Reference Holman and Wilholt2022) provide an answer about whether one of these studies involved illegitimate value influences.Footnote 15 The case simply involves a conflict of values.

In pluralistic societies value conflicts are not only common but inevitable given people’s different conceptions of the good or different understandings of what constitute just societies. This is not to say, of course, that we cannot offer reasons to accept one of those studies rather than the other. What I am saying is that those reasons would be grounded on weighting various competing goods, values, and interests rather than on declaring some of them simply illegitimate and thus in need of exclusion.

If I am correct then, rather than a demarcation between legitimate/illegitimate values, this version of the objectionable value concern calls for transparency and openness regarding the values influencing scientific inquiry (Douglas Reference Douglas2009; Elliott Reference Elliott2013; Intemann and de Melo-Martín Reference Intemann and de Melo-Martín2023). Clearly, these do not constitute ways to solve value conflicts, but they would allow for careful consideration of those values and for their critical evaluation. The problem also calls for conflict resolution strategies (Laursen et al. Reference Laursen, Gonnerman and Crowley2021). I am not claiming that such strategies are either easy to develop or—much less—easy to put into practice. But we would do better developing conflict resolution strategies instead of seeking demarcation criteria to declare particular values illegitimate on grounds that those values are ethically or politically objectionable.

Further support for my claim that the objectionable value problem often involves value conflicts comes from the evaluations of cases dealing with the different concerns discussed here. Few disagree about instances where the influence of values is illegitimate on epistemic or value-imposition grounds. I am unaware, for example, of anyone who has argued that Lysenko’s case constitutes a case in which values played legitimate influences. Likewise, much of the work feminists have done calling attention to illegitimate value influences in many areas of science has not been contested. However, many of the cases used to discuss the objectionable value problem, that is a problem regarding which values should influence science, are contested. Consider for instance the heated debates about many instances of scientific dissent (de Melo-Martín and Intemann Reference de Melo-Martín and Intemann2018), the various reasonable interpretations about cases involving what some consider objectionable values, that is, financial profit (Hicks Reference Hicks2014; Cabrera Reference Cabrera2023), or the difficulty determining what exactly is wrong with cases in which presumed objectionable values are at stake (Holman and Wilholt Reference Holman and Wilholt2022).

3. Conclusion

The values and science literature is often described as needing more clarity and precision on various issues: what it means for science to be value free (de Melo-Martín and Intemann Reference de Melo-Martín and Intemann2016; Elliott Reference Elliott2022), what kinds of things values are (Brown Reference Brown2020; Rooney Reference Rooney2017), what they do or how they influence judgment (Ward Reference Ward2021), or what constitutes socially responsible science (de Melo-Martín and Intemann Reference de Melo-Martín and Intemann2023). Debates in science and values would also benefit from more clarity on what constitutes compelling reasons to find criteria to distinguish between legitimate and illegitimate influences of contextual values in science.

I have argued here that concerns about the distorting effect of contextual values on knowledge and about undermining democratic ideals and personal autonomy are appropriate worries that call for ways to distinguish between legitimate and illegitimate contextual value influences. Worries about these two problems might go hand in hand, but these two problems call for different strategies. Safeguarding epistemic integrity requires strategies to identify how values are used in biasing ways or how they negatively influence scientific reasoning. The value imposition concern, however, requires finding procedures that ensure that whatever values are influencing science are appropriately representative or can be scrutinized. Given the different types of solutions required, discussions on the new demarcation problem would do well to distinguish on what grounds the legitimacy/illegitimacy distinction is needed. Importantly, neither of these are problems is about which particular values are used. Ethically or politically objectionable and unobjectionable values could, in principle at least, be considered legitimate or illegitimate on either of these grounds.

Two other worries about values in science often mixed with the two previous one, the public trust and the objectionable values concerns, are not appropriate grounds for demarcation criteria between legitimate/illegitimate influences of values in science. On some conceptualizations, they provide no additional grounds for concern. The worries are already covered by the epistemic and the value imposition concerns. Additionally, insofar as the objectionable value problem involves conflicts of values, finding demarcation criteria between legitimate/illegitimate value influences is misguided as what is at stake is disagreements about which values should be shaping research. Discussions on the new demarcation problem would do well to clearly separate this problem from the others. This concern calls for transparency and openness regarding the values guiding research, not as a solution to addressing disputes, but as a way to ensure that values are appropriately identified and critically assessed. It also requires the development of strategies for value conflict resolution.

As important a role as science has in modern societies, I fear that the debate on value influences in science simply grants excessive power to it and seems to presuppose that we have no mechanisms other than science to ensure (more) just societies. But we have other ways to influence what science is produced. We also have other social and political mechanisms to improve our societies. To accomplish this goal, we do not need to—and if I am right, we cannot—rely on finding criteria to eliminate some ethically or politically objectionable values from the core of science.

Acknowledgments

I am grateful to Kristen Intemann and David Teira for helpful comments on prior versions of this manuscript.

Footnotes

1 Although everyone accepts that contextual values play legitimate influences in the so-called contexts of discovery and application, there are also reasons for concern regarding what such influences are. Nonetheless, the legitimate/illegitimate debate has focused on the influence of contextual values in the core of science, or the so-called context of justification. That is my focus here. This does not mean that these different contexts or phases are linear or unrelated. Indeed, at times the legitimate/illegitimate debate mixes concerns about all these different levels, which contributes to the problems pointed out here.

2 My discussion centers on contextual values because those have been the terms of the demarcation debate. Other types of contextual influences might be relevant to scientific research (see Hilligardt Reference Hilligardt2022).

3 Those who want to return to a value-free ideal of science and those who believe that such ideal is mistaken can share the concerns discussed here. However, those who defend the value-free ideal use the concerns as reasons to keep values out of science while for those who believe that science is unavoidably value-laden, the concerns provide reasons to appropriately manage the values used. I thank an anonymous reviewer for pointing this out.

4 In their recent paper, Holman and Wilholt (Reference Holman and Wilholt2022) also use some of these worries as reasons for demarcation criteria. They put the worries in terms of desiderata that must be addressed for a demarcation criterion to be satisfactory. Their “veracity” desideratum expresses the “epistemic distortion” worry that I discuss; their “universality” could be understood as my “value imposition” concern; and their “authority” is comparable to my “undermining of public trust” worry. Contrary to Holman and Wilholt, my argument is that this last worry does not call for demarcation criteria. Holman and Wilholt do not directly discuss what I call the “objectionable value” concern and their discussion seems to suffer from the problems I point out here regarding that problem.

5 See (Wagner Reference Wagner2022), who also argues that concerns about value influences in science call for distinct solutions. I thank an anonymous reviewer for bringing this article to my attention.

6 This is a reasonable concern. However, the excessive worry the science and values literature expresses with the potential biasing effects of contextual values betrays an unwarranted view of such values as inherently problematic. See (Brown Reference Brown2020) for a detailed discussion on this.

7 Some have challenged the claim that wishful thinking or confirmation bias are always epistemically detrimental and argued that confirmatory values and dogmatism can have epistemic benefits (Peters Reference Peters2021).

8 Although I refer to scientists as the ones imposing their values on others that might not share them, this does not mean that the problem is only of concern when the values at stake are those of scientists. Scientists are simply the ones doing the valuing when conducting science, but the values in question could be those of others, e.g., funders. The value imposition concern is about whose values are the ones shaping scientific inquiry and whether the values used have been chosen in ways that attend to the interests of relevant stakeholders.

9 Note that were this the case, and assuming that contextual values are unavoidable, people’s failure to trust could provide reasons to ensure that people do not know that non-epistemic values play a role in science. See (John Reference John2018).

10 Thanks to an anonymous reviewer for bringing up this point. See, also, Hilligardt (Reference Hilligardt2023) and John (Reference John2019) for proposals about conducting or communicating science influenced by values shared by particular communities.

11 I am not suggesting that uncontroversially unethical values are easily recognized as such by everyone. For the most part, scientists—even if they espouse racist and sexist values—do not openly claim that their research is guided by such objectionable values. The disagreements are often about whether the particular values influencing their research, e.g., about gender or racial differences, or about the values they want to advance are in fact racists and sexists.

12 Democratic procedures are imperfect, hence, knowing this, one can put constrains—that have been democratically agreed upon—on the procedures used for value selection to ensure that some values that are undemocratic cannot be selected. This is not a way to bake the objectionable value problem into the value-imposition concern as what those ethically or politically objectionable values might be is not stipulated—it would be decided by democratic procedures. I thank two anonymous reviewers for forcing me to clarify this point.

13 Again, those worrying about this problem present it as one involving epistemic worries, but their concern is with which values are used rather that how they are used and thus they seek to exclude particular contested values as illegitimate. Much of the discussion on industry bias instantiates this problem. See, for instance, de Melo-Martín (Reference de Melo-Martín2019) for a discussion of this issue. I take it that commercial values are not uncontroversially unethical ones.

14 Both the NRC Committee and the FWS acted in accordance with federal mandates.

15 Holman and Wilholt are unclear about what their evaluation of the demarcation strategies they discuss does regarding the Klamath dispute. They explicitly say that they are not claiming that one of the two groups has “crossed a line between epistemically admissible and inadmissible forms of value influences—or that a ‘good’ demarcation criterion should bear this out.” (p. 216). However, they also explicitly say that it is precisely in cases like the Klamath dispute—where private and public interest conflict—“that a demarcation criterion would be especially helpful.” What I am arguing is that, provided the use of those values avoid the epistemic and the value-imposition concerns, there are no grounds for a demarcation criterion to be helpful in resolving the dispute.

References

Alexandrova, Anna. 2017. A Philosophy for the Science of Well-Being. New York: Oxford University Press. https://doi.org/10.1093/oso/9780199300518.001.0001.CrossRefGoogle Scholar
Andersen, Hanne. 2016. “Collaboration, Interdisciplinarity, and the Epistemology of Contemporary Science.” Studies in History and Philosophy of Science 56:110. https://doi.org/10.1016/j.shpsa.2015.10.006.CrossRefGoogle ScholarPubMed
Anderson, Elisabeth. 1995. “Knowledge, Human Interests, and Objectivity in Feminist Epistemology.” Philosophical Topics 23 (2):2758. https://doi.org/10.5840/philtopics199523213 CrossRefGoogle Scholar
Anderson, Elisabeth. 2004. “Uses of Value Judgments in Science: A General Argument, with Lessons from a Case Study of Feminist Research on Divorce.” Hypatia 19 (1):124. https://doi.org/10.1111/j.1527-2001.2004.tb01266.x.CrossRefGoogle Scholar
Anderson, Elisabeth. 2011. “Democracy, Public Policy, and Lay Assessments of Scientific Testimony.” Episteme 8 (2):144–64. https://doi.org/10.3366/epi.2011.0013.CrossRefGoogle Scholar
Baier, Annette. 1986. “Trust and Antitrust.” Ethics 96 (2):231–60. https://doi.org/10.1086/292745.CrossRefGoogle Scholar
Benea, Carla, Turner, Kimberly A., Michelle Roseman, Lisa A. Bero, Joel Lexchin, Turner, Erick H., and Thombs, Brett D.. 2020. “Reporting of Financial Conflicts of Interest in Meta-Analyses of Drug Trials Published in High-Impact Medical Journals: Comparison of Results from 2017 to 2018 and 2009.” Systematic Reviews 9 (1):7777. https://doi.org/10.1186/s13643-020-01318-5.CrossRefGoogle ScholarPubMed
Betz, Gregor. 2013. “In Defense of the Value Free Ideal.” European Journal of the Philosophy of Science 3 (2):207–20. https://doi.org/10.1007/s13194-012-0062-x.CrossRefGoogle Scholar
Biddle, Justin. 2013. “State of the Field: Transient Underdetermination and Values in Science.” Studies in History and Philosophy of Science 44 (1):124–33. https://doi.org/10.1016/j.shpsa.2012.09.003.CrossRefGoogle Scholar
Biddle, Justin, and Leuschner, Anna. 2015. “Climate skepticism and the manufacture of doubt: Can dissent in science be epistemically detrimental?European Journal of the Philosophy of Science 5 (3): 261–78. https://doi.org/10.1007/s13194-014-0101-x.CrossRefGoogle Scholar
Boulicault, Marion and Schroeder, Andrew. 2021. “Public Trust in Science: Exploring the Idiosyncrasy-Free Ideal.” In Social Trust: Foundational and Philosophical Issues, edited by Vallier, Kevin and Weber, Michael, 102–21, New York: Routledge. https://doi.org/10.4324/9781003029786 CrossRefGoogle Scholar
Bright, Liam Kofi. 2018. “Du Bois’ Democratic Defence of the Value Free Ideal.” Synthese 195 (5):2227–45. https://doi.org/10.1007/s11229-017-1333-z.CrossRefGoogle Scholar
Brown, Matthew J. 2014. “Values in Science beyond Underdetermination and Inductive Risk.” Philosophy of Science 80 (5):829–39. https://doi.org/10.1086/673720.CrossRefGoogle Scholar
Brown, Matthew J. 2020. Science and Moral Imagination: A New Ideal For Values in Science. Pittsburgh: University of Pittsburgh Press. https://doi.org/10.2307/j.ctv18b5d19.CrossRefGoogle Scholar
Cabrera, Frank. 2023. “A Rawlsian Solution to the New Demarcation Problem.” Canadian Journal of Philosophy 52 (8):810–27. https://doi.org/10.1017/can.2023.19.CrossRefGoogle Scholar
Carrier, Martin. 2013. “Values and Objectivity in Science: Value-Ladenness, Pluralism and the Epistemic Attitude.” Science & Education 22 (10):2547–68. https://doi.org/10.1007/s11191-012-9481-5.CrossRefGoogle Scholar
Cassini, Alejandro. 2022. “Simulation Models and Probabilities: A Bayesian Defense of the Value-Free Ideal.” Simulation-Transactions of the Society for Modeling and Simulation International 98 (2):113–25. https://doi.10.1177/00375497211028815.CrossRefGoogle Scholar
Clough, Sharyn. 2003. Beyond Epistemology: A Pragmatist Approach to Feminist Science Studies. Lanham, MD: Rowman & Littlefield Publishers. https://doi.org/10.1353/hyp.2005.0132.Google Scholar
Clough, Sharyn, and Loges, William E.. 2008. “Racist Value Judgments as Objectively False Beliefs: A Philosophical and Social-Psychological Analysis.” Journal of Social Philosophy 39 (1):7795. https://doi.org/10.1111/j.1467-9833.2007.00412.x.CrossRefGoogle Scholar
de Melo-Martín, Inmaculada. 2019. “The Commercialization of the Biomedical Sciences: (Mis)Understanding Bias.” History and Philosophy of the Life Sciences 41 (34). https://doi.org/10.1007/s40656-019-0274-x.CrossRefGoogle ScholarPubMed
de Melo-Martín, Inmaculada, and Intemann, Kristen. 2007. “Can Ethical Reasoning Contribute to Better Epidemiology? A Case Study in Research on Racial Health Disparities.” European Journal of Epidemiology 22 (4):215–21. https://doi.org/10.1007/s10654-007-9108-3.CrossRefGoogle ScholarPubMed
de Melo-Martín, Inmaculada, and Intemann, Kristen. 2016. “The Risk of Using Inductive Risk to Challenge the Value-Free Ideal.” Philosophy of Science 83 (4):500–20. https://doi.org/10.1086/687259.CrossRefGoogle Scholar
de Melo-Martín, Inmaculada, and Intemann, Kristen. 2018. The Fight against Doubt: How to Bridge the Gap between Scientists and the Public. New York: Oxford University Press. https://doi.org/10.1093/oso/9780190869229.001.0001.CrossRefGoogle Scholar
de Melo-Martín, Inmaculada, and Intemann, Kristen. 2023. “Socially Responsible Science: Exploring the Complexities.” European Journal for Philosophy of Science 13 (3). https://doi.org/10.1007/s13194-023-00537-6.CrossRefGoogle Scholar
Douglas, Heather. 2000. “Inductive Risk and Values in Science.” Philosophy of Science 67 (4):559–79. https://doi.org/10.1086/392855.CrossRefGoogle Scholar
Douglas, Heather. 2009. Science, Policy, and The Value-Free Ideal. Pittsburgh: University of Pittsburgh Press. https://doi.org/10.2307/j.ctt6wrc78.CrossRefGoogle Scholar
Douglas, Heather. 2017. “Science, Values, and Citizens.” In Eppur Si Muove: Doing History and Philosophy of Science with Peter Machamer: A Collection of Essays in Honor of Peter Machamer 81:83–96. https://doi.org/10.1007/978-3-319-52768-0_6.CrossRefGoogle Scholar
Douglas, Heather. 2023. “The Importance of Values for Science.” Interdisciplinary Science Reviews 48 (2):251–63. https://doi.org/10.1080/03080188.2023.2191559.CrossRefGoogle Scholar
Dupré, John. 2007. “Fact and Value.” In Value-Free Science? Ideals and Illusions, edited by Kincaid, Harold, Dupre, John, and Wylie, Alison, 2741. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195308969.003.0003.CrossRefGoogle Scholar
Elliott, Kevin. 2013. “Douglas on Values: From Indirect Roles to Multiple Goals.” Studies in History and Philosophy of Science 44 (3):375–83. https://doi.org/10.1016/j.shpsa.2013.06.003.CrossRefGoogle Scholar
Elliott, Kevin. 2017. A Tapestry of Values: An Introduction to Values in Science. New York: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780190260804.001.0001.CrossRefGoogle Scholar
Elliott, Kevin. 2022. Values in Science. Cambridge: Cambridge University Press. https://doi.org/10.4324/9780203000502-36.CrossRefGoogle Scholar
Elliott, Kevin, McCright, Aaron M., Allen, Summer, and Dietz, Thomas. 2017. “Values in Environmental Research: Citizens’ Views of Scientists Who Acknowledge Values.” PLOS ONE 12 (10):e0186049. https://https://doi.org/10.1371/journal.pone.0186049.CrossRefGoogle Scholar
Fausto-Sterling, Anne. 1992. Myths of Gender: Biological Theories about Women and Men. 2nd ed. New York: BasicBooks.Google Scholar
Frost-Arnold, Karen. 2013. “Moral Trust and Scientific Collaboration.” Studies in History and Philosophy of Science 44 (3):301–10. https://doi.org/10.1016/j.shpsa.2013.04.002.CrossRefGoogle Scholar
Goldenberg, Maya J. 2015. “How Can Feminist Theories of Evidence Assist Clinical Reasoning and Decision-Making?Social Epistemology 29 (1):330. https://doi.org/10.1080/02691728.2013.794871.CrossRefGoogle Scholar
Gordin, Michael D. 2012. “How Lysenkoism Became Pseudoscience: Dobzhansky to Velikovsky.” Journal of the History of Biology 45 (3):443–68. https://doi.org/10.1007/s10739-011-9287-3.CrossRefGoogle ScholarPubMed
Gould, Stephen Jay. 1981. The Mismeasure of Man. New York: Norton.Google Scholar
Grasswick, Heidi E. 2010. “Scientific and Lay Communities: Earning Epistemic Trust Through Knowledge Sharing.” Synthese 177 (3):387409. https://doi.org/10.1007/s11229-010-9789-0.CrossRefGoogle Scholar
Haack, Susan. 1998. Manifesto of a Passionate Moderate: Unfashionable Essays. Chicago: University of Chicago Press.Google Scholar
Haraway, Donna. 1989. Primate Visions: Gender, Race, and Nature in the World of Modern Science. New York: Routledge.Google Scholar
Hardin, Russell. 2002. Trust and Trustworthiness. New York: Russell Sage Foundation.Google Scholar
Harding, Sandra. 2008. Sciences from Below: Feminisms, Postcolonialities, and Modernities. Durham: Duke University Press. https://doi.org/10.2307/j.ctv11smmtn.Google Scholar
Hardwig, John. 1985. “Epistemic Dependence.” Journal of Philosophy 82 (7):335–49. https://doi.org/10.2307/2026523.CrossRefGoogle Scholar
Hardwig, John. 1991. “The Role of Trust in Knowledge.” Journal of Philosophy 88 (12):693708. https://doi.10.2307/2027007.CrossRefGoogle Scholar
Havstad, Joyce, and Brown, Matthew. 2017. “Inductive Risk, Deferred Decisions, and Climate Science Advising.” In Exploring Inductive Risk, edited by Elliott, K. C. and Richards, T., 101–23. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780190467715.003.0006.Google Scholar
Hicks, Daniel J. 2014. “A New Direction for Science and Values.” Synthese 191 (14):3271–95. https://doi.org/10.1007/s11229-014-0447-9.CrossRefGoogle Scholar
Hicks Daniel, J., and Lobato, Emilio J. C.. 2022. “Values Disclosures and Trust in Science: A Replication Study.” Frontiers in Communication 7:1017362. https://doi.org/10.3389/fcomm.2022.1017362 CrossRefGoogle Scholar
Hilligardt, Hannah. 2022. “Looking beyond Values: The Legitimacy of Social Perspectives, Opinions and Interests in Science.” European Journal for Philosophy of Science 12 (58). https://doi.org/10.1007/s13194-022-00490-w.CrossRefGoogle ScholarPubMed
Hilligardt, Hannah. 2023. “Partisan Science and the Democratic Legitimacy Ideal.” Synthese 202:135. https://doi.org/10.1007/s11229-023-04370-5 CrossRefGoogle Scholar
Hollis, Martin. 1998. Trust within Reason. New York: Cambridge University Press.CrossRefGoogle Scholar
Holman, Bennett, and Wilholt, Torsten. 2022. “The New Demarcation Problem.” Studies in History and Philosophy of Science 91:211–20. https://doi.org/10.1016/j.shpsa.2021.11.011.CrossRefGoogle ScholarPubMed
Holton, Richard. 1994. “Deciding to Trust, Coming to Believe.” Australasian Journal of Philosophy 72:6376. https://doi.org/10.1080/00048409412345881.CrossRefGoogle Scholar
Hrdy, Sarah B. 1986. “Empathy, Polyandry and the Myth of the Coy Female.” In Feminist Approaches to Science, edited by Bleier, R., 119–46. New York: Pergamon.Google Scholar
Hudson, Robert. 2016. “Why We Should Not Reject the Value-Free Ideal of Science.” Perspectives on Science 24 (2):167–91. https://doi.org/10.1162/posc_a_00199.CrossRefGoogle Scholar
Intemann, Kristen. 2011. “Diversity and Dissent in Science: Does Democracy Always Serve Feminist Aims?” In Feminist Epistemology and Philosophy of Science: Power in Knowledge, edited by Grasswick, Heidi, 111–32. Dordrecht, Netherlands: Springer. https://doi.org/10.1007/978-1-4020-6835-5_6.CrossRefGoogle Scholar
Intemann, Kristen. 2015. “Distinguishing between Legitimate and Illegitimate Values in Climate Modeling.” European Journal of Philosophy of Science 5:217–32. https://doi.org/10.1007/s13194-014-0105-6.CrossRefGoogle Scholar
Intemann, Kristen, and de Melo-Martín, Inmaculada. 2023. “On Masks and Masking: Epistemic Harms and Science Communication.” Synthese 202 (93). https://doi.org/10.1007/s11229-023-04322-z.CrossRefGoogle Scholar
John, Stephen. 2015. “Inductive Risk and the Contexts of Communication.” Synthese 192 (1):7996. https://doi.org/10.1007/s11229-014-0554-7.CrossRefGoogle Scholar
John, Stephen. 2018. “Epistemic Trust and the Ethics of Science Communication: Against Transparency, Openness, Sincerity and Honesty.” Social Epistemology 32 (2):7587. https://doi.org/10.1080/02691728.2017.1410864.CrossRefGoogle Scholar
John, Stephen. 2019. “Science, Truth and Dictatorship: Wishful Thinking or Wishful Speaking?Studies in History and Philosophy of Science 78:6472. https://doi.org/10.1016/j.shpsa.2018.12.003.CrossRefGoogle ScholarPubMed
John, Stephen. 2021. “Science, Politics and Regulation: The Trust-Based Approach to the Demarcation Problem.” Studies in History and Philosophy of Science 90:19. https://doi.org/10.1016/j.shpsa.2021.08.006.CrossRefGoogle Scholar
Jones, Karen. 1996. “Trust as an Affective Attitude.” Ethics 107 (1):425. https://doi.org/10.1086/233694.CrossRefGoogle Scholar
Kitcher, Philip. 2011. Science in a democratic Society. Amherst, NY: Prometheus Books.CrossRefGoogle Scholar
Koskinen, Inkeri, and Rolin, Kristina. 2022. “Distinguishing between Legitimate and Illegitimate Roles for Values in Transdisciplinary Research.” Studies in History and Philosophy of Science 91:191–98. https://doi.org/10.1016/j.shpsa.2021.12.001.CrossRefGoogle ScholarPubMed
Kourany, Janet A. 2010. Philosophy of Science after Feminism. New York: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199732623.001.0001.CrossRefGoogle Scholar
Kourany, Janet A. 2016. “Should Some Knowledge Be Forbidden? The Case of Cognitive Differences Research.” Philosophy of Science 83 (5):779–90. https://doi.org/10.1086/687863.CrossRefGoogle Scholar
Lacey, Hugh. 1999. Is Science Value Free? Values and Scientific Understanding. New York: Routledge. https://doi.org/10.4324/9780203983195.Google Scholar
Laursen, Bethany K., Gonnerman, Chad, and Crowley, Stephen J.. 2021. “Improving Philosophical Dialogue Interventions to Better Resolve Problematic Value Pluralism in Collaborative Environmental Science.” Studies in History and Philosophy of Science 87:5471. https://doi.org/10.1016/j.shpsa.2021.02.004.CrossRefGoogle ScholarPubMed
Le Bihan, Soazig. 2023. “How to Not Secure Public Trust in Science: Representative Values v. Polarization and Marginalization.” Philosophy of Science 1–11. https://doi.org/10.1017/psa.2023.119 CrossRefGoogle Scholar
Leuschner, Anna, and Fernandez Pinto, Manuela. 2021. “How Dissent on Gender Bias in Academia Affects Science and Society: Learning from the Case of Climate Change Denial.” Philosophy of Science 88 (4):573–93. https://doi.org/10.1086/713903.CrossRefGoogle Scholar
Leuschner, Anna, and Fernandez Pinto, Manuela. 2022. “Exploring the limits of Dissent: The Case of Shooting Bias.” Synthese 200:326. https://doi.org/10.1007/s11229-022-03783-y.CrossRefGoogle Scholar
Lloyd, Elisabeth Anne. 2005. The Case of the Female Orgasm: Bias in the Science of Evolution. Cambridge, MA: Harvard University Press. https://doi.org/10.4159/9780674040304.Google Scholar
Longino, Helen E. 1990. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry. Princeton, NJ: Princeton University Press. https://doi.org/10.1515/9780691209753.CrossRefGoogle Scholar
Longino, Helen E. 2002. The Fate of Knowledge. Princeton, NJ: Princeton University Press. https://doi.org/10.1515/9780691187013.CrossRefGoogle Scholar
Longino, Helen E. 2013. Studying Human Behavior: How Scientists Investigate Aggression and Sexuality. Chicago: University of Chicago Press. https://doi.org/10.7208/chicago/9780226921822.001.0001.CrossRefGoogle Scholar
Lusk, Greg. 2021. “Does Democracy Require Value-Neutral Science? Analyzing the Legitimacy of Scientific Information in the Political Sphere.” Studies in History and Philosophy of Science 90:102–10. https://doi.org/10.1016/j.shpsa.2021.08.009.CrossRefGoogle ScholarPubMed
Martin, Emily. 1991. “The Egg and the Sperm: How Science Has Constructed a Romance Based on Stereotypical Male-Female Roles.” Signs 16 (3):485501. https://doi.org/10.1086/494680.CrossRefGoogle Scholar
McMullin, Ernan. 1983. “Values in Science.” In PSA 1982: The Proceedings of the 1982 Biennial Meeting of the Philosophy of Science Association, edited by Asquith, Peter and Nickels, Thomas, 328. East Lansing, MI: Philosophy of Science Association.Google Scholar
Mitchell, Sandra. 2004. “The Prescribed and Proscribed Values in Science Policy.” In Science, Values and Objectivity, edited by Machamer, Peter and Wolters, Gereon, 245–55. Pittsburgh: University of Pittsburgh Press. https://doi.org/10.2307/j.ctt5vkg7t.16.CrossRefGoogle Scholar
O’Neill, Onora. 2002. A Question of Trust. Cambridge: Cambridge University Press.Google Scholar
Peters, Uwe. 2021. “Illegitimate Values, Confirmation Bias, and Mandevillian Cognition in Science.” British Journal for the Philosophy of Science 72 (4):1061–81. https://doi.org/10.1093/bjps/axy079.CrossRefGoogle Scholar
Pielke, Roger A. 2007. The Honest Broker. Cambridge and New York: Cambridge University Press.CrossRefGoogle Scholar
Popper, Karl. 1963. Conjectures and Refutations: The Growth of Scientific Knowledge. London: Routledge.Google Scholar
Potter, Nancy Nyquist. 2002. How Can I Be Trusted? Lanham, MD: Rowman & Littlefield.Google Scholar
Resnik, David B., and Elliott, Kevin C.. 2023. “Science, Values, and the New Demarcation Problem.” Journal for General Philosophy of Science 54 (2):259–86. https://doi.org/10.1007/s10838-022-09633-2.CrossRefGoogle ScholarPubMed
Richardson, Sarah S. 2013. Sex Itself: The Search for Male and Female in the Human Genome. Chicago: University of Chicago Press. https://doi.org/10.7208/chicago/9780226084718.003.0001.CrossRefGoogle Scholar
Rooney, Phyllis. 2017. “The Borderlands between Epistemic and Non-Epistemic Values.” In Current Controversies in Values and Science, edited by K. C. Elliott and D. Steel, 31–45. New York: Routledge. https://doi.org/10.4324/9781315639420-3.Google Scholar
Scheman, Naomi. 2001. “Epistemology Resuscitated. Objectivity and Trustworthiness.” In Engendering Rationalities, edited by Tuana, N. and Morgen, S., 2352. Albany, NY: SUNY Press. https://doi.org/10.1093/acprof:osobl/9780195395112.003.0012.Google Scholar
Schroeder, Andrew. 2021. “Democratic Values: A Better Foundation for Public Trust in Science.” British Journal for the Philosophy of Science 72 (2):545–62. https://doi.org/10.1093/bjps/axz023.CrossRefGoogle Scholar
Wagner, Wendy E. 2022. “No One Solution to the ‘New Demarcation Problem’? A View from the Trenches.” Studies in History and Philosophy of Science 92:177–85. https://doi.org/10.1016/j.shpsa.2022.02.001.CrossRefGoogle Scholar
Ward, Zina B. 2021. “On Value-Laden Science.” Studies in History and Philosophy of Science 85:5462. https://doi.org/10.1016/j.shpsa.2020.09.006.CrossRefGoogle ScholarPubMed
Wilholt, Torsten. 2009. “Bias and Values in Scientific Research.” Studies in History and Philosophy of Science 40 (1):92101. https://doi.org/10.1016/j.shpsa.2008.12.005.CrossRefGoogle Scholar
Wilholt, Torsten. 2013. “Epistemic Trust in Science.” British Journal for the Philosophy of Science 64 (2):233–53. https://doi.org/10.1093/bjps/axs007.CrossRefGoogle Scholar
Wylie, Alison. 2001. “Doing Social Science as a Feminist: The Engendering of Archaeology.” In Feminism in Twentieth Century, edited by Angela, N. H. Creager, Lunbeck, Elizabeth, and Schiebinger, Londa, 2345. Chicago: University of Chicago Press.Google Scholar
Wylie, Alison, and Hankinson Nelson, Lynn. 2007. “Coming to Terms with the Values of Science: Insights from Feminist Science Studies Scholarship.” In Value-Free Science? Ideals and Illusions, edited by Kincaid, Harold, Dupre, John, and Wylie, Alison, 5886. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195308969.003.0005.CrossRefGoogle Scholar