The Social Dimensions of Scientific Knowledge

First published Fri Apr 12, 2002; substantive revision Thu Mar 7, 2013

Study of the social dimensions of scientific knowledge encompasses the effects of scientific research on human life and social relations, the effects of social relations and values on scientific research, and the social aspects of inquiry itself. Several factors have combined to make these questions salient to contemporary philosophy of science. These factors include the emergence of social movements, like environmentalism and feminism, critical of mainstream science; concerns about the social effects of science-based technologies; epistemological questions made salient by big science; new trends in the history of science, especially the move away from internalist historiography; anti-normative approaches in the sociology of science; turns in philosophy to naturalism and pragmatism. This entry reviews the historical background to current research in this area, features of contemporary science which invite philosophical attention, the challenge to normative philosophy from social, cultural, and feminist studies of science, and the principal philosophical models of the social character of scientific knowledge. A concluding postscript previews developments to be covered in the next major update.


1. Historical Background

Philosophers who study the social character of scientific knowledge can trace their lineage at least as far as John Stuart Mill. Mill, Charles Sanders Peirce, and Karl Popper all took some type of critical interaction as central to the validation of knowledge claims.

Mill's arguments occur in his well-known political essay On Liberty, (Mill 1859) rather than in the context of his logical and methodological writings, but he makes it clear that they are to apply to any kind of knowledge or truth claim. Mill argues from the fallibility of human knowers to the necessity of unobstructed opportunity for and practice of the critical discussion of ideas. Only such critical discussion can assure us of the justifiability of the (true) beliefs we do have and can help us avoid falsity or the partiality of belief or opinion framed in the context of just one point of view. The achievement of knowledge, then, is a social or collective, not an individual, matter.

Peirce's contribution to the social epistemology of science is commonly taken to be his consensual theory of truth: “The opinion which is fated to be ultimately agreed to by all who investigate is what we mean by truth, and the object represented is the real.” (Peirce 1878, 133) While often read as meaning that the truth is whatever the community of inquirers converges on in the long run, the notion in turn is interpretable as meaning more precisely either that truth (and “the real”) depends on the agreement of the community of inquirers or that it is the effect of the real that it will in the end produce agreement among inquirers. Whatever the correct reading of this particular statement, Peirce elsewhere makes it clear that, in his view, truth is both attainable and beyond the reach of any individual. “We individually cannot hope to attain the ultimate philosophy which we pursue; we can only seek it for the community of philosophers.” (Peirce 1868, 40). Peirce puts great stock in instigating doubt and critical interaction as means to knowledge. Thus, whether his theory of truth is consensualist or realist, his view of the practices by which we attain it grants a central place to dialogue and social interaction.

Popper is often treated as a precursor of social epistemology because of his emphasis on the importance of criticism in the development of scientific knowledge. Two concepts of criticism are found in his works (Popper 1963, 1972) and these can be related to logical and practical senses of falsification. The logical sense of falsification is just the structure of a modus tollens argument, in which a hypothesis is falsified by the demonstration that one of its logical consequences is false. This is one notion of criticism, but it is a matter of formal relations between statements. The practical sense of falsification refers to the efforts of scientists to demonstrate the inadequacies of one another's theories by demonstrating observational shortcomings or conceptual inconsistencies. This is a social activity. For Popper the methodology of science is falsificationist, and science progresses through the demonstration by falsification of the untenability of theories and hypotheses. Popper's falsificationism is part of an effort to demarcate genuine science from pseudo science, and has lost its plausibility as a description of scientific methodology as the demarcation project has come under challenge from naturalist and historicist approaches in philosophy of science. While criticism does play an important role in some current approaches in social epistemology, Popper's own views are more closely approximated by evolutionary epistemology, especially that version that treats cognitive progress as the effect of selection against incorrect theories and hypotheses.

The work of Mill, Peirce, and Popper is a resource for philosophers presently exploring the social dimensions of scientific knowledge. However, the current debates are framed in the context of developments in both philosophy of science and in history and social studies of science following the collapse of the logical empiricist consensus. The philosophers of the Vienna Circle are conventionally associated with an uncritical form of positivism and with the logical empiricism that replaced American pragmatism in the 1940s and 1950s. According to some recent scholars, however, they saw natural science as a potent force for progressive social change. (Cartwright, Cat, and Chang 1996; Giere and Richardson, eds., 1996; Uebel 2005) With its grounding in observation and public forms of verification, science for them constituted a superior alternative to what they saw as metaphysical obscurantism, an obscurantism that led not only to bad thinking but to bad politics. While one development of this point of view leads to scientism, the view that any meaningful question can be answered by the methods of science; another development leads to inquiry into what social conditions promote the growth of scientific knowledge. Logical empiricism, the version of Vienna Circle philosophy that developed in the United States, focused on logical, internal aspects of scientific knowledge and discouraged philosophical inquiry into the social dimensions of science. These came into prominence again after the publication of Thomas Kuhn's Structure of Scientific Revolutions (Kuhn 1962). A new generation of sociologists of science took Kuhn's emphasis on the role of non-evidential community factors in scientific change even further than he had and argued that scientific judgment was determined by social factors, such as professional interests and political ideologies. This family of positions has provoked a counter-response among philosophers. These responses are marked by an effort to grant some social character to scientific knowledge while at the same time maintaining its epistemological legitimacy, which they take to be undermined by the new sociology. At the same time, features of the organization of scientific inquiry compel philosophers to consider their implications for the normative analysis of scientific practices.

2. Big Science, Trust, and Authority

The second half of the twentieth century saw the emergence of what has come to be known as Big Science: the organization of large numbers of scientists bringing different bodies of expertise to a common research project. The original model was the Manhattan Project, undertaken during the Second World War to develop an atomic weapon. Theoretical and experimental physicists located at various sites across the country, though principally at Los Alamos, New Mexico, worked on sub-problems of the project under the overall direction of J. Robert Oppenheimer. While academic and military research have since been to some degree separated, much experimental research in physics, especially high energy particle physics, continues to be pursued by large teams of researchers. Research in other areas of science as well, for example the work comprehended under the umbrella of the Human Genome Project, has taken on some of the properties of Big Science, requiring multiple forms of expertise. In addition, the dependence of research on central funding bodies prompts questions about the degree of independence of contemporary scientific knowledge from its social and economic context.

John Hardwig (1985) articulated one philosophical dilemma posed by such large teams of researchers. Each member or subgroup participating in such a project is required because each has a crucial bit of expertise not possessed by any other member or subgroup. This may be knowledge of a part of the instrumentation, the ability to perform a certain kind of calculation, the ability to make a certain kind of measurement or observation. The other members are not in a position to evaluate the results of other members' work, and hence, all must take one anothers' results on trust. The consequence is an experimental result, (for example, the measurement of a property such as the decay rate or spin of a given particle) the evidence for which is not fully understood by any single participant in the experiment. This leads Hardwig to ask two questions, one about the evidential status of testimony, and one about the nature of the knowing subject in these cases. With respect to the latter, Hardwig says that either the group as a whole, but no single member, knows or it is possible to know vicariously. Neither of these is palatable to him. Talking about the group or the community knowing smacks of superorganisms and transcendent entities and Hardwig shrinks from that solution. Vicarious knowledge, knowing without oneself possessing the evidence for the truth of what one knows, requires, according to Hardwig, too much of a departure from our ordinary concepts of knowledge.

The first question is, as Hardwig notes, part of a more general discussion about the epistemic value of testimony. Much of what passes for common knowledge is acquired from others. We depend on experts to tell us what is wrong with our appliances, our cars, our bodies. Indeed, much of what we later come to know depends on what we previously learned as children from our parents. We acquire knowledge of the world through the institutions of education, journalism, and scientific inquiry. Philosophers disagree about the status of beliefs acquired in this way. Here is the question: If A knows that p on the basis of evidence e, B has reason to think A trustworthy and B believes p on the basis of A's testimony that p, does B also know that p? Some philosophers, like Locke and Hume, argued that only what one has observed oneself could count as a good reason for belief, and that the testimony of another is, therefore, never sufficient warrant for belief. Thus, B does not know simply on the basis of A's testimony. While this result is consistent with traditional philosophical empiricism and rationalism, which emphasized the individual's sense experience or rational apprehension as foundations of knowledge, it does have the consequence that we do not know most of what we think we know.

A number of philosophers have recently offered alternative analyses focusing on one or another element in the problem. Some argue that testimony by a qualified expert is itself evidential, (Schmitt 1988), others that the expert's evidence constitutes good reason for, but is not itself evidential for the recipient of testimony (Hardwig 1985, 1988), others that what is transmitted in testimony is knowledge and not just propositional content and thus the question of the kind of reason a recipient of testimony has is not to the point (Welbourne 1981).

However this dispute is resolved, questions of trust and authority arise in a particularly pointed way in the sciences, and Hardwig's dilemma for the physics experiment is also a specific version of a more general phenomenon. A popular conception of science, fed partly by Popper's falsificationism, is that it is epistemically reliable because the results of experiments and studies are checked by independent repetition. In practice, however, only some results are so checked and many are simply accepted on trust. Thus, just as in the non-scientific world information is accepted on trust, so in science, knowledge grows by depending on the testimony of others. What are the implications of accepting this fact for our conceptions of the reliability of scientific knowledge?

David Hull, in his (1988) argues that because the overall structure of reward and punishment in the sciences is a powerful incentive not to cheat, further epistemological analysis of the sciences is unnecessary. But some celebrated recent episodes, such as the purported production of “cold fusion” were characterized by the failure of replication attempts to produce the same phenomenon. And, while the advocates of cold fusion were convinced that their experiments had produced the phenomenon, there have also been cases of outright fraud. Thus, even if the structure of reward and punishment is an incentive not to cheat, it does not guarantee the veridicality of every research report.

The reward individual scientists seek is credit. That is, they seek recognition, to have their work cited as important and as necessary to further scientific progress. The scientific community seeks true theories or adequate models. Credit, or recognition, accrues to individuals to the extent they are perceived as having contributed to that community goal. Without strong community policing structures, there is a strong incentive to cheat, to try to obtain credit without necessarily having done the work. Communities and individuals are then faced with the question: when is it appropriate to trust and when not?

Both Alvin Goldman (Goldman and Cox 1994) and Philip Kitcher (1993) treat this as a question to be answered by means of decision theoretic models. The decision theoretic approach to problems of trust and authority treats both credit and truth as utilities. The challenge then is to devise formulas that show that actions designed to maximize credit also maximize truth. Kitcher, in particular, develops formulas intended to show that even in situations peopled by non-epistemically motivated individuals (that is, individuals motivated more by a desire for credit than by a desire for truth), the reward structure of the community can be organized in such a way as to maximize truth and foster scientific progress. Kitcher also applies this approach to problems in the division of cognitive labor, i.e. to the questions whether (and when) to pursue research that calls a community consensus into question or to pursue research that extends the models and theories upon which a community agrees.

Steve Fuller and Joseph Rouse are both concerned with political dimensions of cognitive authority. Rouse in his (1987) integrated analytic and continental philosophy of science and technology to develop what might be called a critical pragmatism. This perspective facilitated an analysis of the transformative impact of science on human life and social relations. Fuller (1988) partially accepts the empirical sociologists' claim that traditional normative accounts of scientific knowledge fail to get a purchase on actual scientific practices, but takes this as a challenge to relocate the normative concerns of philosophers. These should include the distribution and circulation of knowledge claims. The task of social epistemology of science is regulation of the production of knowledge by regulating the rhetorical, technological, and administrative means of its communication.

Big science is typically interdisciplinary. A special case of interdisciplinary science is presented by risk assessment, which involves both research on the effects of various substances or practices and the evaluation of those effects once identified. The idea is to gain an understanding of both positive effects and of negative effects and a method of evaluating these. In this case, we are dealing not only with the problems of trust and authority among specialists from different disciplines, but also with the effects of new technologies in the social world. Typically, such assessment is prompted by the prospects of deploying science-based technologies. The risks studied are generally of harm to human health or to the environment. Interest in applying philosophical analysis to risk assessment originated in response to debates about the development and expansion of nuclear power-generating technologies. In addition, the application of cost-benefit analysis and attempts to understand decision-making under conditions of uncertainty became topics of interest as extensions of formal modeling techniques (Giere 1991). These discussions intersect with debates about the scope of rational decision theory and have expanded to include other technologies as well as applications of scientific research in agriculture and in the myriad forms of biological engineering. Essays on the relation between science and social values in risk research collected in the volume edited by Deborah Mayo and Rachelle Hollander (1991) attempt to steer a course between uncritical reliance on cost-benefit models and their absolute rejection. Coming from a slightly different angle, the precautionary principle represents an approach shifting the burden of proof in regulatory decisions from demonstration of harm to demonstration of safety of substances and practices. Carl Cranor (2004) explores versions of the principle and defends its use in certain decision contexts. Shrader-Frechette (2002) has advocated models of ethically weighted cost-benefit analysis and greater public involvement in risk assessment. Philosophers of science have also worked to make visible the ways in which values play a role in the research assessing the effects of technoscientifically produced substances and practices themselves, as distinct from the challenges of assigning values to identified risks and benefits. See Douglas 2000, Lacey 2005, Shrader-Frechette 1994.

3. Social, Cultural, and Feminist Studies of Science

Kuhn's critique of logical empiricism included a strong naturalism. Scientific rationality was to be understood by studying actual episodes in the history of science, not by formal analyses developed from a priori concepts of knowledge and reason (Kuhn 1962, 1977). Sociologists and sociologically inclined historians of science took this as a mandate for the examination of the full spectrum of scientists' practices without any prior prejudice as to which were epistemically legitimate and which not. That very distinction came under suspicion from the new social scholars, often labeled “social constructivists.” They urged that understanding the production of scientific knowledge required looking at all the factors causally relevant to the acceptance of a scientific idea, not just at those the researcher thinks should be relevant.

A wide range of approaches in social and cultural studies of science has come under the umbrella label of “social constructivism.” Both terms in the label are understood differently in different programs of research. While constructivists agree in holding that those factors treated as evidential, or as rationally justifying acceptance, should not be privileged at the expense of other causally relevant factors, they differ in their view of which factors are causal or worth examination. Macro-analytic approaches, such as those associated with the so-called Strong Programme in the Sociology of Scientific Knowledge, treat social relations as an external, independent variable and scientific judgment and content as a dependent variable. Micro-analyses or laboratory studies, on the other hand, abjure the implied separation of social context and scientific practice and focus on the social relations within scientific research programs and communities and on those that bind research-productive and research-receptive communities together.

Researchers also differ in the degree to which they treat the social and the cognitive dimensions of inquiry as independent or interactive. The researchers associated with the macro-analytic Strong Programme in the Sociology of Scientific Knowledge (Barry Barnes, David Bloor, Harry Collins, Donald MacKenzie, Andrew Pickering, Steve Shapin) were particularly interested in the role of large scale social phenomena, whether widely held social/political ideologies or group professional interests, on the settlement of scientific controversies. Some landmark studies in this genre include Andrew Pickering's (1984) study of competing professional interests in the interpretation of high energy particle physics experiments, and Steven Shapin and Simon Shaffer's (1985) study of the controversy between Robert Boyle and Thomas Hobbes about the proper interpretation of experiments with vacuum pumps.

The micro-sociological or laboratory studies approach features ethnographic study of particular research groups, tracing the myriad activities and interactions that eventuate in the production and acceptance of a scientific fact or datum. Karin Knorr Cetina's (1981) reports her year-long study of a plant science laboratory at UC Berkeley. Bruno Latour and Steven Woolgar's (1986) study of Roger Guillemin's neuroendocrinology laboratory at the Salk Institute is another classic in this genre. These scholars argued in subsequent work that their form of study showed that philosophical analyses of rationality, of evidence, of truth and knowledge, were irrelevant to understanding scientific knowledge. Sharon Traweek's (1988) comparative study of the cultures of Japanese and North American high energy physics communities pointed to the parallels between cosmology and social organization without making such extravagant and provocative epistemological claims. The efforts of philosophers of science to articulate norms of scientific reasoning and judgment were, to all these scholars, misdirected, because actual scientists relied on quite different kinds of considerations in the practice of science.

Until recently, apart from a few anomalous figures like Caroline Herschel, Barbara McClintock, and Marie Curie, the sciences were a male preserve. Feminist scholars have asked what bearing the masculinity of the scientific profession has had on the content of science and on conceptions of scientific knowledge and practice. Drawing on work by feminist scientists, exposing and critiquing gender biased science, and on theories of gender, feminist historians and philosophers of science have offered a variety of models of scientific knowledge and reasoning intended to accommodate the critique of accepted science and the concomitant proposal and advocacy of alternatives. Evelyn Keller (1985) proposed a psycho-dynamic model of knowledge and objectivity, arguing that a certain psychological profile, facilitated by typical patterns of masculine psychological development, associated knowledge and objectivity with domination. The association of knowledge and control continues to be a topic of concern for feminist thinkers as it is also for environmentally concerned critics of the sciences. In this connection, see especially Lacey's (2005) study of the controversy concerning transgenic crops. Other feminists turned to Marxist models of social relations and developed versions of standpoint theory, which holds that the beliefs held by a group reflect the social interests of that group. As a consequence, the scientific theories accepted in a context marked by divisions of power such as gender will reflect the interests of those in power. Alternative theoretical perspectives can be expected from those systematically excluded from power. (Rose 1983; Haraway 1978).

Still other feminists have argued that some standard philosophical approaches to the sciences can be used to express feminist concerns. Nelson (1990) adopts Quine's holism and naturalism to analyze debates in recent biology. Elizabeth Potter (2001) adapts Mary Hesse's network theory of scientific inference to analyse gendered aspects of 17th century physics. Helen Longino (1990) develops a contextual empiricism to analyze research in human evolution and in neuroendocrinology. In addition to the direct role played by gender bias, scholars have attended to the ways shared values in the context of reception can confer an a priori implausibility on certain ideas. Keller (1983) argued that this was the fate of Barbara McClintock's unorthodox proposals of genetic transposition. Stephen Kellert (1993) makes a similar suggestion regarding the resistance to so-called chaos theory.

What the feminist and empirical sociological analyses have in common is the view that the social organization of the scientific community has a bearing on the knowledge produced by that community. There are deep differences, however, in their views as to what features of that social organization are deemed relevant and how they are expressed in the theories and models accepted by a given community. The gender relations focused on by feminists went unrecognized by sociologists pursuing macro- or microsociological research programs. The feminist scientists and scholars further differ from the scholars in empirical social and cultural studies of science in their call for alternative theories and approaches in the sciences. These calls imply that philosophical concerns with truth and justification are not only legitimate but useful tools in advancing feminist transformative goals for the sciences. As can be seen in their varying treatments of objectivity, however, philosophical concepts are often reworked in order to be made applicable to the content or episodes of interest (See Anderson 2004, Haraway 1988, Harding 1993, Keller 1985, Longino 1990, Nelson 1990, Wylie 2005)

4. Models of the Social Character of Knowledge

Since 1980, interest in developing philosophical accounts of scientific knowledge that incorporate the social dimensions of scientific practice has been on the increase. Some philosophers see attention to the social as a straightforward extension of already developed approaches in epistemology. Others, inclined toward some form of naturalism, have taken the work in empirical social studies of science discussed above seriously. They have, however, diverged quite considerably in their treatment of the social. Some, understand the social as biasing or distorting, and hence see the social as opposed to or competing with the cognitive or epistemic. These philosophers see the sociologists' disdain for normative philosophical concerns as part of a general debunking of science that demands a response. They attempt either to rebut the claims of the sociologists or to reconcile the demonstration of the role of interests in science with its ultimate rationality. Others treat the social as instead constitutive of rationality. This division parallels to some degree the division between macro-analyses and micro-analyses in the sociology of science described above.

Philosophers who treat the social as biasing or distorting tend to focus on the constructivists' view that there are no universal principles of rationality or principles of evidence that can be used to identify in any context-independent way which factors are evidential and which not. They can be divided into roughly two camps: defenders of rationality and reconciliationists who seek to disarm the sociologists' analyses by incorporating them into a broader rational framework. 

Philosophers concerned to defend the rationality of science against sociological misrepresentations include Larry Laudan (1984) James Brown (1989, 1994), Alvin Goldman (1987, 1995) and Susan Haack (1996). The details of these philosophers' approaches differ, but they agree in holding that scientists are persuaded by what they regard as the best evidence or argument, the evidence most indicative of the truth by their lights, and in holding that arguments and evidence are the appropriate focus of attention for understanding the production of scientific knowledge. When evidential considerations have not trumped non-evidential considerations, we have an instance of bad science. They read the sociologists as arguing that a principled distinction between evidential and nonevidential considerations cannot be drawn and devote their efforts to refuting those arguments. The social character of science is understood as a matter of the aggregation of individuals, not their interactions, and public knowledge as simply the additive outcome of many individuals making sound epistemic judgments. Individual rationality and individual knowledge are thus the proper focus of philosophers of science. Exhibiting principles of rationality applicable to individual reasoning is sufficient to demonstrate the rationality of science, at least in its ideal form.

Reconciliationists include Ronald Giere, Mary Hesse, and Philip Kitcher. Giere (1988) models scientific judgment using decision theory. This permits incorporating scientists' interests as one of the parameters of the decision matrix. Mary Hesse (1980) employs a network model of scientific inference that resembles W.V.O. Quine's web of belief in that its constituents are heterogeneous in character, but all subject to revision in relation to changes elsewhere in the network. She understands the social factors as coherence conditions operating in tandem with logical constraints to determine the relative plausibility of beliefs in the network.

The most elaborate reconciliationist position is that developed in Philip Kitcher's (1993). In addition to modeling relations of authority and the division of cognitive labor as described above, he offers what he terms a compromise between extreme rationalists and sociological debunkers. The compromise model appeals to a principle of rationality, which Kitcher calls the External Standard. It is deemed external because it is proposed as holding independently of any particular historical, cultural or social context. Thus, not only is it external, but it is also universal. The principle applies to change of belief (or shift from one practice to another, in Kitcher's broader locution), not to belief. It treats a shift (in practice or belief) as rational if and only “the process through which the shift was made has a success ratio at least as high as that of any other process used by human beings (ever) ...” (Kitcher 1993, 303). Kitcher's compromise proposes that scientific ideas develop over time and benefit from the contributions of many differently motivated researchers. This is the concession to the sociologically oriented scholars. In the end, however, those theories that get accepted are those that satisfy Kitcher's External Standard. Kitcher thus joins Goldman, Haack, and Laudan in the view that it is possible to articulate a priori conditions of rationality or of epistemic warrant that operate independently of, or, perhaps one might say, orthogonally to, the social relations of science.

A third set of models is integrationist in character. Nelson (1990) uses Quine's arguments against the independently foundational status of observation statements as the basis for what she calls a feminist empiricism. According to Nelson, no principled distinction can be made between the theories, observations, or values of a community. What counts as evidence, in her view, is fixed by the entire complex of a community's theories, value commitments, and observations. There is neither knowledge nor evidence apart from such a shared complex. The community is the primary knower on this view and individual knowledge is dependent on the knowledge and values of the community.

Miriam Solomon's social empiricism is focused on scientific rationality (Solomon 1992, 1994a, 1994b). It, too, involves denying a universal principled distinction among the causes of belief. Solomon draws on contemporary cognitive science literature to argue that biases are simply any factors that influence belief. They are not necessarily distorting, and can be productive of insight and rational belief. Salience and availability (of data, of measurement technologies) are biases as much as social ideologies. The distinctive feature of Solomon's social empiricism is her contrast between individual and community rationality. The theory or belief that it is rational to accept is that which has the greatest amount of empirical success. Individuals can persist in beliefs that are less rational than others on this view, if the totality of available evidence (or empirical data) is not available to them. What matters to science, however, is that community judgments be rational. A community is rational when the theories it accepts are those with all or the most empirical successes. Thus, the community can be rational even when its members are irrational. Indeed, individual irrationality can contribute to community rationality in that individuals committed to a theory that accounts for their data keep that data in the range of phenomena any theory accepted by the entire community must eventually explain. In order that the totality of relevant constraints on theory acceptance remain available to the entire community, biases must be appropriately distributed. Thus Solomon proposes appropriate distribution of biases as a normative condition on the structure of scientific communities.

Finally, in Longino's critical contextual empiricism, the cognitive processes that eventuate in scientific knowledge are themselves social (Longino 1990). Longino's starting point is a version of the underdetermination argument: the semantic gap between statements describing data and statements expressing hypotheses or theories to be confirmed or disconfirmed by that data means that evidential relations cannot be formally specified and that data cannot support one theory or hypothesis to the exclusion of all alternatives. Instead, such relations are mediated by background assumptions. Eventually, in the chain of justification, one reaches assumptions for which no evidence is available. If these are the context in which evidential relations are constituted, questions arise concerning how the acceptance of such assumptions can be legitimated. According to Longino, the only check against the arbitrary dominance of subjective (metaphysical, political, aesthetic) preference in such cases is critical interaction among the members of the scientific community or among members of different communities. Longino takes the underdetermination argument to express in logical terms the point made by the sociologically oriented researchers: the individuals participating in the production of scientific knowledge are historically, geographically, and socially situated and their observations and reasoning reflect their situations. This fact does not undermine the normative enterprise of philosophy, but requires its expansion to include within its scope the social interactions within and between scientific communities. What counts as knowledge is determined by such interactions. Longino claims that scientific communities do institutionalize some critical practices (for example, peer review), but argues that such practices and institutions must satisfy conditions of effectiveness in order to qualify as objective.

5. Social Direction of Science

Modern science has been regarded as both a model of democratic self-governance and an activity requiring and facilitating democratic practices in its supporting social context (Popper 1950, Bronowski 1956). In this perspective, science is seen as embedded in and dependent on its supporting social context, but insulated in its practices from the influence of that context. As the reach of science and science-based technologies has extended further and further into the economy and daily life of industrialized societies, new attention is paid to the governance of science. Regardless of one's views about the social character of knowledge, there are further questions concerning what research to pursue, what social resources to devote to it, who should make such decisions, and how they should be made.

Philip Kitcher (Conclusions, Science, Truth, and Democracy, 2001) has opened these questions to philosophical scrutiny. Kitcher largely endorses the epistemological views of his (1993). In this new work, however, he argues that there is no absolute standard of the significance (practical or epistemic) of research projects, nor any standard of the good apart from subjective preferences. The only non-arbitrary way to defend judgments concerning research agendas in the absence of absolute standards is through democratic means of establishing collective preferences. Kitcher, thus, attempts to spell out procedures by which decisions concerning what research directions to pursue can be made in a democratic manner. The result, which he calls well-ordered science, is a system in which the decisions actually made track the decisions that would be a made by a suitably constituted representative body collectively deliberating with the assistance of relevant information (concerning, e.g., cost and feasibility) supplied by experts.

Kitcher's “well-ordered science” has attracted attention from other philosophers, from scientists, and from scholars of public policy. Winning praise as a first step, it has also elicited a variety of criticisms and further questions. The criticisms of his proposal range from worries about the excessive idealism of the conception to worries that it will enshrine the preferences of a much smaller group than those who will be affected by research decisions. Kitcher's proposal at best works for a system in which all or most scientific research is publicly funded. But the proportion of private, corporate, funding of science compared to that of public funding has been increasing, thus calling into question the effectiveness of a model that presupposes largely public control (Mirowski and Sent 2002, Krimsky 2003). Kitcher's model, it should be noted, still effects a significant separation between the actual conduct of research and decisions concerning the direction of research and scholars who see a more intimate relation between social processes and values in the context and those in the conduct of research will be dissatisfied with it.

The counterfactual character of the proposal raises questions about the extent to which well-ordered science really is democratic. If the actual decisions do not need to be the result of democratic procedures but only to be the same as those that would result, from such procedures how do we know which decisions those are without actually going through the deliberative exercise? Even if the process is actually carried out, there are places, e.g. in choice of experts whose advice is sought, which permit individual preferences to subvert or bias the preferences of the whole (Roth 2003). Furthermore, given that the effects of scientific research are potentially global, while democratic decisions are at best national, national decisions will have an effect well beyond the population represented by the decision makers. Sheila Jasanoff has also commented that even in contemporary industrialized democracies there are quite different science governance regimes. There is not one model of democratic decision making, but many, and the differences translate into quite different policies (Jasanoff 2005).

6. Conclusion

Philosophical study of the social dimensions of scientific knowledge has been intensifying in the decades since 1970. Social controversies about the sciences and science based technologies as well as developments in philosophical naturalism and social epistemology combine to drive thinking in this area forward. Scholars in a number of cognate disciplines continue to investigate the myriad social relations within scientific communities and between them and their social, economic, and institutional contexts.

In the interval since the last major update of this entry, there has been increased interest in several of the topics falling under its purview, Philosophers have attempted to model the distribution of cognitive labor in ways that take into account more realistic features of scientific communities. Strevens (2003)attends to the priority rule in science and argues that it offers a rationale for allocating resources, and hence explains the distribution among leaders and followers. Muldoon and Weisberg (2011) take exception to accounts like Strevens's and Kitcher's before him as relying on unrealistically uniform and ideal agents. Instead they focus on modeling situations in which agents have imperfect, partial, and different information. In a slightly different twist on the theme of distribution, Giere (2002) takes a naturalist approach to modeling, not so much the distribution of cognitive labor, but the distribution of cognition. This approach takes a system or interactive community as the locus of cognition, rather than the individual agent. Nersessian (2006) extends distributed cognition to model-based reasoning in the sciences. Paul Thagard's emphasis on the interdisciplinary (and hence social) nature of cognitive science itself introduces a distinctive social perspective to the cognitive science of science (Thagard 2102).

Models of the social character of scientific knowledge will have to take account of Daston and Galison (2010) as well as Hacking (2004) which have approached the epistemology of science historically, seeking to identify changes in the salience of epistemological desiderata and to situate those changes in changing circumstances of research and its context. Arthur Fine (2007) defends what he calls non-idiot relativism, by articulating the relativism of social constructivists in a pragmatist idiom. Giere (2006) introduces perspectivalism as a new entrant in the effort to model the character of diversity in the sciences. The essays collected in Kellert, Longino, and Waters (2006) argue for a pluralist attitude towards the theoretical diversity identified in a variety of scientific subfields. Van Fraassen's (2008) emphasis on the partiality and perspective dependence of measurement provides another point of entry to such diversity. These investigations provide both source material for philosophical analysis and challenges to conventional approaches to understanding scientific knowledge.

While philosophers were initially focused on what might be termed narrowly epistemic concerns in their response to this work, they are expanding that focus to include attention to the ethical and political questions its analyses make salient. One issue that has drawn attention is the apparent disconnect between the scientific consensus regarding climate change and public attitudes. There are standard issues, such as what do we owe future generations, not to mention those communities presently affected by such phenomena as rising sea levels, that are in the domain of ethics. However, the resistance to accepting the apparent scientific consensus has led philosophers to think about what models of knowledge would facilitate overcoming such resistance (Anderson 2011; Kitcher 2011). As the old distinction between basic and applied research fades in favor of more complex understandings of the relation of research and the hopes of using it in real world settings, philosophers have considered what features actually make for reliability of research (Cartwright and Hardie 2012; Douglas 2009). Finally publication practices and uptake are attracting renewed attention. Here the work of Carole Lee (2013) on bias in peer review is of particular interest. And the work of John Ioannidis (Tatsioni, Bonitsis, and Ioannidis 2007; Young, N.S. Ioannidis, and Al-Ubaydli 2008) raises questions about publication practices in assuring the reliability of research. It also reflects back on the priority rule as possibly having a contaminating effect on the quality of knowledge, even as it drives the distribution of cognitive labor.

Bibliography

Works Cited

  • Anderson, Elizabeth, 2004. “Uses of Value Judgements in Science” Hypatia, 19, 1–24.
  • –––, 2011. “Democracy, Public Policy, and Lay Assessments of Scientific Testimony,” Episteme, 8(2): 144–164.
  • Barnes, Barry and David Bloor, 1982. “Relativism, Rationalism, and the Sociology of Knowledge,” in Rationality and Relativism, eds. Martin Hollis and Steven Lukes, pp. 21–47, Oxford: B. Blackwell.
  • Bronowski, Jacob, 1956. Science and Human Values, New York: Harper and Bros.
  • Brown, James, 1989. The Rational and the Social, London: Routledge.
  • –––, 1994. Smoke and Mirrors: How Science Reflects Reality, New York: Routledge.
  • Cartwright, Nancy, with Jordi Cat, Lola Fleck, and Hasok Chang, 1996. Otto Neurath: Philosophy Between Science and Politics, New York, NY: Cambridge University Press.
  • Cartwright, Nancy, and J. Hardie, 2012. Evidence-Based Policy: A Practical Guide to Doing It Better, New York, NY: Oxford University Press.
  • Cranor, Carl F., 2004. “Toward Understanding Aspects of the Precautionary Principle,” Journal of Medicine and Philosophy, 29(3): 259–79.
  • Daston, Lorraine, and Peter Galison, 2010. Objectivity, Cambridge, MA: MIT Press.
  • Douglas, Heather, 2000. “Inductive Risk and Values in Science,” Philosophy of Science, 67(4): 559–579>
  • –––, 2009. Science, Policy, and the Value-Free Ideal, Pittsburgh, PA: University of Pittsburgh Press.
  • Fine, Arthur, 2007. “Relativism, Pragmatism, and the Practice of Science,” in New Pragmatists, Cheryl Misak (ed.), pp. 50–67, Oxford: Oxford University Press.
  • Fuller, Steve, 1988. Social Epistemology, Bloomington, IN: Indiana University Press.
  • Giere, Ronald, 1988. Explaining Science: A Cognitive Approach, Chicago: University of Chicago Press.
  • –––, 1991. “Knowledge, Values, and Technological Decisions: A Decision Theoretical Approach,” in Acceptable Evidence: Science and Values in Risk Management, Deborah Mayo and Rachelle Hollander (eds.), pp. 183–203, New York: Oxford University Press.
  • –––, 2002. “Scientific Cognition as Distributed Cognition,” in Cognitive Bases of Science Peter Carruthers, Stephen Stitch, and Michael Siegal (eds.), Cambridge, UK: Cambridge University Press.
  • –––, 2006. Scientific Perspectivism, Chicago, IL: University of Chicago Press.
  • Giere, Ronald, and Alan Richardson (eds.), 1996. Origins of Logical Empiricism (Minnesota Studies in the Philosophy of Science, Vol. XVI), Minneapolis, MN: University of Minnesota Press.
  • Goldman, Alvin, 1987. “The Foundations of Social Epistemics,” Synthese, 73(1): 109–144.
  • –––, 1995. “Psychological, Social and Epistemic Factors in the Theory of Science,” in PSA 1994: Proceedings of the 1994 Biennial Meeting of the Philosophy of Science Association, Richard Burian, Mickey Forbes, and David Hull (eds.), pp. 277–286, East Lansing, MI: Philosophy of Science Association.
  • Haack, Susan, 1996. “Science as Social: Yes and No,” in Feminism, Science, and the Philosophy of Science, Lynn Hankinson Nelson and Jack Nelson (eds.), pp. 79–94, Dordrecht: Kluwer Academic Publishers.
  • Hacking, Ian, 2004. Historical Ontology, Cambridge, MA: Harvard University Press.
  • Haraway, Donna, 1978. “Animal Sociology and a Natural Economy of the Body Politic (Part II),” Signs, 4(1): 37–60.
  • –––, 1988. “Situated Knowledges,” Feminist Studies, 14(3): 575–600.
  • Harding, Sandra, 1993. “Rethinking Standpoint Epistemology,” in Feminist Epistemologies, Linda Alcoff and Elizabeth Potter (eds.), pp. 49–82, New York: Routledge.
  • Hardwig, John, 1985. “Epistemic Dependence,” Journal of Philosophy, 82(7): 335–349.
  • –––, 1988. “Evidence, Testimony, and the Problem of Individualism,” Social Epistemology, 2(4): 309–21.
  • Hesse, Mary, 1980. Revolutions and Reconstructions in the Philosophy of Science, Bloomington, IN: Indiana University Press.
  • Hull, David, 1988. Science As a Process: An Evolutionary Account of the Social and Conceptual Development of Science, Chicago: University of Chicago Press.
  • Jasanoff, Sheila, 2005. Designs on Nature: Science and Democracy in Europe and the United States, Princeton: Princeton University Press.
  • Keller, Evelyn Fox, 1983. A Feeling for the Organism: The Life and Work of Barbara McClintock, San Francisco: W.H. Freeman.
  • –––, 1985. Reflections on Gender and Science, New Haven: Yale University Press.
  • Kellert, Stephen, 1993. In the Wake of Chaos, Chicago: University of Chicago Press.
  • Kellert, Stephen, with Helen Longino, and C. Kenneth Waters (eds.), 2006. Scientific Pluralism (Minnesota Studies in the Philosophy of Science, Vol. XIX), Minneapolis: University of Minnesota Press.
  • Kitcher, Phillip, 1993. The Advancement of Science: Science Without Legend, Objectivity Without Illusions, Oxford: Oxford University Press.
  • –––, 2001. Science, Truth, and Democracy, New York: Oxford University Press.
  • –––, 2011. Science in a Democratic Society, Amherst, NY: Prometheus Press.
  • Knorr-Cetina, Karin, 1981. The Manufacture of Knowledge, Oxford: Pergamon Press.
  • Krimsky, Sheldon, 2003. Science in the Private Interest, Lanham: Rowman and Littlefield.
  • Kuhn, Thomas, 1962. The Structure of Scientific Revolutions, Chicago: University of Chicago Press.
  • –––, 1977. The Essential Tension: Selected Studies in Scientific Tradition and Change, Chicago: University of Chicago Press.
  • Lacey, Hugh, 2005. Values and Objectivity: The Controversy over Transgenic Crops, Lanham: Rowman and Littlefield.
  • Latour, Bruno and Steven Woolgar, 1986. Laboratory Life: The Construction of Scientific Facts, 2d ed., Princeton: Princeton University Press.
  • Laudan, Larry, 1984a. “The Pseudo-Science of Science?” in Scientific Rationality: The Sociological Turn, James Brown (ed.), pp. 41–74, Dordrecht: D. Reidel.
  • Lee, Carole J., 2013. “Bias in Peer Review,” Journal of the American Society for Information Science and Technology, 64(1): 2–17.
  • Longino, Helen, 1990. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry, Princeton: Princeton University Press.
  • Mayo, Deborah, and Rachelle Hollander (eds.), 1991. Acceptable Evidence: Science and Values in Risk Management, New York: Oxford University Press.
  • Mill, John Stuart, 1859. On Liberty, London: John W. Parker and Son; reprinted 1974, 1982, Gertrude Himmelfarb (ed.), Harmondsworth: Penguin.
  • Mirowski, Philip, and Esther-Mirjam Sent (eds.), 2002. Science Bought and Sold, Chicago: University of Chicago Press.
  • Muldoon, Ryan, and Michael Weisberg, 2011. “Robustness and Idealization in Models of Cognitive Labor,” Synthese, 183: 161–174.
  • Nersessian, Nancy J., 2006. “Model-Based Reasoning in Distributed Cognitive Systems,” Philosophy of Science, 73(5): 699–709.
  • Nelson, Lynn Hankinson, 1990. Who Knows: From Quine to Feminist Empiricism, Philadelphia: Temple University Press.
  • Peirce, Charles S., 1868. “Some Consequences of Four Incapacities,” Journal of Speculative Philosophy, 2: 140–157; reprinted in C.S. Peirce, Selected Writings, Philip Wiener (ed.), New York: Dover Publications, 1958, pp. 39–72.
  • –––, 1878. “How to Make Our Ideas Clear,” Popular Science Monthly, 12: 286–302; reprinted in C.S. Peirce, Selected Writings, Philip Wiener (ed.), New York: Dover Publications, 1958, pp. 114–136.
  • Pickering, Andrew, 1984. Constructing Quarks: A Sociological History of Particle Physics, Edinburgh: Edinburgh University Press.
  • Popper, Karl, 1950. The Open Society and its Enemies, Princeton: Princeton University Press.
  • –––, 1963. Conjectures and Refutations, London: Routledge and Kegan Paul.
  • –––, 1972. Objective Knowledge, Oxford: Oxford University Press.
  • Potter, Elizabeth, 2001. Gender and Boyle's Law of Gases, Bloomington: Indiana University Press.
  • Rose, Hilary, 1983. “Hand, Brain, and Heart,” Signs, 9(1): 73–96.
  • Roth, Paul, 2003. “Kitcher's Two Cultures,” Philosophy of the Social Sciences, 33(3): 386–405.
  • Rouse, Joseph, 1987. Knowledge and Power: Toward a Political Philosophy of Science, Ithaca: Cornell University Press.
  • Schmitt, Frederick, 1988. “On the Road to Social Epistemic Interdependence,” Social Epistemology, 2: 297–307.
  • Shapin, Steven and Simon Schaffer, 1985. Leviathan and the Air Pump, Princeton: Princeton University Press.
  • Shrader-Frechette, Kristin, 1994. “Expert Judgment and Nuclear Risks: The Case for More Populist Policy,” Journal of Social Philosophy, 25: 45–70.
  • –––, 2002. Environmental Justice: Creating Equality; Reclaiming Democracy, New York: Oxford University Press.
  • Solomon. Miriam, 1992. “Scientific Rationality and Human Reasoning,” Philosophy of Science, 59(3): 439–54.
  • –––, 1994a. “Social Empiricism,” Noûs, 28(3): 323–343.
  • –––, 1994b. “A More Social Epistemology,” in Socializing Epistemology: The Social Dimensions of Knowledge, Frederick Schmitt (ed.), pp. 217–233, Lanham: Rowman and Littlefield Publishers.
  • Strevens, Michael, 2003. “The Role of the Priority Rule in Science,” Journal of Philosophy, 100: 55–79.
  • Tatsioni, Athina, with Nikolaos Bonitsis, and John Ioannidis, 2007. “The Persistence of Contradicted Claims in the Literature,” Journal of the American Medical Association, 298(21): 2517–26.
  • Thagard, Paul, 2012. The Cognitive Science of Science: Explanation, Discovery, and Conceptual Change, Cambridge, MA: MIT Press.
  • Traweek, Sharon, 1988. Beamtimes and Lifetimes: The World of High Energy Physicists, Cambridge, MA: Harvard University Press.
  • Uebel, Thomas, 2004. “Political Philosophy of Science in Logical Empiricism: The Left Vienna Circle,” Studies in History and Philosophy of Science, 36: 754–773.
  • van Fraassen, Bas, 2008. Scientific Representation, New York: Oxford University Press.
  • Welbourne, Michael, 1981. “The Community of Knowledge,” Philosophical Quarterly, 31(125): 302–314.
  • Wylie, Alison, 2002. Thinking from Things, Los Angeles: University of California Press.
  • Young, N.S., with John Ioannidis, O. Al-Ubaydli, 2008. “Why Current Publication Practices May Harm Science,” Public Library of Science Medicine, 5(10): e201, doi: 10.1371/journal.pmed.0050201.

Further Reading

  • Fleck, Ludwig, 1973. The Genesis and Development of a Scientific Fact, Chicago: University of Chicago Press.
  • Goldman, Alvin, 1999. Knowledge in a Social World, New York: Oxford University Press.
  • Hacking, Ian, 1999. The Social Construction of What?, Cambridge, MA. Harvard University Press.
  • Latour, Bruno, 2004. Politics of Nature: How to Bring the Sciences into Democracy, Cambridge, MA: Harvard University Press.
  • Levi, Isaac, 1980. The Enterprise of Knowledge, Cambridge, MA: MIT Press.
  • Longino, Helen E., 2002. The Fate of Knowledge, Princeton: Princeton University Press.
  • McMullin, Ernan (ed.), 1992. Social Dimensions of Scientific Knowledge, South Bend: Notre Dame University Press.
  • Sismondo, Sergio, 1996. Science Without Myth, Albany: State University of New York Press.
  • Solomon, Miriam, 2001. Social Empiricism, Cambridge, MA: MIT Press.

Academic Tools

sep man icon How to cite this entry.
sep man icon Preview the PDF version of this entry at the Friends of the SEP Society.
inpho icon Look up this entry topic at the Indiana Philosophy Ontology Project (InPhO).
phil papers icon Enhanced bibliography for this entry at PhilPapers, with links to its database.

Other Internet Resources

[Please contact the author with suggestions.]

Related Entries

epistemology: evolutionary | epistemology: social | ethics: environmental | feminist (interventions): epistemology and philosophy of science | Kuhn, Thomas | Mill, John Stuart | naturalism | Peirce, Charles Sanders | Popper, Karl | pragmatism | rationality: historicist theories of

Copyright © 2013 by
Helen Longino <hlongino@stanford.edu>

Open access to the SEP is made possible by a world-wide funding initiative.
Please Read How You Can Help Keep the Encyclopedia Free