Please Read How You Can Help Keep the Encyclopedia Free
Social epistemology is the study of the social dimensions of knowledge or information. There is little consensus, however, on what the term "knowledge" comprehends, what is the scope of the "social", or what the style or purpose of the study should be. According to some writers, social epistemology should retain the same general mission as classical epistemology, revamped in the recognition that classical epistemology was too individualistic. According to other writers, social epistemology should be a more radical departure from classical epistemology, a successor discipline that would replace epistemology as traditionally conceived. The classical approach could be realized in at least two forms. One would emphasize the traditional epistemic goal of acquiring true beliefs. It would study social practices in terms of their impact on the truth-values of agents' beliefs. A second version of the classical approach would focus on the epistemic goal of having justified or rational beliefs. Applied to the social realm, it might concentrate, for example, on when a cognitive agent is justified or warranted in accepting the statements and opinions of others. Proponents of the anti-classical approach have little or no use for concepts like truth and justification. In addressing the social dimensions of knowledge, they understand "knowledge" as simply what is believed, or what beliefs are "institutionalized" in this or that community, culture, or context. They seek to identify the social forces and influences responsible for knowledge production so conceived. Social epistemology is theoretically significant because of the central role of society in the knowledge-forming process. It also has practical importance because of its possible role in the redesign of information-related social institutions.
- 1. History of Social Epistemology
- 2. Classical Approaches
- 3. Anti-Classical Approaches
- 4. Conceptions of the Social
- 5. Theoretical Questions for Social Epistemology
- 6. Questions of Institutional Design in Social Epistemology
- 7. Conclusion
- Other Internet Resources
- Related Entries
The phrase "social epistemology" does not have a long history of systematic use. It is not difficult, however, to find historical philosophers who made at least brief forays into the social dimensions of knowledge or rational belief. In his dialogue Charmides, Plato posed the question of how a layperson can determine whether someone who purports to be an expert in an area really is one. Since dependence on experts or authorities is a problem within the scope of social epistemology, this was a mini-exploration of the subject. The seventeenth and eighteenth century British philosophers John Locke, David Hume, and Thomas Reid devoted portions of their epistemologies — often just scattered remarks — to the problem of "testimony": When should cognitive agents rely on the opinions and reports of others? What must a hearer know about a speaker to be entitled to trust his assertions? Locke so emphasized the importance of intellectual self-reliance that he expressed strong doubts about giving authority to the opinions of others (1959, I. iii. 23). Hume took it for granted that we regularly rely on the factual statements of others, but insisted that it is reasonable to do so only to the extent that we have adequate reasons for thinking that these sources are reliable. Hume's empiricism led him to require that these reasons be based on personal observations that establish the veracity of human testimony (Hume 1975, X, 111). Reid, by contrast, claimed that our natural attitude of trusting others is reasonable even if we know little if anything about their reliability. Testimony, at least sincere testimony, is always prima facie credible (Reid 1975, VI, xxiv). All of these positions, of course, are epistemological positions. However, they were generally part of an epistemological enterprise that was basically egocentric in orientation, so they are perhaps not ideal or pure paradigms of social epistemology. Nonetheless, they are clear examples of early epistemologies that examined social dimensions of epistemic justification.
A different tradition focused on aspects of knowledge that are "social" in a more sociological or political sense, though members of this tradition less frequently aligned their work to core issues in epistemology. Karl Marx's theory of ideology could well be considered a type of social epistemology. On one interpretation of Marx's conception of "ideology", an ideology is a set of beliefs, a world-view, or a form of consciousness that is in some fashion false or delusive. The cause of these beliefs, and perhaps of their delusiveness, is the social situation and interests of the believers. Since the theory of ideology, so described, is concerned with the truth and falsity of beliefs, it might even be considered a form of classical social epistemology.
Karl Mannheim (1936) extended Marx's theory of ideology into a sociology of knowledge. He classed forms of consciousness as ideological when the thoughts of a social group can be traced to the group's social situation or "life conditions" (1936: 78). The descriptive enterprise of tracing these thoughts to the social situation might be construed as social epistemology. The further enterprise of critiquing and dissolving ideological delusions — "Ideologiekritik" — is surely a form of social epistemology. The critical theory of the Frankfurt School was one attempt, or a family of attempts, to develop this idea. Critical theory aims at emancipation and enlightenment by making agents aware of hidden coercion in their environment, enabling them to determine where their true interests lie (Geuss 1981: 54). In a variant of critical theory, Jurgen Habermas introduced the idea of an "ideal speech situation", a hypothetical situation of absolutely uncoerced and unlimited discussion between completely free and equal human agents (Habermas 1973; Geuss 1981: 65). In some writings Habermas uses the ideal speech situation as a transcendental criterion of truth. Beliefs that agents would agree upon in the ideal speech situation are ipso facto true beliefs (Habermas and Luhmann 1971: 139, 224). Here a social communicational device is treated as a type of epistemic standard.
Subsequent developments in the sociology of knowledge, and especially in the sociology of science, can also be considered forms of social epistemology. Since science is widely considered the paradigmatic knowledge-producing enterprise, and since epistemology is centrally concerned with knowledge, any endeavor that seeks to identify social determinants of science might plausibly be categorized as a form of social epistemology. Both Mannheim and the sociologist of science Robert Merton (1973) exempted (natural) science from the influence of societal or "existential" factors of the types that influence other categories of beliefs. Science was viewed as a society unto itself, largely autonomous from the rest of society. But later sociologists of science have declined to offer the same exemption. The Edinburgh School contends that all scientific beliefs are on a par with other beliefs in terms of their causes. Barry Barnes and David Bloor formulated a "symmetry" or "equivalence" postulate, according to which all beliefs are on a par with respect to the causes of their credibility (1982). Many historical case studies conducted in this tradition have tried to show how scientists too are swayed by class interests, political interests, and other factors usually considered "external" to pure science (Forman 1971; Shapin 1975; Mackenzie 1981). Thomas Kuhn (1962/1970) is thought to have shown that purely objective considerations can never settle disputes between competing scientific theories or paradigms, and hence scientific beliefs must be influenced by "social factors". Kuhn's descriptions of the practices of scientific research communities, especially descriptions of the inculcation and preservation of paradigms during periods of "normal" science, were clear and influential examples of a social analysis of science, especially when contrasted with the positivist tradition of analysis. Michel Foucault developed a radically political view of knowledge and science, arguing that practices of so-called knowledge-seeking, especially in the modern world, really serve the aims of power and social domination (1977, 1980). All of these writers may be considered "social epistemologists", although they themselves do not employ this phrase.
Perhaps the first use of the phrase "social epistemology" appears in the writings of a library scientist, Jesse Shera, who in turn credits his associate Margaret Egan. "[S]ocial epistemology," says Shera, "is the study of knowledge in society…. The focus of this discipline should be upon the production, flow, integration, and consumption of all forms of communicated thought throughout the entire social fabric" (1970: 86). Shera was particularly interested in the affinity between social epistemology and librarianship. He did not, however, construct a conception of social epistemology with very definite philosophical or social-scientific contours. What might such contours be?
Classical epistemology has been concerned with the pursuit of truth. How can an individual engage in cognitive activity so as to arrive at true belief and avoid false belief? This was the task René Descartes set for himself in his Discourse on the Method of Rightly Conducting the Reason and Seeking for Truth in the Sciences (1637/1955) and in his Meditations on First Philosophy (1641/1955). Classical epistemology has equally been concerned with rationality or epistemic justification, as suggested by part of the title of the Discourse. A person might rightly conduct her reason in the search for truth but not succeed in getting the truth. However, as long as she forms a belief by a proper use of reason — and perhaps by proper use of other faculties like perception and memory — then her belief is rationally warranted or justified. Classical epistemologists all regard this as one sort of epistemic desideratum. Furthermore, according to the standard account of knowledge in classical epistemology, for a person to know a proposition, she must believe it, it must be true, and the belief in it must be justified or rationally warranted. So if epistemology is the study of knowledge, and more specifically the study of how knowledge can be attained, it must also be the study of how true and justified belief can be attained. Epistemological projects restricted to just one of these dimensions — truth or justification — would also fit the classical mold.
The foregoing remarks apply to classical epistemology in its "individualist" guise. What type of epistemology does one get if one tries to "socialize" classical epistemology? One gets some sort of social angle on the pursuit of true belief and/or the pursuit of justified belief. Some projects in social epistemology have adopted precisely these themes.
Perhaps the first formulation of a truth-oriented social epistemology is found in writings by Alvin Goldman from the late 1970s through the mid-1980s (Goldman 1978, 1986, 1987). Goldman there proposes to divide epistemology into two branches: individual epistemology and social epistemology (or "epistemics"). Both branches would seek to identify and assess processes, methods or practices in terms of their contributions — positive or negative — to the production of true belief. Individual epistemology would identify and evaluate psychological processes that occur within the epistemic subject. Social epistemology would identify and evaluate social processes by which epistemic subjects interact with other agents who exert causal influence on their beliefs. The communicational acts of other agents and the institutional structures that guide or frame such communicational acts would be prime examples of social-epistemic practices that would be studied within social epistemology. In Goldman's subsequent book, Knowledge in a Social World (1999), this conception of social epistemology is developed in detail. It is argued that, both in everyday life and in specialized arenas such as science, law, and education, a certain value is placed on having true beliefs rather than false beliefs or no opinion (uncertainty). This type of value is called "veritistic value", and a measure of veritistic value is proposed. The rest of the book examines types of social practices that make positive or negative contributions toward increasing veritistic value. Types of practices examined include speech practices of reporting and arguing, market and non-market mechanisms that regulate the flow of speech, types of information technologies, assigning scientific credit and guiding scientific inquiries with an eye to credit, trial procedures or legal adjudication systems, and systems that disseminate political information about electoral candidates.
The veritistic approach to social epistemology aims to be evaluative or normative rather than purely descriptive or explanatory. It seeks to evaluate actual and prospective practices in terms of their impacts on true versus false beliefs. Although truth may have no explanatory role to play in the social studies of knowledge, it can play a regulative role. How can truth play a regulative role, it may be asked, unless we already have ways of deciding what is true? How can the social epistemologist assess the truth-propensity of a practice unless she already has a method of determining whether the beliefs caused by the practice are true or false? But if she has such a method of determination, why bother with social epistemology? In answer to these questions, it is sometimes possible to demonstrate mathematically that a certain practice would have certain veritistic properties. For example, Goldman indicates that a particular (difficult to instantiate) practice of Bayesian inference has a general propensity, on average, to increase the veritistic properties of one's beliefs (Goldman 1999: 115–123). Similarly, it can be shown mathematically that a certain mode of amalgamating expert opinions in a group yields greater group accuracy than other modes of amalgamation (Shapley and Grofman 1984; Goldman 1999: 81–82). Finally, a practice can sometimes be judged veritistically unsatisfactory when later and better evidence shows that many judgments issued under its aegis were false. The medieval practice of trial by ordeal was abandoned in part because it was shown that the ordeal had produced numerous erroneous judgments of guilt. This emerged when voluntary confessions were later obtained from different people, or new eye-witnesses came forward.
Philip Kitcher has also developed the social epistemology of science from a truth-oriented perspective. One of his chief concerns has been the division of cognitive labor (Kitcher 1990, 1993: chap. 8). The progress of science will be optimized, says Kitcher, when there is an optimal distribution of effort within the scientific community. It may be better for a scientific community to attack a given problem by encouraging some members to pursue one strategy and others to pursue another, rather than all pursue the single most promising strategy. In saying that progress will be "optimized", it is meant that it will be optimized in terms of getting true answers to significant scientific questions. In The Advancement of Science (1993) Kitcher constructs the notion of a "consensus practice", a social practice built up from individual practices consisting of an individual's beliefs, the informants he regards as credible, the methodology of scientific reasoning he accepts, and so forth. A "core" consensus practice consists of the elements of individual practices common to all members of the community. A "virtual" consensus practice is a practice generated by taking into account the statements, methodologies, etc. that members accept "indirectly" by deferring to other scientists as authorities. Kitcher then constructs a family of notions of scientific "progress" and characterizes progress in terms of improvements of consensus practices in getting significant truth and achieving explanatory success.
Feminist epistemologists often embrace the idea of social epistemology. However, many of them strongly criticize traditional epistemology and view it as a poor model for feminist epistemology. At least a few feminist epistemologists, however, take a fundamentally truth-oriented position. Elizabeth Anderson explicitly views feminist epistemology as a branch of social epistemology (1995: 54). Furthermore, when she proceeds to explain the aim of social epistemology, she identifies it as the aim of promoting our reliable, i.e., truth-conducive, processes of belief formation and checking or canceling out our unreliable belief-forming processes (1995: 55). Thus, the fundamental aim is the classical one of seeking true beliefs and avoiding false ones. Miranda Fricker (1998) also adopts an approach to social epistemology with classical roots. She takes her lead from Edward Craig (1990), who stresses the fact that human beings have a fundamental need to acquire truth beliefs and hence a derived need to seek out "good informants," people who will tell us the truth as to whether p. Fricker then points out that norms of credibility arise in society to pick out the class of good informants, people alleged to be competent about the truth as well as sincere. Unfortunately, societal norms of credibility tend to assign more credibility to the powerful than they deserve and to deny credibility to the powerless. The latter is a phenomenon of epistemic injustice. This phenomenon is one that social epistemology should be concerned with, which has "politicizing" implications for the field. Such politicizing implications may be foreign to epistemology in the classical mold, but Fricker derives them from a classical epistemological perspective in which truth-seeking is the basic epistemic activity.
Thus far our examples of classically-oriented social epistemology center on the truth aim. What about the aim of epistemic justification or rationality? As indicated earlier, the problem of testimony is a problem about justification: What makes a hearer justified in accepting a report or other factual statement by a speaker? In the last two decades, testimony has become an active area of epistemological investigation. Although testimony theorists do not generally use the phrase "social epistemology" to describe their inquiry, that seems to be an appropriate label (see Schmitt 1994a).
According to reductionism about testimony, a hearer is justified or warranted in accepting a speaker's report or factual statement only if she is justified is believing that the speaker is reliable and sincere, and the justification for these kinds of belief rests on sources other than testimony itself. Thus, testimony is only a derivative source of epistemic warrant, not a "basic" source like perception, memory, or inductive inference. A hearer must use sources like perception, memory, and inductive inference to arrive at the belief that speakers in general, or the present speaker in particular, are reliable and sincere. Only when the hearer has such justified beliefs, derived from non-testimonial sources, can she be justified in believing what any given speaker reports or asserts. Reductionism was endorsed by David Hume.
In opposition to reductionism stands the doctrine of anti-reductionism about testimonial justification. Anti-reductionism holds that testimony is itself a basic source of evidence or warrant. No matter how little positive evidence a hearer has about the reliability and sincerity of a given speaker, or of speakers in general, she has default or prima facie warrant in believing what the speaker says. Of course, evidence of the speaker's unreliability or insincerity may defeat or override her prima facie warrant for acceptance. But this doesn't undercut the anti-reductionist claim that testimony is a basic source of evidence for the truth of what the speaker asserts. Anti-reductionism in various strengths has been endorsed by C. A. J. Coady (1992), Elizabeth Fricker (1995), Tyler Burge (1993), and Richard Foley (1994).
Perhaps the most natural version of reductionism is global reductionism, which holds that justifiable acceptance of a speaker's report requires non-testimonially based positive reasons for believing that testimony is generally reliable. Here are two inital difficulties with global reductionism. In order to have justified beliefs based on testimony, including testimony from one's own parents, very young children would have to wait until they have checked the accuracy of enough different kinds of reports from enough different speakers to infer that testimony is generally reliable. But surely young children aren't capable of this. Indeed, how could they acquire even the conceptual and linguistic tools needed for an induction to the general reliability of testimony without accepting some testimony in the first place? Second, a person would have to be exposed to a wide sample of reports and corresponding facts in order to infer the general reliability of testimony. But the observational base of ordinary epistemic agents is too narrow to allow this. As C. A. J. Coady points out, few of us have done anything like the field-work that global reductionism requires (1992: 82). So, for most epistemic agents, global reductionism leads to skepticism.
According to anti-reductionism, one doesn't need positive reasons to support the general reliability of testimony, or even reasons for trusting a target speaker's reliability and sincerity. As far as the hearer's reasons are concerned, they need only satisfy the much weaker condition of not including evidence that defeats the speaker's reliability and sincerity. Since this negative requirement is extremely weak, most anti-reductionists add an additional requirement. In particular, they add the requirement that the speaker must actually be competent and sincere. However, Jennifer Lackey (2006) argues that these two conditions don't suffice for hearer justifiedness, because of the weakness of the negative reasons requirement. Suppose Sam sees an alien creature in the woods who drops something that, on examination, seems to be a diary, written in a language that appears to be English. Sam has no evidence either for or against the sincerity and reliability of aliens as testifiers, so he lacks both positive reasons for trusting the diary's contents and negative reasons against trusting them. If the alien is both reliable and sincere, anti-reductionism implies that Sam is justified in believing the diary's contents. Intuitively, however, he isn't so justified, says Lackey. Thus, we need a third kind of theory, she says, that combines reductionism's positive-reasons requirement for hearers with anti-reductionism's actual-reliability requirement for speakers.
Many researchers in the social studies of knowledge reject or ignore such classical concerns of epistemology as truth, justification, and rationality. It is acknowledged, of course, that various communities and cultures speak the language of truth, justification, or rationality, but the researchers in question do not find such concepts legitimate or useful for their own purposes. They seek to describe and understand a selected community's norms of rationality, like anthropologists describing the norms or mores of an alien culture. But they reject the notion that there are any universal or "objective" norms of rationality, or criteria of truth, that they themselves could appropriately invoke. As Barry Barnes and David Bloor put it, "there are no context-free or super-cultural norms of rationality" (1982: 27). So they are not prepared to decree that certain practices are more rational or more truth-conducive than others. In other words, they officially decline to make any judgments about the epistemic properties of various belief-forming practices (though the debunking connotations of their work, discussed below, may belie this stance). They indicate that such judgments would have no culture-free basis or foundation.
They are, nonetheless, clearly interested in belief-forming practices. If we use the term "knowledge" for any sort of belief (or at least for "institutionalized" belief), whether true or false, justified or unjustified, then they can be said to be investigators of knowledge. Since they are specifically interested in social influences on knowledge (so understood), they plausibly qualify as social epistemologists. They do not typically apply this label to themselves, perhaps in recognition that what was traditionally called "epistemology" had different purposes or aspirations. But if the old aspirations must be abandoned — as Richard Rorty (1979) explicitly argued — why not use the old label for the new type of project? For this reason, researchers in the social studies of science, or science and technology studies, will here be considered social epistemologists. There is, however, an additional reason why some of these writers might be called social epistemologists. Some claim to derive epistemologically significant conclusions (in the classical sense of "epistemology") from their sociological or anthropological investigations. Two examples are cases in point. First, as indicated earlier, historical case studies undertaken by members of the Edinburgh School attempt to show that scientists are heavily influenced by social factors "external" to the proper business of science. Other social analyses of science try to show how the game of scientific persuasion is essentially a battle for political power, where the outcome depends on the number or strength of one's allies as contrasted with, say, genuine epistemic worth. If either of these claims were right, the epistemic status of science as an objective and authoritative source of information would be greatly reduced. This claim, if true, seems to have genuine epistemological significance. Second, some sociologists of science claim to show that scientific "facts" are not "out-there" entities, which obtain independently of the human social interactions, but are mere "fabrications" resulting from those social interactions. This is an epistemological thesis, or at least a metaphysical thesis, of some philosophical significance. So some of these writers seem to have philosophical aspirations, not merely social science aspirations.
Let us begin with the first type of thrust, i.e., attempts to debunk the epistemic authority of science. The debunking of science's epistemic authority, at least by sociologists or historians of science, would have to be accomplished by empirical means, for example, by showing how scientific beliefs were actually produced in this or that socio-historical episode. This is precisely what various historians and sociologists of science purport to accomplish. One challenge to this would be a straightforwardly empirical challenge: Do these historical accounts get matters right? Many debunking efforts by members of the "Strong Programme" in the sociology of science have been disputed by others. In addition, there is an obvious, theoretically more interesting, response. How can these studies establish the debunking conclusions unless the studies themselves have epistemic authority? Yet the studies themselves use some of the very empirical, scientific procedures they purport to debunk. If such procedures are epistemically questionable, the studies' own results should be in question. There is, in other words, a problem of "reflexivity" facing this type of debunking challenge. Members of the Edinburgh School sometimes deny that they are trying to debunk or undermine science. Bloor, Barnes and Henry (1996), for example, say that they cheerfully embrace the methods of science, that they "honour science by imitation" (1996: viii). However, as James Robert Brown (2001) points out, this claim is disingenuous. The logical implication of their descriptions of science is to undercut the objectivity and authority of science. They cannot intelligibly propose a revolution and then deny that it would change anything (2001: 143).
Not all sociological approaches are linked to historical case studies. Some offer a more theoretical analysis of how scientists persuade one another of this or that conclusion. For example, Bruno Latour sketches an account of how persuasion is effected in science by marshalling "allies" of substantial reputation on one's own side of a controversy (1987: chap. 1). Can this ostensibly non-epistemic account of science support a successful debunking of its epistemic pretensions? A first point to notice is that any successful debunking of epistemic authority, if explicitly spelled out, must address epistemic issues. It must be shown that the procedures used by scientists have poor epistemic qualities. But this presupposes that there are objective, bona fide epistemic categories, which sociologists of science of Latour's persuasion tend to doubt or deny. If such categories are admitted, the further question arises as to whether persuasion by reference to the numbers of concurring "allies" is really an epistemically bad procedure. Although Latour's military/political vocabulary provides an amusing contrast with conventional characterizations of science, it isn't clear that the practices described are epistemically bad, or sub-rational, practices.
Let us turn now to the social construction of scientific facts. Again there is a question of how this sort of thesis could be established by sociologists. How could any scrutinizing of the activities of human scientists have determinate implications as to whether certain chemical substances, for example, exist independently of interactions among such scientists? Yet this is exactly what Latour and Steve Woolgar imply in their book Laboratory Life: The [Social] Construction of Scientific Facts (1979/1986). Latour and Woolgar claim that the "reality [of a scientific entity or fact] is formed as a consequence of [the] stabilization [of a controversy]" (1986: 180). In other words, the reality does not exist prior to the social event of stabilization, but is the result of such stabilization. How can they ascertain this without being trained, qualified biochemists as opposed to sociologists? How can the study of macro-events of a social nature establish that certain alleged biochemical substances do or do not exist independently of those macro-events?
In discussing social constructivism, it is essential to distinguish between weak and strong versions. Weak social constructivism is the view that human representations of reality — either linguistic or mental representations — are social constructs. For example, to say that gender is socially constructed, in this weak version of social constructivism, is to say that people's representations or conceptions of gender are socially constructed. Strong social constructivism claims not only that representations are socially constructed, but that the entities themselves to which these representations refer are socially constructed. In other words, not only are scientific representations of certain biochemical substances socially constructed, but the substances themselves are socially constructed. The weak version of social constructivism is quite innocuous, at least in the present context. Only the thesis of strong social constructivism is metaphysically (and, by implication, epistemologically) interesting. It is this sort of metaphysical thesis that Latour and Woolgar seem to endorse.
But there are many problems with this metaphysical thesis. One question is whether social constructivists like Latour and Woolgar mean to be "causal" constructivists or "constitutive" constructivists, in the terminology of Andre Kukla (2000). Causal constructivism is the view that human activity causes and sustains facts about the world, including scientific facts, whereas constitutive constructivism is the view that what we call "facts about the world" are really just facts about human activity (Kukla 2000: 21). Although Latour and Woolgar use the language of causal constructivism, it seems more likely that the doctrine they intend is constitutive constructivism. There are, however, severe philosophical difficulties for constitutive social constructivism as a general metaphysical doctrine, as Kukla explains.
Not all researchers within the social studies of science think of social epistemology as confined to the description and explanation of science. Steve Fuller (1987, 1988, 1999), who champions social epistemology in that very phrase, sees the enterprise as normative: How should the institution of science be organized and run? What is the best (scientific) means to knowledge production? However, Fuller does not construe "knowledge" in a truth-entailing fashion, and so parts company with classical epistemology. What does he take the end of knowledge production to be? In one place he says that it's a matter of empirical determination what that end is (1987: 177). But if we don't now know the end, how can we try to direct science toward it? And how can one determine science's end empirically? Science might be found to have many different results. Which of them is its "end"? Helen Longino (1990, 2002) is another contributor to the social studies of science who emphasizes the normative. The social, says Longino, does not contaminate the normative, or justificatory, dimension of science. To the contrary, she views justificatory reasoning as part of a social practice — a practice of challenge and response (2002: chap. 5).
In what sense is social epistemology "social"? Different writers have different conceptions of the social, and this inevitably leads to different conceptions of social epistemology. In the Marxian tradition and in early forms of the sociology of knowledge, "social factors" referred primarily to various types of "interests": class interests, political interests, or anything else pertaining to the "existential" world of power and politics. Under this conception of the social, it is natural to see social factors as antithetical to "reason". If science is infiltrated by social factors, in this sense, how can it be a successful instrument for getting at truth? Thinking of the relationship between the rational and the social as one of opposition, it is not surprising to find Larry Laudan proposing an "arationality principle": "[T]he sociology of knowledge may step in to explain beliefs if and only if those beliefs cannot be explained in terms of their rational merits" (Laudan 1977: 202).
Can the opposition between the rational and the social be eliminated, or at least relaxed? A first possible move is to allow "interests" to include the private or professional interests of scientists. It seems undeniable that scientists are at least partly driven by a desire for "credit" from their peers (Hull 1988). But won't private and professional interests deflect scientists from reason and truth as much as class or political interests? Several writers argue to the contrary. There is no necessary conflict between professional interest and successful pursuit of truth. Kitcher (1990) argues that the optimal division of labor in scientific research may be attained not by "pure", altruistic scientists but by scientists with "grubby" and epistemically "sullied" motives. Similarly, Goldman and Shaked (1991) show that, given certain assumptions about credit-giving practices and experimental choices, there will be little difference between choices of truth-motivated scientists and choices of credit-motivated scientists. Hence, there will be little difference in expected success of moving the community toward truth. Credit-driven interests need not be inimical to truth-promotion.
A further proposal is to expand the "social" beyond politics and interests altogether. The most inclusive sense of the social is simply any relationship among two or more individuals. There is no reason why social epistemology cannot be social in this broad sense. Any interaction among individuals affecting the credal states of some of them might be considered a social-epistemic relationship. So understood, a wide range of communicative interactions would be fit subjects for social epistemology. For example, many knowledge-seeking enterprises are collaborative in nature, including scientific enterprises involving research teams. An interesting task for social epistemology is to identify the types of collaboration that would be optimal in terms of some epistemically relevant measure (Thagard 1997).
Can the "social" be fully captured by inter-individual relationships? Some theorists would argue in the negative, pointing specifically to collective entities such as corporations, committees, juries, and teams. We often attribute mental or mental-like states, including beliefs, to such collective entities (Gilbert 1989, 1994; Bratman 1999; Tuomela 1995; Searle 1995). We might say, for example, that a jury was convinced that the defendant intended such-and-such, or that the jury doubted that a certain alleged conversation really took place. Collective entities are obviously "social" in an important way; and if it is granted that such entities are bearers of beliefs and other doxastic states, shouldn't these collective states be an important target for social epistemology? Precisely this is suggested by Lynn Hankinson Nelson (1993), who goes even further in proposing that the only real knowers are communities.
Should social epistemology pursue its agenda by focusing, in whole or in part, on group knowledge? Obviously, this depends on whether groups or collectivities are legitimate bearers of epistemic states like knowledge or justified belief. Most philosophers who address this issue agree that a group can be described as believing in some proposition p in the minimal sense that all or most members of the group believe p. This is what Anthony Quinton (1975/1976) calls the "summative" conception of group belief. But if this is the only legitimate sense in which groups can be said to believe something, many "socializing" philosophers will be disappointed. They want to hold the stronger view that groups or collectivities can be the subjects of beliefs and other attitudes that diverge from the attitudes of their members. Is it legitimate to speak of group beliefs in this more challenging, non-summative conception?
Philip Pettit (2003) defends the view that groups are subjects of propositional attitudes in the non-summative sense. One key to his position is the idea, popular among many philosophers of mind, that a system is properly viewed as an intentional subject just in case it displays a certain kind of rational unity. It must preserve intentional attitudes over time and it must form, unform, and act on those attitudes — at least under favorable conditions — so as to maintain a pattern of rational unity. Pettit then argues that certain types of groups, which he calls "social integrates," display exactly this kind of rational unity. Although these groups are not distinct from their individual members in the sense of being capable of existing in their absence, they are distinct from their members in the sense of being centers for the formation of attitudes that can be quite discontinuous from those of their members (2003: 183). According to Pettit, collective judgments and intentions do not constitute an ontologically emergent realm, because these judgments and intentions may always supervene on the attitudes and relations among their members. Still, the judgments and intentions can diverge from those of their several members.
Even if the existence of non-summative group beliefs is granted, this doesn't concede everything a group-oriented social epistemologist might want (or need). As indicated earlier, group-oriented social epistemologists should also want to hold that (positive) epistemic properties, like knowledge or justifiedness, are properly ascribed to groups, and this conclusion hasn't yet been fully defended. One basis for rejecting this conclusion is that non-summative groups choose their beliefs voluntarily, and doxastic voluntarism is incompatible with positive epistemic properties like knowledge or justifiedness. The chief sticking-point here is that groups may adopt views for non-epistemic reasons, not because they aim at truth. K. Brad Wray (2003) argues that groups, unlike individual agents, always choose to believe based on their goals. Similarly, Christopher McMahon (2003) points out that groups undertake to defend as true positions they adopt for purely instrumental reasons. Notoriously, the tobacco companies took the position that smoking does not cause cancer, although it is questionable whether any tobacco executives actually believed this. If we assume with many authors that the goal of truth is the hallmark of the epistemic, rampant doxastic voluntarism on the part of groups would be a stumbling-block for the attainment of (positive) epistemic properties. However, as Kay Mathiesen (2006) argues, it seems unlikely that all group beliefs are deliberately chosen. Moreover, what precludes the possibility that some group beliefs are chosen with the aim of truth, or accuracy? So positive epistemic status for collective beliefs still has legs to stand on, and it is open to social epistemologists to select collective belief as the keystone of their conception of what is distinctively social in social epistemology (see also Schmitt 1994b).
Suppose, then, that the door is left open to employ group belief as the keystone of social epistemology. Which notion of group belief should be chosen? There is more than one legitimate notion of group belief. That this is so is implied in a pithy summary of what led to disaster on 9/11, as described by Sandy Berger, national security advisor in the Clinton Administration. Berger said: "We've learned since 9/11 that not only did we not know what we didn't know, but the F.B.I. didn't know what it did know." Focus on the second half of Berger's dictum, "the F.B.I. didn't know what it did know." Let us analyze this carefully (Goldman 2004). Berger implies, with a strong ring of truth, that under one conception of ‘the F.B.I.’ the F.B.I. did have knowledge highly pertinent to 9/11 and under a different conception of ‘the F.B.I.’ it lacked this knowledge. Now, an entity that lacked a certain piece of knowledge cannot be the same entity that possessed the very same knowledge at the same time. So ‘the F.B.I.’ must have more than one referent. What Berger meant, clearly, is that knowledge was possessed in a distributed fashion by the collection of agents in the field (e.g., in Minneapolis and Phoenix), in particular, by those agents who were each aware that a certain suspicious alien was engaged in flight training. In this distributed way, the F.B.I. had knowledge of the flight-training pattern of several of the future hijackers.
What is the second conception of ‘the F.B.I.’ under which the agency didn't possess the same knowledge? It must be a non-distributive conception; but there are different non-distributive candidates here. Pettit, as we have seen, has developed the notion of a "social integrate," or an "integrated collectivity." These are groups united by a common purpose. As Pettit develops the notion of rational collective judgment, it involves an important assumption of equality of weight among group members. This assumption seems to be embedded in his notion of an integrated collectivity. This isn't an apt characterization, however, of all collective epistemic subjects worthy of social epistemology's attention. Continuing with the 9/11 example, the F.B.I. clearly isn't that kind of collectivity. Rather, like many other organizations, it is what might be called a hierarchical collectivity. Decision-making power is vested in a single individual or directorship; to a first approximation, whatever this individual or directorship decides is the decision of the organization. And what this individual or directorship knows or doesn't know is naturally construed as what is known or isn't known by the organization. Where the F.B.I. was deficient, in the case of the 9/11 hijackers, was in its failure to transmit and pool communications from agents in the field to high-level analysts in Washington. The hierarchical construal of (one occurrence of) ‘the F.B.I.’ makes best sense of Berger's quip. Qua hierarchical collectivity, the F.B.I. didn't know what the F.B.I. qua distributive collectivity did know (Goldman 2004).
It seems clear that if social epistemology is to invoke group belief and group knowledge, it should be prepared to deal with many types of groups or collectivities and many conceptions of group belief and knowledge. One size will not fit all.
In the next two sections of this article, a sample agenda for social epistemology is sketched. This sample, of course, doesn't purport to be exhaustive. It merely partitions the enterprise into two natural divisions and describes selected projects that belong to each division.
The two-fold partition divides the territory into theoretical issues and applied issues. Theoretical issues are illustrated in this section and applied issues in section 6. In the theoretical division there is substantial continuity between the individual and social branches of epistemology. Certain theoretical issues of social epistemology could also be be posed in the context of individual epistemology. At a minimum, there are counterpart issues in both branches. The applied issues are more distinctive to the social branch, however. In particular, applied issues in social epistemology commonly involve matters of institutional design, where the problem is to configure or reconfigure social institutions so as to promote truth acquisition or error avoidance. Problems of institutional design typically demand inputs from empirical and formal disciplines outside philosophy. It is expected, therefore, that social epistemology will be an interdisciplinary enterprise, not one of pure, a priori philosophy. Interdisciplinarity per se doesn't separate the two branches, because individual epistemology can also be approached in an interdisciplinary spirit (Goldman 1986). But individual epistemology would not be concerned with social systems, or with disciplines like economics, social choice theory, or formal political theory.
The first topic under "theoretical" heading is an extension of the problem of testimonial justification. The central problem in testimonial justification is to specify the conditions under which a hearer is justified in trusting what a single speaker reports. Our present question concerns two speakers who deliver conflicting reports or assertions. Specifically, let the two speakers be putative experts in a given field, experts who nonetheless disagree with one another on a particular point. Let the hearer be a self-acknowledged novice, with no prior opinion on the matter. How can such a novice decide, justifiably, which of the two conflicting assertions merits greater trust? Goldman (2001) calls this the "novice/two-experts? problem." It is a recurring problem in practical life, but here we examine it in abstraction from particular instances.
What is special to the novice/two-experts problem is that the hearer has no opinions of his own. At any rate, he prefers to defer to someone more authoritative. Is there any way for him to choose (justifiably) between two putative experts? If he could decide who is the greater authority, he could use this information to decide whom to trust. But how can someone who lacks knowledge about the domain justifiably choose between two self-proclaimed experts? Kitcher (1993: 314, 316) says that we sometimes "directly calibrate" a putative authority by comparing the output of that authority with our own opinions on questions where our judgments overlap. If X wishes to decide how much authority to ascribe to Y with respect to domain D, X should ascertain what opinions Y has expressed about D on which X has independent opinions. Then X should assign Y a degree of authority proportional to the truth-ratio of Y's D-related statements as judged by X's own opinions. In the novice/two-experts problem, however, X doesn't have any opinions in domain D, at least none he feels confident in deploying. So how can X make a justified determination of degree of authority or expertise?
Goldman (2001) considers several methods the novice might try to use. One is to listen to a debate between the contending experts. Another is to solicit judgments from other (meta-)experts about the comparative expertise of the two contenders. A third is to investigate the opinions of additional experts, to see which position has more adherents. There are tricky theoretical questions in each case, however, about the quality of evidence that a novice could obtain via these methods. How much can the novice be illuminated by hearing a debate on a topic on which his own ignorance precludes him from judging the correctness of the various premises? How can the novice assess the relative trustworthiness of the third parties who assess the original experts? Their trustworthiness may be as problematic as that of the initial experts. Finally, does a viewpoint with more adherents always deserve greater credence than its negation? Agreement can arise from many factors, not all of which warrant increases in credence. Maybe the people who adhere to a certain view are just slavish followers of a charismatic but fundamentally confused or misguided leader.
Another intriguing theoretical issue — not unrelated to the assessment of rival experts — is the possibility of reasonable disagreement among people with shared evidence. Suppose two people begin with conflicting beliefs about a given question: one believes P and the other believes not-P. Suppose they proceed to share all of their evidence that bears on the question, including what each seems to have observed. Finally, suppose each forms the opinion that they have equally good eyesight, equally good inferential skills, and so forth. Can they reasonably persist in holding their respective, incompatible opinions? Clearly, they cannot both be right if they persist in these beliefs. But can they be rational in continuing to disagree, despite the same evidence?
Richard Feldman (2006) encapsulates the matter by formulating two questions:
Q1: Can epistemic peers who have shared their evidence have reasonable disagreements?
Q2: Can epistemic peers who have shared their evidence reasonably maintain their own belief yet also think that the other party to the disagreement is also reasonable?
Feldman argues for negative answers to both questions. Suppose that a detective has strong evidence that incriminates suspect Lefty of a certain crime, and another detective has equally strong evidence incriminating suspect Righty of the same crime. They also have decisive evidence that there was only one culprit. Once the two detectives share all their evidence, is it reasonable for the first to continue to believe in Lefty's guilt and for the second to continue to believe in Righty's guilt? Surely not. Each should suspend judgment. This leads Feldman to what he calls the "uniqueness thesis." This thesis says that a body of evidence justifies at most one propositional attitude toward any particular proposition, where possible attitudes include believing, disbelieving, and suspending judgment. In the two detectives case, the uniquely proper attitude for each detective, given that they have the same body of evidence, is suspension of judgment (Elga 2007).
Not all theorists agree with this conclusion. Gideon Rosen (2001: 71) writes: "It should be obvious that reasonable people can disagree, even when confronted with a single body of evidence. When a jury or a court is divided in a difficult case, the mere fact of disagreement does not mean that someone is being unreasonable." Rosen expands on this view by arguing that epistemic norms are permissive norms, not obligating or coercive norms. Thus, even when two people share the same evidence, it is permissible for one to adopt one doxastic attitude toward a proposition and for the other to adopt a different attitude (also see Pettit 2006).
The problem of rational disagreement might be viewed as a special case of the problem of epistemic relativism versus objectivism (or absolutism). The denial of Feldman's uniqueness thesis, for example, might be viewed as an endorsement of relativism (Rosen uses this language). But most epistemologists intend the term ‘relativism’ in a different way. In conformity with meta-ethics, epistemic relativism is more likely to be understood as the view that all epistemic norms are relative to a community, or the view that there are no objectively right epistemic norms. Thus, Paul Boghossian (2006: 73) formulates epistemic relativism as a composite of three theses: (1) There are no absolute facts about what justifies what (epistemic non-absolutism); (2) Epistemic judgments should be construed as having a relational form, "E justifies B according to epistemic system C" (epistemic relationism); and (3) There are many alternative epistemic systems, but no facts that make any one of these systems more correct than any of the others (epistemic pluralism).
Disputes between epistemic relativism versus objectivism (absolutism) certainly belong on the list of theoretical problems of epistemology. Such disputes aren't confined to social as opposed to individual epistemology, but they arise with special force in the context of social epistemology, where diversity of epistemic systems is often highlighted. This leads to relativism being an oft-encountered component of social epistemologies. As noted earlier, Barry Barnes and David Bloor, in their paper entitled "Relativism, Rationalism, and the Sociology of Knowledge," endorse epistemic relativism with the statement that "there are no context-free or super-cultural norms of rationality" (1982: 27). Similarly, Martin Kusch's (2002) brand of social epistemology includes a defense of relativism.
On the other hand, social epistemology per se is hardly committed to epistemic relativism. Boghossian launches a multi-pronged critique of it. First he casts doubt on the possibility of giving a coherent interpretation to the relationist strand of relativism. If ordinary singular epistemic judgments are said to be unacceptable because they express incomplete propositions, won't the same hold of the contents of epistemic systems? So, says Boghossian, there is no stable rendering of a relativistic (system-relative) conception of epistemic justification. Analogous maneuvers, he claims, provide adequate responses to relativistic challenges from norm-circularity.
A fourth and final example of theoretically-oriented social epistemology concerns the rational aggregation of factual judgments. As we have seen, groups often adopt "beliefs" by aggregating their members' individual judgments. For example, suppose that a three-judge court has to decide a tort case, and under relevant legal doctrine it must judge the defendant liable if and only if it finds, first, that the defendant's negligence was causally responsible for the injury to the plaintiff, and, second, that the defendant had a duty of care toward the plaintiff. Suppose that the three judges, A, B, and C, vote as shown below on the following two "premise" propositions and the "conclusion" proposition concerning a certain defendant, where the first premise is that the defendant caused harm, the second premise is that the defendant had a duty of care, and the conclusion is that the defendant is liable.
Cause of harm? Duty of care? Liable? A Yes No No B No Yes No C Yes Yes Yes Majority Yes Yes No
Assume further that the court aggregates the several judges' individual votes by a majority decision rule. The result in this case is anomalous. The group's aggregate judgment endorses both premises but rejects the conclusion. This set of collective judgments is inconsistent, because, given the indicated legal doctrine, the conclusion logically follows from the premises. Yet the group endorses the premises but rejects the conclusion.
This kind of outcome has led theorists to reflect on the range of possible aggregation procedures, where an aggregation procedure is a rule by which a group generates collectively endorsed beliefs or judgments on the basis of its members' individual beliefs or judgments. Various questions can be asked about judgment aggregation procedures, questions of interest to social epistemology. One question is whether each possible procedure preserves rationality at the group level. A different question concerns the truth-conducive properties of each procedure. These types of questions are currently under intensive investigation.
Christian List and Philip Pettit (2002, 2004) have proved some interesting impossibility theorems, analogous to Kenneth Arrow's (1963) impossibility theorem that launched social choice theory. Here is an example of such an impossibility result pertaining to group rationality (List 2005). Consider any group of two or more individuals that must make judgments on a set of non-trivially interconnected propositions, as in the tort liability example. A set of judgments is called rational if and only if it is consistent and complete (in specified senses of these terms). Assume that each individual makes a rational set of judgments on these propositions. Then the following impossibility theorem holds for the collective judgments: No aggregation procedure exists that generates collective judgments from individual judgments that satisfies both the constraint of rationality plus the following three conditions: (a) universal domain, (b) anonymity, and (c) systematicity (List and Pettit 2002). Universal domain is the condition that a procedure accepts as admissible input any possible combinations of complete and consistent individual judgments on the propositions. Anonymity is the condition that the judgments of all individuals have equal weight in determining the collective judgments. Systematicity is the condition that the collective judgment on each proposition depends only on the individual judgments on that proposition, and the same pattern of dependence holds for all propositions. The theorem implies that majority voting does not satisfy these conditions, and no other procedure does so either. This has the flavor of a paradox, because it ostensibly demonstrates an inherent but surprising difficulty in generating rational judgments at the collective level. Of course, the impossibility result can be avoided if some of these conditions are relaxed. But related impossibility theorems have also been proved (Dietrich 2006), and it's of theoretical interest to see which sets of conditions are or are not jointly satisfiable. In the next section we note that these questions might also have a bearing on matters of institutional design.
The proper function of forensic science is to extract the truth. This function, unfortunately, is not well served by current practice. Saks et al. (2001: 28) write: "As it is practiced today, forensic science does not extract the truth reliably. Forensic science expert evidence that is erroneous (that is, honest mistakes) and fraudulent (deliberate misrepresentation) has been found to be one of the major causes, and perhaps the leading cause, of erroneous convictions of innocent persons." One rogue scientist engaged in rampant falsification for 15 years, and another faked more than 100 autopsies on unexamined bodies and falsified dozens of toxicology and blood reports (Kelly and Wearne 1998; Koppl 2006, Other Internet Resources). Shocking cases are found in more than one country.
Can the error rate from forensic laboratory reports be reduced? This is a problem in applied social epistemology. Roger Koppl (2005, 2006) offers a theoretical analysis, an experimental finding that supports this analysis, and a particular suggestion for redesigning the current system. This combination of analysis and policy recommendation constitutes a perspicuous specimen of applied social epistemology.
Koppl, an economist, pinpoints the problem as the monopoly position enjoyed by most forensic laboratories vis-a-vis the legal jurisdictions for whom they work. Each legal jurisdiction is served by one lab, and only that lab delivers reports about crime scene evidence. A typical report says whether or not there is a "match" between an evidentiary item obtained at the crime scene and a trait of the defendant, e.g., a match between a DNA sample from the crime scene and the DNA profile of the defendant. Forensic workers know that prosecutors prefer messages that report matches, and this generates a bias toward reporting of matches. Koppl (2005) analyzes the situation by means of game-theoretic models of epistemic systems. All such models have one or more "senders" who search a "message space" and deliver a message to one or more "receivers." In forensic science the receivers are jurors who hear the forensic message delivered via testimony in open court. The jury then decides whether a fingerprint or some DNA left at the crime scene belongs to the defendant. This judgment is just one input into the jury's deliberation that typically culminates in a judgment of guilt or innocence. But the specific target of analysis is the jury's judgment of whether the print or the sample came from the defendant. Some institutional arrangements, Koppl argues, will induce patterns of forensic reports that are less accurate, on average, than those of other institutional arrangements, and will thereby induce a pattern of jury judgments that are less reliable than those under other possible institutional arrangements.
Koppl argues on the basis of game-theoretic analysis that in the absence of competition with any other forensic lab (another potential "sender"), the bias toward reporting matches will produce a high incidence of false information. On the other hand, suppose that competition is introduced into the institutional arrangement, by having (say) three forensic labs all produce reports, where each lab knows that two other labs may also be doing a report. The incentives arising from the new pattern of strategic interaction will be different, and more unfavorable to the transmission of false information. Koppl performed a gaming experiment intended to reproduce the strategic structure of monopolistic versus competitive games that mimics the described scenarios for forensic laboratories. The experimental findings confirm a change in behavior in the predicted direction (Koppl 2006). The three-sender situation reduced the systemic error rate by two-thirds (as compared with the one-sender situation). This is a fine example of a field that Koppl calls "epistemic systems design," where we study the impact of institutional system design on matters of veracity. This contrasts with the standard technique in economics of analyzing the different institutional systems on matters of efficiency.
The institution relating forensic laboratories and courts is a minor institution in the larger scheme of things. But epistemic systems design can be applied to systems of any scale, macro or micro. Consider the overarching legal structure that governs speech and the press within a country. This is a legal framework with a powerful bearing on the informational state of a society, and hence can be analyzed in terms of its epistemic consequences. Many historical writers have held that the best rationale for freedom of speech and the press is its truth-promoting consequences. In the words of John Milton (1644/1959): "Let [Truth] and Falshood grapple; who ever knew Truth put to the wors, in a free and open encounter" (561). In the twentieth century, the truth rationale was defended in specifically economic terms, i.e., in terms of efficiency of free trade or market mechanisms. As Frederick Schauer expressed the idea, "Just as Adam Smith's ‘invisible hand’ will ensure that the best products emerge from free competition, so too will an invisible hand ensure that the best ideas emerge when all opinions are permitted freely to compete" (1982: 16).
It is debatable, however, whether pure competition, unfettered by legal interference, would optimize knowledge in society. Contrary to some of its proponents, this isn't a consequence of pure economic theory (Goldman and Cox 1996). Moreover, many legal institutions are created with an eye toward (1) deterring the transmission of falsehoods, (2) encouraging the transmission of newsworthy truths, and (3) promoting the creation of new knowledge. Laws against libel and fraud are examples of (1). Shield laws enabling journalists to protect the confidentiality of their sources (thereby promoting public disclosure of consequential truths) are an example of (2). And patent and copyright laws are examples of (3). The precise epistemic impact of all such laws is open to debate. It is hard to deny, however, that constitutional provisions and statutes in general have profound epistemic consequences.
There are many routes by which institutions come into being and change. Legislation is not the only route, so it shouldn't be assumed that applied social epistemology is exclusively focused on legal policies. It should be equally interested in policies adopted by voluntary associations and organizations, and in behavioral patterns that emerge in various economic, technological, and historical circumstances. New forms of communication, for example, arise and displace older forms as a result of new technologies. In our own era, the Internet has become a major source of communication that threatens to displace the mainstream press because of declining audiences and advertising revenues. In some quarters Weblogs are more trusted than the mainstream press. One result is that professional journalists, with their distinctive ethos, may be supplanted by non-professionals. Whether this is good or bad in epistemic, or veritistic, terms is a serious question for social epistemology. Richard Posner, a free-enterprise promoting judge who also happens to be a blogger contends that the blogosphere does at least as good a job as the traditional press at disseminating (and analyzing) the news (Posner 2005). Whether this is correct is another important "applied" question in social epistemology (Goldman, in press).
Finally, return to the topic of judgment aggregation and the different prospects for a group to get the truth under different aggregation procedures. List (2005) discusses several ways in which differences in aggregation procedures can have an impact on the amount of knowledge a group will tend to obtain. Let an agent's "positive reliability" on proposition p be the probability that he will believe p given that p is true and let his "negative reliability" on p be the probability that he does not believe p given that p is false. By considering a group's positive and negative reliability on various propositions under different aggregation procedures and scenarios, one can see how an aggregation procedure (a particular institution) makes a difference to the group's prospects for veritistic success.
First, one can compare three procedures: majority voting, dictatorial rule (in which the collective judgment is always fully determined by the same fixed group member), and the unanimity procedure (where agreement among all members is necessary to reach a collective judgment). The last procedure permits incomplete collective judgments. It is assumed that each group member has a positive and negative reliability r on proposition p, where 1 > r > 1/2 (the competence condition). Under the dictatorial procedure, the group's positive and negative reliability on p equals that of the dictator, which by assumption is r. Under the unanimity procedure, the group's negative reliability approaches 1 as the group size increases, but its positive reliability approaches 0 as the group sizes increases. So the unanimity procedure is good at avoiding false judgments but bad at reaching true ones. This is because under unanimity a determinate collective judgment is reached only if all members agree; if not, no collective judgment is made. Under majority voting, by contrast, the positive reliability of the group also approaches 1 as the group size increases, as shown in the famous "Condorcet jury theorem." Generalizing a bit, if individuals are independent, fallible, but biased toward the truth, majority voting outperforms both unanimity and dictatorial procedures in terms of maximizing the group's positive and negative reliability on p. Hence, for purposes of attaining "knowledge" (especially under Nozick's 1981 definition of "knowledge"), the best of the three aggregation procedures is majority voting.
Another lesson that List (2005) derives from the formal analysis of aggregation procedures concerns prospective veritistic gains from "distribution." When an epistemic task is complex in that it requires judgments on several propositions, different individuals within the group may have different levels of expertise on different propositions. Suppose a system allows the group to be partitioned into subgroups, where members of each subgroup specialize on one premise. Each subgroup makes collective judgments on its designated premise and then a collective judgment is derived on the conclusion from the subgroup judgments on the premises. There are scenarios under which such a "distributed" procedure outperforms the regular, non-distributed (premise-based) procedure.
Reflecting on our portrait of applied social epistemology, a reader may wonder about our earlier claim that this project is continuous with classical individual epistemology. How can this be? Descartes's epistemological enterprise was targeted at episodes within the mind of the subject. What connection does that highly "internalist" enterprise have with the design of social systems or institutions?
True, Cartesianism focused exclusively on introspectible mental contents, and this is dramatically different from social epistemology, especially in its institutional-design dimension. But contemporary epistemology no longer hews to Descartes's rigid introspectionism. If we emphasize a different aspect of Descartes's enterprise, we find a feature entirely congenial to social epistemology, namely, the pursuit of truth. Whereas Descartes thought that truth should be pursued only by the proper conduct of "reason," specifically, the doxastic agent's own reason, social epistemology acknowledges what everyone except a radical skeptic will admit, namely, that quests for truth are commonly influenced, for better or for worse, by institutional arrangements that massively affect what doxastic agents hear (or fail to hear) from others. To maximize prospects for successful pursuits of truth, this variable cannot sensibly be neglected.
- Anderson, Elizabeth (1995), "Feminist Epistemology: An Interpretation and a Defense," Hypatia, 10 (3): 50–84.
- Arrow, Kenneth (1963), Social Choice and Individual Values, New York: Wiley.
- Barnes, Barry and Bloor, David (1982), "Relativism, Rationalism, and the Sociology of Knowledge," in Rationality and Relativism, M. Hollis and S. Lukes (eds.), Cambridge, MA: MIT Press.
- Bloor, David, Barnes, Barry and Henry, J. (1996), Scientific Knowledge: A Sociological Analysis, Chicago: University of Chicago Press.
- Boghossian, Paul (2006), Fear of Knowledge, Oxford: Clarendon Press.
- Bratman, Michael (1999), Faces of Intention, Cambridge: Cambridge University Press.
- Brown, James Robert (2001), Who Rules in Science? An Opinionated Guide to the Wars, Cambridge, MA: Harvard University Press.
- Burge, Tyler (1993), "Content Preservation," The Philosophical Review, 102: 457–488.
- Coady, C. A. J. (1992), Testimony, Oxford: Oxford University Press.
- Craig, Edward (1990), Knowledge and the State of Nature, Oxford: Clarendon Press.
- Descartes, René (1637/1955). Discourse on the Method of Rightly Conducting the Reason and Seeking for Truth in the Sciences, trans. E. Haldane and G. Ross, The Philosophical Works of Descartes, vol. 1, New York: Dover.
- ––– (1641/1955). Meditations on First Philosophy, trans. E. Haldane and G. Ross, The Philosophical Works of Descartes, vol. 1, New York: Dover.
- Dietrich, Franz (2006), "Judgment Aggregation: (im)possibility Theorems," Journal of Economic Theory, 126 (1): 286–298.
- Elga, Adam (2007), "Reflection and Disagreement", Noûs, 41 (3): 478–502. [Preprint Available Online (PDF)]
- Feldman, Richard (2006), "Reasonable Religious Disagreements", in Louise Antony (ed.), Philosophers without God, Oxford: Oxford University Press, forthcoming.
- Foley, Richard (1994), "Egoism in Epistemology," in Socializing Epistemology, F. Schmitt (ed.), Lanham, MD: Rowman and Littlefield.
- Forman, Paul (1971), "Weimar Culture, Causality and Quantum Theory, 1918–1927: Adaptation by German Physicists and Mathematicians to a Hostile Intellectual Environment," in Historical Studies in the Physical Sciences 3, R. McCormmach (ed.), Philadelphia: University of Pennsylvania Press.
- Fricker, Elizabeth (1995), "Telling and Trusting: Reductionism and Anti-Reductionism in the Epistemology of Testimony," Mind, 104: 393–411.
- Fricker, Miranda (1998), "Rational Authority and Social Power: Towards a Truly Social Epistemology," Proceedings of the Aristotelian Society, 19 (2): 159–177.
- Foucault, Michel (1977), Discipline and Punish, trans. A. Sheridan, New York: Random House.
- ––– (1980), Power/Knowledge, New York: Pantheon.
- Fuller, Steve (1987), "On Regulating What is Known: A Way to Social Epistemology," Synthese, 73: 145–183.
- ––– (1988), Social Epistemology, Bloomington: Indiana University Press.
- ––– (1993), Philosophy, Rhetoric, and the End of Knowledge, Madison: University of Wisconsin Press.
- ––– (1999), The Governance of Science: Ideology and the Future of the Open Society, London: Open University Press.
- Geuss, Raymond (1981), The Idea of a Critical Theory: Habermas and the Frankfurt School, Cambridge: Cambridge University Press.
- Gilbert, Margaret (1989), On Social Facts, London: Routledge.
- ––– (1994), "Remarks on Collective Belief," in Socializing Epistemology, F. Schmitt (ed.), Lanham, MD: Rowman and Littlefield.
- Goldman, Alvin (1978), "Epistemics: The Regulative Theory of Cognition," The Journal of Philosophy, 75: 509–523.
- ––– (1986), Epistemology and Cognition, Cambridge, MA: Harvard University Press.
- ––– (1987), "Foundations of Social Epistemics," Synthese, 73: 109–144.
- ––– (1999), Knowledge in a Social World, Oxford: Oxford University Press.
- ––– (2001), "Experts: Which Ones Should You Trust?" Philosophy and Phenomenological Research, 63: 85–110.
- ––– (2004), "Group Knowledge versus Group Rationality: Two Approaches to Social Epistemology," Episteme, A Journal of Social Epistemology, 1 (1): 11–22.
- ––– (2006, in press), "The Social Epistemology of Blogging," in Information Technology and Moral Philosophy, eds. J. van den Hoven and J. Weckert, Cambridge: Cambridge University Press.
- Goldman, Alvin and Cox, James (1996), "Speech, Truth, and the Free Market for Ideas," Legal Theory, 2: 1–32.
- Goldman, Alvin and Shaked, Moshe (1991), "An Economic Model of Scientific Activity and Truth Acquisition," Philosophical Studies, 63: 31–55.
- Habermas, Jurgen (1973), "Wahrheitstheorien," in Wirklichkeit und Reflexion: Festschrift fur Walter Schulz, Pfullingen: Neske.
- Habermas, Jurgen and Luhmann, Niklas (1971), Theorie der Gesellschaft oder Sozialtechnologie – Was Leistet die Systemforschung? Frankfurt: Suhrkamp.
- Hull, David (1988), Science as a Process, Chicago: University of Chicago Press.
- Hume, David (1975), An Enquiry Concerning Human Understanding, in Hume's Enquiries, P. H. Nidditch and L. A. Selby-Bigge (eds.), Oxford: Oxford University Press.
- Kelly, J. F. and Wearne, P. (1998), Tainting Evidence: Inside the Scandals at the FBI Crime Lab, New York: The Free Press.
- Kitcher, Philip (1990), "The Division of Cognitive Labor," The Journal of Philosophy, 87: 5–22.
- ––– (1993), The Advancement of Science, New York: Oxford University Press.
- Koppl, Roger (2005), "Epistemic Systems," Episteme: A Journal of Social Epistemology, 2 (2): 91–106.
- Kuhn, Thomas (1962/1970), The Structure of Scientific Revolutions, 2nd ed., Chicago: University of Chicago Press.
- Kukla, Andre (2000), Social Construction and the Philosophy of Science, London: Routledge.
- Kusch, Martin (2002), Knowledge by Agreement, Oxford: Clarendon Press.
- Lackey, Jennifer (2006), "It Takes Two to Tango: Beyond Reductionism and Non-Reductionism in the Epistemology of Testimony," in The Epistemology of Testimony, J. Lackey and E. Sosa (eds.), New York: Oxford University Press.
- Latour, Bruno (1987), Science in Action, Cambridge, MA: Harvard University Press.
- Latour, Bruno and Woolgar, Steve (1979/1986), Laboratory Life: The [Social] Construction of Scientific Facts, Princeton: Princeton University Press.
- Laudan, Larry (1977), Progress and Its Problems, Berkeley: University of California Press.
- List, Christian (2005), "Group Knowledge and Group Rationality: A Judgment Aggregation Perspective," Episteme: A Journal of Social Epistemology, 2 (1): 25–38.
- List, Christian and Pettit, Philip (2002), "Aggregating Sets of Judgments: An Impossibility Result," Economics and Philosophy, 18: 89–110.
- ––– (2004), "Aggregating Sets of Judgments: Two Impossibility Results Compared," Synthese, 140 (1–2): 207–235.
- Locke, John (1959), An Essay Concerning Human Understanding, 2 volumes, A.C. Fraser (ed.), New York: Dover.
- Longino, Helen (1990), Science as Social Knowledge, Princeton: Princeton University Press.
- ––– (2002), The Fate of Knowledge, Princeton: Princeton University Press.
- Mackenzie, Donald (1981), Statistics in Britain: 1865–1930, The Social Construction of Scientific Knowledge, Edinburgh: Edinburgh University Press.
- Mannheim, Karl (1936), Ideology and Utopia, trans. L. Wirth and E. Shils, New York: Harcourt, Brace and World.
- Mathiesen, Kay (2006), "The Epistemic Features of Group Belief," Episteme, A Journal of Social Epistemology, 2 (3): 161–175.
- McMahon, Christopher (2003), "Two Modes of Collective Belief," Protosociology, 18/19: 347–362.
- Merton, Robert (1973), The Sociology of Science, Chicago: University of Chicago Press.
- Milton, John (1644/1959), "Areopagitica, A Speech for the Liberty of Unlicensed Printing," in Complete Prose Works of John Milton, E. Sirluck (ed.), New Haven: Yale University Press.
- Nelson, Lynn Hankinson (1993), "Epistemological Communities," in Feminist Epistemologies, L. Alcoff and E. Potter (eds.), New York: Routledge.
- Nozick, Robert (1981), Philosophical Explanations, Cambridge, MA: Harvard University Press.
- Pettit, Philip (2003), "Groups with Minds of Their Own," in Socializing Metaphysics, F. Schmitt (ed.), Lanham, MD: Rowman and Littlefield.
- ––– (2006), "When to Defer to Majority Testimony — and When Not," Analysis, 66 (3): 179–187.
- Posner, Richard (2005), "Bad News," New York Times Book Review, July 31, 2005, pp. 1, 8–11.
- Quinton, Anthony (1975/1976), "Social Objects," Proceedings of the Aristotelian Society, 75: 1–27.
- Reid, Thomas (1975), An Inquiry into the Human Mind on the Principles of Common Sense, in Thomas Reid's Inquiry and Essays, R. Beanblossom and K. Lehrer (eds.), Indianapolis: Bobbs-Merrill.
- Rorty, Richard (1979), Philosophy and the Mirror of Nature, Princeton: Princeton University Press.
- Rosen, Gideon (2001), "Nominalism, Naturalism, Philosophical Relativism," Philosophical Perspectives, 15: 69–91.
- Saks, Michael et al. (2001), "Model Prevention and Remedy of Erroneous Convictions Act," Arizona State Law Journal, 33: 665–718.
- Schauer, Frederick (1982), Free Speech: A Philosophical Enquiry, New York: Cambridge University Press.
- Schmitt, Frederick (1994a), "Socializing Epistemology: An Introduction through Two Sample Issues," in Socializing Epistemology, F. Schmitt (ed.), Lanham, MD: Rowman and Littlefield.
- ––– (1994b), "The Justification of Group Beliefs," in Socializing Epistemology, F. Schmitt (ed.), Lanham, MD: Rowman and Littlefield.
- Searle, John (1995), The Construction of Social Reality, New York: Free Press.
- Shapin, Steven (1975), "Phrenological Knowledge and the Social Structure of Early Nineteenth-Century Edinburgh," Annals of Science, 32: 219–243.
- Shapley, Lloyd and Grofman, Bernard (1984), "Optimizing Group Judgmental Accuracy in the Presence of Interdependence," Public Choice, 43: 329–343.
- Shera, Jesse (1970), Sociological Foundations of Librarianship, New York: Asia Publishing House.
- Thagard, Paul (1997), "Collaborative Knowledge," Noûs, 31: 242–261.
- Tuomela, Raimo (1995), The Importance of Us: A Philosophical Study of Basic Social Notions, Stanford: Stanford University Press.
- Wray, K. Brad (2003), "What Really Divides Gilbert and the Rejectionists," Protosociology, 18/19: 363–376.
- Koppl, Roger G. (2006), "Democratic Epistemics: An Experiment on How to Improve Forensic Science," in PDF.
- Episteme: A Journal of Social Epistemology
[Please contact the author with other suggestions.]