This is a file in the archives of the Stanford Encyclopedia of Philosophy. |
version |
Stanford Encyclopedia of Philosophy
|
last substantive content change
|
This may not seem like much of a puzzle. That everyone here is seated is spatially restricted in that it is about a specific place; the principle of relativity is not similarly restricted. So, it is easy to think that, unlike laws, accidentally true generalizations are about specific places. But that's not what makes the difference. There are true nonlaws that are not spatially restricted. Consider the unrestricted generalization that all gold spheres are less than one mile in diameter. There are no gold spheres that size and in all likelihood there never will be, but this is still not a law. There also appear to be generalizations that could express laws that are restricted. Galileo's law of free fall is the generalization that, on Earth, free-falling bodies accelerate at a rate of 9.8 meters per second squared.
The perplexing nature of the puzzle is clearly revealed when the gold-sphere generalization is paired with a remarkably similar generalization about uranium spheres:
All gold spheres are less than a mile in diameter.Though the former is not a law, the latter arguably is. The latter is not nearly so accidental as the first, since uranium's critical mass is such as to guarantee that such a large sphere will never exist (van Fraassen 1989, 27). What makes the difference? What makes the former an accidental generalization and the latter a law?All uranium spheres are less than a mile in diameter.
Many features of the systems approach are appealing. For one thing, it appears to deal with a challenge posed by vacuous laws. Some laws are vacuously true: Newton's first law of motion -- that all inertial bodies have no acceleration -- is a law, even though there are no inertial bodies. But there are also lots of vacuously true nonlaws: all plaid pandas weigh 5 lbs., all unicorns are unmarried, etc. With the systems approach, there is no exclusion of vacuous generalizations from the realm of laws, and yet only those vacuous generalizations that belong to the best systems qualify (cf., Earman 1978, 180; Lewis 1986, 123). Furthermore, it is reasonable to think that one goal of scientific theorizing is the formulation of true theories that are well balanced in terms of their simplicity and strength. So, the systems approach seems to underwrite the truism that an aim of science is the discovery of laws of nature (Earman 1978, 197; Loewer 1996, 112). One last aspect of the systems view that is appealing to many (though not all) is that it is in keeping with broadly Humean constraints on an account of lawhood. There is no overt appeal to closely related modal concepts (e.g., the counterfactual conditional) and no overt appeal to modality-supplying entities (e.g., possible worlds or universals). Indeed, the systems approach was the centerpiece of Lewis's defense of the principle he called Humean supervenience, "the doctrine that all there is in the world is a vast mosaic of local matters of particular fact, just one little thing and then another" (1986, ix).
Other features of the systems approach have made philosophers wary. (See, especially, Armstrong 1983, 66-73; van Fraassen 1989, 40-64; Carroll 1990, 197-206.) Some argue that this approach will have the untoward consequence that laws are inappropriately mind-dependent in virtue of the account's appeal to the concepts of simplicity, strength and best-balance, concepts whose application seems to depend on cognitive abilities, interests, and purposes. The appeal to simplicity raises further questions stemming from the apparent need for a regimented language to permit reasonable comparisons of the systems. Interestingly, sometimes the view is abandoned because it satisfies the broadly Humean constraints on an account of laws of nature; some argue that what generalizations are laws is not determined by local matters of particular fact.
Focusing on Armstrong's development of the view, here is one of his concise statements of the framework characteristic of the universals approach:
Suppose it to be a law that Fs are Gs. F-ness and G-ness are taken to be universals. A certain relation, a relation of non-logical or contingent necessitation, holds between F-ness and G-ness. This state of affairs may be symbolized as ‘N(F,G)’ (1983, 85).This framework promises to address familiar puzzles and problems: Maybe the difference between the uranium-spheres generalization and the gold-spheres generalization is that being uranium does necessitate being less than one mile in diameter, but being gold does not. Worries about the subjective nature of simplicity, strength and best balance do not emerge; there is no threat of lawhood being mind-dependent so long as necessitation is not mind-dependent. Some (Armstrong 1991, Dretske 1977) think that the framework supports the idea that laws play a special explanatory role in inductive inferences, since a law is not just a universal generalization, but is an entirely different creature -- a relation holding between two other universals. The framework is also consistent with lawhood not supervening on local matters of particular fact; adoption of some nonsupervenience thesis often accompanies acceptance of the universals approach.
For there truly to be this payoff, however, more has to be said about what N is. This is the first of the two problems Bas van Fraassen calls the identification problem and the inference problem (1989, 96). The essence of these two problems was captured early on by David Lewis with his usual flair:
Whatever N may be, I cannot see how it could be absolutely impossible to have N(F,G) and Fa without Ga. (Unless N just is constant conjunction, or constant conjunction plus something else, in which case Armstrong's theory turns into a form of the regularity theory he rejects.) The mystery is somewhat hidden by Armstrong's terminology. He uses ‘necessitates’ as a name for the lawmaking universal N; and who would be surprised to hear that if F ‘necessitates’ G and a has F, then a must have G? But I say that N deserves the name of ‘necessitation’ only if, somehow, it really can enter into the requisite necessary connections. It can't enter into them just by bearing a name, any more than one can have mighty biceps just by being called ‘Armstrong’ (1983, 366).Basically, there needs to be a specification of what the lawmaking relation is (the identification problem). Then, there needs to be a determination of whether it is suited to the task (the inference problem): Does N's holding between F and G entail that Fs are Gs? Does its holding support corresponding counterfactuals? Do laws really turn out not to supervene, to be mind-independent, to be explanatory?
Armstrong does say more about what his lawmaking relation is. He states in reply to van Fraassen:
It is at this point that, I claim, the Identification problem has been solved. The required relation is the causal relation, ... now hypothesized to relate types not tokens (1993, 422).Yet, questions remain about the nature of this causal relation understood as a relation that relates both token events and universals. (See van Fraassen 1993, 435-437, and Carroll 1994, 170-174.) Others resist Armstrong's approach because of its ontological commitment to universals and the semantic and epistemological problems some associate with a rejection of supervenience. (See Loewer 1996, Beebee 2000.)
Only a statement that is lawlike -- regardless of its truth or falsity or its scientific importance -- is capable of receiving confirmation from an instance of it; accidental statements are not.(Terminology: P is lawlike if and only if P is a law if true.) Goodman claims that, if a generalization is accidental (and so not lawlike), then it is not capable of receiving confirmation from one of its instances.
This has prompted much discussion, including some challenges. For example, suppose there are ten flips of a fair coin, and that the first nine land heads (Dretske 1977, 256-257). The first nine instances -- at least in a sense -- confirm the generalization that all the flips will land heads; the probability of that generalization is raised from (.5)10 up to .5. But, this generalization is not lawlike; if true, it is not a law. It is standard to respond to such an example by arguing that this is not the pertinent notion of confirmation (that it is mere "content-cutting") and by suggesting that what does require lawlikeness is confirmation of the generalization's unexamined instances. Notice that, in the coin case, the probability that the tenth flip will land heads does not change after the first nine flips land heads. There are, however, examples that generate problems for this idea too.
Suppose the room contains one hundred men and suppose you ask fifty of them whether they are third sons and they reply that they are; surely it would be reasonable to at least increase somewhat your expectation that the next one you ask will also be a third son (Jackson and Pargetter 1980, 423)It does no good to revise the claim to say that no generalization believed to be accidental is capable of confirmation. About the third-son case, one would know that the generalization, even if true, would not be a law.
The discussion continues. Frank Jackson and Robert Pargetter have proposed an alternative connection between confirmation and laws on which certain counterfactual truths must hold: my observations of these As that are F and B confirms that all non-F As are Bs only if the As would still have been both A and B if they had not been F. This suggestion is criticized by Elliott Sober. See his (1988, 97-98). Marc Lange (2000, 111-142) uses a different strategy. He tries to refine further the relevant notion of confirmation, characterizing what he takes to be an intuitive notion of inductive confirmation, and then contends that only generalizations that are not believed not to be lawlike can be (in his sense) inductively confirmed.
Sometimes the idea that laws have a special role to play in induction serves as the starting point for a criticism of Humean analyses. Fred Dretske (1977, 261-262) and David Armstrong (1983, 52-59, and 1991) adopt a model of inductive inference on which it involves an inference to the best explanation. (Also see Foster 1983.) On its simplest construal, the model describes a pattern that begins with an observation of instances of a generalization, includes an inference to the corresponding law (this is the inference to the best explanation), and concludes with an inference to the generalization itself or to its unobserved instances. The complaint lodged against Humeans is that, on their view of what laws are, laws are not suited to explain their instances and so cannot sustain the required inference to the best explanation.
This is an area where work on laws needs to be done. Armstrong and Dretske make substantive claims on what can and can't be instance confirmed: Roughly, Humean laws can't, laws-as-universals can. But, at the very least, these claims cannot be quite right. Humean laws can't? As the discussion above illustrates, Sober, Lange and others have argued that even generalizations known to be accidental can be confirmed by their instances. Dretske and Armstrong need some plausible and suitably strong premise connecting lawhood to confirmability and it is not clear that there is one to be had. Here is the basic problem: As many authors have noticed (e.g., Sober 1988, 98; van Fraassen 1987, 255), the confirmation of a hypothesis or its unexamined instances will always be sensitive to what background beliefs are in place. So much so, with background beliefs of the right sort, just about anything can be confirmed irrespective of its status as a law or whether it is lawlike. Thus, stating a plausible principle describing the connection between laws of nature and the problem of induction will be difficult. In order to uncover a nomological constraint on induction, something needs to be said about the role of background beliefs.
Two reasons are traditionally given for believing that being a law does not depend on any necessary connection between properties. The first is the conceivability of it being a law in one possible world that all Fs are Gs even though there is another world with an F that is not G. The second is that there are laws of the form that all Fs are Gs that can only be discovered in an a posteriori manner. If necessity is always associated with laws of nature, then it is not clear why scientists can't always get by with a priori methods. Naturally, these two traditional reasons are often challenged. The necessitarians argue that conceivability is not a guide to possibility. They also appeal to Saul Kripke's (1972) arguments meant to reveal certain a posteriori necessary truths, suggesting that the a posteriori nature of some laws does not prevent their lawhood from depending on a necessary connection. In further support of their own view, the necessitarians often argue that their position is a consequence of the correct theory of the individuation of properties. Roughly, mass just would not be the property it is unless it had the causal powers it does, and hence obeyed the laws that it does. As they see it, it is also a virtue of their position that they can explain why laws of nature are counterfactual supporting: They support counterfactuals in the same way that the truths of logic and mathematics do (Swoyer 1982, 209; Fales 1990, 85-87).
A few philosophers, however, are doubtful that there are
exceptionless regularities even at the level of fundamental physics.
For example, Nancy Cartwright has argued that the descriptive and the
explanatory aspects of laws of nature conflict. "Rendered as
descriptions of fact, they are false; amended to be true, they lose
their fundamental explanatory force" (1980, 75). Consider Newton's
gravitational principle, F =
Gmm/r2.
Properly understood, according to Cartwright, it says that for any
two bodies the force between them is
Gmm
/r2.
But if that is what the law says then the law is not an exceptionless
regularity. This is because the force between two bodies is influenced
by other things than just their mass and the distance between them,
e.g. the charge of the two bodies as described by Coulomb's law. The
statement of the gravitational principle can be amended to make it
true, but that, according to Cartwright, at least on certain standard
ways of doing so, would strip it of its explanatory power. For
example, if the principle is taken to hold only that F =
Gmm
/r2
if there are no forces other than gravitational forces at work, then
though it would be true it would not apply except in idealized
circumstances. Lange (1993) uses a different example to make a similar
point. Consider a standard expression of the law of thermal expansion:
“Whenever the temperature of a metal bar of length
L0
changes by T, the length of the bar changes by L =
kL0T”,
where k is a constant, the thermal expansion coefficient of the
metal. If this expression were used to express the strict
generalization straightforwardly suggested by its grammar, then such
an utterance would be false since the length of a bar does not change
in the way described in cases where someone is hammering on its
ends. It looks like the law will require provisos, but so many that
the only apparent way of taking into consideration all the required
provisos would be with something like a ceteris-paribus
clause. Then, the concern becomes that the statement would be
empty. Because of the difficulty of stating plausible truth conditions
for ceteris-paribus sentences, it is feared that “Other
things being equal, L =
kL0T”
could only mean “L =
kL0T
provided that L =
kL0T”.
Even those who agree with the arguments of Lange and Cartwright sometimes disagree about what ultimately the arguments say about laws of nature. Cartwright believes that the true laws of nature are not exceptionless regularities, but instead are statements that describe causal powers. So construed they turnout to be both true and explanatory. Lange ends up holding that there are propositions properly adopted as laws though in doing so one need not also believe any exceptionless regularity; there need not be one. Ronald Giere (1999) can usefully be interpreted as agreeing with Cartwright's and Lange's basic arguments but insisting that law-statements don't have implicit provisos or implicit ceteris-paribus clauses. So, he concludes that there are no laws.
John Earman and John Roberts hold that there are exceptionless and lawful regularities. More precisely, they argue that scientists doing fundamental physics do attempt to state strict generalizations that are such that they would be strict laws if they were true:
Our claim is only that ... typical theories from fundamental physics are such that if they were true, there would be precise proviso free laws. For example, Einstein's gravitational field law asserts -- without equivocation, qualification, proviso, ceteris paribus clause -- that the Ricci curvature tensor of spacetime is proportional to the total stress-energy tensor for matter-energy; the relativistic version of Maxwell's laws of electromagnetism for charge-free flat spacetime asserts -- without qualification or proviso -- that the curl of the E field is proportional to the partial time derivative, etc. (1999, 446).About Cartwright's gravitational example, they think (473, fn. 14) that a plausible understanding of the gravitational principle is as describing only the gravitational force between the two massive bodies. (Cartwright argues that there is no such component force and so thinks such an interpretation would be false. Earman and Roberts disagree.) About Lange's example, they think the law should be understood as having the single proviso that there be no external stresses on the metal bar (461). In any case, much more would need to be said to establish that all the apparently strict and explanatory generalizations that have been or will be stated by physicists have turned or will turn out to be false.
John W. Carroll carroll@unity.ncsu.edu |