TSWLatexianTemp_PSA Paper.pdf Chance in Boltzmannian Statistical Mechanics Roman Frigg ⇤ Philosophy of Science 75, 2008, 670-681. Abstract In two recent papers Barry Loewer (2001, 2004) has suggested to interpret probabilities in statistical mechanics as Humean chances in David Lewis’ (1994) sense. I first give a precise formulation of this proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting these probabilities epistemically. 1 Introduction Consider a gas that is adiabatically isolated from its environment and confined to the left half of a container. Then remove the wall separat- ing the two parts. The gas will immediately start spreading and soon be evenly distributed over the entire available space. The gas has ap- proached equilibrium. Thermodynamics (TD) characterises this pro- cess in terms of an increase of thermodynamic entropy, which attains its maximum value at equilibrium. The Second Law of thermodynam- ics captures the irreversibility of this process by positing that in an isolated system such as the gas entropy cannot decrease. The aim of statistical mechanics (SM) is to explain the behaviour of the gas, and in particular its conformity with the Second Law, in terms of the dynamical laws governing the individual molecules of which the gas is made up. In what follows these laws are assumed to be the ones of Hamiltonian classical mechanics. ⇤Department of Philosophy, Logic and Scientific Method, London School of Economics, Houghton Street, London WC2A 2AE, England, r.p.frigg@lse.ac.uk. 1 We should not, however, ask for an explanation of the the Second Law literally construed. This law is a universal law and as such cannot be explained by a statistical theory. But this is not a problem because we can rest content if we explain the ‘Boltzmannian version’ of the second law (Callender 1999), which I call ‘Boltzmann’s Law’ (BL): Consider an arbitrary instant of time t and assume that at that time the Boltzmann entropy SB(t) of the system is low. It is then highly probable that at any time t0 > t we have SB(t 0) > SB(t). What notion of probability is invoked in BL and what reasons do we have to believe that the claim it makes is true? The orthodox an- swer is that probabilities are time averages and that entropy is likely to increase because, assuming that the system is ergodic, the system is in equilibrium most of the time. This view is now widely believed to be untenable due to both conceptual problems and its invocation of ergodicity (see Earman and Rédei (1996) and van Lith (2001) for dis- cussions). A propensity interpretation of SM probabilities is ruled out by the fact that the underlying micro-theory, Hamiltonian mechanics, is deterministic: Popper’s and Miller’s hand-waving notwithstanding, this is incompatible with there being propensities (Clark 2001). Fre- quentism has never been seriously put forward as an interpretation of SM probabilities because mechanical systems do not satisfy von Mises’ independence requirement. Finally, so-called ‘no-theory-theories’ do not improve the situation because at least in the context of physi- cal theories they do not provide an independent alternative to other accounts (Frigg and Hoefer 2007). In two recent papers Loewer (2001, 2004) has suggested that the way out of this deadlock is to build on David Lewis’ (1986, 1994) approach and interpret SM probabilities as Humean chances. In this paper I first give a precise formulation of Loewer’s proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting SM probabilities epis- temically. 2 Boltzmannian Statistical Mechanics The microstate of a system consisting of n particles is specified by a point in its 6n-dimensional phase space Γ, which is endowed with the 2 Lebesgue measure µ L (its ‘natural’ measure).1 The dynamics of the system is governed by Hamilton’s equations of motion, which define a measure-preserving phase flow φt on Γ; that is, φt: Γ ! Γ is a one-to-one mapping for every real number t and µL(φt(B)) = µL(B) for every measurable set B ✓ Γ. In what follows we assume that the relevant physical process begins at a particular instant t0 and I adopt the convention that ‘φt(x)’ denotes the state of the system at time t0 + t if it was in state x at t0, and likewise for ‘φt(B)’. Similarly, ‘φ−t(x)’ denotes the denotes the state at time t0 which gets mapped onto x at time t0 + t under the dynamics of the system, and likewise for φ−t(B). In a Hamiltonian system energy is conserved and hence the motion of the system is confined to the 6n − 1 dimensional energy hypersurface ΓE. The measure µL can be restricted to ΓE, which induces a natural and invariant measure µ on ΓE. The macrostates Mk, k = 1, ..., m, of the system (where m < 1), characterised by the values of macroscopic parameters, are assumed to supervene on the system’s microstates. Therefore each Mk is asso- ciated with a region ΓMk ✓ ΓE so that the system is in macrostate Mk at t i↵ its mictostate x at t lies within ΓMk. The ΓMk form a partition of ΓE, meaning that they do not overlap and jointly cover ΓE. By definition, SB(Mk) := kB log[µ(ΓMk)] is the Boltzmann en- tropy of macrostate Mk (where kB is the Boltzmann constant). Be- cause the ΓMk don’t overlap it follows that a system is in exactly one macrostate at any given time t and for this reason it makes sense to talk about the Boltzmann entropy SB(t) of a system at time t: SB(t) := kB log[µ(ΓMt)], where Mt is the system’s macrostate at time t (i.e. Mt is the Mk for which it is the case that x 2 ΓMk, where x is the system’s microstate at t). The Boltzmann entropy assumes its maximum for the equilibrium state. To rationalise Boltzmann’s law we need to introduce probabilities. The standard way to do this is by appeal to the so-called ‘Statistical Postulate’ (SP): Let Mt be the system’s macrostate at time t. Then the probability at time t that the system’s microstate lies in B ✓ ΓMt is pt(B) = µ(B)/µ(ΓMt). Now consider the set F of all microstates in ΓMt which in the near future evolve towards macrostates M 0 that have higher Boltzmann 1For a short introduction to Boltzmannian SM see Lebowitz (1993). 3 entropy than Mt. With the assumption that µ(F)/µ(ΓMk) ⇡ 1 for all k it follows from SP that for all t the system is highly likely to evolve towards a state of higher entropy, which is exactly what BL asserts. Whether or not this assumption holds true in a particular system is a substantial question. However, even if it does there is a problem. It follows from the time reversal invariance of Hamilton’s equations of motion that if it is true that the system is overwhelmingly likely to evolve towards a macrostate of higher entropy in the future, it is also overwhelmingly likely to have evolved into the current macrostate from a past macrostate M 00 which also has higher entropy. This flies in the face everyday experience and leads to wrong retrodictions. Albert (2000, 71-96) suggests fixing this problem by first taking the system under investigation to be the entire universe and then adopting the so-called Past Hypothesis (PH), the postulate that the universe came into being in a low entropy macrostate, the Past State, which is provided to us by modern Big Bang cosmology. The problems with flawed retrodictions can then be avoided by conditionalising on the Past State. From a technical point of view, this amounts to replac- ing SP with what I call the ‘Past Hypothesis Statistical Postulate’ (PHSP): Let Mt be the system’s macrostate at time t. SP is valid for the Past State Mp, which obtains at time t0. For all times t > t0 the probability at time t that the system’s microstate lies in B is pt(B| Rt) = µ(B \ Rt)/µ(Rt), where Rt := Mt \ φt(Mp). In what follows I refer to these probabilities as ‘PHSP probabili- ties’. This principle is used to make predictions about the system’s future by choosing B to be the set of those microstates that behave in the desired way. For instance, if choose B to be the set F (as de- fined above), pt(F) is the probability that the system’s entropy will increase in the near future given PH. If this probability comes out high for all Mk we have explained BL. Again, whether or not this is the case is a substantive question having to do with both the construction of the macrostates as well as the dynamics of the system. However, the problems I discuss in this paper are orthogonal to this issue and so I assume for the sake of argument that this assumption bears out in systems of interest.2 2I also assume that one can make sense of PH in the current context, an assumption 4 3 Humean Chance The basis for Lewis’ theory of probability is the so-called Humean mosaic, the collection of all non-modal and non-probabilistic actual events making up the world’s entire history (from the very beginning to the very end) and upon which all other facts supervene. Lewis him- self suggested that the mosaic consists of space-time points plus local field quantities representing material stu↵. In a classical mechanical system the Humean mosaic simply consist of the trajectory of the sys- tem’s microstate in phase space, on which the system’s macrostates supervene. The next element of Lewis’ theory is a thought experiment. To make this explicit – more explicit than it is in Lewis’ own presenta- tion – I introduce a fictitious creature, Lewis’ Demon. In contrast to human beings who can only know a small part of the Humean mosaic, Lewis’ Demon knows the entire mosaic. The demon now formulates various deductive systems which make true assertions about what is the case, and, perhaps, also about what the probability for certain events are. Then the demon is asked to choose the best among these systems. The laws of nature are the true theorems of this system and the chances for certain events to occur are what the probabilistic laws of the best system say they are (Lewis 1994, 480). Following Loewer, I call probabilities thus defined L-chances. The best system is the one that strikes the best balance between strength, simplicity and fit. The notions of strength and simplicity are given to the demon and are taken for granted in this context, but the notion of fit needs explicit definition. Every system assigns probabili- ties to certain courses of history, among them the actual course; the fit of the system is measured by the probability that it assigns to the ac- tual course of history, i.e. by how likely it regards things that actually happen. By definition systems that do not involve probabilistic laws have perfect fit. As an illustration, consider a Humean mosaic that consists of just ten outcomes of a coin flip: HHTHTTHHTT. Theory T1 posits that all events are independent and sets p(H) = p(T) = 0.5; theory T2 shares the independence assumption but posits p(H) = 0.9 and p(T) = 0.1. It follows that T1 has better fit than T2 because (0.5)10 > (0.1)5(0.9)5. Loewer’s suggestion is that Boltzmannian SM as introduced above that has been questioned by Earman (2006). 5 – the package of Hamiltonian mechanics, PH and PHSP – is a putative best system of the sort just described (2001, 618; 2004, 1124) and that PHSP probabilities can therefore be regarded as Humean chances. But there is an obvious problem, namely reconciling determinism and the existence of probabilistic laws, which Lewis himself thought was impossible (1986, 118). Loewer claims that Lewis was wrong about this and suggests that introducing probabilities via initial conditions solves the problem: ‘[...] while there are chances di↵erent from 0 and 1 for possible initial conditions the chances of any event A after the initial time will be either 1 or 0 since A’s occurrence or non-occurrence will be entailed by the initial state and the deterministic laws. However, we can define a kind of dynamical chance which I call ‘macroscopic chance’. The macroscopic chance at t of event A is the probability given by starting with the micro-canonical distribution over the initial conditions and then conditionalising on the entire macroscopic history of the world (including the low entropy postulate) up until t. [...] this probability distribution is completely compatible with deterministic laws since it con- cerns only the initial conditions of the universe.’ (Loewer 2001, 618-19)3 Loewer does not tell us what exactly he means by ‘a kind of dy- namical chance’, in what sense this chance is macroscopic, how its values are calculated, and how it connects to the technical apparatus of SM. I will now present how I think this proposal is best under- stood and show that, on this reading, Loewer’s ‘macroscopic chances’ coincide with PHSP as formulated above. As in Section 2, I take the system’s state at t > t0 to be the macrostate Mt. We now need to determine the probability of the event ‘being in set B ✓ ΓMt at time t’. As I understand it, Loewer’s proposal falls into two parts. The first is that the probability of an event at a time t is ‘completely determined’ by the probability of the corresponding event at time t0; that is, the probability of the event ‘being in set B at time t’, pt(B), is equal to the probability of ‘being in set B0 at time t0’ where B0 is, by definition, the set 3The same idea is described in Loewer (2004, 1124). 6 that evolves into B under the dynamics of the system after time t has elapsed. Formally, pt(B) = µ0(B0) = µ0(φ−t(B)), where µ0 is the microcanonical distributino over the Past State, i.e. µ0( · ) = µ( · \ ΓMp)/µ(ΓMp) The second part is conditionalising on the entire macro history up to time t, i.e a specification of the system’s macro state at each in- stant of time between t0 and t. A possible macro history, for instance, is that system is in macrostate M1 during the interval [t0, t1], in M5 during (t1, t2], in M7 during (t2, t3], etc., where t1, t2, t3, ... are the instants of time at which the system changes from one macrostate into another. What we are now expected to calculate is the proba- bility of ‘being in set B at time t’ given the system’s macro history. Let Qt be the set of all microstates in ΓMt that are compatible with the entire past history of the system; i.e. it is the set of all x 2 ΓMt that lie on trajectories that for every t were in the ΓMk corresponding to the actual macrostate of the system at t. The sought-after condi- tional probability then is pt(B| Qt) = pt(B&Qt)/pt(Qt), provided that pt(Qt) 6= 0, which, as we shall see, is the problematic condition. Putting these two parts together we obtain the fundamental equa- tion defining L-chances for deterministic systems: pt(B| Qt) = µ0(φ−t(B \ Qt)) µ0(φ−t(Qt)) , (1) where, again, µ0( · ) = µ( · \ ΓMp)/µ(ΓMp). The crucial thing to realise now is that due to the conservation of the measure the expression for the conditional probability in PHSP can be expressed as pt(B| Rt) = µ(φ−t(B \Rt))/µ(φ−t(Rt)). Trivially, we can substitute µ0 for µ in this expression which makes it is equiva- lent to Equation 1, if we treat Qt and Rt as equals. (In fact there is a di↵erence between them in that Rt only involves a conditionalisation on PH, while Qt contains the entire past history. However, nothing in PHSP depends on this and one could just as well include the entire history in Rt.) Hence PHSP can be interpreted as attributing prob- abilities to events at t > t0 solely on the basis of the microcanonical measure over the initial conditions, which is precisely what Loewer needs. 7 4 Problems with Fit Loewer claims that SM as introduced above is the system that strikes the best balance between simplicity, strength and fit. Trivially, this implies that it can be ranked along these three dimensions. Simplicity and strength are no more problematic in SM than they are in any other context and I shall therefore not discuss them further here. The problematic concept is fit. The fit of a theory is measured in terms of the probability that it assigns to the actual course of history. But what history? Given that L-chances are calculated using the Lebesgue measure, which assigns measure zero to any trajectory, they do not lead to a non-trivial rank- ing of micro histories (trajectories in Γ). The right choice seems to be to judge the fit of theory with respect to the system’s macro history. What is the probability of a macro history? A first answer to this question would be to simply use Equation 1 to calculate the probabil- ity of a macro state at each instant of time and then multiply them all, just as we did in the above example with the coins (with the only di↵erence that the probabilities are now not independent any more, which is accounted for in Equation 1). This is plain nonsense. There is an uncountable infinity of such probabilities and multiplying an un- countable infinity of numbers is an ill-defined operation. Determining the probability of a history by multiplying probabilities for individual events in the history works fine as long as the events are discrete (like coin flips), but it fails when we have a continuum. Maybe this was too crude a stab at the problem and when taking the right sorts of limits things work out fine. Let us discretise time by dividing the real axis into small intervals of length δ, then calculating the probabilities at the instants t0, t0 + δ, t0 + 2δ etc., multiply them (there are only countably many now), and then take the limit δ ! 0. This would work if the pt(B| Qt) depended in a way on δ that would assure that the limit exists. This is not the case. In fact, for all t > t1 (i.e. after the first change of macrostate), the pt(B| Qt) do not exist because Qt has measure zero, and this irrespective of δ. This can be seen as follows. Take the above example of a macro history and consider an instant t 2 (t1, t2] when the system is in macrostate M5. To calculate the probability of the system being in M5 at t we need to determine Qt, the set of all microstates in ΓM5 compatible with the past macro history. Now, these points must be such that they were 8 were in M1 at t1 and in M5 just an instant later (i.e. for any " > 0, at t1 + " the system’s state is in ΓM5). The mechanical systems we are considering have global solutions (or at least solutions for the entire interval [t0, tf], where tf is the time when the system ceases to exist) and trajectories in such systems have finite phase velocity; that is, a phase point x in Γ cannot cross a finite distance in no time. From this it follows that the only points that satisfy the condition of being in M1 at t1 and in M5 just instant later are the ones that at t1 lie exactly on the boundary between M1 and M5. But the boundary of a 6n − 1 dimensional region is 6n − 2 dimensional and threfore has measure zero. Therefore Qt has measure zero for all t > t1, and accordingly pt(B| Qt) does not exist for t > t1, no matter what B is. Needless to say, this renders the limit δ ! 0 obsolete. The source of the problem is the conjunction of three elements: (1) the posit that time is continuous, (2) the assumption that the transition from one macrostate to another one takes place at a precise instant, (3) the posit that we conditionalise on the entire macro history of the system. We have to give up at least one of these to obtain non-zero pt(B| Qt). The problem is that all three elements either seem reasonable or are deeply entrenched in the theory and cannot be renounced without far-reaching consequences. The first option, discretising time, would solve the problem because if we assume that time is discrete the macro history is discrete too. If we only consider, say, the events ‘being in ΓM1 at instant ⌧1’ and ‘being in ΓM5 at instant ⌧2’, where ⌧1 2 [t0, t1] and ⌧2 2 (t1, t2], sets of finite measure can move from ΓM1 to ΓM5 and Qt no longer needs to have measure zero. The problem with this suggestion is that it is ad hoc and defeats the purpose of SM. If we believe that classical mechanics is the fundamental theory governing the micro constituents of the universe and set out to explain the behaviour of the universe in terms of its laws, not much seems to be gained if such an explanation can only be had at the expense of profoundly modifying these laws. The second suggestion would be to allow for finite transition times between macrostates, i.e. allowing for there to be periods during which it is indeterminate in which macrostate the system is. Given that states move with finite phase velocity, this would amount to intro- ducing ‘transition zones’, i.e. ‘belts’ between di↵erent ΓMk consisting of microstates which neither belong to one nor the other macrostate. This suggestion is not without merit as one could argue that sharp 9 boundaries between macro-states are indeed a mathematical idealisa- tion that is ultimately unjustifiable from a physics perspective. How- ever, giving up the assumption that the ΓMk have sharp boundaries and together form a partition of ΓE would amount to a serious change in SM itself, and it would remain to be seen whether a theory based on this assumption turns out to be workable. The third option denies that we we should conditionalise on the complete macro history. The idea is that even though time at bottom is continuous, the macro history takes record of he system’s macrostate only at discrete instants and is oblivious about what happens between these. That is, what we should conditionalise on is the discrete macro history (DMH): Mp at ⌧0, M⌧1 at ⌧1, M⌧2 at ⌧2, ..., M⌧j1 at ⌧j−1 and M⌧j at ⌧j, where t0 =: ⌧0  ⌧1  ...  ⌧j−1  ⌧j := tf is a finite number of instants of time and the M⌧i the system’s macro state at time ⌧i. 4 This solves the problem because when conditionalising on a discrete macro history Qt no longer necessarily is of measure zero. This is at once the most feasible and the most problematic sugges- tion. It is feasible because it does not require revisions in the structure of the theory. It is problematic because we have given up the notion that the fit of a theory has to be best with respect to the complete history of the world, and replaced it with the weaker requirement that fit be best for a partial history. (And mind you, the point is not that the fit of the full history is in practice too complicated to calculate and we therefore settle for a more tractable notion; the point is that the fit of a complete macro history is simply not defined because the relevant conditional probabilities do not exist.) From the point of view of Lewis’ theory this seems unmotivated. Fit, like truth, is a semantic concept characterising the relation between the theory and the world, and if the Humean mosaic has continuous events in it there should still be a matter of fact about what the fit of the theory is. Moreover, even if one was willing to believe that a discrete version of fit was satisfactory, it is not clear whether this leads to useful results. Depending on which particular instants of time one chooses to measure fit, one can get widely di↵erent results. Conditionalising on a DMH would be useful only if it was the case that the fit rankings came out the same no matter what choice of instants we make. There is at least 4I assume j to be finite. There is a further problem with infinite sequences (Elga 2004). The difficulties I discuss in this section and the next are independent of that problem and Elga’s solution is available also in the present context. 10 a question whether this is the case. 5 The Putative Best System Is Not the Best System I now assume that a DMH notion of fit can be defended in one way or another.5 Then a further problem emerges: the package consisting of Hamiltonian Mechanics, PH and PHSP in fact is not the best system. The reason for this is that we can always improve the fit of a system if we choose a measure that, rather than being uniform over ΓMp, is somehow peaked over those initial conditions that are compatible with the entire DMH. Let us make this more precise. The probability of the DMH is p(DMH) = p⌧0(B⌧0| Q⌧0) ... p⌧j1(B⌧j1| Q⌧j1), where the B⌧i are those subsets of M⌧i that evolve into M⌧i+1 under the evolution of the system. One can then prove that p(DMH) = µ0[Γp \ φ−⌧1(Γ1) \ ... \ φ−⌧j (Γj)] (2) where Γp := ΓMp and Γi := ΓM⌧i for i = 1, ..., j. Now define N := Γp \ φ−⌧1(Γ1) ... φ−⌧j (Γj). The fit of system is measured by the probability that it assigns to the actual DMH, which is given by Equation 2. It is a straightforward consequence of this equation that the fit of a system can be improved by replacing µ0 by a measure µP that is peaked over N, i.e. µP (N) > µ0(N) and µP (Γp \ N) < µ0(Γp \ N) while µP (Γp) = µ0(Γp). Fit becomes maximal (i.e. p(DMH) = 1) if, for instance, we choose the measure µN that assigns all the weight to N and none to Γp \ N: for any set B ✓ Γp we have µN(B) := kµ0(B \ N), where k = 1/µ0(N) (provided that µ0(N) 6= 0). Trivially, N contains the actual initial condition of the unverse. A simpler and more convenient distributions that yields maximal fit is a Dirac delta function over the actual initial condition. If there is such a simple way to improve (and even maximise) fit, why does the demon not provide us with a system comprising µN or a delta function? Coming up with such a system is not a problem for 5I make this choice for convenience; the problem that I describe in this section also arises for the other two options. 11 the demon, as, by assumption, he knows the entire Humean mosaic, which contains the exact initial condition. A reason to prefer µ0 to other measures might be that these make the system less simple and that this loss in simplicity is not com- pensated by a corresponding gain in fit and strength. This seems implausible. Handling a dirac delta function rather than µ0 does not render the system more complicated while the gain in fit is consider- able. Hence simplicity does not seem to provide reason to prefer µ0 to other measures that have better fit. 6 Outlook: Epistemic Probabilities Af- ter All The system consisting of Hamiltonian mechanics, PH, and PHSP is not the best system and therefore PHSP probabilities cannot be in- terpreted as Humean chances. In this section I first want suggest that the probabilities in this system are best understood as epistemic prob- abilities of sorts and then indicate how this view could be defended against some common objections. Every theory involving probabilities must answer the question of what these probabilities are probabilities for. The initial conditions approach to chance does not seem to have an answer to this ques- tion. The universe has exactly one initial micro condition and there is nothing chancy about this condition. How, then, can we understand a probability distribution over initial conditions? The only answer seems to be that this distribution reflects our ignorance about the systems’s initial micro condition; all we know is the system’s initial macrostate and so we put a probability distribution over the micro conditions compatible with that macrostate which reflects our lack of knowledge. How these epistemic probabilities should be understood is a ques- tion that I cannot discuss here. Let me just indicate that there are at least two options. The first, a version of objective Bayesianism, ap- peals to Jayenes’ maximum entropy principle, which indeed instructs us to prefer µ0 to alternative measures because, given the informa- tion about the system’s macro state, µ0 maximises the (continuous) Shannon entropy. The other alternative is to revise Lewis’ account in a way that builds epistemic restrictions of the users of theories into 12 the selection criteria for systems. Hoefer’s (2007) theory of Humean chance makes room for this possibility. There are two main complaints about an epistemic interpretation of SM probabilities. The first points out that the thermodynamic entropy is a property of a physical system and that SB coincides with it up to a constant. This, so the argument goes, is inexplicable on the basis of an epistemic approach to probabilities.6 This is wrong because SB is defined in terms of the measure of certain chunks of phase space and probabilities (no matter how we interpret them) have simply nothing to with it. The second complaint concerns the alleged causal efficacy of human knowledge. The point becomes clear in the following – rhetorical – questions by Albert:7 ‘Can anybody seriously think that it is somehow necessary, that it is somehow a priori, that the particles that make up the material world must arrange themselves in accord with what we know, with what we happen to have looked into? Can anybody seriously think that our merely being ignorant of the exact microconditions of thermodynamic systems plays some part in bringing it about, in making it the case, that (say) milk dissolves in co↵ee? How could that be?’ (Albert 2000, 64, original emphasis) It can’t be, and no one should think that it could. Proponents of epistemic probabilities need not believe in parapsychology. What un- derlies this objection is the mistaken view that PHSP probabilities play a part in bringing about things in the world. Of course the cool- ing down of drinks and the boiling of kettles has nothing to do with what anybody thinks or knows about them; but they have nothing to do with the probabilities attached to these events either. Drinks cool down and kettles boil because the universe’s initial condition is such that under the dynamics of the system it evolves into a state in which this happens. All we need to explain why things happen is the initial condition and the dynamics. Last but not least, the decision to conditionalise on DMH rather than the full macro history seems to square better with an epistemic 6This point is often made in conversation, but I have been unable to locate it in print. 7Redhead (1995, 27-28, 32-33), Loewer (2001, 611), Goldstein (2001, 48), and Meacham (2005 287-8) make similar points. 13 approach to probabilities. For these reasons I suggest that we take seriously the option to interpret SM probabilities epistemically. Acknowledgements Thanks to Nancy Cartwright, David Lavis, Carl Hoefer, Barry Loewer, and Charlotte Werndl for helpful discussion and/or comments on ear- lier drafts. Thanks also to the audiences of the conference ‘Time, Chance and Reduction’ in Munich and the PSA Meeting in Vancou- ver for feedback and suggestions. Bibliography Albert, David (2000), Time and Chance. Cambridge/MA and Lon- don: Harvard University Press. Callender, Craig (1999), “Reducing Thermodynamics to Statistical Mechanics: The Case of Entropy’, Journal of Philosophy 96: 348- 373. Earman, John (2006), “The ‘Past Hypothesis’: Not Even False”, Stud- ies in History and Philosophy of Modern Physics 37: 399-430. – and Miklós Rédei (1996), “Why Ergodic Theory Does Not Explain the Success of Equilibrium Statistical Mechanics”, British Jour- nal for the Philosophy of Science 47: 63-78. Elga, Adam (2004), “Infinitesimal Chances and the Laws of Nature”, Australasian Journal of Philosophy 82: 67-76. Frigg, Roman and Carl Hoefer (2007), “Probability in GRW Theory”, Studies in History and Philosophy of Science 38: 371-389. Goldstein, Sheldon (2001), “Boltzmann’s Approach to Statistical Me- chanics”, in: Bricmont et al. (2001): Chance in Physics: Foun- dations and Perspectives. Berlin and New York: Springer, 39-54. Hoefer, Carl (2007), “The Third Way on Objective Probability: A Sceptic’s Guide to Objective Chance”, Mind 116: 549-596. Lebowitz, Joel (1993), “Macroscopic Laws, Microscopic Dynamics, Time’s Arrow and Boltzmann’s Entropy’, Physica A 194: 1-27. 14 Lewis, David (1986), “A Subjectivist’s Guide to Objective Chance” and ‘Postscripts to “A subjectivist’s guide to objective Chance”, in David Lewis, Philosophical Papers, Vol. 2, Oxford: Oxford University Press, 83-132. – (1994), “Humean Supervenience Debugged”, Mind 103: 473-90. Loewer, Barry (2001): “Determinism and Chance”, Studies in History and Philosophy of Modern Physics 32: 609-629. – (2004), “David Lewis’ Humean Theory of Objective Chance”, Phi- losophy of Science 71: 1115 -1125. Meacham, Christopher (2005), “Three Proposals Regarding a Theory of Chance”, Philosophical Perspectives 19: 281-307. Redhead, Michael (1995), From Physics to Metaphysics. Cambridge: Cambridge University Press. van Lith, Janneke (2001), “Ergodic Theory, Interpretations of Proba- bility and the Foundations of Statistical Mechanics”, Studies in History and Philosophy of Modern Physics 32: 581-594. 15 << /ASCII85EncodePages false /AllowTransparency false /AutoPositionEPSFiles true /AutoRotatePages /None /Binding /Left /CalGrayProfile (Dot Gain 20%) /CalRGBProfile (sRGB IEC61966-2.1) /CalCMYKProfile (U.S. Web Coated \050SWOP\051 v2) /sRGBProfile (sRGB IEC61966-2.1) /CannotEmbedFontPolicy /Error /CompatibilityLevel 1.4 /CompressObjects /Tags /CompressPages true /ConvertImagesToIndexed true /PassThroughJPEGImages true /CreateJobTicket false /DefaultRenderingIntent /Default /DetectBlends true /DetectCurves 0.0000 /ColorConversionStrategy /CMYK /DoThumbnails false /EmbedAllFonts true /EmbedOpenType false /ParseICCProfilesInComments true /EmbedJobOptions true /DSCReportingLevel 0 /EmitDSCWarnings false /EndPage -1 /ImageMemory 1048576 /LockDistillerParams false /MaxSubsetPct 100 /Optimize true /OPM 1 /ParseDSCComments true /ParseDSCCommentsForDocInfo true /PreserveCopyPage true /PreserveDICMYKValues true /PreserveEPSInfo true /PreserveFlatness true /PreserveHalftoneInfo false /PreserveOPIComments true /PreserveOverprintSettings true /StartPage 1 /SubsetFonts true /TransferFunctionInfo /Apply /UCRandBGInfo /Preserve /UsePrologue false /ColorSettingsFile () /AlwaysEmbed [ true ] /NeverEmbed [ true ] /AntiAliasColorImages false /CropColorImages true /ColorImageMinResolution 300 /ColorImageMinResolutionPolicy /OK /DownsampleColorImages true /ColorImageDownsampleType /Bicubic /ColorImageResolution 300 /ColorImageDepth -1 /ColorImageMinDownsampleDepth 1 /ColorImageDownsampleThreshold 1.50000 /EncodeColorImages true /ColorImageFilter /DCTEncode /AutoFilterColorImages true /ColorImageAutoFilterStrategy /JPEG /ColorACSImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /ColorImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000ColorACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /JPEG2000ColorImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /AntiAliasGrayImages false /CropGrayImages true /GrayImageMinResolution 300 /GrayImageMinResolutionPolicy /OK /DownsampleGrayImages true /GrayImageDownsampleType /Bicubic /GrayImageResolution 300 /GrayImageDepth -1 /GrayImageMinDownsampleDepth 2 /GrayImageDownsampleThreshold 1.50000 /EncodeGrayImages true /GrayImageFilter /DCTEncode /AutoFilterGrayImages true /GrayImageAutoFilterStrategy /JPEG /GrayACSImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /GrayImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000GrayACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /JPEG2000GrayImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /AntiAliasMonoImages false /CropMonoImages true /MonoImageMinResolution 1200 /MonoImageMinResolutionPolicy /OK /DownsampleMonoImages true /MonoImageDownsampleType /Bicubic /MonoImageResolution 1200 /MonoImageDepth -1 /MonoImageDownsampleThreshold 1.50000 /EncodeMonoImages true /MonoImageFilter /CCITTFaxEncode /MonoImageDict << /K -1 >> /AllowPSXObjects false /CheckCompliance [ /None ] /PDFX1aCheck false /PDFX3Check false /PDFXCompliantPDFOnly false /PDFXNoTrimBoxError true /PDFXTrimBoxToMediaBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXSetBleedBoxToMediaBox true /PDFXBleedBoxToTrimBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXOutputIntentProfile () /PDFXOutputConditionIdentifier () /PDFXOutputCondition () /PDFXRegistryName () /PDFXTrapped /False /CreateJDFFile false /Description << /ARA /BGR /CHS /CHT /CZE /DAN /DEU /ESP /ETI /FRA /GRE /HEB /HRV (Za stvaranje Adobe PDF dokumenata najpogodnijih za visokokvalitetni ispis prije tiskanja koristite ove postavke. Stvoreni PDF dokumenti mogu se otvoriti Acrobat i Adobe Reader 5.0 i kasnijim verzijama.) /HUN /ITA /JPN /KOR /LTH /LVI /NLD (Gebruik deze instellingen om Adobe PDF-documenten te maken die zijn geoptimaliseerd voor prepress-afdrukken van hoge kwaliteit. De gemaakte PDF-documenten kunnen worden geopend met Acrobat en Adobe Reader 5.0 en hoger.) /NOR /POL /PTB /RUM /RUS /SKY /SLV /SUO /SVE /TUR /UKR /ENU (Use these settings to create Adobe PDF documents best suited for high-quality prepress printing. Created PDF documents can be opened with Acrobat and Adobe Reader 5.0 and later.) >> /Namespace [ (Adobe) (Common) (1.0) ] /OtherNamespaces [ << /AsReaderSpreads false /CropImagesToFrames true /ErrorControl /WarnAndContinue /FlattenerIgnoreSpreadOverrides false /IncludeGuidesGrids false /IncludeNonPrinting false /IncludeSlug false /Namespace [ (Adobe) (InDesign) (4.0) ] /OmitPlacedBitmaps false /OmitPlacedEPS false /OmitPlacedPDF false /SimulateOverprint /Legacy >> << /AddBleedMarks false /AddColorBars false /AddCropMarks false /AddPageInfo false /AddRegMarks false /ConvertColors /ConvertToCMYK /DestinationProfileName () /DestinationProfileSelector /DocumentCMYK /Downsample16BitImages true /FlattenerPreset << /PresetSelector /MediumResolution >> /FormElements false /GenerateStructure false /IncludeBookmarks false /IncludeHyperlinks false /IncludeInteractive false /IncludeLayers false /IncludeProfiles false /MultimediaHandling /UseObjectSettings /Namespace [ (Adobe) (CreativeSuite) (2.0) ] /PDFXOutputIntentProfileSelector /DocumentCMYK /PreserveEditing true /UntaggedCMYKHandling /LeaveUntagged /UntaggedRGBHandling /UseDocumentProfile /UseDocumentBleed false >> ] >> setdistillerparams << /HWResolution [2400 2400] /PageSize [612.000 792.000] >> setpagedevice