Equilibrium Statistical Mechanics [2nd ed.] 0471031232, 978-0471031239

875 166 11MB

English Pages 282 Year 1975

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Equilibrium Statistical Mechanics [2nd ed.]
 0471031232,  978-0471031239

Citation preview

Digitized by the Internet Archive in 2019 with funding from Kahle/Austin Foundation

https://archive.org/details/equilibriumstatiOOOOandr

EQUILIBRIUM STATISTICAL MECHANICS

V

EQUILIBRIUM STATISTICAL MECHANICS Second Edition

FRANK C. ANDREWS Hi

Merrill College, University of California, Santa Cruz

A WILEY-INTERSCIENCE PUBLICATION

John Wiley & Sons NEW YORK



LONDON



SYDNEY



TORONTO

QCy , 1 S3 f !776~

Copyright © 1963, 1975, by John Wiley & Sons, Inc. All rights reserved. Published simultaneously in Canada. No part of this book may be reproduced by any means, nor transmitted, nor translated into a machine language with¬ out the written permission of the publisher. Library of Congress Cataloging in Publication Data

Andrews, Frank C Equilibrium statistical mechanics. “A Wiley-Interscience publication.” Companion vol. to the author’s Thermodynamics: principles and applications. Bibliography: p. 000 1. Statistical mechanics. I. Title. QC174.8.A534 1974 ISBN 0-471-03123-2

530.T32

74-17197

Printed in the United States of America •10 98765432

ERRATA

Page

Line

Should read

15 15 78

3 4 Prob. 8-2

.1+ .8 + 2.7 = 12.9

98

108 120 130 132 139 141

Eq, 10-18 3rd from bottom Table 11-1 Eq. 11-10 ^-vibration 8th from bottom Eq. 12-16 6 Prob. 12-16

161 161 161

1 of Prob. 14-3 4 of Prob. 14-3 Prob. 14-3

161 193

Prob. 14-5 Prob. 17-1

241 243 243 244 245 245 245

2-4 9-1, line 1 add 9-7 11-4 11-25 12-4, line 1 12-4, line 3

P2 = of — D0 from singly degenerate. Cl atoms are doubly de¬ generate. At what high temperature... contribution is less than the Also prove the low-temperature crossing of these heat capacities to be at 0.952 K. density 6 g/ml Normalize with all particles in the system. Note: Sometimes this average is calculated using only the particles moving from left to right, which gives twice this result. (Prove that.) insert ir after 4: v = oo

*d

g «* B 04 04 X X w

53

cc3

53 a>

c5 cx „ X x

d , .X 'd

73

00

"O

* B

O

53

o

x a Q d



£ 7

73

-g

13 O -*—*

04

d X)

•d

1 B a>

X ctj

a>

Wh

’>

53

X

CO

'S o CO w

x d -* x »-

w

^

« X

T3 IL)

g-

^ a a>

X

X

^

X

3

O »-

CJ b

z

^ '

£

G c d ■'*4—


53

d

X

13 X X > a>

x

>7 ^x

I 2 ,a

>. X)

~ ■'c

04

J3

%

0>

04

19

>3 d

6

20

Basic Theory

The defining equation 2-2 becomes Eq. 2-44, which in terms of the ensemble is a? (value between x and x + dx) (2-45)

f(x)dx =

Since each member of the ensemble has only a single value of the property under consideration, the analog of Eq. 2-5 is rx 2

Probability property lies between x, and x2 = I

Jx\

f(x)dx.

(2-46)

In terms of Fig. 2-1 d, this means that the fraction of the cats with ages between nx and n2 is just the area under the curve between those two ages. That is simply the sum (integral) of the areas f(x)dx of all the infinitesim¬ ally thin slices between nx and n2. The normalization condition, Eq. 2-7, becomes

/

OO

f(x)dx — 1,

(2-47)

-

where the integration limits — oo to oo indicate that the integral is over the entire range of values of the property, whatever that range may be. This means that the total area under the curve in Fig. 2-1 d (and obviously under Figs. 2-lc and 2-16 as well) must be unity, the probability of a certainty. It is sometimes convenient to normalize /(x) to something other than unity. If it were properly called a “probability density,” the normalization would have to be to unity. Thus the expression “distribution function” is more commonly used with other normalizations. An example could be constructed as follows: out of 1000 cats in the local animal shelter, the number whose age lies between n and n + dn is given by F(n)dn. In this case the value of F(n) is just 1000 times the value of /, the probability density for age, and F is normalized to 1000, which is the total number of cats. If one knows, in analogy with Eq. 2-8, that /(x) is proportional to F(x), the normalized value of /(x) is

/(*) =

/

F(x)

(2-48)

OO

F(x)dx - on

in analogy with Eq. 2-11. If there are s different properties with continuous values, x,y,z,...,s, described by the ensemble, the ensemble defines not only the simple

Probability Theory

21

probability densities f(x),f(y),f(z),..., for each property, it also defines joint and multiple probability densities in analogy with Eq. 2-13. The complete distribution function fs(x,y,z,... ,s) for the ensemble gives full information about the ensemble. It is defined as follows: fs(x,y,z,... ,s) dx dy dz ■ ■ • ds is the fraction of members of the ensemble with the value of the first property between x and x + dx, and the value of the second property between^ and y + dy, and so on. This is analogous to Eq. 2-17. The complete distribution function can be reduced to lower order distribu¬ tion functions by integrating over all values of the properties in which one is not interested, in analogy to Eqs. 2-18:*

/

OO

dydz- ■ ■ dsfs(x,y,z,...,s),

(2-49)

dz ■ • • dsfs(x,y,z,... ,s),

(2-50)

dyf2(x,y).

(2-51)

- OO

/

OO

- OO

/

OO

- OO

If the properties given by x and y are uncorrelated, then the joint probability factors, in analogy with Eq. 2-23: /2(x,_y) =/(x)/(_y);

(uncorrelated properties)

fs(x,y,z,... ,s) = f(x)f(y)f(z) • ■ ■f(s).

(2-52)

(uncorrelated properties) (2-53)

The ensemble average of a function g of the value x of the property is given by

/

OO

dxf(x)g(x),

(2-54)

- OO

in analogy with Eq. 2-28. If g is a function of several properties, its average

*In this book it is common to find each differential immediately behind its integral sign instead of after the integrand. This is useful in multiple integrals in which the different variables have different limits. It emphasizes the variable over which the integration occurs. Also, in multiple integrals, one integral sign is often made to serve for all the differentials that follow it if there is no question of the limits of integration.

22

Basic Theory

is given by

8

=

■ ■ dsfs(x,y,z,...,s)g(x,y,z,...,s).

dxdydz

(2-55)

in analogy with Eq. 2-29. Expressions for mean deviation, mean square deviation (in particular, Eq. 2-40), and root mean square deviation (Eq. 2-41), and various moments about the mean, are completely analogous for continuous variables to those found for discrete. The median m of a distribution is easily defined with a continuous property to be the solution of either equation /m p oo (2-56) f(x)dx; 2= f f(x)dx. Jm - 00

CHANGE OF VARIABLE Suppose we know how /(x) depends on x; thus we know the probability that the value of the property lies between x and x + dx as a function of x. Perhaps, however we are really interested in the probability g(y)dy that the value of the function of x,/(x), lies between/ andy + dy. The problem is to relate g(y) to /(x). This is done immediately when we note that f(x)dx = g(y)dy,

(2-57)

so long as the interval dy is not independent but is that which accompanies the change dx in x. In other words, Eq. 2-57 is valid so long as dy dy——dx, dx

(2-58)

f(x)dx = g(y)C^dx,

(2-59)

which means Eq. 2-57 becomes

which must be valid for all infinitesimal dx's. Thus the desired relationship between /(x) and g(y) is

f(x) = g[y(x)\-^

or

g(y)=f[x(y)]~^ ■

(2-60)

The notation g[y(x)] means the function g(y) with / replaced by its value /(x) in terms of x.

Probability Theory

23

Example 2-1. Suppose the probability amplitude for values of the x-component of velocity of a molecule vx is given by

2 61

/2 e-mv*1/lkT ■

=

(

-

)

What is the definition of What is the most probable value of vxl What is the value of g(px), the probability amplitude for ^-component of momentum (px = mvx)l What is the definition of g(px)2 What is the value of h(Kx), the probability amplitude for kinetic energy due to motion in the x-direction (Kx = jmvx2)l What is the definition of h(Kx)2 The quantity