正在加载图片...
The constant k is known as the boltzmann constant k=1.380×10-23J (D.3.9) K The value of k is(another wonderful result! given by k=R/ AVogadro, where R is the universal gas constant,8.3143 J/(mol-K)and NAvogadro is Avogadro's number, 6.02 x 10- molecules per mol. Sometimes k is called the gas constant per molecule. With this value for k, the statistical definition of entropy is identical with the macroscopic definition of entropy 1. D 4 Connection between the statistical Definition of Entropy and Randomness We need now to examine the behavior of the statistical definition of entropy as regards randomness. Because a uniform probability distribution reflects the largest randomness, a system with n allowed states will have the greatest entropy when each state is equally likely. In this situation, the probabilities become Pi=p D.4. where Q2 is the total number of microstates. The entropy is thus g2(g2 S=kIn2 D.4.2) Equation( D. 4.1)states that the larger the number of possible states the larger the entro The behavior of the entropy stated in Equation(D 4.2)can be summarized as follows a)Sis maximum when Q2 is maximum, which means many permitted quantum states, hence much randomness b)S is minimum when Q2 is minimum. In particular, for $2=1, there is no randomness and s=0 These trends are in accord with our qualitative ideas concerning randomness. Equation(D 4.2) is carved on Boltzmann's tombstone (he died about a hundred years ago) in Vienna. We can also examine the additive property of entropy with respect to probabilities. If we have two systems, A and B, which are viewed as a combined system, C, the quantum states for the combined system are the combinations of the quantum states from a and b. The quantum state where a is in its state x and B in its state y would have a probability Pa. PB. because the two probabilities are independent. The number of probabilities for the combined system, 2c, is thus defined by $c=2a $2B. The entropy of the combined system is ID-51D-5 The constant k is known as the Boltzmann constant, k J K = × − 1 380 10 23 . . (D.3.9) The value of k is (another wonderful result!) given by k =R/ NAvogadro , where R is the universal gas constant, 8.3143 J/(mol-K) and NAvogadro is Avogadro's number, 6 02 1023 . × molecules per mol. Sometimes k is called the gas constant per molecule. With this value for k, the statistical definition of entropy is identical with the macroscopic definition of entropy. 1.D.4 Connection between the Statistical Definition of Entropy and Randomness We need now to examine the behavior of the statistical definition of entropy as regards randomness. Because a uniform probability distribution reflects the largest randomness, a system with n allowed states will have the greatest entropy when each state is equally likely. In this situation, the probabilities become p p i = = 1 Ω, (D.4.1) where Ω is the total number of microstates. The entropy is thus Sk k k i = − ∑     = −           = −     = 11 11 1 1Ω Ω Ω ΩΩ Ω Ω ln ln ln S k = lnΩ . (D.4.2) Equation (D.4.1) states that the larger the number of possible states the larger the entropy. The behavior of the entropy stated in Equation (D.4.2) can be summarized as follows: a) S is maximum when Ω is maximum, which means many permitted quantum states, hence much randomness, b) S is minimum when Ω is minimum. In particular, for Ω=1, there is no randomness and S=0. These trends are in accord with our qualitative ideas concerning randomness. Equation (D.4.2) is carved on Boltzmann's tombstone (he died about a hundred years ago) in Vienna. We can also examine the additive property of entropy with respect to probabilities. If we have two systems, A and B, which are viewed as a combined system, C, the quantum states for the combined system are the combinations of the quantum states from A and B. The quantum state where A is in its state x and B in its state y would have a probability p p Ax By ⋅ because the two probabilities are independent. The number of probabilities for the combined system, ΩC , is thus defined by Ω ΩΩ C AB = ⋅ . The entropy of the combined system is
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有