正在加载图片...
SA=∑Pf(P) (D.3.3a) S=∑n/(n) (D.3.3b) Sc=∑∑n(n)-∑p/(p)∑P/(p) (D.3.3c) In Equations (D3.2 and D 33), the term P means the probability of a microstate in which tate i and system B is in state j. For Equation(D3. 1)to he Sc=∑n/(n)∑Pf(n)=∑nf(p)+∑n/(p)=S4+S (D.3.4) The function f must be such that this is true regardless of the values of the probabilities Pi and pi if f(=In da b)=In(a)+In(b) substitution, the expression for S in Equation(D 3. 4)becomes ∑∑Pp(n)+∑∑n1n) (D.3.5a) Rearranging the sums, D.3.5a becomes n()+1m(p)n (D.3.5b) Because ∑P=ΣP (D.36) the left hand side of Equation (D3. 4)can be written as In( Pi In(Pi)+2Pi In pj (D.3.7) This means that Equation(D 3. 4)is satisfied for any Pi, Pi, I, m. Reynolds and Perkins show that the most general f(p is f=Cin(p ) where C is an arbitrary constant. Because the p;are less than unity, the constant is chosen to be negative to make the entropy positive Based on the above a the statistical definition of entropy can be given as S=-k∑P1lnP1 D.3 1D-41D-4 S pf p A ii i n = ∑ ( ) =1 (D.3.3a) S pf p B jj j m = ∑ ( ) =1 (D.3.3b) S p f p pf C ij ij p pf p j m i n i i i n j j j m = ∑ ∑ ( ) =∑ ( ) ∑ ( ) = == 1 11 =1 (D.3.3c) In Equations (D.3.2 and D.3.3), the term pij means the probability of a microstate in which system A is in state i and system B is in state j. For Equation (D.3.1) to hold, S pf p pf p p C ii f p pf p S S i n j j j m i i i n j j j m = ∑ ( ) ∑ ( ) = ∑ ( ) + ∑ ( ) = + A B == = = 11 1 1 . (D.3.4) The function f must be such that this is true regardless of the values of the probabilities p p i j and . This will occur if f( ) = ln( ) because ln ln ln ( ) ab a b ⋅ = ( ) + ( ). Making this substitution, the expression for Sc in Equation (D.3.4) becomes pp p pp p ij i j m i n ij j j m i n ∑ ∑ ln ln ( ) +∑ ∑ ( ) = = 1 1 =1 =1 . (D.3.5a) Rearranging the sums, D.3.5a becomes pp p p p p ii j j m i n jj i i n j m ln ln ( ) ∑             ∑ + ( ) ∑             ∑ = = 1 1 =1 =1 . (D.3.5b) Because p p i i n j j m = = ∑ = ∑ = 1 1 1, (D.3.6) the left hand side of Equation (D.3.4) can be written as S pp p p C ii i n j j j m = ∑ ( ) + ∑ ( ) = = ln ln 1 1 . (D.3.7) This means that Equation (D.3.4) is satisfied for any p p nm i j , ,, . Reynolds and Perkins show that the most general f p( )i is fC p = ( )i ln , where C is an arbitrary constant. Because the pi are less than unity, the constant is chosen to be negative to make the entropy positive. Based on the above a the statistical definition of entropy can be given as: S kpp i i = − ∑ i ln . (D.3.8)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有