正在加载图片...
The probability distribution provides information on the randomness of the equilibrium quantum states. For example, suppose the system can only exist in three states(1, 2 and 3). If the distribution probability is Pr p=0 the system is in quantum state 1 and there is no randomness. If we were asked what quantum state the system is in, we would be able to say it is always in state 1. If the distribution were P=0.3 P2=0.2, p=0.5 p=0.5, 0.2 the randomness would not be zero and would be equal in both cases. We would be more uncertain about the instantaneous quantum state than in the first situation Maximum randomness corresponds to the case where the three states are equally probable P=1/3,p=l/3 In this case, we can only guess the instantaneous state with 33 per cent probability 1. D.3 A Statistical Definition of Entropy The list of the pi is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the Pi that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure There are several attributes that the sought for function should have. The first is that the average of the function over all of the microstates should have an extensive behavior in other words the microscopic description of the entropy of a system C, composed of parts A and b is given by (D.3.1) Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable The average of the function over all the microstates is defined by S=(f)=∑P(n) D.3.2) where the function f(p )is to be found. Suppose that system A has n microstates and system B has m microstates. The entropies of systems A, B, and C, are defined by 1D-31D-3 The probability distribution provides information on the randomness of the equilibrium quantum states. For example, suppose the system can only exist in three states (1, 2 and 3). If the distribution probability is p1=1, p2=0, p3=0 the system is in quantum state 1 and there is no randomness. If we were asked what quantum state the system is in, we would be able to say it is always in state 1. If the distribution were p1=0.3, p2=0.2, p3=0.5 or p1=0.5, p2=0.2, p3=0.3 the randomness would not be zero and would be equal in both cases. We would be more uncertain about the instantaneous quantum state than in the first situation. Maximum randomness corresponds to the case where the three states are equally probable: p1=1/3, p2=1/3, p3=1/3 In this case, we can only guess the instantaneous state with 33 per cent probability. 1.D.3 A Statistical Definition of Entropy The list of the pi is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the pi that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure. There are several attributes that the sought for function should have. The first is that the average of the function over all of the microstates should have an extensive behavior, in other words the microscopic description of the entropy of a system C, composed of parts A and B is given by SSS C AB = + . (D.3.1) Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable. The average of the function over all the microstates is defined by S f pf p i i = = ∑ ( )i (D.3.2) where the function f p( )i is to be found. Suppose that system A has n microstates and system B has m microstates. The entropies of systems A, B, and C, are defined by
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有