当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

美国麻省理工大学:《Thermal Energy》(热能) 06 part1d

资源类别:文库,文档格式:PDF,文档页数:8,文件大小:127.63KB,团购合买
1. D: Interpretation of Entropy on the Microscopic Scale- The Connection between Randomness and entropy 1. D I Entropy Change in Mixing of Two ldeal gases Consider an insulated rigid container of gas separated into two halves by a heat conducting partition so the temperature of the gas in each part is the same. One side contains air, the other side another gas, say argon, both regarded as ideal gases. The mass of gas in each side is such
点击下载完整版文档(PDF)

1. D: Interpretation of Entropy on the Microscopic Scale- The Connection between Randomness and entropy 1. D I Entropy Change in Mixing of Two ldeal gases Consider an insulated rigid container of gas separated into two halves by a heat conducting partition so the temperature of the gas in each part is the same. One side contains air, the other side another gas, say argon, both regarded as ideal gases. The mass of gas in each side is such that the pressure is also the same The entropy of this system is the sum of the entropies of the two parts: Ssystem=Sair sarg Suppose the partition is taken away so the gases are free to diffuse throughout the volume. For an ideal gas, the energy is not a function of volume, and, for each gas, there is no change in temperature. (The energy of the overall system is unchanged, the two gases were at the same temperature initially, so the final temperature is the same as the initial temperature. )The entropy change of each gas is thus the same as that for a reversible isothermal expansion from le initial specific volume v, to the final specific volume, V. For a mass m of ideal gas, the entropy change is AS=mRIn(v,/v ). The entropy change of the system is ASm=4m+48mm加m/h]+mmRm可团厘m/y-①1) Equation D. 1. 1) states that there is an entropy increase due to the increased volume that each gas is able to access. Examining the mixing process on a molecular level gives additional insight. Suppose we were able to see the gas molecules in different colors, say the air molecules as white and the argon into the red region and, similarly, red molecules start to come into the white volume. As we c molecules as red. After we took the partition away, we would see white molecules start to m watched, as the gases mixed there would be more and more of the different color molecules in the regions that were initially all white and all red. If we moved further away so we could no longer pick out individual molecules, we would see the growth of pink regions spreading into the initially red and white areas. In the final state, we would expect a uniform pink gas to exist throughout the volume. There might be occasional small regions which were slightly more red or slightly more white, but these fluctuations would only last for a time on the order of several molecular collisions In terms of the overall spatial distribution of the molecules, we would say this final state was more random, more mixed than the initial state in which the red and white molecules were confined to specific regions. Another way to say this is in terms of disorder", there is more disorder in the final state than in the initial state. One view of entropy is thus that increases in entropy are connected with increases in randomness or disorder. This link can be made rigorous and is extremely useful in describing systems on a microscopic basis. While we do not have scope to examine this topic in depth, the purpose of Section 1. D is to make plausible the link between disorder and entropy through a statistical definition of entropy ID-1

1D-1 1.D: Interpretation of Entropy on the Microscopic Scale - The Connection between Randomness and Entropy 1.D.1 Entropy Change in Mixing of Two Ideal Gases Consider an insulated rigid container of gas separated into two halves by a heat conducting partition so the temperature of the gas in each part is the same. One side contains air, the other side another gas, say argon, both regarded as ideal gases. The mass of gas in each side is such that the pressure is also the same. The entropy of this system is the sum of the entropies of the two parts: S SS system air on = + arg . Suppose the partition is taken away so the gases are free to diffuse throughout the volume. For an ideal gas, the energy is not a function of volume, and, for each gas, there is no change in temperature. (The energy of the overall system is unchanged, the two gases were at the same temperature initially, so the final temperature is the same as the initial temperature.) The entropy change of each gas is thus the same as that for a reversible isothermal expansion from the initial specific volume vito the final specific volume, vf . For a mass m of ideal gas, the entropy change is ∆S mR v v = ( ) f i ln . The entropy change of the system is ∆ ∆∆ S S S mR v v m R v v system air on air air =+ = arg arg arg arg arg [ ] f air i air on on + [ ] f on i on ln ( ) ( ) ln ( ) ( ) . (D.1.1) Equation (D.1.1) states that there is an entropy increase due to the increased volume that each gas is able to access. Examining the mixing process on a molecular level gives additional insight. Suppose we were able to see the gas molecules in different colors, say the air molecules as white and the argon molecules as red. After we took the partition away, we would see white molecules start to move into the red region and, similarly, red molecules start to come into the white volume. As we watched, as the gases mixed, there would be more and more of the different color molecules in the regions that were initially all white and all red. If we moved further away so we could no longer pick out individual molecules, we would see the growth of pink regions spreading into the initially red and white areas. In the final state, we would expect a uniform pink gas to exist throughout the volume. There might be occasional small regions which were slightly more red or slightly more white, but these fluctuations would only last for a time on the order of several molecular collisions. In terms of the overall spatial distribution of the molecules, we would say this final state was more random, more mixed, than the initial state in which the red and white molecules were confined to specific regions. Another way to say this is in terms of “disorder”; there is more disorder in the final state than in the initial state. One view of entropy is thus that increases in entropy are connected with increases in randomness or disorder. This link can be made rigorous and is extremely useful in describing systems on a microscopic basis. While we do not have scope to examine this topic in depth, the purpose of Section 1.D is to make plausible the link between disorder and entropy through a statistical definition of entropy

1. D2 Microscopic and Macroscopic Descriptions of a System The microscopic description of a system is the complete description of each particle in this system. In the above example, the microscopic description of the gas would be the list of the state of each molecule: position and velocity in this problem. It would require a great deal of data for this description; there are roughly 10 molecules in a cube of air one centimeter on a side at room temperature and pressure. The macroscopic description, which is in terms of a few (two! properties is thus far more accessible and useable for engineering applications, although it is restricted to equilibrium states To address the description of entropy on a microscopic level, we need to state some results concerning microscopic systems. These, and the computations and arguments below are taken almost entirely from the excellent discussion in Chapter 6 of Engineering Thermodynamics by Reynolds and Perkins(1977) mechanics is that the states of atoms, molecules, and entire systems are discretely quannlee m For a given macroscopic system, there are many microscopic states. a key idea from quantum This means that a system of particles under certain constraints, like being in a box of a specified size, or having a fixed total energy, can exist in a finite number of allowed microscopic states This number can be very big, but it is finite The microstates of the system keep changing with time from one quantum state to another as molecules move and collide with one another. The probability for the system to be in a particular quantum state is defined by its quantum-state probability Pi. The set of the pi is called the distribution of probability. The sum of the probabilities of all the allowed quantum states must be unity, hence for any time t, ∑P1=1 (D.21) When the system reaches equilibrium, the individual molecules still change from one quantum state to another. In equilibrium, however, the system state does not change with time; so the probabilities for the different quantum states are independent of time. This distribution is then called the equilibrium distribution, and the probability pi can be viewed as the fraction of time a system spends in the i quantum state. In what follows, we limit consideration to equilibrium states We can get back to macroscopic quantities from the microscopic description using the probability distribution. For instance, the macroscopic energy of the system would be the weighted average of the successive energies of the system(the energies of the quantum states ); weighted by the relative time the system spends in the corresponding microstates. In terms of probabilities, the average energy, E),is (E)=EPE, where E, is the energy of a quantum state D.22) Reynolds, W.C., and Perkins, H.C., Engineering Thermodynamics, McGraw-Hill Book Co., 1977 1D-2

1D-2 1.D.2 Microscopic and Macroscopic Descriptions of a System The microscopic description of a system is the complete description of each particle in this system. In the above example, the microscopic description of the gas would be the list of the state of each molecule: position and velocity in this problem. It would require a great deal of data for this description; there are roughly 1019 molecules in a cube of air one centimeter on a side at room temperature and pressure. The macroscopic description, which is in terms of a few (two!) properties is thus far more accessible and useable for engineering applications, although it is restricted to equilibrium states. To address the description of entropy on a microscopic level, we need to state some results concerning microscopic systems. These, and the computations and arguments below are taken almost entirely from the excellent discussion in Chapter 6 of Engineering Thermodynamics by Reynolds and Perkins (1977)* . For a given macroscopic system, there are many microscopic states. A key idea from quantum mechanics is that the states of atoms, molecules, and entire systems are discretely quantized. This means that a system of particles under certain constraints, like being in a box of a specified size, or having a fixed total energy, can exist in a finite number of allowed microscopic states. This number can be very big, but it is finite. The microstates of the system keep changing with time from one quantum state to another as molecules move and collide with one another. The probability for the system to be in a particular quantum state is defined by its quantum-state probability pi . The set of the pi is called the distribution of probability. The sum of the probabilities of all the allowed quantum states must be unity, hence for any time t, pi i ∑ = 1 (D.2.1) When the system reaches equilibrium, the individual molecules still change from one quantum state to another. In equilibrium, however, the system state does not change with time; so the probabilities for the different quantum states are independent of time. This distribution is then called the equilibrium distribution, and the probability pi can be viewed as the fraction of time a system spends in the i th quantum state. In what follows, we limit consideration to equilibrium states. We can get back to macroscopic quantities from the microscopic description using the probability distribution. For instance, the macroscopic energy of the system would be the weighted average of the successive energies of the system (the energies of the quantum states); weighted by the relative time the system spends in the corresponding microstates. In terms of probabilities, the average energy, E , is E pi i i = ∑ ε , where i ε is the energy of a quantum state. (D.2.2) * Reynolds, W.C., and Perkins, H.C., Engineering Thermodynamics, McGraw-Hill Book Co., 1977

The probability distribution provides information on the randomness of the equilibrium quantum states. For example, suppose the system can only exist in three states(1, 2 and 3). If the distribution probability is Pr p=0 the system is in quantum state 1 and there is no randomness. If we were asked what quantum state the system is in, we would be able to say it is always in state 1. If the distribution were P=0.3 P2=0.2, p=0.5 p=0.5, 0.2 the randomness would not be zero and would be equal in both cases. We would be more uncertain about the instantaneous quantum state than in the first situation Maximum randomness corresponds to the case where the three states are equally probable P=1/3,p=l/3 In this case, we can only guess the instantaneous state with 33 per cent probability 1. D.3 A Statistical Definition of Entropy The list of the pi is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the Pi that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure There are several attributes that the sought for function should have. The first is that the average of the function over all of the microstates should have an extensive behavior in other words the microscopic description of the entropy of a system C, composed of parts A and b is given by (D.3.1) Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable The average of the function over all the microstates is defined by S=(f)=∑P(n) D.3.2) where the function f(p )is to be found. Suppose that system A has n microstates and system B has m microstates. The entropies of systems A, B, and C, are defined by 1D-3

1D-3 The probability distribution provides information on the randomness of the equilibrium quantum states. For example, suppose the system can only exist in three states (1, 2 and 3). If the distribution probability is p1=1, p2=0, p3=0 the system is in quantum state 1 and there is no randomness. If we were asked what quantum state the system is in, we would be able to say it is always in state 1. If the distribution were p1=0.3, p2=0.2, p3=0.5 or p1=0.5, p2=0.2, p3=0.3 the randomness would not be zero and would be equal in both cases. We would be more uncertain about the instantaneous quantum state than in the first situation. Maximum randomness corresponds to the case where the three states are equally probable: p1=1/3, p2=1/3, p3=1/3 In this case, we can only guess the instantaneous state with 33 per cent probability. 1.D.3 A Statistical Definition of Entropy The list of the pi is a precise description of the randomness in the system, but the number of quantum states in almost any industrial system is so high this list is not useable. We thus look for a single quantity, which is a function of the pi that gives an appropriate measure of the randomness of a system. As shown below, the entropy provides this measure. There are several attributes that the sought for function should have. The first is that the average of the function over all of the microstates should have an extensive behavior, in other words the microscopic description of the entropy of a system C, composed of parts A and B is given by SSS C AB = + . (D.3.1) Second is that entropy should increase with randomness and should be largest for a given energy when all the quantum states are equiprobable. The average of the function over all the microstates is defined by S f pf p i i = = ∑ ( )i (D.3.2) where the function f p( )i is to be found. Suppose that system A has n microstates and system B has m microstates. The entropies of systems A, B, and C, are defined by

SA=∑Pf(P) (D.3.3a) S=∑n/(n) (D.3.3b) Sc=∑∑n(n)-∑p/(p)∑P/(p) (D.3.3c) In Equations (D3.2 and D 33), the term P means the probability of a microstate in which tate i and system B is in state j. For Equation(D3. 1)to he Sc=∑n/(n)∑Pf(n)=∑nf(p)+∑n/(p)=S4+S (D.3.4) The function f must be such that this is true regardless of the values of the probabilities Pi and pi if f(=In da b)=In(a)+In(b) substitution, the expression for S in Equation(D 3. 4)becomes ∑∑Pp(n)+∑∑n1n) (D.3.5a) Rearranging the sums, D.3.5a becomes n()+1m(p)n (D.3.5b) Because ∑P=ΣP (D.36) the left hand side of Equation (D3. 4)can be written as In( Pi In(Pi)+2Pi In pj (D.3.7) This means that Equation(D 3. 4)is satisfied for any Pi, Pi, I, m. Reynolds and Perkins show that the most general f(p is f=Cin(p ) where C is an arbitrary constant. Because the p;are less than unity, the constant is chosen to be negative to make the entropy positive Based on the above a the statistical definition of entropy can be given as S=-k∑P1lnP1 D.3 1D-4

1D-4 S pf p A ii i n = ∑ ( ) =1 (D.3.3a) S pf p B jj j m = ∑ ( ) =1 (D.3.3b) S p f p pf C ij ij p pf p j m i n i i i n j j j m = ∑ ∑ ( ) =∑ ( ) ∑ ( ) = == 1 11 =1 (D.3.3c) In Equations (D.3.2 and D.3.3), the term pij means the probability of a microstate in which system A is in state i and system B is in state j. For Equation (D.3.1) to hold, S pf p pf p p C ii f p pf p S S i n j j j m i i i n j j j m = ∑ ( ) ∑ ( ) = ∑ ( ) + ∑ ( ) = + A B == = = 11 1 1 . (D.3.4) The function f must be such that this is true regardless of the values of the probabilities p p i j and . This will occur if f( ) = ln( ) because ln ln ln ( ) ab a b ⋅ = ( ) + ( ). Making this substitution, the expression for Sc in Equation (D.3.4) becomes pp p pp p ij i j m i n ij j j m i n ∑ ∑ ln ln ( ) +∑ ∑ ( ) = = 1 1 =1 =1 . (D.3.5a) Rearranging the sums, D.3.5a becomes pp p p p p ii j j m i n jj i i n j m ln ln ( ) ∑             ∑ + ( ) ∑             ∑ = = 1 1 =1 =1 . (D.3.5b) Because p p i i n j j m = = ∑ = ∑ = 1 1 1, (D.3.6) the left hand side of Equation (D.3.4) can be written as S pp p p C ii i n j j j m = ∑ ( ) + ∑ ( ) = = ln ln 1 1 . (D.3.7) This means that Equation (D.3.4) is satisfied for any p p nm i j , ,, . Reynolds and Perkins show that the most general f p( )i is fC p = ( )i ln , where C is an arbitrary constant. Because the pi are less than unity, the constant is chosen to be negative to make the entropy positive. Based on the above a the statistical definition of entropy can be given as: S kpp i i = − ∑ i ln . (D.3.8)

The constant k is known as the boltzmann constant k=1.380×10-23J (D.3.9) K The value of k is(another wonderful result! given by k=R/ AVogadro, where R is the universal gas constant,8.3143 J/(mol-K)and NAvogadro is Avogadro's number, 6.02 x 10- molecules per mol. Sometimes k is called the gas constant per molecule. With this value for k, the statistical definition of entropy is identical with the macroscopic definition of entropy 1. D 4 Connection between the statistical Definition of Entropy and Randomness We need now to examine the behavior of the statistical definition of entropy as regards randomness. Because a uniform probability distribution reflects the largest randomness, a system with n allowed states will have the greatest entropy when each state is equally likely. In this situation, the probabilities become Pi=p D.4. where Q2 is the total number of microstates. The entropy is thus g2(g2 S=kIn2 D.4.2) Equation( D. 4.1)states that the larger the number of possible states the larger the entro The behavior of the entropy stated in Equation(D 4.2)can be summarized as follows a)Sis maximum when Q2 is maximum, which means many permitted quantum states, hence much randomness b)S is minimum when Q2 is minimum. In particular, for $2=1, there is no randomness and s=0 These trends are in accord with our qualitative ideas concerning randomness. Equation(D 4.2) is carved on Boltzmann's tombstone (he died about a hundred years ago) in Vienna. We can also examine the additive property of entropy with respect to probabilities. If we have two systems, A and B, which are viewed as a combined system, C, the quantum states for the combined system are the combinations of the quantum states from a and b. The quantum state where a is in its state x and B in its state y would have a probability Pa. PB. because the two probabilities are independent. The number of probabilities for the combined system, 2c, is thus defined by $c=2a $2B. The entropy of the combined system is ID-5

1D-5 The constant k is known as the Boltzmann constant, k J K = × − 1 380 10 23 . . (D.3.9) The value of k is (another wonderful result!) given by k =R/ NAvogadro , where R is the universal gas constant, 8.3143 J/(mol-K) and NAvogadro is Avogadro's number, 6 02 1023 . × molecules per mol. Sometimes k is called the gas constant per molecule. With this value for k, the statistical definition of entropy is identical with the macroscopic definition of entropy. 1.D.4 Connection between the Statistical Definition of Entropy and Randomness We need now to examine the behavior of the statistical definition of entropy as regards randomness. Because a uniform probability distribution reflects the largest randomness, a system with n allowed states will have the greatest entropy when each state is equally likely. In this situation, the probabilities become p p i = = 1 Ω, (D.4.1) where Ω is the total number of microstates. The entropy is thus Sk k k i = − ∑     = −           = −     = 11 11 1 1Ω Ω Ω ΩΩ Ω Ω ln ln ln S k = lnΩ . (D.4.2) Equation (D.4.1) states that the larger the number of possible states the larger the entropy. The behavior of the entropy stated in Equation (D.4.2) can be summarized as follows: a) S is maximum when Ω is maximum, which means many permitted quantum states, hence much randomness, b) S is minimum when Ω is minimum. In particular, for Ω=1, there is no randomness and S=0. These trends are in accord with our qualitative ideas concerning randomness. Equation (D.4.2) is carved on Boltzmann's tombstone (he died about a hundred years ago) in Vienna. We can also examine the additive property of entropy with respect to probabilities. If we have two systems, A and B, which are viewed as a combined system, C, the quantum states for the combined system are the combinations of the quantum states from A and B. The quantum state where A is in its state x and B in its state y would have a probability p p Ax By ⋅ because the two probabilities are independent. The number of probabilities for the combined system, ΩC , is thus defined by Ω ΩΩ C AB = ⋅ . The entropy of the combined system is

Sc=kin( 2. 22g)=king,+kIn Br=s,+s (D43) Equation(D 4.2)is sometimes taken as the basic definition of entropy, but it should be remembered that it is only appropriate when each quantum state is equally likely. Equation (D3. 8)is more general and applies equally for equilibrium and non-equilibrium situations A simple numerical example shows trends in entropy changes and randomness for a system which can exist in three states. Consider the five probability distributions i)p1=1.0,P2=0,3=0;S=-k(mnl+0m0+0ln0)=0 i)P=0.P2=0.2,P2=0;S=-k{0.8M(08)+0.2b(02)+01m(0)]=0.00 i)n=08P2=0.1,P3=01;S=-k08m(0.8)+0.m0)+0.m0.=0639k iy)n=05P2=03,乃2=02;S=-40.5m(05)+03M(03)+0.2m(0.2)=1.03k 1-1-1-3-109米 The first distribution has no randomness For the second we know that state 3 is never found States (iii) and (iv) have thus higher randomI largest entropy. 1. 5. Numerical Example of the Approach to the equilibrium distribution Reynolds and Perkins give a numerical example which illustrates the above concepts and also the tendency of a closed isolated system to tend to equilibrium Reynolds and Perkins, Engineering Thermodynamics, McGraw-Hill, 1977. Sec. 6.7. pp 177-183 1D-6

1D-6 Sk k k SS C AB A B A B = ⋅ ln ln ln ( ) ΩΩ Ω Ω = + =+ (D.4.3) Equation (D.4.2) is sometimes taken as the basic definition of entropy, but it should be remembered that it is only appropriate when each quantum state is equally likely. Equation (D.3.8) is more general and applies equally for equilibrium and non-equilibrium situations. A simple numerical example shows trends in entropy changes and randomness for a system which can exist in three states. Consider the five probability distributions i) p pp 1 23 = == 10 0 0 ., , ; S k =− + + ( ) 110 0 ln ln ln 00 0 = ii) pp p 123 === 08 02 0 ., ., ; Sk k = − [ ] 0 8 0 8 0 2 0 2 0 0 0 500 . ln . . l ( ) + n . ln ( ) + ( ) = . iii) ppp 123 === 08 01 01 ., ., . ; Sk k = − [ ] 0 8 0 8 0 1 0 1 0 1 0 1 0 639 . ln . . l ( ) + n . . ln . . ( ) + ( ) = iv) pp p 123 === 05 03 02 ., ., . ; Sk k = − [ ] 0 5 0 5 0 3 0 3 0 2 0 2 1 03 . ln . . l ( ) + n . . ln . . ( ) + ( ) = v) pp p 123 === 13 13 13 /, /, / ; Sk k = −           3 = 1 3 1 3 ln . 1 099 . The first distribution has no randomness. For the second, we know that state 3 is never found. States (iii) and (iv) have progressively greater uncertainty about the distribution of states and thus higher randomness. State (v) has the greatest randomness and uncertainty and also the largest entropy. 1.D.5. Numerical Example of the Approach to the Equilibrium Distribution Reynolds and Perkins give a numerical example which illustrates the above concepts and also the tendency of a closed isolated system to tend to equilibrium. Reynolds and Perkins,Engineering Thermodynamics, McGraw-Hill, 1977. Sec. 6.7. pp.177-183

1. D 6 Summary and Conclusions a) Entropy as defined from a microscopic point of view is a measure of randomness in a system b)The entropy is related to the probabilities pi of the individual quantum states of the system by ∑P2lmP where k, the Boltzmann constant is given by R/NAvogadre c) For a system in which there are Q2 quantum states, all of which are equally probable(for which the probability is P=1/ 2), the entropy is given by s=kIne The more quantum states, the more the randomness and uncertainty that a system is in a particular quantum state d ) From the statistical point of view there is a finite, but exceedingly small possibility that a system that is well mixed could suddenly"unmix"and that all the air molecules in the room could suddenly come to the front half of the room. The unlikelihood of this is well described by Denbigh [Principles of Chemical equilibrium, 1981]in a discussion of the behavior of an isolated system In the case of systems containing an appreciable number of atoms, it becomes increasingly improbably that we shall ever observe the system in a non-uniform condition. For example, it is calculated that the probability of a relative change of density, Ap/p, of only 0.001% in 1 cm'of air is smaller than 10 and would not be observed in trillions of years. Thus, according to the statistical interpretation the discovery of an appreciable and spontaneous decrease in the entropy of an isolated system, if it is separated into two parts, is not impossible, but exceedingly improbable. We repeat, however, that it is an absolute impossibility to know when it will take place. 1D-7

1D-7 1.D.6 Summary and Conclusions a) Entropy as defined from a microscopic point of view is a measure of randomness in a system. b) The entropy is related to the probabilities pi of the individual quantum states of the system by S kpp i i = − ∑ i ln where k, the Boltzmann constant is given by R/ NAvogadro . c) For a system in which there are Ω quantum states, all of which are equally probable (for which the probability is pi = 1/Ω ), the entropy is given by S k = lnΩ . The more quantum states, the more the randomness and uncertainty that a system is in a particular quantum state. d) From the statistical point of view there is a finite, but exceedingly small possibility that a system that is well mixed could suddenly "unmix" and that all the air molecules in the room could suddenly come to the front half of the room. The unlikelihood of this is well described by Denbigh [Principles of Chemical Equilibrium, 1981] in a discussion of the behavior of an isolated system: "In the case of systems containing an appreciable number of atoms, it becomes increasingly improbably that we shall ever observe the system in a non-uniform condition. For example, it is calculated that the probability of a relative change of density, ∆ρ ρ, of only 0.001% in 1 cm3 of air is smaller than 10 108 − and would not be observed in trillions of years. Thus, according to the statistical interpretation the discovery of an appreciable and spontaneous decrease in the entropy of an isolated system, if it is separated into two parts, is not impossible, but exceedingly improbable. We repeat , however, that it is an absolute impossibility to know when it will take place

e) The definition of entropy in the form S=-kEPi Inp; arises in other aerospace fields, notably that of information theory. In this context, the constant k is taken as unity and the entropy becomes a dimensionless measure of the uncertainty represented by a particular message. There is no underlying physical connection with thermodynamic entropy, but the underlying uncertainty concepts are the same f The presentation of entropy in this subject is focused on the connection to macroscopic variables and behavior. These involve the definition of entropy given in Section 1. B of the notes and the physical link with lost work, neither of which makes any mention of molecular (microscopic) behavior. The approach in other sections of the notes is only connected to these macroscopic processes and does not rely at all upon the microscopic viewpoint. Exposure to the statistical definition of entropy, however, is helpful as another way not only to answer the question of What is entropy? but also to see the depth of this fundamental concept and the connection with other areas of technology. 1D-8

1D-8 e) The definition of entropy in the form S kpp i i = − ∑ i ln arises in other aerospace fields, notably that of information theory. In this context, the constant k is taken as unity and the entropy becomes a dimensionless measure of the uncertainty represented by a particular message. There is no underlying physical connection with thermodynamic entropy, but the underlying uncertainty concepts are the same. f) The presentation of entropy in this subject is focused on the connection to macroscopic variables and behavior. These involve the definition of entropy given in Section 1.B of the notes and the physical link with lost work, neither of which makes any mention of molecular (microscopic) behavior. The approach in other sections of the notes is only connected to these macroscopic processes and does not rely at all upon the microscopic viewpoint. Exposure to the statistical definition of entropy, however, is helpful as another way not only to answer the question of "What is entropy?" but also to see the depth of this fundamental concept and the connection with other areas of technology

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
已到末页,全文结束
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有