正在加载图片...
Figure.2.13 The equivalent discrete channel:BSC Note that for BSC,I(X;Y)=H(Y)-H(YX) =H(Y)+∑Px)∑P(ylx)log P(y川x) =H(Y)-H(p) s1-H(p) where H(p)=-plog:p-(1-p)log(1-p)is the binary entropy function.The equality is Casc =1+plogz p+(1-p)logz (1-p) (2.22) If we use the 2-PAM signaling over the AWGN channel,then p=(2E,R./N)(see Section 7)where Ris the coe rate.Substituting pinto()we can For a general discrete memoryless channel (DMC)with M input letters and L output letters,i.e.Ax=and A=(bb),its capacity is c=22a,ao,s P(bla,) bits/symbol (a)P(bla) 2.6.2 Discrete Inputs and Continuous Outputs One-Dimensional Constellations Con sider the following AWGN channe Y=X+N. N-N(0,o2) (2.23) where the channel input X is restricted to take on values in an alphabet Ax=R.1sM),but no restriction is imposed on the channel output,i.e.Ay=( )The channel transition probability density function(p.d.f.)is (r-x)2 p0川)=PBrY=y川X=)=2mGep-2a (2.24) The capacity of this channel is given by (2.25) p(y) and the average signal energy per symbol is( 2-212-21 Figure. 2.13 The equivalent discrete channel: BSC Note that for BSC, I X Y H Y H Y X ( ; ) ( ) ( | ) = − ( ) ( ) ( | )log ( | ) x y = + H Y P x P y x P y x   ( ) ( ) 1 ( ) H Y H p H p = −  − , where 2 2 H p p p p p ( ) log (1 )log (1 ) = − − − − is the binary entropy function. The equality is achieved when P(Y=0) = P(Y=1) = 1/2. This can only occur when P(X=0) = P(X=1) = 1/2, so the optimum input distribution is uniform. Hence the capacity of a BSC with crossover p is 2 2 1 log (1 )log (1 ) C p p p p BSC = + + − − (2.22) If we use the 2-PAM signaling over the AWGN channel, then 0 ( 2 / ) b c p Q E R N = (see Section 2.72), where Rc is the code rate. Let Rc = CBSC. Substituting p into (2.22) we can obtain the performance curves of the hard-decision decoding as shown in Fig. 2.15. For a general discrete memoryless channel (DMC) with M input letters and L output letters, i.e., 1 2 { , ,., } M = a a a X and 1 2 { , ,., }L = b b b Y , its capacity is 2 1 1 1 ( | ) max ( ) ( | )log ( ) ( | ) X M L l j X j l j M P j l X i l i i P b a C P a P b a P a P b a = = = =    bits/symbol 2.6.2 Discrete Inputs and Continuous Outputs ◼ One-Dimensional Constellations Consider the following AWGN channel Y = X + N, 2 N (0, )  (2.23) where the channel input X is restricted to take on values in an alphabet { ,1 } j X =    a R j M , but no restriction is imposed on the channel output, i.e., Y = (-, +). The channel transition probability density function (p.d.f.) is 2 | 2 1 ( ) ( | ) ( | ) exp 2 2 Y X y x p y x p Y y X x      −  = = = −    (2.24) The capacity of this channel is given by 1 ( | ) max ( ) ( | )log X ( ) M j X j j P j p y a C P a p y a dy p y + − = =   (2.25) and the average signal energy per symbol is 2 1 ( ) M s X j j j E P a a = = 
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有