正在加载图片...
It can be shown that the following properties are true for all Labove: 1)They have a symmetric p.d.f.p(LIx=-1)=p(Lx=+1): 2)They satisfy the consistency condition p(Lx)=ep(Lx). With (4.74).the conditional p.d.f.of L can be expressed as 1 (L,-(o/2x p(L lx)=- 2πo -exp 2o1 To measure the reliability of L the mutual information=I(XL)between X and L is used,which has proven to be the most accurate and convenient measure. In the case of Gaussian modelled input LLR values,the mutual information measure,in bits,can be calculated as p(Llx) I,=∑PL,lloe:ΣPpL,Ir, 2p(LX=x) =二,X=ep,K+x=- =1-L,IX=le:LXtK=- p(LX=x) L-uog.(1+exp-L)db. 1-, (4.76) where the last line follows from the consistency condition and because p(L=-1)=p(L=+1)for Gaussian densities. The mutual information,I=/(:L),between the output extrinsic LLR and the information bitx is more complex to evaluate since Lg is not exactly Gaussian.The values of Le corresponding to L have to be found via simulating the component decoder.Other than in very simple cases,no losed form or even analytical formulas for the output extrinsic mutual nation exist to date.Theo extrinsic information is the fore an empirical rmetion o the component decoder.the imputnd the operatingirati. formally given by the EXIT function Ig=T(I.EINo) 4484-48 It can be shown that the following properties are true for all L above: 1) They have a symmetric p.d.f. ( | 1) ( | 1) pL x p L x      ; 2) They satisfy the consistency condition ( |) (|) x L p L x e pL x     . With (4.74), the conditional p.d.f. of LA can be expressed as   2 2 2 1 ( / 2) ( | ) exp 2 2 A A A A A L x pL x                To measure the reliability of LA, the mutual information ( ; ) A A I  IXL between X and LA is used, which has proven to be the most accurate and convenient measure. In the case of Gaussian modelled input LLR values, the mutual information measure, in bits, can be calculated as 2 ' ( |) ( ) ( | )log ( ') ( | ') A AA A x A x pL x I P x p L x dL Px pL x       2 { 1} 1 2( | ) ( | )log 2 ( | 1) ( | 1) A A A x A A pL X x p L X x dL pL X pL X             2 { 1} 1 ( | 1) ( | 1) 1 ( | )log 2 (| ) A A A A x A pL X pL X p L X x dL pL X x              2 ( | 1) 1 ( | 1)log 1 ( | 1) A A A A pL X p L X dL pL X                    2 2 2 1 ( ) 1 exp log 1 exp( ) 2 2 A A A A A A L L dL                  (4.76) where the last line follows from the consistency condition and because pL X p L X ( | 1) ( | 1)     for Gaussian densities. The mutual information, ( ; ) E E I  IXL , between the output extrinsic LLR and the information bit x is more complex to evaluate since LE is not exactly Gaussian. The values of LE corresponding to LA have to be found via simulating the component decoder. Other than in very simple cases, no closed form or even analytical formulas for the output extrinsic mutual information exist to date. The output extrinsic information IE is therefore an empirical function of the component decoder, the input IA, and the operating signal-to-noise ratio, formally given by the EXIT function IE Ab  TI E N  , / 0 
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有