正在加载图片...
as the value of mutual information between the input and output when the input distri- bution is Q.We will now show that for every rate R<I(Q),the exponent E(R,Q)>0 and thus for every rate R<1(Q)we can find codes with arbitrarily small probability of error(by taking n large enough).Since this statement holds for every input distribution Q.it also follows that for every rate R<maxo I(Q)=maxI(X:Y)there exist codes ofarbitrarilysmal probability of This proves that any rate Rbelow capacity is achievable (achievability part of the coding thed A tedious but straightforward calculation also yields that Eo(p.Q) =1(Q) dp Using the fact that whenever Q and A are nonnegative,the function t[0,) Q(r)A(r)1 is nonincreasing,it is easy to show that Eo(p,Q)is nondecreasing in p. We do not need it here,but it is also possible to show that Eolp,Q)is a concave function of p.Thus,Figure I shows the typical behavior of Eo as a function of p. Observe now that since the function RE(R,Q)is a maximum of the linear functions REo(p,Q)-pR,(see Figure 2),we see that E(R.Q)>0 whenever for some p 0,1], Eo(p,Q)-pR >0 is satisfied. Eo(Q) Eo(i.Q) E(R,Q) I(Q)R Figure 2:E(R)as a maximum of linear functions It now follows that if R<Eolp,Q)/p for some p(0,1]and Q,then Eo(p,Q)-pR>0 and thus E(R,Q)>0.But, 1g=0BQ刨 ap =0,B0@-▣g 0 where the last equality follows from fact that Eo(0,Q)=-In Q(E)P(l)=-In1= 0.Thus,by the definition of the limit above,for every R<I(),we can find a p such that Eo(p,Q)/p R.Hence for every R<I(Q)we have Er(R,Q)>0.Since the above argument holds for any input distribution Q,it holds in particular for the Q that maximizes 1(O).This proves the existence of codes with arbitrarily small probability of error for every rate less than the capacity max I(Q). The approach we used here yields to the function E(R,Q)called random coding error exponent.To prove the achievability part of the coding theorem it would suffice to show that for any rate R below capacity there exists a sequence of rate R codes of length n with corresponding error probability that tends to zero as n tends to infinity.The random coding error exponent argument in addition to prove it,also characterizes the decay of the probability of error as n tends to infinity.as the value of mutual information between the input and output when the input distri￾bution is Q. We will now show that for every rate R < I(Q), the exponent Er(R, Q) > 0, and thus for every rate R < I(Q) we can find codes with arbitrarily small probability of error (by taking n large enough). Since this statement holds for every input distribution Q, it also follows that for every rate R < maxQ I(Q) = maxpX I(X; Y ) there exist codes of arbitrarily small probability of error. This proves that any rate R below capacity is achievable (achievability part of the coding theorem). A tedious but straightforward calculation also yields that ∂E0(ρ, Q) ∂ρ ρ=0 = I(Q). Using the fact that whenever Q and A are nonnegative, the function t ∈ [0, ∞) 7→ P x Q(x)A(x) 1/tt is nonincreasing, it is easy to show that E0(ρ, Q) is nondecreasing in ρ. We do not need it here, but it is also possible to show that E0(ρ, Q) is a concave function of ρ. Thus, Figure 1 shows the typical behavior of E0 as a function of ρ. Observe now that since the function R 7→ Er(R, Q) is a maximum of the linear functions R 7→ E0(ρ, Q) − ρR, (see Figure 2), we see that Er(R, Q) > 0 whenever for some ρ ∈ [0, 1], E0(ρ, Q) − ρR > 0 is satisfied. E0(1, Q) E0( 1 2 , Q) E0( 1 4 , Q) I(Q) Er(R, Q) R Figure 2: Er(R) as a maximum of linear functions. It now follows that if R < E0(ρ, Q)/ρ for some ρ ∈ (0, 1] and Q, then E0(ρ, Q)−ρR > 0 and thus Er(R, Q) > 0. But, I(Q) = ∂E0(ρ, Q) ∂ρ ρ=0 = lim ρ→0 E0(ρ, Q) − E0(0, Q) ρ = lim ρ→0 E0(ρ, Q) ρ . where the last equality follows from fact that E0(0, Q) = − lnP x,y Q(x)P(y|x) = − ln 1 = 0. Thus, by the definition of the limit above, for every R < I(Q), we can find a ρ such that E0(ρ, Q)/ρ > R. Hence for every R < I(Q) we have Er(R, Q) > 0. Since the above argument holds for any input distribution Q, it holds in particular for the Q that maximizes I(Q). This proves the existence of codes with arbitrarily small probability of error for every rate less than the capacity maxQ I(Q). The approach we used here yields to the function Er(R, Q) called random coding error exponent. To prove the achievability part of the coding theorem it would suffice to show that for any rate R below capacity there exists a sequence of rate R codes of length n with corresponding error probability that tends to zero as n tends to infinity. The random coding error exponent argument in addition to prove it, also characterizes the decay of the probability of error as n tends to infinity. 8
<<向上翻页
©2008-现在 cucdc.com 高等教育资讯网 版权所有