正在加载图片...
Some Definition Noisy Channel Coding Theorem a Codeword x=(x[小]…x[D∈(12…1} a Channel capacity of a discrete memoryless Codebook a Consider a discrete memoryless channel with x2y…x input symbol X and output symbol Y The a Error probability (X;y) ■ Reliable communication 後照k季的 後k手哪 Channel Capacity of a Continuous-Valued Channel AWGN信道 a Differential entropy h(x)=-丁f(x)ogf(x)d r,m () Conditional entropy h(xpr)-,(x, y)log /xr(aly )dxdy x() Mutual information V2sin a I(X; Y): =h(x)-h(rr) HiV) C=maxe(x Y) y( ) 後照大手 後照k季DSome Definition ◼ Codeword ◼ Codebook ◼ Error probability ◼ Reliable communication xi = (xi 1,, xiN) i1,2,,  = x1 ,x2 ,,xC  pe = i ˆ  i 13 Noisy Channel Coding Theorem ◼ Channel capacity of a discrete memoryless channel ◼ Consider a discrete memoryless channel with input symbol X and output symbol Y. The capacity of the channel is p(x) C = max I (X ;Y ) 14 Channel Capacity of a Continuous-Valued Channel ◼ Differential entropy h (X )= − f (x)log f (x)dx ◼ Conditional entropy X ,Y X Y h (X Y )= − f (x, y)log f (x y )dxdy  ◼ Mutual information I (X ;Y ):= h (X )− h (X Y ) ◼ Capacity I (X ;Y ) f X :EX 2 P C = max AWGN信道 I x m xQ m 2 cosc t x (t) x (t) Wsinc(Wt− n) I y (t) 2 cosc t yI (t) yQ (t) yI m yQ m HL (f ) 1 W HL (f ) y (t) n (t) x (t) Wsinc(Wt− n) Q − 2 sinc t − 2 sin c t 1 W 15 16
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有