正在加载图片...
Given that y is received.the conditional error probability of decoding is defined as P(Ey)eP(e≠cy) Then the error probability of P(E)=∑P(Ey)Py) A decoding rule that minimizes P(E)is referred to as an optimal decoding rule. Since minimize P(y)is equivalent to maximize P(=cly),we have MAP rule: c=arg max P(cly) Maximum-likelihood decoding(MLD): Note that Pc)P)we have P(y) ML rule:=arg max P(ylc)(Suppose all the messages are equally likely) 6.MLD for a BSC In coding for a BSC,every codeword and every received word are binary sequences Suppose some codeword is transmitted and the received word is y=(y.). For a codeword c,the conditional probability P(ylc)is P(ylc)=p(1-p)) For p<12,P(ye)is a monotonially decreasing function ofd(y,e,).Then P(yle,)>P(ylc)iff du(y.c,)<du(y.c,) MLD: 1)Compute du(y.c,)for all c,C. 2)c,is taken as the transmitted codeword if du(y,c,)<d(y,c)forji. 3)Decoding c,into message u, This is called the minimum distance (nearest neighbor)decoding. and coding gai Block-error pro ility:It is the prob ability that a decoded word is in erro Bit-error probability:It is the probability that a decoded bit is in error. The usual figure of merit for a communication system is the ratio of energy per information bit to noise power spectral density,E/N,that is required to achieve a 1010 Given that y is received, the conditional error probability of decoding is defined as PE P () ( ) y c cy  ˆ ≠ Then the error probability of PE PE P ( ) ( ) () = ∑ y y y A decoding rule that minimizes P(E) is referred to as an optimal decoding rule. Since minimize P( ) c cy ˆ ≠ is equivalent to maximize P( ) c cy ˆ = , we have MAP rule: ˆ = arg max ( ) P c c c y „ Maximum-likelihood decoding (MLD): Note that () ( ) ( ) ( ) P P P P = c y c c y y , we have ML rule: ˆ = arg max ( | ) P c c y c (Suppose all the messages are equally likely) 6. MLD for a BSC In coding for a BSC, every codeword and every received word are binary sequences. „ Suppose some codeword is transmitted and the received word is 1 2 (, ) n y = y y y ,., . For a codeword i c , the conditional probability ( | ) P i y c is H H (, ) (, ) ( | ) (1 ) i i d nd Pp p i − = − y c y c y c For p<1/2, ( | ) P i y c is a monotonially decreasing function of H (, )i d y c . Then (| ) (| ) P P y ci j > y c iff H H (, ) (, ) i j d d y c < y c „ MLD: 1) Compute H (, )i d y c for all ci ∈C . 2) i c is taken as the transmitted codeword if H H (, ) (, ) i j d d y c < y c for ∀ ≠j i . 3) Decoding i c into message ui . This is called the minimum distance (nearest neighbor) decoding. 7. Performance measure and coding gain „ Block-error probability: It is the probability that a decoded word is in error. „ Bit-error probability: It is the probability that a decoded bit is in error. „ The usual figure of merit for a communication system is the ratio of energy per information bit to noise power spectral density, 0 / E N b , that is required to achieve a
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有