正在加载图片...
2.4.RELATIVE ENTROPY AND MUTUAL INFORMATION 17 2.4.2 Mutual information Mutual information is a measure of the amount of information that one r.v.contains about another r.v.It is the reduction in the uncertainty of one r.v.due to the knowledge of the other Definition 2.4.2 I(X:Y)=D(P(,P()P(y)) =名于Ps品 P,) be Poo -∑∑Pes (2.16) Properties of mutual information 1.Non-negativity of mutual information:I(X:Y)>0 with equality iff X and Y are independent. nofI0xy)=D(Pz,PP(》≥0,t证Pz)=PePe吧 2.Symmetry of mutual information:I(X:Y)=I(Y;X) xn-∑Pes瑞 =-∑P(,)logP(a)-∑P(z,logP(z) =H()-H(Y) (2.17) By symmetry,it also follows that I(X;Y)=H(Y)-H(YX).Since H(XY)= H(X)+H(YX).we have I(X;Y)=H(X)+H(Y)-H(XY)2.4. RELATIVE ENTROPY AND MUTUAL INFORMATION 17 2.4.2 Mutual information Mutual information is a measure of the amount of information that one r.v. contains about another r.v. It is the reduction in the uncertainty of one r.v. due to the knowledge of the other. Definition 2.4.2. Consider two random variables X and Y with a joint pmf P(x, y) and marginal pmf P(x) and P(y). The (average) mutual information I(X; Y ) between X and Y is the relative entropy between P(x, y) and P(x)P(y), i.e., I(X; Y ) = D(P(x, y)||P(x)P(y)) = ∑ x∈X ∑ y∈Y P(x, y) log P(x, y) P(x)P(y) = Ep(x,y) [ log P(x, y) P(x)P(y) ] = ∑ x ∑ y P(x, y) log P(x|y) P(x) (2.16) Properties of mutual information: 1. Non-negativity of mutual information: I(X; Y ) ≥ 0 with equality iff X and Y are independent. Proof. I(X; Y ) = D(P(x, y)||P(x)P(y)) ≥ 0, with equality iff P(x, y) = P(x)P(y). 2. Symmetry of mutual information: I(X; Y ) = I(Y ; X) 3. I(X; Y ) = ∑ x ∑ y P(x, y) log P(x|y) P(x) = − ∑ x,y P(x, y) log P(x) − ( − ∑ x,y P(x, y) log P(x|y) ) = H(X) − H(X|Y ) (2.17) By symmetry, it also follows that I(X; Y ) = H(Y ) − H(Y |X). Since H(XY ) = H(X) + H(Y |X), we have I(X; Y ) = H(X) + H(Y ) − H(XY ) 4. I(X; X) = H(X) − H(X|X) = H(X). Hence, entropy is sometimes referred to as average self-information
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有