正在加载图片...
Rg(r)=R,(r)=-Re(R(r)).Ru(r)=-Im(R(r)) Circularly symmetric processes with a real-valued R()have a real and imaginary part that are independent at all time,since R()=0. Circularly symmetry is preserved by linear (time-invariant or time-varying) systems. A white complex-valued Gaussian process has an autocorrelation function R,(T)=N6(x),R,(k)=2o6 for contir and discrete ss,the real and imaginary parts are ndent of each other D.Basics of information theory Entropy and mutual information For a discrete random variable X with sample space Qx,its entropy is defined as H(X)=E-logP(x)】=-∑Px(x)log P(x) The mutual information between two random variables Xand Y are given by I(XY)=H(X)-H(XIY=H(Y)-H(YIX) K,)=∑px∑p0 )og p(y) n-2aob24 x)od p(y) Channel capacity and the coding theorem (Operational)Channel ca Maximum rate e R fo Information channel capacity: Maximum of mutual information over all possible input statistics P() C=max I(X;Y)=max [H(Y)-H(YX)] Suggested Reading [I]R.G Gallas D iples of Digital Commnication.Cambridge University Press,2009 (影印版,人民邮电出版社,2010) 1616 1 ( ) ( ) Re{ ( )} 2 RR R RI Z τ = = τ τ , 1 ( ) Im{ ( )} 2 R R RI Z τ = − τ - Circularly symmetric processes with a real-valued ( ) RZ τ have a real and imaginary part that are independent at all time, since RRI (τ) = 0. - Circularly symmetry is preserved by linear (time-invariant or time-varying) systems. „ A white complex-valued Gaussian process has an autocorrelation function 0 () () R N Z τ = δ τ , 2 () 2 R k Z σ k = δ for continuous and discrete time, respectively. „ For a circularly symmetric white Gaussian process, the real and imaginary parts are identically distributed, and are independent of each other. D. Basics of information theory Entropy and mutual information „ For a discrete random variable X with sample space ΩX, its entropy is defined as ( ) [ log ( )] ( )log ( ) X X XX x HX P x P x P x ∈Ω = − =− E ∑ „ The mutual information between two random variables X and Y are given by I( ,) ( ) ( | ) () ( | ) XY H X H X Y HY HY X = − =− Channel capacity and the coding theorem (Operational) Channel capacity: Maximum rate R for which reliable communication can be achieved. Information channel capacity: Maximum of mutual information over all possible input statistics P(X) Suggested Reading [1] R. G. Gallager, Principles of Digital Communication. Cambridge University Press, 2009. (影印版,人民邮电出版社,2010) max ( ; ) max [ ( ) ( | )] P P X X C I XY HY HY X ≡ =− ( ) ( ) ( ; ) ( ) ( )log x y p y x I XY px p yx p y = ∑ ∑ ( ) ( ) ( ; ) ( ) ( )log x p yx I X Y p x p y x dy p y ∞ −∞ = ∑ ∫ ( ) ( ) ( ; ) ( ) ( )log p yx I X Y p x p y x dxdy p y ∞ ∞ −∞ −∞ = ∫ ∫
<<向上翻页
©2008-现在 cucdc.com 高等教育资讯网 版权所有