当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

美国电气和电子工程师协会:《信息科学原理》英文版 Chapter 5 Principles of Information Transferring Communication Theory

资源类别:文库,文档格式:PPT,文档页数:23,文件大小:171.5KB,团购合买
1. Model of Communication System Noise Source Sink T Channel T: Transformation Source, sink, and channel are given beforehand Transformation to match the source with the channel
点击下载完整版文档(PPT)

Principles of Information Science Chapter 5 Principles of Information Transferring Communication Theory

Principles of Information Science Chapter 5 Principles of Information Transferring: Communication Theory

1. Model of Communication System Noise Source Sink Channel T: Transformation Source, sink, and channel are given beforehand Transformation to match the source with the channel

1. Model of Communication System T C T Source Sink Channel Transformation: to match the source with the channel. Source, sink, and channel are given beforehand. Noise X Y -1 T: Transformation

The functions of Transformation Modulation: Spectra Matching, Better performance/cost seeking g Amplification: Signal/Noise Improving Equalization: Channel Characteristic Adjusting Source Coding: Transmission Efficiency Bettering Channel Coding: Noise Immunity Cryptographic Coding: Security Protection

The Functions of Transformation - Modulation: Spectra Matching, Better Performance/Cost Seeking - Amplification: Signal/Noise Improving - Equalization: Channel Characteristic Adjusting - Source Coding: Transmission Efficiency Bettering - Channel Coding: Noise Immunity - Cryptographic Coding: Security Protection

2, Model analysis A radical feature of communication: The sent waveform recovery at receiving end with a certain fidelity under noises. Ignoring the content and utility factors, the model of communication exhibits statistical properties. Source Entropy: H(X),HX,Y Mutual Information I(X; Y)=H(X)-H(XY H(Y-H(YX

2, Model Analysis Ignoring the content and utility factors, the model of communication exhibits statistical properties. Source Entropy: H(X), H(X, Y) I(X; Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) A radical feature of communication: The sent waveform recovery at receiving end with a certain fidelity under noises. Mutual Information:

Define Channel Capacity C Max Rp(X) I(X;Y) Example(aWgn channel) p(yx)=(22)12exp-(1/2a2)(yx)2 I(X; Y=H(Y-H(YX) H(Y- p(x) p(yx) log p(ylx) dy dx H(Y)-log(2no 2e) The only way to maximize I(X; Y)is to maximize H(Y) This requires y being normal variable with zero mean. Due to Y=X+. it requires x being a normal variable with zero mean

Channel Capacity C = I(X; Y) Max Define {p(X)} Example (AWGN Channel): p(y|x) =(2ps ) exp[-(1/2s ) (y-x) ] = H(Y) - p(x) p(y|x) log p(y|x) dy dx 2 = H(Y) - log (2ps e) I(X; Y) = H(Y) - H(Y|X) The only way to maximize I(X; Y) is to maximize H(Y). This requires Y being normal variable with zero mean. Due to Y = X+N, it requires X being a normal variable with zero mean. -1/2 2 2 - - 2 1/2

Let Py=P+o 2 C=l0g(2ePv1n log(2Teo 1/2 (12)log(Py/2) (N=σ2) (1/2)log(1+PN (bit/symbol) If X is a white Gaussian signal with bandwidth f and duration T, then there are 2FT symbols transmitted per second. Therefore, we have C= FTlog(1+P/ bit/second, he famous capacity formula

Let P = P + s C = log (2peP ) Y - log(2pes ) 1/2 1/2 = (1/2) log (P /s ) 2 Y = (1/2) log (1 + P/N) (bit/symbol) If X is a white Gaussian signal with bandwidth F and duration T, then there are 2FT symbols transmitted per second. Therefore, we have C = FT log (1 + P/N) bit/second, 2 Y 2 the famous capacity formula. (N = s ) 2

rom C=FTlog(1+PN) A)F, T, P are basic parameters of a channel. T Signal volume to be transmitted through a channel must be smaller than the channel capacity provided

F F T T P P Signal Volume to be transmitted through a channel must be smaller than the channel capacity provided. From C = FT log (1 + P/N ) A) F, T, P are basic parameters of a channel

B) For a given C, the parameters are exchangeable. This provides great flexibility for communication systems designing. C)Signal Division Frequency Division - FDMA Time Division-- TDMA Time Frequency Division Frequency Hopping F

C) Signal Division Frequency Division -- FDMA Time Division -- TDMA Time Frequency Division -- Frequency Hopping B) For a given C, the parameters are exchangeable. This provides great flexibility for communication systems designing. F T P Channel Capacity

D)It is the origin of CDma For a given channel with additive white gaussian noise, the maximal value of the capacity can be implemented when the signal form is also additive white gaussian in nature - noise-like signal This lead to the noise communication and then the pseudo-noise coding communication, or code multiple access -- CDMA CDMA: each user is assigned a different. and also mutual orthogonal, code as its address

D) It is the origin of CDMA For a given channel with additive white Gaussian noise, the maximal value of the capacity can be implemented when the signal form is also additive white Gaussian in nature -- noise-like signal. This lead to the noise communication and then the pseudo-noise coding communication, or code multiple access -- CDMA. CDMA: each user is assigned a different, and also mutual orthogonal, code as its address

3, H(X) Analysis: Source Coding Source Encoder For being a ble to express the amount of information of a source with H(X), the encoder with an average length, L of codes should keep the relation as below: (X) p(暮)logp()<l p(x dl n=1 Otherwise, distortion will unavoidably be introduced Thus, the optimal coding is the one with

3, H(X) Analysis: Source Coding For being able to express the amount of information of a source with H(X), the encoder with an average length, l, of codes should keep the relation as below: Otherwise, distortion will unavoidably be introduced. H(X) = - p(x ) log p(x ) < l = p(x ) l Thus, the optimal coding is the one with Source Encoder X Y n n - n=1 N - n=1 N n n

点击下载完整版文档(PPT)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
共23页,试读已结束,阅读完整版请下载
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有