正在加载图片...
4 CHAPTER 1.INTRODUCTION ·狭义信息论(Shannon Theory) o在 的基础 陆 到的最性能限 以用编码方法实现这目标,并在理论上证明信系统可进 瓷产等理论外,还包括最佳接论(合号检、计与制 ·广义信息论 信息论是通信与信息系统的基础理论,是现代通信发展的动力和源泉: oupP oughs within months of each other,have launched and powered the 。信源编码定理→数据压缩技术→无线通信系统从1G变革到2G ·信道编码定理→差错控制编码(Trbo.LDPC)→3G 。数据处理定理→软判决译码 ·高斯噪声是最坏的加性噪声+多用户信息论→CDMA、多用户检测 ·MIMO容量理论→空时编码、预编码→LTE、4G ·多用户信息论协作通信、网络编码→新一代无线系统 The recent work on the information-theoretic aspects of communication concentrated on:1)Network information theory.and 2)MIMO systems. 1.2 What is information?(Measure of information) For Shannon theory,information is what we receive when uncertainty is reduced. How to measure: ·Amount of information should fulfill I≥0 .Amount of information should depend on probability P(r) For independent events:P(X,Y)=P(X)P(Y)I=I(X)+I(Y) It should has the form of log(Self-information of the eventx-z) 1.3 Applications .Data compression:voice coder,MPEG,LZ algorithm. ·Modem 4 CHAPTER 1. INTRODUCTION • 狭义信息论(Shannon Theory) Shannon在前人工作的基础上,用概率统计的方法研究通信系统。揭示了通信系 统中传送的对象是信息;系统设计的中心问题是在干扰噪声中如何有效而可靠地 传送信息。指出可以用编码方法实现这一目标;并在理论上证明了通信系统可达 到的最佳性能限。 • 一般信息论:除Shannon理论外,还包括最佳接收理论(信号检测、估计与调制 理论),噪声理论等。 • 广义信息论 信息论是通信与信息系统的基础理论,是现代通信发展的动力和源泉: I have often remarked that the transistor and information theory, two Bell Laborato￾ries breakthroughs within months of each other, have launched and powered the vehicle of modern digital communications. Solid state electronics provided the engine while in￾formation theory gave us the steering wheel with which to guide it. — Viterbi, IT News Lett., 1998. • 信源编码定理→ 数据压缩技术→ 无线通信系统从1G变革到2G • 信道编码定理→ 差错控制编码(Turbo,LDPC)→ 3G • 数据处理定理→ 软判决译码 • 高斯噪声是最坏的加性噪声+ 多用户信息论→ CDMA、多用户检测 • MIMO容量理论→ 空时编码、预编码→ LTE、4G • 多用户信息论→ 协作通信、网络编码→ 新一代无线系统 The recent work on the information-theoretic aspects of communication concentrated on: 1) Network information theory, and 2) MIMO systems. 1.2 What is information? (Measure of information) For Shannon theory, information is what we receive when uncertainty is reduced. How to measure: • Amount of information should fulfill I ≥ 0 • Amount of information should depend on probability P(x) • For independent events: P(X, Y ) = P(X)P(Y ) → I = I(X) + I(Y ) It should has the form of log 1 PX(x) . (Self-information of the event X = x) 1.3 Applications • Data compression: voice coder, MPEG, LZ algorithm. • Modem
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有