3-1 Measures of Random Syntactic Information Shannon Theory of Information Key point: 1. Information is something that can be used to remove uncertainty. 2. The amount of information can then be measured by the amount of uncertainty it removed. 3. In the cases of communications, only waveform is concerned while meaning and value are ignored. 4. Uncertainty and thus information are statistic in nature and statistical mathematics is enough
Problems to be concerned with 1. How should we define the concept of Information 2. What are the typical features of Information compared with matter and energy 3. What are the relationship and difference between Shannon Information and comprehensive Information 4. How to reasonably classify Information 5. How to properly represent Information
1.1 Information science Definition of information Science Information Science is a trans-disciplinary science with information as its object in study; the laws in information process as its content information methodology as its approach; and the strengthening human information functions particularly the intellectual ability, as its goal