information theory

[xìn xī lùn]
Applied mathematics discipline studying information, information entropy, etc
Collection
zero Useful+1
zero
Information theory is the application of probability theory And mathematical statistics Of method study Information Information entropy communication system data transmission cryptography data compression Etc Applied Mathematics Information system is a generalized communication system, which generally refers to the system composed of all the equipment needed to transmit certain information from one place to another.
Information theory is a theory about information, which should have its own clear research object and scope of application. But since the birth of information theory, people have different understandings of it. [1] Information theory is widely used in various fields: Coding , Password and Cryptanalysis , data transmission, data compression Estimation theory and Edit Entry Etc. [4]
Chinese name
information theory
Foreign name
Information Theory
expression
H = - ∑ pilogpi
Presenter
Claude Shannon (Claude Shannon)
Proposed time
October 1948
Applied discipline
statistics

Origin of name

Announce
edit
Information theory considers the transmission of information as a statistical phenomenon, and gives a method to estimate the capacity of communication channels. information transmission and Information compression It is the two major fields of information theory research. These two aspects are Information transmission theorem The source channel isolation theorem is interrelated.
Shannon is called the "father of information theory". People usually regard Shannon's paper A Mathematical Theory of Communication published in Bell System Technology Journal in October 1948 as the beginning of modern information theory research. This article is based in part on Harry Nyquist And Ralph Hartley. In this paper, Shannon gave the definition of information entropy (hereinafter referred to as "entropy"):
This definition can be used to calculate the Binary encoding Required for the original information after Channel bandwidth Entropy measures the information content , which removes the part determined by the inherent structure of the message, such as, language structure Redundancy of and letters Lexical Frequency of use, etc Statistical characteristics
The Concept of Entropy in Information Theory and thermodynamics Entropy is closely related. Boltzmann And Gibbs stay Statistical Physics A lot of work has been done on entropy. Entropy in information theory is also inspired by it.
Mutual information (Mutual Information) is another useful information measure, which refers to the correlation between two sets of events. The mutual information of two events X and Y is defined as:
among
Is joint entropy, which is defined as:
Mutual information and pluralism logarithm Likelihood ratio test And Pearson χ 2 check.

Development History

Announce
edit
Information theory is a discipline summarized from long-term communication practice in the late 1940s, which specializes in effective information processing and Reliable transmission Of general rule Science.
E.C. Cherry once wrote an early history of information theory Hieroglyphics From then on, after the medieval enlightenment linguistics, until the 16th century Gilbert (E.N Gilbert )Waiting for someone Telegraphy Work on.
1920s Nyquist H. Nyquist and L.V.R. Hartley first studied the ability of communication systems to transmit information, and tried to measure the channel capacity Modern information theory began to appear.
1948 Claude Shannon (Claude Shannon) published the paper "Mathematical Theory of Communication", which is the first time in the world to establish the communication process mathematical model This paper, together with another paper published in 1949, laid the foundation of modern information theory.
Because modern Communication technology With the rapid development and cross penetration of other disciplines, the study of information theory has expanded from the narrow scope of Shannon's mathematical theory that was limited to communication systems to what is now called information science A huge system of. [2] stay Technology application In terms of information theory modern information technology Has made indelible contributions. Information Science and material science And energetics have become the leaders of modern science and technology. IT industry It is the fastest growing, most potential, most efficient and most influential in today's society pillar industry I. Without the guidance of information theory, there would be no radio technology With TV receiving system, there will be no Network Communications Remote control Bluetooth Technology, no mobile communication And satellite navigation, not to mention Internet and wireless Communication network These are at the forefront of society core technology They are all guided by modern information theory. [4]

basic content

Announce
edit
conventional communication system Such as telegram, telephone Postal They are transmission message Information Voice Information and text information; The broadcasting, telemetry, remote sensing and remote control systems also transmit various information, but the information types are different, so they are also information systems. Sometimes, information must be transmitted in both directions, such as Telephone communication Require two-way conversation, remote control system requirements Transmission control Use information and reverse measurement information. This kind of two-way information system is actually composed of two information systems. All information systems can be summarized into the model shown in the figure to study its basic laws.
Source: the source of information or the entity that generates the information to be transmitted, such as the speaker in the telephone system Telecommunication system It should also include a microphone that outputs electrical signal As a carrier containing information.
The destination of information or recipient In the telephone system, this is the listener and the headset, which converts the received electrical signal into sound for the listener to extract the required information.
Channel: A channel for transmitting information, such as a repeater in telephone communication Coaxial cable System, satellite communication in Earth station Of Transceiver , antenna and satellite Transponder Etc.
encoder In information theory, it generally refers to all the devices that transform signals. In fact Terminal The sending part of. It includes all equipment from source to channel, such as quantizer, compression encoder Modulator And converting the signal output by the source into a signal suitable for channel transmission.
decoder : encoder inverse transformation Equipment that converts signals sent from the channel into signals that can be accepted by the sink, including Demodulator , decoder Digital to analog converter Etc.
When the source and destination are given and the channel is selected, the encoder and decoder determine the performance of the information system. When designing an information system, the main work is to design Codec In general, the main performance index It's his Effectiveness And reliability. Effectiveness is to transmit as much information as possible in the system; Reliability requires that the information received by the destination should be as consistent as possible with the information sent by the source, or the distortion should be as small as possible. The best codec is to make the system most efficient and reliable. However, reliability and effectiveness are often contradictory. The more effective it is, the more unreliable it is. In a quantitative sense, the system should be able to transmit the maximum Information rate Or under the condition of specified information rate, the distortion is minimum. The basic task of information theory is to calculate the maximum information rate and prove that there is a codec that reaches or approaches this value. The theory that only discusses such issues can be called Shannon's information theory. It is generally believed that the content of information theory should be more extensive, including the theory of extracting information and ensuring information security. The latter is Estimation theory Detection theory and cryptography
Information theory is based on probability theory It is formed on the basis of source symbols and Channel noise Based on the probability characteristics of. Such information is often referred to as Syntax information In fact, the basic laws of information system should also include semantic information and Pragmatic information Grammatical information refers to the structure of the source output symbol or its objective characteristics, which has nothing to do with the subjective requirements of the destination, while semantics should consider the meaning of each symbol. The same meaning can be expressed in different languages or characters, and the grammatical information contained in various languages can be different. Generally speaking, the semantic information rate can be less than the grammatical information rate; An example is that the message rate of a telegram can be lower than that of a voice expressing the same meaning. Furthermore, the receiver of the letter or information often only needs the information that is useful to him. The language that he cannot understand is meaningful, but it is useless to him. Therefore, pragmatic information, that is, information useful to the destination, is generally less than semantic information. If the information system is only required to transmit semantic information or pragmatic information, the efficiency will obviously be higher. At present, the systematic theory of grammatical information has been established on the basis of probability theory, forming a discipline; However, semantic and pragmatic information is not mature enough. Therefore, the discussion on the latter is usually called information science or Generalized information theory , does not belong to the category of general information theory. To sum up, the basic laws of information system should include information measurement, source characteristics and source coding, channel characteristics and channel coding , detection theory, estimation theory and cryptography.

application

Announce
edit

Range

Detection theory
Politics( Political communication

Overview of Information Theory

Information theory is a science that uses mathematical statistics to study the laws of measurement, transmission and transformation of information. It is mainly used to study the universal existence of information transfer And study the best way to solve the problems of information limitation, measurement, transformation, storage and transmission Basic theory

Scope of study

information theory It is a comprehensive applied mathematics discipline, which includes research, communication Information entropy , data compression and transmission Encryption and decryption technology Etc. [4] The research scope of information theory is extremely broad. Generally, information theory is divided into three different types:
(1) Narrow information theory It is a science that applies mathematical statistics to study information processing and information transmission. It studies the common law of information transmission that exists universally in communication and control systems, and how to improve each information transmission system Of Effectiveness And reliability.
(2) General information theory mainly studies communication problems, but also includes noise theory, signal filtering and prediction, modulation and information processing.
(3) Generalized information theory Not only Narrow information theory And general information theory, including all fields related to information, such as psychology, linguistics Neuropsychology semantics Etc.

Definition field of information

information yes Increase in certainty ----Inverse Shannon information definition;
information yes Marking of substance, energy and information ----The inverse of Wiener information definition;
Information is A collection of things and their attribute identifications

Information and communication

Information is a kind of message, which is closely related to communication problems. 1948 Bell Shannon of the Institute systematically put forward the discussion on information in his paper entitled "Mathematical Theory of Communication" and founded the information theory. Wiener's proposal on measurement information content Of mathematical formula It opens up a wide application prospect of information theory. In 1951, the American Radio Engineering Society recognized the subject of information theory, which has developed rapidly since then. The 1950s was a period when information theory impacted various disciplines. The 1960s was not a period of major innovation, but a period of digestion and understanding, and a period of major construction on the basis of existing ones. Research focuses on information and Source code Question. By the 1970s, due to digital computer The ability of the communication system has also been greatly improved. How to use and process information And become an increasingly urgent problem. People are increasingly aware of the importance of information and that information can be fully used and shared as the same resource as materials and energy. The concept and method of information have been widely used in various scientific fields. It urgently needs to break through the narrow scope of Shennong's information theory so that it can become the basic theory of information problems encountered in various human activities, thus promoting many other fields Emerging disciplines Further development. The laws and theories of information established earlier have been widely applied to physics, chemistry, biology and other disciplines. A study on the generation, acquisition, transformation, transmission, storage, processing, display, identification and utilization of information information science Forming.

information science

Information science refers to the process of expanding people's understanding and utilization of information computer science , artificial intelligence systems engineering Automation technology It is a new frontier discipline developed on the basis of multiple disciplines. Its main task is to study the nature of information and the acquisition, transformation, transmission, processing, utilization and control of various information by machines, organisms and humans general rule , design and develop various information machines and control equipment , to realize the automation of operation, so as to natural force Liberated from the shackles of Transform the world Ability. Information science safety problem It also has important applications.
With the continuous development of information science. The study of information theory has been closely related to many modern sciences, such as communication, radar sonar , navigation, telemetry, remote control, remote sensing, computer information processing technology Physics, Biology bionics Etc. [4]

Information theory hypothesis

Announce
edit

English translation

Materialism

Detailed definition

Material, energy and information are the three major elements of the world. People have already had a deep understanding of matter and energy, but the understanding of information has just begun. So what is information? How does it exist? What role does it play? The following is my guess, hoping to further human Know the world It's helpful.
1、 Definition of information
Non world Three elements Information definition of: information is thing And attribute Identified aggregate Without the three elements of the world Information definition .
contain Three elements Information definition of:
1. The message is certainty Of increase ----Inverse Shannon information definition;
2. Information is information, information is Material, energy, information And attribute Of Marking ----The inverse of Wiener information definition
Information is objective things A universal form of state and motion characteristics, objective world A large number of information expressed in these ways exists, generates and transmits in. However, this is only for the three-dimensional space In terms of information, there is a deeper nature. So, does information still exist in four-dimensional space Four-dimensional space Does it not include time, but the four-dimensional state of space)? Yes, but to be clear, information only exists in four-dimensional space, and information in three-dimensional space is just the shadow of real information in four-dimensional space. Information exists in a large amount in four-dimensional space, and its essence is an informer that exists in four-dimensional space Basic unit )The regular layout of.
Information is the root cause of the event, which will be analyzed in detail in Section III.
2、 Nature of information
Information has the following properties: objectivity Universality , integrity Specificity First, the message is objective existence It is not determined by will, but it is related to Human thought There is an inevitable connection (Section IV will make a concrete analysis )。 At the same time, information is widespread, and the four-dimensional space is filled with a large number of information particles. An important property of information is integrity. Each information sub cannot determine any event. There must be two or more information sub rules arranged as complete information, and the energy released is enough to make sure the event occurs. There is also specificity in information. Each information determines a certain event, but the information of similar events also has similarities. The explanation of the causes requires further discovery of information subtypes and layout cryptography theory.
3、 Information theory mechanism
In the normal state, the information particles are randomly distributed in the four-dimensional space. When molecules in three-dimensional space collide by friction, the energy escapes into four-dimensional space, starting the regular arrangement of pheromones, and the arranged pheromones will energy release Come out and enter three-dimensional space, causing friction and collision of other molecules, and so on. If the molecule causing the friction collision happens to be the decider, the factor determining the event, such as Nerve impulse Sodium of Potassium ion , electric charge causing lightning), and there is a certain Amount of substance The event occurs when the resolution of the collision is caused by friction. Of course, the energy generated by different molecular friction collisions is different, and the types of the arrangement forms of the pheromones caused by them are also different, so the events determined are also different.
However, before the cosmic explosion, only information existed, and a decisive factor (what this factor is not yet known) led to the accidental regular arrangement of information particles, some of which were converted into energy (there are certain conditions for the conversion of information particles into energy, which can only be realized before or at the beginning of the cosmic explosion), and then energy was converted into matter under certain conditions, And continue to transfer and transform, and finally form our present universe. Therefore, the orderly arrangement of pheromones is the root cause of the event, the friction and collision of matter is the direct cause of the event, and the transmission of energy is the event necessary condition
4、 Examples of Information Theory Hypothesis
1. Thought and memory: Thought is something that we have always been elusive. According to the information theory hypothesis, thought is actually a kind of information. The friction and collision of certain molecules in the brain lead to the regular arrangement of some pheromones, which is shown in three-dimensional space as the generation of current, causing Brain cell This is the activity of The essence of thought Of course, different information shows different ideas. However, does this not mean that our thoughts have already been defined? In fact, it is. However, the number of molecules in our brain is huge, and there are many types that can cause the arrangement of information particles. Our thoughts have only developed a small part. In reality, there is another factor in what we call thinking, that is, it needs to pass a complete and complex Regulatory mechanism To express it, this regulatory mechanism is nervous system Therefore, only we can express complex ideas. Memory is the specialization of thought. When the molecule that information causes friction and collision happens to be the molecule that produced the thought before (the decision son of memory), the previous thought will be expressed again through the rule layout of specific information son. In this way, our thoughts are continuous, and the thoughts of the previous moment directly determine the thoughts of the later moment, but we have no way to discover it.
2. Life phenomenon: the birth, aging, illness and death of people can also be explained through the information theory hypothesis. Sickness is actually a non fusion molecule (bacteria or Viruses )Information caused by friction with molecules in the body. Growth is actually a variety of external elements (such as calcium ion )Information caused by molecular friction entering the human body. Aging and death are information caused by molecular friction in cells, whose macroscopic performance is cell aging and apoptosis, and then affects people.
3. Presumption and coincidence: Presumption is a very special form of thought. When some molecules in the brain cause the distribution of information particles, the information does not release all the energy, but only part of the energy is released first, causing the friction of the premonitor. The rest of the energy is released at another time, and because it is the same as the previous part of the energy, It just caused the friction of the event resolution, which confirmed the premonition. Coincidence is also a very special phenomenon. Its essence is that the energy released by information is divided into two parts and enters into different places in three-dimensional space, causing friction of the same molecule, thus causing the same thing to happen in different places. This usually occurs in identical twins Because of its genetic Similarity It determines that the probability of friction of the same molecule is large.
4. Dream and unreal impressions: Dreams are thoughts generated unconsciously, and their essence is also information. We usually have false impressions. When we see a situation, it seems that it has happened before, but it has not happened before. In fact, it is because molecular friction in the brain causes information, and the information does not release energy immediately, but is temporarily stored. When the same molecular friction occurs at another moment, its energy is activated and double energy is released, One half of the energy makes us think and the other half makes us have an impression, which is the essence of false impression.
5. chemical reaction The essence of all chemical reactions is information. The friction of several molecules causes specific information and other molecules friction Chemical bond Fracture and formation, complete the chemical reaction.
6. Destiny and soul: The ancients believed in fate, probably because they felt that we were already arranged individuals in another space, so there was human reverie about soul and God.
5、 The Significance of Informational Hypothesis
The information theory hypothesis unifies matter and thought. It is Materialism A necessary step for development, it explains the problems that human beings have been unable to understand with a materialistic view. It itself is just a hypothesis, which needs human beings to explore and prove for a long time. It also has its own defects, which need human beings to constantly discover. Maybe it is a mistake, but it is the witness of human growth and the greatness of mankind Spiritual wealth
Looking at problems from the perspective of information theory hypothesis can make people realize a new world and help to explore the deeper nature of the world. It provides a rich experience for human beings and is a model for human beings to look at problems out of their inherent thoughts. In short, whether it is correct or not, it is an immortal work of human beings.

Book Information I

Announce
edit

summary

Title: Information Theory
information theory
press: Harbin Engineering University Press
Author: Tang Shiwei, Liu Xianmei
ISBN: 9787811332780
Format: 16
Pages: 217
Price: 25.00 yuan

content validity

This book is divided into seven chapters. The first chapter is an introduction, which introduces the basic concept and definition of information, the origin, development and research content of information theory; Chapter II: Sources and Source entropy , introduce the concepts, properties and theorems of various entropy; Chapter 3 is the lossless source coding, which introduces the fixed length and Variable length coding Theorems, methods, and several practical lossless source coding; The fourth chapter is distortion limited source coding, which introduces the definition, nature, calculation of information rate distortion function and predictive coding of voice and image signals; Chapter 5 is the channel and channel capacity. It introduces single symbol discrete channel, multi symbol discrete channel and Multi-user channel Channel model and channel capacity calculation; Chapter 6 is channel coding. It introduces the basic concepts of channel coding Channel coding theorem , linear block code and Cyclic code Chapter 7 is Network information security And cryptography, introducing the basic concepts of cryptography encryption algorithm and digital signature And other technologies.

Book Information II

Announce
edit
Information Theory - Basic Theory and Application
Title: Information Theory
Author: Fu Zuyun
Published in: May 2007
ISBN : 9787121042737
Format: 16
Price: 38.00 yuan

content validity

Information Theory: Basic Theory and Application (2nd Edition) systematically discusses Shannon Information Theory fundamental theory And some application problems basically cover all aspects of information theory. The content includes: definition and measurement of information; All kinds discrete source and Continuous source Of Information entropy With memory, without memory, discrete and Continuous channel Channel capacity; Three Basic Theorems of Shannon's Information Theory: No Distortion data compression (i.e. no distortion Source code )Practical coding algorithms and methods, and channel Error correcting coding The basic content and analysis method of. Information Theory: Basic Theory and Application 》(The second edition) also briefly introduced the information theory and thermodynamics , optics, statistics, biology, medicine, etc Interdisciplinary Combined application content.

catalog

Chapter 1 Introduction
Chapter 3 Discrete channel And its channel capacity
Chapter 4 Waveform Source And waveform channel
Chapter 5 No Distortion Source code theorem
Chapter 6 Noise channel coding theorem
Chapter 7 Fidelity criteria Source code under
Chapter 8 Undistorted Source Coding
Chapter 9 Channel Error Correction Coding
Chapter 12 Relationship and Application of Information Theory and Other Disciplines
appendix
Bibliography and literature
……

Book Information III

Announce
edit

essential information

Fundamentals of Information Theory
Title: Fundamentals of Information Theory
Author: Tian Baoyu Yang Jie He Zhiqiang, Wang Xiaoxiang
ISBN:9787115177902
Publication time: the first edition in August 2008
Format: 16
Pages: 275
Price: 29.8 yuan
Answers to basic exercises of information theory
Title: Fundamentals of Information Theory exercises answer
Author: Tian Baoyu, Yang Jie, He Zhiqiang, Xu Wenjun, Wang Xiaoxiang
ISBN:9787115224552
Publication time: the first edition in October 2010
Format: 16
Pages: 242
Price: 29 yuan

content validity

Fundamentals of Information Theory has been the author's teaching and scientific research practice for many years accumulation Is to further optimize and integrate on the basis of absorbing the advantages of excellent textbooks at home and abroad content of courses And improved and supplemented. The book is divided into 12 chapters, including: the basic concept of information, the content and progress of Shannon's information theory research, the measurement of discrete information, discrete sources, continuous information and continuous sources, undistorted source coding, discrete channels and their capacity, noisy channel coding, waveform channels, information rate distortion function, constrained channels and their coding, preliminary network information theory, Information theory methods and applications.
Answers to basic exercises of information theory 》It is supported by the undergraduate textbook "Fundamentals of Information Theory" Auxiliary teaching Materials, the main purpose is to provide students with more basic questions about information theory Problem explanation Problem demonstration, broaden students' thinking on problem solving, improve students' ability to solve basic or comprehensive problems related to information theory, and further improve the information theory course Theoretical teaching Quality of. Many exercises in this book come from traditional or classic textbooks at home and abroad, and also contain a considerable number of typical questions refined and verified through years of teaching practice of front-line teachers. This book and the main Textbook structure Similarly, it also contains 12 chapters. The content includes four parts: "key points of knowledge", "detailed solutions to examples", "problem solving" and "supplementary problem solving".

catalog

Chapter 1 Introduction
Chapter 2 Measurement of Discrete Information
Chapter 4 Continuous Information and Continuous source
Chapter 5 No Distortion Source code
Chapter 6 Discrete Channel and Its Capacity
Chapter 7 Noise channel coding
Chapter 8 Waveform Channel
Chapter 9 Information Rate Distortion Function
Chapter 10 Constrained Channel and Its Coding
Chapter 11 Preliminary Network Information Theory
Chapter 12 Information Theory, Methods and Applications
reference
……

Book Information IV

Announce
edit
Basic Course of Information Theory (2nd Edition)
Author: Li Mei, Li Yinong Editorial Press: Beijing University of Posts and Telecommunications Press
Published on: October 1, 2008
Number of words: 310000
Revision: 2
Pages: 217
Printing time: October 1, 2008
Format: 16
Print No.: 1
I S B N :9787563518685
Packaging: paperback
Category: Book>> Industrial technology >>Electronic Communication>>Communication>>Communication Theory

catalog

Chapter 1 Introduction
1.1 Concept of information
1.2 Research object, purpose and content of information theory
Chapter 2 Measurement of Information
2.1.1 Self information
2.1.2 Mutual information
2.2 Average self information
2.2.1 Concept of average self information
2.2.2 Entropy function Nature of
2.3 Average mutual information
2.3.1 Concept of average mutual information
2.3.2 Properties of average mutual information
Exercise 2
3.1 Classification of information sources and their mathematical models
3.2 Discrete single symbol source
3.3 Discrete multi symbol source
3.3.1 Discrete and stable No memory source
3.3.2 Discrete and stable Memory source
3.3.4 Source relevance and Surplus
3.4.1 Differential entropy of continuous source
3.4.2 Continuous source Maximum entropy
3.4.3 Continuous source Entropy power
Exercise 3
Chapter 4 Channel and channel capacity
4.1 Channel classification
4.2 Discrete single symbol channel and its channel capacity
4.2.1 Mathematical model of discrete single symbol channel
4.2.2 Concept of channel capacity
4.2.3 Channel capacity of several special channels
4.2.4 Discrete Symmetric channel Channel capacity of
4.2.5 General Discrete channel Channel capacity of
4.2.6 Channel capacity theorem
4.2.7 Channel capacity iterative algorithm
4.3 Discrete multi symbol channel and its channel capacity
4.4 Combined channel and its channel capacity
4.4.1 Independence Parallel channel
4.4.2 Cascading channels
4.5 Continuous channel And its channel capacity
4.5.1 Continuous random variable Mutual information of
4.5.2 Channel capacity of Gaussian additive channel
4.5.3 Channel capacity of multidimensional Gaussian additive channel
4.6 Waveform channel and its capacity
Exercise 4
Chapter 5 Distortionless Source Coding
5.1 Relevant concepts of source coding
5.1.1 encoder
5.1.2 Classification of codes
5.3.1 Kraft inequality and McMman inequality
5.3.3 Average code length of compact code Boundary theorem
5.3.4 Distortionless variable length source coding theorem (Shannon's first theorem)
5.4 Variable length code coding method
5.4.1 Shannon code
5.4.2 Shannon Feno Ellis code
5.4.3 Binary Hoffman code
5.4.4 r-ary Hoffman code
5.4.5 Feno code
5.5 Practical lossless source coding method
5.5.1 Run length coding
5.5.3 LZW coding
Exercise 5
Chapter 6 Noisy Channel Coding
6.1 Relevant concepts of channel coding
6.1.1 Error probability and Decoding rule
6.1.2 Error probability and coding method
6.3 Upper bound of error probability
6.4.1 Error correcting code classification
6.4.2 Basic concept of error correction code
Exercise 6
Chapter 7 Distortion limited source coding
7.1.2 Average distortion
7.2.1 D distortion permissive channel
7.2.2 Definition of information rate distortion function
7.2.3 Properties of information rate distortion function R (D)
7.3 Distortion limited source coding theorem
7.4 Calculation of information rate distortion function
7.4.1 Calculate R (D) by parameter expression
7.4.2 Iterative algorithm of rate distortion function
7.5 Common distortion limited source coding methods
7.5.1 Quantization coding
Exercise 7
Appendix A Mathematics Preliminaries
A.1 probability theory And stochastic process
A.1.1 probability theory Basic concepts of
A.1.2 random variable And its distribution
A. 1.3 Multidimensional random variables and their distribution
A. 1.4 Numerical characteristics of random variables
A.2 convex function And Jensen's inequality
A. 3 Channel capacity theorem lemma
A. 4 Progressive equal segmentation and £ typical sequence
Appendix B Computer Operation
B. Iterative algorithm for channel capacity
B. 2 Unique Decodable Decision Criterion
B. 3 Shannon coding
B. 4 Huffman encoding
B. 5 Fano encoding
B. 6 LZW code
B.7 BSC simulator
B.8 Hamming(7,4) Codec
reference

Book Information V

Announce
edit

essential information

Chinese Title: Fundamentals of Information Theory
English title: Elements of Information Theory
Authors: Thomas M. Cover, Joy A, Thomas
Translators: Ruan Jishou, Zhang Hua
Version: Version 1 (January 1, 2008)
Format: 16
Page: 439
ISBN:9787111220404

content validity

This book is a classic and concise textbook in the field of information theory. The main contents include: entropy, source, channel capacity, rate distortion, data compression and coding theory And complexity theory. This book also focuses on network information theory and hypothesis test , etc., and based on the horse racing model stock market The research of investment portfolio The research of Investment philosophy And research skills. This book is suitable for senior undergraduate and graduate students in electronic engineering, statistics and telecommunications Basic Course of Information Theory The textbook can also be used as a reference for researchers and professionals.

catalog

Chapter 1 Introduction and Overview
Chapter 2 Entropy Relative entropy And mutual information
Chapter 3 Asymptotic Evenness
Chapter 5 Data Compression
Chapter 6 Game and Data Compression
Chapter 7 Channel Capacity
Chapter 8 Differential Entropy
Chapter 11 Information Theory and Statistics
Chapter 12 Maximum Entropy
Chapter 13 General Source Code
Chapter 14 Cole kolmogorov complexity
Chapter 15 Network Information Theory
Chapter 16 Information Theory and Portfolio Theory
Chapter 17 Inequalities in Information Theory
reference [3]