Information theory is a theory about information, which should have its own clear research object and scope of application.But since the birth of information theory, people have different understandings of it.[1]Information theory is widely used in various fields:Coding, Password andCryptanalysis, data transmission, data compressionEstimation theoryandEdit Entry Etc.[4]
Information theory considers the transmission of information as a statistical phenomenon, and gives a method to estimate the capacity of communication channels.information transmissionandInformation compressionIt is the two major fields of information theory research.These two aspects areInformation transmission theoremThe source channel isolation theorem is interrelated.
Shannon is called the "father of information theory".People usually regard Shannon's paper A Mathematical Theory of Communication published in Bell System Technology Journal in October 1948 as the beginning of modern information theory research.This article is based in part onHarry Nyquist And Ralph Hartley.In this paper, Shannon gave the definition of information entropy (hereinafter referred to as "entropy"):
The Concept of Entropy in Information Theory andthermodynamicsEntropy is closely related.BoltzmannAndGibbsstayStatistical PhysicsA lot of work has been done on entropy.Entropy in information theory is also inspired by it.
Mutual information(Mutual Information) is another useful information measure, which refers to the correlation between two sets of events.The mutual information of two events X and Y is defined as:
Information theory is a discipline summarized from long-term communication practice in the late 1940s, which specializes in effective information processing andReliable transmissionOfgeneral ruleScience.
E.C. Cherry once wrote an early history of information theoryHieroglyphicsFrom then on, after the medieval enlightenment linguistics, until the 16th century Gilbert (E.NGilbert)Waiting for someoneTelegraphyWork on.
1920sNyquistH. Nyquist and L.V.R. Hartley first studied the ability of communication systems to transmit information, and tried to measure thechannel capacity 。Modern information theory began to appear.
1948Claude Shannon (Claude Shannon) published the paper "Mathematical Theory of Communication", which is the first time in the world to establish the communication processmathematical model This paper, together with another paper published in 1949, laid the foundation of modern information theory.
Because modernCommunication technologyWith the rapid development and cross penetration of other disciplines, the study of information theory has expanded from the narrow scope of Shannon's mathematical theory that was limited to communication systems to what is now calledinformation scienceA huge system of.[2]stayTechnology applicationIn terms of information theorymodern information technology Has made indelible contributions.Information Science andmaterial science And energetics have become the leaders of modern science and technology.IT industryIt is the fastest growing, most potential, most efficient and most influential in today's societypillar industryI. Without the guidance of information theory, there would be noradio technology With TV receiving system, there will be noNetwork Communications、Remote control、BluetoothTechnology, nomobile communicationAnd satellite navigation, not to mention Internet and wirelessCommunication networkThese are at the forefront of societycore technology They are all guided by modern information theory.[4]
basic content
Announce
edit
conventionalcommunication system Such as telegram, telephonePostalThey are transmissionmessageInformationVoiceInformation and text information;The broadcasting, telemetry, remote sensing and remote control systems also transmit various information, but the information types are different, so they are also information systems.Sometimes, information must be transmitted in both directions, such asTelephone communicationRequire two-way conversation, remote controlsystem requirements Transmission controlUse information and reverse measurement information.This kind of two-way information system is actually composed of two information systems.All information systems can be summarized into the model shown in the figure to study its basic laws.
Source: the source of information or the entity that generates the information to be transmitted, such as the speaker in the telephone systemTelecommunication systemIt should also include a microphone that outputselectrical signalAs a carrier containing information.
The destination of information orrecipientIn the telephone system, this is the listener and the headset, which converts the received electrical signal into sound for the listener to extract the required information.
encoderIn information theory, it generally refers to all the devices that transform signals. In factTerminalThe sending part of.It includes all equipment from source to channel, such as quantizer, compression encoderModulatorAnd converting the signal output by the source into a signal suitable for channel transmission.
When the source and destination are given and the channel is selected, the encoder and decoder determine the performance of the information system.When designing an information system, the main work is to designCodec。In general, the mainperformance index It's hisEffectivenessAnd reliability.Effectiveness is to transmit as much information as possible in the system;Reliability requires that the information received by the destination should be as consistent as possible with the information sent by the source, or the distortion should be as small as possible.The best codec is to make the system most efficient and reliable.However, reliability and effectiveness are often contradictory.The more effective it is, the more unreliable it is.In a quantitative sense, the system should be able to transmit the maximumInformation rate;Or under the condition of specified information rate, the distortion is minimum.The basic task of information theory is to calculate the maximum information rate and prove that there is a codec that reaches or approaches this value.The theory that only discusses such issues can be called Shannon's information theory. It is generally believed that the content of information theory should be more extensive, including the theory of extracting information and ensuring information security.The latter isEstimation theory、Detection theoryandcryptography。
Information theory is based onprobability theoryIt is formed on the basis of source symbols andChannel noiseBased on the probability characteristics of.Such information is often referred to asSyntax information。In fact, the basic laws of information system should also includesemantic informationandPragmatic information。Grammatical information refers to the structure of the source output symbol or its objective characteristics, which has nothing to do with the subjective requirements of the destination, while semantics should consider the meaning of each symbol. The same meaning can be expressed in different languages or characters, and the grammatical information contained in various languages can be different.Generally speaking, the semantic information rate can be less than the grammatical information rate;An example is that the message rate of a telegram can be lower than that of a voice expressing the same meaning.Furthermore, the receiver of the letter or information often only needs the information that is useful to him. The language that he cannot understand is meaningful, but it is useless to him.Therefore, pragmatic information, that is, information useful to the destination, is generally less than semantic information.If the information system is only required to transmit semantic information or pragmatic information, the efficiency will obviously be higher.At present, the systematic theory of grammatical information has been established on the basis of probability theory, forming a discipline;However, semantic and pragmatic information is not mature enough.Therefore, the discussion on the latter is usually called information science orGeneralized information theory, does not belong to the category of general information theory.To sum up, the basic laws of information system should include information measurement, source characteristics and source coding, channel characteristics andchannel coding , detection theory, estimation theory and cryptography.
Information theory is a science that uses mathematical statistics to study the laws of measurement, transmission and transformation of information.It is mainly used to study the universal existence ofinformation transferAnd study the best way to solve the problems of information limitation, measurement, transformation, storage and transmissionBasic theory。
Scope of study
information theory It is a comprehensive applied mathematics discipline, which includes research, communicationInformation entropy, data compression and transmissionEncryption and decryption technologyEtc.[4]The research scope of information theory is extremely broad.Generally, information theory is divided into three different types:
(1)Narrow information theoryIt is a science that applies mathematical statistics to study information processing and information transmission.It studies the common law of information transmission that exists universally in communication and control systems, and how to improve each informationtransmission system OfEffectivenessAnd reliability.
(2) General information theory mainly studies communication problems, but also includes noise theory, signal filtering and prediction, modulation and information processing.
informationyesIncrease in certainty----Inverse Shannon information definition;
informationyesMarking of substance, energy and information----The inverse of Wiener information definition;
Information isA collection of things and their attribute identifications。
Information and communication
Information is a kind of message, which is closely related to communication problems.1948BellShannon of the Institute systematically put forward the discussion on information in his paper entitled "Mathematical Theory of Communication" and founded the information theory.Wiener's proposal on measurementinformation contentOfmathematical formula It opens up a wide application prospect of information theory.In 1951, the American Radio Engineering Society recognized the subject of information theory, which has developed rapidly since then.The 1950s was a period when information theory impacted various disciplines. The 1960s was not a period of major innovation, but a period of digestion and understanding, and a period of major construction on the basis of existing ones.Research focuses on information andSource codeQuestion.By the 1970s, due todigital computerThe ability of the communication system has also been greatly improved. How to use andprocess informationAnd become an increasingly urgent problem.People are increasingly aware of the importance of information and that information can be fully used and shared as the same resource as materials and energy.The concept and method of information have been widely used in various scientific fields. It urgently needs to break through the narrow scope of Shennong's information theory so that it can become the basic theory of information problems encountered in various human activities, thus promoting many other fieldsEmerging disciplinesFurther development.The laws and theories of information established earlier have been widely applied to physics, chemistry, biology and other disciplines.A study on the generation, acquisition, transformation, transmission, storage, processing, display, identification and utilization of informationinformation scienceForming.
information science
Information science refers to the process of expanding people's understanding and utilization of informationcomputer science, artificial intelligencesystems engineering、Automation technologyIt is a new frontier discipline developed on the basis of multiple disciplines.Its main task is to study the nature of information and the acquisition, transformation, transmission, processing, utilization and control of various information by machines, organisms and humansgeneral rule, design and develop various information machines andcontrol equipment , to realize the automation of operation, so as tonatural forceLiberated from the shackles ofTransform the worldAbility.Information sciencesafety problemIt also has important applications.
With the continuous development of information science.The study of information theory has been closely related to many modern sciences, such as communication, radarsonar, navigation, telemetry, remote control, remote sensing, computerinformation processing technology Physics, BiologybionicsEtc.[4]
Information theory hypothesis
Announce
edit
English translation
Materialism
Detailed definition
Material, energy and information are the three major elements of the world.People have already had a deep understanding of matter and energy, but the understanding of information has just begun.So what is information?How does it exist?What role does it play?The following is my guess, hoping to further humanKnow the worldIt's helpful.
1、 Definition of information
Non worldThree elementsInformation definition of: information isthingAndattributeIdentifiedaggregate。Without the three elements of the worldInformation definition.
containThree elementsInformation definition of:
1. The message iscertaintyOfincrease----Inverse Shannon information definition;
2. Information is information, information isMaterial, energy, informationAndattributeOfMarking----The inverse of Wiener information definition
Information isobjective thingsA universal form of state and motion characteristics,objective worldA large number of information expressed in these ways exists, generates and transmits in.However, this is only for thethree-dimensional spaceIn terms of information, there is a deeper nature.So, does information still exist in four-dimensional spaceFour-dimensional spaceDoes it not include time, but the four-dimensional state of space)?Yes, but to be clear, information only exists in four-dimensional space, and information in three-dimensional space is just the shadow of real information in four-dimensional space.Information exists in a large amount in four-dimensional space, and its essence is an informer that exists in four-dimensional spaceBasic unit)The regular layout of.
Information is the root cause of the event, which will be analyzed in detail in Section III.
2、 Nature of information
Information has the following properties:objectivity、Universality, integritySpecificity。First, the message isobjective existenceIt is not determined by will, but it is related toHuman thoughtThere is an inevitable connection (Section IV willmake a concrete analysis)。At the same time, information is widespread, and the four-dimensional space is filled with a large number of information particles.An important property of information is integrity. Each information sub cannot determine any event. There must be two or more information sub rules arranged as complete information, and the energy released is enough to make sure the event occurs.There is also specificity in information. Each information determines a certain event, but the information of similar events also has similarities. The explanation of the causes requires further discovery of information subtypes and layout cryptography theory.
3、 Information theory mechanism
In the normal state, the information particles are randomly distributed in the four-dimensional space.When molecules in three-dimensional space collide by friction, the energy escapes into four-dimensional space, starting the regular arrangement of pheromones, and the arranged pheromones willenergy releaseCome out and enter three-dimensional space, causing friction and collision of other molecules, and so on.If the molecule causing the friction collision happens to be the decider, the factor determining the event, such asNerve impulseSodium ofPotassium ion, electric charge causing lightning), and there is a certainAmount of substanceThe event occurs when the resolution of the collision is caused by friction.Of course, the energy generated by different molecular friction collisions is different, and the types of the arrangement forms of the pheromones caused by them are also different, so the events determined are also different.
However, before the cosmic explosion, only information existed, and a decisive factor (what this factor is not yet known) led to the accidental regular arrangement of information particles, some of which were converted into energy (there are certain conditions for the conversion of information particles into energy, which can only be realized before or at the beginning of the cosmic explosion), and then energy was converted into matter under certain conditions,And continue to transfer and transform, and finally form our present universe.Therefore, the orderly arrangement of pheromones is the root cause of the event, the friction and collision of matter is the direct cause of the event, and the transmission of energy is the eventnecessary condition。
4、 Examples of Information Theory Hypothesis
1. Thought and memory: Thought is something that we have always been elusive. According to the information theory hypothesis, thought is actually a kind of information.The friction and collision of certain molecules in the brain lead to the regular arrangement of some pheromones, which is shown in three-dimensional space as the generation of current, causingBrain cellThis is the activity ofThe essence of thoughtOf course, different information shows different ideas.However, does this not mean that our thoughts have already been defined?In fact, it is.However, the number of molecules in our brain is huge, and there are many types that can cause the arrangement of information particles. Our thoughts have only developed a small part.In reality, there is another factor in what we call thinking, that is, it needs to pass a complete and complexRegulatory mechanismTo express it, this regulatory mechanism isnervous systemTherefore, only we can express complex ideas.Memory is the specialization of thought. When the molecule that information causes friction and collision happens to be the molecule that produced the thought before (the decision son of memory), the previous thought will be expressed again through the rule layout of specific information son.In this way, our thoughts are continuous, and the thoughts of the previous moment directly determine the thoughts of the later moment, but we have no way to discover it.
2. Life phenomenon: the birth, aging, illness and death of people can also be explained through the information theory hypothesis.Sickness is actually a non fusion molecule (bacteria orViruses)Information caused by friction with molecules in the body.Growth is actually a variety of external elements (such ascalcium ion)Information caused by molecular friction entering the human body.Aging and death are information caused by molecular friction in cells, whose macroscopic performance is cell aging and apoptosis, and then affects people.
3. Presumption and coincidence: Presumption is a very special form of thought. When some molecules in the brain cause the distribution of information particles, the information does not release all the energy, but only part of the energy is released first, causing the friction of the premonitor. The rest of the energy is released at another time, and because it is the same as the previous part of the energy,It just caused the friction of the event resolution, which confirmed the premonition.Coincidence is also a very special phenomenon. Its essence is that the energy released by information is divided into two parts and enters into different places in three-dimensional space, causing friction of the same molecule, thus causing the same thing to happen in different places. This usually occurs inidentical twins Because of its geneticSimilarityIt determines that the probability of friction of the same molecule is large.
4. Dream and unreal impressions: Dreams are thoughts generated unconsciously, and their essence is also information.We usually have false impressions. When we see a situation, it seems that it has happened before, but it has not happened before. In fact, it is because molecular friction in the brain causes information, and the information does not release energy immediately, but is temporarily stored. When the same molecular friction occurs at another moment, its energy is activated and double energy is released,One half of the energy makes us think and the other half makes us have an impression, which is the essence of false impression.
5.chemical reactionThe essence of all chemical reactions is information.The friction of several molecules causes specific information and other molecules frictionChemical bondFracture and formation, complete the chemical reaction.
6. Destiny and soul: The ancients believed in fate, probably because they felt that we were already arranged individuals in another space, so there was human reverie about soul and God.
5、 The Significance of Informational Hypothesis
The information theory hypothesis unifies matter and thought. It isMaterialismA necessary step for development, it explains the problems that human beings have been unable to understand with a materialistic view.It itself is just a hypothesis, which needs human beings to explore and prove for a long time. It also has its own defects, which need human beings to constantly discover.Maybe it is a mistake, but it is the witness of human growth and the greatness of mankindSpiritual wealth。
Looking at problems from the perspective of information theory hypothesis can make people realize a new world and help to explore the deeper nature of the world.It provides a rich experience for human beings and is a model for human beings to look at problems out of their inherent thoughts.In short, whether it is correct or not, it is an immortal work of human beings.
This book is divided into seven chapters. The first chapter is an introduction, which introduces the basic concept and definition of information, the origin, development and research content of information theory;Chapter II: Sources andSource entropy, introduce the concepts, properties and theorems of various entropy;Chapter 3 is the lossless source coding, which introduces the fixed length andVariable length codingTheorems, methods, and several practical lossless source coding;The fourth chapter is distortion limited source coding, which introduces the definition, nature, calculation of information rate distortion function and predictive coding of voice and image signals;Chapter 5 is the channel and channel capacity. It introduces single symbol discrete channel, multi symbol discrete channel andMulti-user channelChannel model and channel capacity calculation;Chapter 6 is channel coding. It introduces the basic concepts of channel codingChannel coding theorem, linear block code andCyclic code;Chapter 7 isNetwork information securityAnd cryptography, introducing the basic concepts of cryptographyencryption algorithm anddigital signature And other technologies.
Publication time: the first edition in October 2010
Format: 16
Pages: 242
Price: 29 yuan
content validity
Fundamentals of Information Theory has been the author's teaching and scientific research practice for many yearsaccumulationIs to further optimize and integrate on the basis of absorbing the advantages of excellent textbooks at home and abroadcontent of coursesAnd improved and supplemented.The book is divided into 12 chapters, including: the basic concept of information, the content and progress of Shannon's information theory research, the measurement of discrete information, discrete sources, continuous information and continuous sources, undistorted source coding, discrete channels and their capacity, noisy channel coding, waveform channels, information rate distortion function, constrained channels and their coding, preliminary network information theory,Information theory methods and applications.
《Answers to basic exercises of information theory》It is supported by the undergraduate textbook "Fundamentals of Information Theory"Auxiliary teachingMaterials, the main purpose is to provide students with more basic questions about information theoryProblem explanationProblem demonstration, broaden students' thinking on problem solving, improve students' ability to solve basic or comprehensive problems related to information theory, and further improve the information theory courseTheoretical teachingQuality of.Many exercises in this book come from traditional or classic textbooks at home and abroad, and also contain a considerable number of typical questions refined and verified through years of teaching practice of front-line teachers.This book and the mainTextbook structureSimilarly, it also contains 12 chapters.The content includes four parts: "key points of knowledge", "detailed solutions to examples", "problem solving" and "supplementary problem solving".
This book is a classic and concise textbook in the field of information theory.The main contents include: entropy, source, channel capacity, rate distortion, data compression andcoding theory And complexity theory.This book also focuses on network information theory andhypothesis test, etc., and based on the horse racing modelstock marketThe research ofinvestment portfolio The research ofInvestment philosophyAnd research skills.This book is suitable for senior undergraduate and graduate students in electronic engineering, statistics and telecommunicationsBasic Course of Information TheoryThe textbook can also be used as a reference for researchers and professionals.