Data Compression


Criteria

Survey Formats

Basics

Basic Terms

Symbol

Set of Symbols

Alphabet

Code

Coding

Redundancy

Information Theory

Message

Probability

Information

Entropy

Redundancy Reduction

Irrelevance Reduction

Entropy Coding

Variable Length Codes

Code Trees

Compression Methods

Data Formats


Glossary

Index


Download


www.BinaryEssence.com

Information Theory


Information theory defines mathematical fundamentals for communication and information interchange. Central terms are information and entropy. With information theory resources can be quantified necessary for transferring information.


Information theory is of importance for communication technology, data compression, cryptography, etc. and for other subjects beyond technical sciences like genetics, neurology and information sciences, etc.


In information theory information is regarded as a quantitative, mathematical measure without taking its meaning into consideration. Evaluation and interpretation of information (semantics, pragmatics) is not part of this theory. The meaning of an information is not relevant for system design on a pure technical level.


In general Claude Elwood Shannon will be regarded as the originator of information theory. The initial publication was his article "A Mathematical Theory of Communication" published in the Bell System Technical Journal in 1948.


Shannon itself refers to preceding work of Nyquist, Hartley and Tukey

H. Nyquist:

(Certain Factors Affecting Telegraph Speed; Bell System Technical Journal; April 1924)

(Certain Topics in Telegraph Transmission Theory; A.I.E.E. Trans.; April 1928)

R. V. L. Hartley:

(Transmission of Information; July 1928)


Later on information theory was extended continuously (Kullback, Leibler, etc.).


 <   ^   > 

Survey Basic Terms Redundancy Message