Data Compression


Criteria

Survey Formats

Basics

Basic Terms

Symbol

Set of Symbols

Alphabet

Code

Coding

Redundancy

Information Theory

Message

Probability

Information

Entropy

Calculation

Characteristics

Extreme Values

Diagram

Redundancy Reduction

Irrelevance Reduction

Entropy Coding

Variable Length Codes

Code Trees

Compression Methods

Data Formats


Glossary

Index


Download


www.BinaryEssence.com

Entropy


Entropy is a measure defined in information theory that quantifies the information of an information source (e.g. the contents of a common file). An information source provides large entropy if its contents will be of random nature. Entropy is small if the source contains regular structures, e.g. if some parts appear more frequent than other ones.


The measure entropy gives an impression about the success of a data compression process. Provided that the propability distribution is known, the entropy results in the average code length (in bit) necessary to encode this information source.


In information theory entropy is one of the most essential terms. It was introduced in 1948 by Claude Shannon who had initially founded this theory.


 <   ^   > 

Survey Basic Terms Diagram: Information of a Message Calculation of Entropy