Data Compression


Criteria

Survey Formats

Basics

Basic Terms

Symbol

Set of Symbols

Alphabet

Code

Coding

Redundancy

Information Theory

Message

Probability

Information

Entropy

Redundancy Reduction

Irrelevance Reduction

Entropy Coding

Variable Length Codes

Code Trees

Compression Methods

Data Formats


Glossary

Index


Download


www.BinaryEssence.com

Entropy Coding


The entropy provides information about the code length necessary for coding the data of a source. Prerequisite is the knowledge about the probability for the occurrence of each symbol of this source.


The term entropy coding summarises all procedures utilising the probability distribution and thus the entropy, in order to produce a code of ideal length.


Entropy encoder assign short code words to frequently appearing symbols with high probability and in contrast longer code words to rare symbols. In the total sum a compression may be achieved.


Examples of procedures for entropy coding:


Shannon Fano Coding

Huffman Coding

Arithmetic Coding



Depending on the procedure used the best theoretical values may be reached more or less well. While arithmetic coding represents the optimum precisely, a clear deviation results in the case of applying Shannon Fano coding. With the help of Huffman coding better results can be obtained, but ideal code length however cannot be achieved.


Although the selection of the procedure has influence on the compression result, the entire efficiency is considerably determined by the accuracy of the preceding steps. The applied model predicting the probability of the following symbols is crucial for the success.


 <   ^   > 

Entropy [Entropie]

Shannon Fano Coding [Shannon Fano Coding]

Huffman Coding [Huffman Coding]

Arithmetic Coding [Arithmetic Coding]

Basic Terms Irrelevance Reduction Variable Length Codes