Data Compression


Criteria

Survey Formats

Basics

Basic Terms

Symbol

Set of Symbols

Alphabet

Code

Coding

Redundancy

Information Theory

Message

Probability

Information

Diagram

Entropy

Redundancy Reduction

Irrelevance Reduction

Entropy Coding

Variable Length Codes

Code Trees

Compression Methods

Data Formats


Glossary

Index


Download


www.BinaryEssence.com

Information of a Message


In information theory, information is a value only depending on the probability for the occurrence of a particular message.


Definition: Information of a Message m


   I(m) = - log2 P(m)

Corresponding to the definition above information shows the following characteristics:

  • Increasing probability for the occurrence of a message results in decrease of information.
  • Information always provides a positive value because probability is varying in a range of 0 to 1.
  • The information of messages having a probability closely to 0 is very large, and for P(m) -> 0 it is infinite.
  • Messages with a small difference in probability provide information also differing slightly.
  • The information of two messges can be added if they are independent of each other.
  • Using the logarithm base 2 information provides the best possible code length in bit for this message.
  • If the probability of a message is 0.5 the information is 1. A proper code would provide a code length of 1 bit.

 <   ^   > 

Survey Basic Terms Probability Diagram: Information of a Message