Entropy
Entropy is a measure defined in information theory that quantifies the information of an information source (e.g. the contents of a common file). An information source provides large entropy if its contents will be of random nature. Entropy is small if the source contains regular structures, e.g. if some parts appear more frequent than other ones.
The measure entropy gives an impression about the success of a data compression process. Provided that the propability distribution is known, the entropy results in the average code length (in bit) necessary to encode this information source.
In information theory entropy is one of the most essential terms. It was introduced in 1948 by Claude Shannon who had initially founded this theory.
< ^ >
