**Entropy** is a concept in thermodynamics (see thermodynamic entropy) and information theory. The two concepts do actually have something in common, although it takes a thorough understanding of both fields for this to become apparent.

Claude E. Shannon defined a measure of entropy (*H* = - Σ *p _{i}* log

Shannon's definition of Entropy is closely related to thermodynamic entropy as defined by physicists and many chemists. Boltzmann and Gibbs did considerable work on statistical thermodynamics, which became the inspiration for adopting the term entropy in information theory. There are relationships between thermodynamic and informational entropy. For example, Maxwell's demon reverses thermodynamic entropy with information but getting that information exactly balances out the thermodynamic gain the demon would otherwise achieve.

In information theory, entropy is conceptually the actual amount of (information theoretic) information in a piece of data. Entirely random byte data has an entropy of about infinity, since the next character is unknown. A long string of A's has an entropy of 0, since the next character will always be an 'A'. The entropy of English text is about 1.5 bits per character (Try compressing it with the PPM compression algorithm!) The entropy rate of a data source means the average number of bits per symbol needed to encode it.

- Many data bits may not convey information. For example, data structures often store information redundantly, or have identical sections regardless of the information in the data structure.
- The amount of entropy is not always an integer number of bits.