This article discusses the eight most common types of entropy measures used in various fields, including Shannon, Kolmogorov-Sinai, and von Neumann entropy.
Introduction
Entropy measures are widely used in various fields such as information theory, physics, and thermodynamics. Entropy is a measure of the degree of disorder or randomness in a system. In information theory, entropy measures the amount of information contained in a message, while in thermodynamics, it measures the amount of energy that is unavailable for work.
There are several types of entropy measures used in different fields. In this article, we will discuss the eight most common types of entropy measures.
Shannon entropy
Shannon entropy, also known as information entropy, is the most well-known and commonly used entropy measure. It is named after Claude Shannon, who introduced it in 1948. Shannon entropy measures the average amount of information required to describe the outcome of a random variable.
Kolmogorov-Sinai entropy
Kolmogorov-Sinai entropy, also known as metric entropy, is a measure of the rate of information production in a chaotic system. It was introduced by Andrey Kolmogorov and Yakov Sinai in 1959. This entropy measure is used to study the dynamics of chaotic systems.
Rényi entropy
Rényi entropy is a generalization of Shannon entropy introduced by Alfréd Rényi in 1961. It is used to quantify the amount of uncertainty in a system. Unlike Shannon entropy, Rényi entropy can be adjusted by a parameter to provide a family of entropy measures that range from min-entropy to max-entropy.
Tsallis entropy
Tsallis entropy is another generalization of Shannon entropy introduced by Constantino Tsallis in 1988. It is used to measure the degree of non-extensivity or long-range correlations in a system. Tsallis entropy can also be adjusted by a parameter to provide a family of entropy measures.