Dear friends, today I want to talk to you about a fascinating concept that encompasses both thermodynamics and information theory: entropy. This term, often misunderstood, plays a crucial role in both disciplines, albeit with different nuances. Allow me to guide you through an explanation that I hope will clarify the connection between these two interpretations, while also providing a general and academic definition of entropy.
Let’s start with thermodynamics. Entropy is not a measure of energy density. Instead, it is a measure of the “degeneration of the microstates of a system.” But what exactly does this mean? Let’s clarify with a simple example.
Imagine a thermodynamic system as a box full of particles. Each particle has microscopic variables, such as position and momentum, that define the “microstates” of the system. When there are many particles, the number of microstates is enormous. However, we don’t need to know every single microstate to understand how the system behaves as a whole. We just need a few macroscopic variables such as temperature and density, which represent the aggregated properties of the system, and these define the “macrostates.”
Each macrostate can correspond to many different microstates. Entropy, therefore, is a measure of how many microstates correspond to a given macrostate. The greater the number of microstates that correspond to a macrostate, the greater the entropy of that state. Mathematically, entropy SSS can be expressed through Boltzmann’s formula:
where kB s Boltzmann’s constant and Ω represents the number of microstates.
Let’s consider an example with dice. If we roll two dice, the possible microstates are the combinations of the face values, for a total of 36 microstates (e.g., {1, 1}, {1, 2}, {2, 1}, {3, 1}, {3, 2}, … {6, 6}). However, in a game, we are often only interested in the sum of the two dice, not the individual values, which define the macrostates, with possible sums ranging from 2 to 12. Some macrostates will have more corresponding microstates (e.g., the sum of 7 can be obtained in 6 different ways), while others have fewer (the sums of 2 and 12 can only be obtained in one way each).
Entropy in information theory, on the other hand, is a measure of uncertainty or unpredictability. In this context, it quantifies the amount of information required to describe a system. For example, in a communication system, entropy measures the average amount of information produced by a stochastic source of data. Shannon’s entropy formula, which is foundational in information theory, is given by:
dove is a random variable with possible states, and is the probability of occurrence of state . This formula measures the uncertainty associated with a probability distribution.
Returning to the example of dice, if the sum is 2, we know that the values are {1, 1} and no information is missing. If the sum is 7, however, there are six possible microstates, and we would need approximately 2.58 bits of information to identify one exact microstate. This amount of missing information is what we mean by entropy in information theory.
In summary, while thermodynamic entropy measures the degeneration of microstates into a macrostate, informational entropy measures the missing information needed to distinguish the microstates of a macrostate. Both concepts use a logarithmic relationship to quantify this complexity.
General and academic definition of entropy:
Entropy is a measure of the amount of disorder or randomness in a system. In general terms, entropy quantifies the uncertainty associated with a physical system or a set of data. In the context of thermodynamics, it represents the number of microscopic configurations that a system can assume, corresponding to an observable macroscopic state. In information theory, entropy measures the amount of missing information needed to determine the entire set of data or the probability distribution of an event.
I hope this explanation has helped you better understand the concept of entropy and its significance in both thermodynamics and information theory.