# Boltzmann's entropy formula

In statistical mechanics, Boltzmann's equation (also known as Boltzmann–Planck equation) is a probability equation relating the entropy ${\displaystyle S}$, also written as ${\displaystyle S_{\mathrm {B} }}$, of an ideal gas to the quantity ${\displaystyle W}$, the number of real microstates corresponding to the gas's macrostate:

${\displaystyle S=k_{\mathrm {B} }\ln W}$

(1)

Boltzmann's equation—carved on his gravestone.[1]

where ${\displaystyle k_{\mathrm {B} }}$ is the Boltzmann constant (also written as simply ${\displaystyle k}$) and equal to 1.380649 × 10−23 J/K.

In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or molecules of a certain kind of thermodynamic system can be arranged.

## History

Boltzmann's grave in the Zentralfriedhof, Vienna, with bust and entropy formula.

The equation was originally formulated by Ludwig Boltzmann between 1872 and 1875, but later put into its current form by Max Planck in about 1900.[2][3] To quote Planck, "the logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".

A 'microstate' is a state specified in terms of the constituent particles of a body of matter or radiation that has been specified as a macrostate in terms of such variables as internal energy and pressure. A macrostate is experimentally observable, with at least a finite extent in spacetime. A microstate can be instantaneous, or can be a trajectory composed of a temporal progression of instantaneous microstates. In experimental practice, such are scarcely observable. The present account concerns instantaneous microstates.

The value of W was originally intended to be proportional to the Wahrscheinlichkeit (the German word for probability) of a macroscopic state for some probability distribution of possible microstates—the collection of (unobservable microscopic single particle) "ways" in which the (observable macroscopic) thermodynamic state of a system can be realized by assigning different positions and momenta to the respective molecules.

There are many instantaneous microstates that apply to a given macrostate. Boltzmann considered collections of such microstates. For a given macrostate, he called the collection of all possible instantaneous microstates of a certain kind by the name monode, for which Gibbs' term ensemble is used nowadays. For single particle instantaneous microstates, Boltzmann called the collection an ergode. Subsequently, Gibbs called it a microcanonical ensemble, and this name is widely used today, perhaps partly because Bohr was more interested in the writings of Gibbs than of Boltzmann.[4]

Interpreted in this way, Boltzmann's formula is the most basic formula for the thermodynamic entropy. Boltzmann's paradigm was an ideal gas of N identical particles, of which Ni are in the i-th microscopic condition (range) of position and momentum. For this case, the probability of each microstate of the system is equal, so it was equivalent for Boltzmann to calculate the number of microstates associated with a macrostate. W was historically misinterpreted as literally meaning the number of microstates, and that is what it usually means today. W can be counted using the formula for permutations

${\displaystyle W={\frac {N!}{\prod _{i}N_{i}!}}}$

(2)

where i ranges over all possible molecular conditions and "!" denotes factorial. The "correction" in the denominator is due to the fact that identical particles in the same condition are indistinguishable. W is sometimes called the "thermodynamic probability" since it is an integer greater than one, while mathematical probabilities are always numbers between zero and one.

## Generalization

Boltzmann's formula applies to microstates of a system, each possible microstate of which is presumed to be equally probable.

But in thermodynamics, the universe is divided into a system of interest, plus its surroundings; then the entropy of Boltzmann's microscopically specified system can be identified with the system entropy in classical thermodynamics. The microstates of such a thermodynamic system are not equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath. For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:

${\displaystyle S_{\mathrm {G} }=-k_{\mathrm {B} }\sum p_{i}\ln p_{i}}$

(3)

This reduces to equation (1) if the probabilities pi are all equal.

Boltzmann used a ${\displaystyle \rho \ln \rho }$ formula as early as 1866.[5] He interpreted ρ as a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later work[6] and recognized it as more general than equation (1). That is, equation (1) is a corollary of equation (3)—and not vice versa. In every situation where equation (1) is valid, equation (3) is valid also—and not vice versa.

## Boltzmann entropy excludes statistical dependencies

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles that move independently apart from instantaneous collisions, and is an approximation, possibly a poor one, for other systems.[7]

The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic system as statistically independent. The probability distribution of the system as a whole then factorises into the product of N separate identical terms, one term for each particle; and when the summation is taken over each possible state in the 6-dimensional phase space of a single particle (rather than the 6N-dimensional phase space of the system as a whole), the Gibbs entropy

${\displaystyle S_{\mathrm {G} }=-Nk_{\mathrm {B} }\sum _{i}p_{i}\ln p_{i}}$

(4)

simplifies to the Boltzmann entropy ${\displaystyle S_{\mathrm {B} }}$.

This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872. For the special case of an ideal gas it exactly corresponds to the proper thermodynamic entropy.

For anything but the most dilute of real gases, ${\displaystyle S_{\mathrm {B} }}$ leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must consider the ensemble of states of the system as a whole, called by Boltzmann a holode, rather than single particle states.[8] Gibbs considered several such kinds of ensembles; relevant here is the canonical one.[7]