Skip to content

6.6 The Gibbs Paradox

Huang, Statistical Mechanics 2ed, Section 6.6

The entropy formula we derived is broken

I need to tell you something uncomfortable. That entropy formula we carefully derived in section 6.2? The one we used to build temperature, pressure, and the whole thermodynamic framework?

It's wrong.

Not subtly wrong. Not "off by a small correction" wrong. It predicts that entropy can grow without bound just by imagining partitions inside a gas. It says entropy depends on the history of how you prepared the system, not on its current state. That violates everything thermodynamics stands for.

The fix turns out to be a single factor of \(N!\) in the denominator. But why that fix works will take us straight to one of the deepest ideas in physics: identical particles are not just similar. They are fundamentally, quantum-mechanically indistinguishable.

Why should you care?

If you run MD simulations, you deal with this every day without thinking about it.

Open any LAMMPS input file. Your atoms have IDs: atom 1, atom 2, atom 3, all the way up to atom 108 (or 10,000, or a million). Those IDs feel real. They feel like the atoms are individuals. But here's the thing: if you take two argon atoms and swap their positions and velocities, have you created a new microstate?

LAMMPS says yes (different atom IDs in different places). Physics says no (argon atom is argon atom). The Gibbs paradox is about getting this distinction right. And if you get it wrong, your entropy is garbage.

The bad intuition: "each atom is a unique snowflake"

In everyday life, objects are distinguishable. You can paint one ball red and another blue. You can scratch one with a knife. Even two "identical" tennis balls have different scuff marks.

Atoms aren't like that. Two argon atoms are identical in a way that no macroscopic objects ever are. Not "really similar." Not "identical for practical purposes." Fundamentally, in-principle, even-God-can't-tell-them-apart identical. Quantum mechanics demands this.

But in classical mechanics, we don't have that concept. We track each particle with its own coordinates \((q_i, p_i)\), and swapping two particles gives a different point in phase space. Classical mechanics treats atoms like they have name tags.

That's where the trouble starts.

The paradox: mixing identical gases creates entropy (it shouldn't)

Let me show you the problem. Take the entropy of an ideal gas that we derived in section 6.5 (we skipped the derivation, but here's the result):

\[S = Nk \log(V u^{3/2}) + Ns_0\]

where \(u = \frac{3}{2}kT\) is the energy per particle and \(s_0\) is a constant.

Now here's the thought experiment. You have two boxes of gas side by side, separated by a partition:

  • Box 1: \(N_1\) particles in volume \(V_1\)
  • Box 2: \(N_2\) particles in volume \(V_2\)
  • Same temperature, same density

Remove the partition. The gases mix into a combined volume \(V = V_1 + V_2\). What's the entropy change?

The temperature doesn't change (same \(T\) on both sides). So \(u\) stays the same. The only thing that changes is that each group of particles now has access to the full volume \(V\) instead of its original \(V_i\). Ready? Let's do this.

\[\Delta S = S_{\text{after}} - S_{\text{before}}\]
\[= \left[N_1 k \log(V u^{3/2}) + N_2 k \log(V u^{3/2})\right] - \left[N_1 k \log(V_1 u^{3/2}) + N_2 k \log(V_2 u^{3/2})\right]\]

The \(u^{3/2}\) and \(s_0\) terms cancel. What's left:

\[\Delta S = N_1 k \log \frac{V}{V_1} + N_2 k \log \frac{V}{V_2}\]

Since \(V > V_1\) and \(V > V_2\), both logarithms are positive. So \(\Delta S > 0\).

If the two gases are different (say argon and neon), this is correct. You've done something irreversible. The gases are mixed now, and you can't unmix them without doing work. Entropy should increase. Beautiful.

But what if both gases are argon? Same species, same temperature, same density. You pull out the partition and... nothing happens. The gas on the left can't tell that it's now "mixed" with the gas on the right. They're the same gas. Nothing physically changed.

Yet our formula says \(\Delta S > 0\). Entropy increased from doing nothing.

That's bad. But it gets worse.

You could have imagined any number of partitions in the original gas. Two partitions? Three? A hundred? Each time you "remove" one, the formula says entropy increases. So the entropy of a gas is larger than any number. It's infinite.

That's not a small error. That's a catastrophe. The entropy isn't even a well-defined function of state anymore. It depends on how many imaginary partitions you dreamed up. The whole foundation of thermodynamics is crumbling.

Key Insight

The real problem isn't the mixing. It's that our entropy formula isn't extensive. A proper entropy should scale with the size of the system: double the system, double the entropy. Our formula has \(S \propto N \log V\) instead of \(S \propto N \log(V/N)\). That \(\log V\) vs \(\log(V/N)\) is exactly where the paradox hides.

The fix: divide by \(N!\)

Gibbs figured out the fix, even though he couldn't fully explain why it works (that had to wait for quantum mechanics).

The idea: when we counted microstates, we overcounted. If you have \(N\) identical particles, then any permutation of those particles gives the same physical state. There are \(N!\) ways to permute \(N\) particles. So we've been overcounting by a factor of \(N!\).

The fix is simple. Divide \(\Sigma(E)\) by \(N!\). Since \(S = k \log \Sigma(E)\), this subtracts \(k \log N!\) from the entropy. Using Stirling's approximation (\(\log N! \approx N \log N - N\)):

\[S_{\text{corrected}} = S_{\text{old}} - k(N \log N - N)\]
\[= Nk \log(V u^{3/2}) + Ns_0 - Nk \log N + Nk\]
\[= Nk \log\!\left(\frac{V}{N} u^{3/2}\right) + Ns_0'\]

Done. Look at what changed. Instead of \(\log V\), we now have \(\log(V/N)\). The entropy depends on the specific volume \(V/N\), not the total volume \(V\).

This is called the Sackur-Tetrode equation (with specific constants filled in), and it matches experiment perfectly.

Now let's check: does the paradox go away?

For mixing identical gases at the same density, the specific volume \(V/N\) is the same before and after mixing. So \(\Delta S = 0\). No entropy change from removing a partition between identical gases. Exactly right.

For mixing different gases, the \(N_1\) and \(N_2\) are constants that don't change during mixing, so the \(N \log N\) correction cancels out of \(\Delta S\). You still get \(\Delta S = N_1 k \log(V/V_1) + N_2 k \log(V/V_2) > 0\). The real entropy of mixing is preserved.

The fix kills the paradox without breaking anything else. That's slick.

But why divide by \(N!\)? (This is the deep part)

Gibbs knew the fix worked, but he couldn't explain why from within classical mechanics. And honestly, there is no classical explanation. Here's why.

In quantum mechanics, a system of \(N\) identical particles is described by a wave function that's either symmetric or antisymmetric under particle exchange. Swap two particles, and the wave function either stays the same (bosons) or flips sign (fermions). Either way, the physical state is unchanged. A permutation does not produce a new state. Period.

So when we counted microstates classically, treating every permutation as a distinct state, we were counting each physical state \(N!\) times. The fix is to divide by \(N!\) to undo the overcounting. This is called correct Boltzmann counting, and it's something we have to bolt onto classical mechanics from the outside. Classical mechanics by itself has no concept of identical particles.

I know that sounds like a hack. But it's actually a preview of something deep. We'll see in chapter 9 that when you take the high-temperature limit of quantum statistical mechanics, you recover classical statistical mechanics with the \(N!\) correction built in automatically. The fix isn't a hack. It's classical mechanics borrowing an answer from quantum mechanics that it can't derive on its own.

Common Mistake

Don't think of the \(N!\) as "correcting for double-counting due to identical particles" in a vague hand-wavy sense. It's specifically: \(N\) identical quantum particles have wave functions that are symmetric or antisymmetric under exchange, so permutations don't produce new states. There is no classical derivation of this fact. You either accept it as a postulate or derive it from quantum mechanics.

What this means for your simulation

Alright, let's bring this back to earth.

Atom IDs are bookkeeping, not physics

In LAMMPS, every atom gets an integer ID. Atom 1, atom 2, atom 42, atom 107. When you dump a trajectory, each atom carries its ID through time. That's useful for tracking diffusion, computing velocity autocorrelations, and debugging.

But those IDs are not physical. If you magically swapped the positions and velocities of atom 42 and atom 87 (both argon), the system is in exactly the same physical microstate. The forces are the same. The energy is the same. Every measurable property is the same. LAMMPS would compute different atom trajectories going forward (different IDs in different places), but the thermodynamics wouldn't change at all.

The Gibbs paradox is the formal version of this observation. Atom IDs are for your convenience. The physics doesn't know about them.

When it actually matters

For most MD simulations, you never need to worry about the \(N!\) factor explicitly. It cancels out of every thermodynamic derivative (pressure, temperature, heat capacity). It shows up only in the absolute entropy, and LAMMPS doesn't compute absolute entropy anyway.

Where it does matter is when you compare entropies of systems with different \(N\) (chemical potentials, free energies of mixing, alchemical transformations). If you're computing the chemical potential \(\mu = (\partial G / \partial N)_{T,P}\), that \(N!\) correction changes the answer by \(kT \log(N/V)\). Get it wrong and your free energy differences are off by a density-dependent term.

MD Connection

If you've ever used thermodynamic integration or free energy perturbation to compute chemical potentials, the ideal gas reference state includes the \(N!\) correction. That's why the ideal gas chemical potential has a \(\log(V/N)\) term instead of \(\log V\). The Gibbs paradox isn't just a philosophical curiosity. It lives inside your free energy calculations.

Takeaway

The naive microstate count overcounts by \(N!\) because it treats identical particles as distinguishable. Dividing by \(N!\) fixes the entropy, makes it extensive, and kills the Gibbs paradox. Classical mechanics can't explain why this works. Quantum mechanics can: swapping identical particles doesn't create a new state. Your atom IDs in LAMMPS are labels for your convenience, not a feature of the physics.

Check Your Understanding
  1. The corrected entropy has \(S \propto N\log(V/N)\). Quick sanity check: double \(N\) and \(V\) (same density). Does \(S\) double? Try it.
  2. You sneak into LAMMPS and swap the positions and velocities of atom 42 and atom 87 (both argon). Does the energy change? Does the pressure change? Does anything in thermo output change? What about the dump file?
  3. The \(N!\) fix doesn't touch \(P\) or \(T\), but it does change the chemical potential \(\mu\). Why does \(\mu\) care about the correction when \(P\) and \(T\) don't?
  4. Gibbs proposed dividing by \(N!\) decades before quantum mechanics. He had no wave functions, no Pauli exclusion, nothing. Could he have justified the fix, or did he just know it worked?