Heat flows from hot objects to cold ones, never spontaneously in reverse. Gases expand to fill their containers but never spontaneously contract. Broken eggs don't reassemble themselves. These everyday observations all reflect a single deep principle: the Second Law of Thermodynamics, which states that entropy—a measure of disorder—never decreases in an isolated system. Entropy gives time its direction, sets limits on engines, and connects to information theory in unexpected ways.
What Is Entropy?
Entropy S measures the number of microscopic configurations (microstates) consistent with a system's observable macrostate. Boltzmann's formula: S = k_B ln(Ω), where k_B is Boltzmann's constant and Ω is the number of microstates. A hot coffee cup and cool room air have many more combined microstates when temperature is equalized than when one is hot and the other cold—so equalization is overwhelmingly likely. The 'disorder' interpretation is intuitive but imprecise; the rigorous meaning is always 'number of accessible microstates.'
The Second Law
The Second Law states: in any spontaneous process in an isolated system, total entropy cannot decrease. For reversible processes, ΔS = 0; for irreversible processes, ΔS > 0. This gives time a direction—physical laws at the microscopic level are time-reversible, but the overwhelming improbability of entropy decreasing macroscopically creates an arrow of time. The Second Law is statistical rather than absolute: a gas could spontaneously contract into a corner, but the probability is so absurdly small (∼e^{-10^{23}}) that it never occurs in practice.
Heat Engines and Efficiency
The Second Law limits how efficiently heat engines convert heat to work. The Carnot efficiency—the theoretical maximum efficiency of any heat engine operating between temperatures T_H (hot) and T_C (cold)—is η = 1 − T_C/T_H. A steam turbine with steam at 500°C and cooling water at 20°C can convert at most 1 − 293/773 ≈ 62% of heat to work; the rest must be expelled as waste heat. No real engine achieves Carnot efficiency due to irreversibilities, but it provides the absolute upper bound for any heat engine operating between those temperatures.
Entropy and Information
In 1948, Claude Shannon defined information entropy by analogy with thermodynamic entropy: H = −Σ p_i log p_i, where p_i are the probabilities of different messages. This quantity measures information content: a message from a highly predictable source contains little information; a message from an unpredictable source contains much. The connection is deep—Maxwell's Demon, a hypothetical creature that sorts molecules by speed to decrease entropy, was shown to require erasing information, which generates entropy. Landauer's principle formalizes this: erasing one bit of information dissipates at least k_B T ln(2) of heat.
Arrow of Time
The Second Law is responsible for our perception that time has a direction. Fundamental physics—classical mechanics, quantum mechanics, electromagnetism—is essentially time-reversible. Yet we remember the past and not the future; processes happen in one temporal direction. The resolution is statistical: the universe started in an extraordinarily low-entropy initial state (the Big Bang), and entropy has been increasing ever since. The psychological and thermodynamic arrows of time are aligned because memory formation requires entropy increase—a remarkable connection between the physics of the early universe and the experience of consciousness.
Conclusion
Entropy and the Second Law encode one of physics' deepest truths: irreversibility emerges from reversible microscopic laws through statistics. The enormous number of particles in macroscopic systems makes entropy decrease so improbable as to be effectively impossible. This single principle limits engine efficiency, explains why heat flows downhill, gives time its direction, and connects thermal physics to information theory. Few concepts span as many domains—from engineering thermodynamics to cosmology to computation—as elegantly as entropy.