In his Physical Chemistry book (6th Edition, p. 99), Peter Atkins wrote that entropy “is a measure of the molecular disorder of a system.” Other references contain similar statements that connect entropy with disorder. I see things differently.
The appearance of structure in large systems of colliding atoms
Assume a system of a trillion atoms that behave as an ideal gas (no interactions). The atoms follow the cause-effect world according to Newton laws of motion. The cause-effect is there, operating in the background, during each and every collision, beyond our limits of detection and modeling. But while all of these collisions are going on, creating a seemingly muddy pool of kinetic confusion—that could be characterized as disorder—something quite remarkable happens.
Even if you were small enough and could see all of these collisions, you’d be one-step removed from the remarkable. Why? Because the remarkable occurs with the instantaneous location and velocity of every atom, which you can’t see. You’d have to calculate these and then graph them, as a population distribution, one for location and one for velocity. If you did this, here’s what you’d see: two beautifully symmetric and smooth population distributions. Given enough time for many collisions to occur, the distribution of the atoms becomes uniform with respect to location and Gaussian—forming the famous shape of the bell curve—with respect to velocity. [1]
This fascinating behavior always happens for such a large number of atoms. It is this behavior that is the basis for the highest-level definition of the 2nd Law of Thermodynamics and that explains the lower-level manifestations of this law such as heat always flow from hot to cold and perpetual motion is impossible. Given enough time, atoms spread evenly with regards to location and Gaussian with regards to velocity.
So why do these two specific distributions result? Probability. There’s no driving force at work other than probability. They result as a natural consequence of the probability associated with a large number of events such as collisions between atoms in the world of Avogadro-sized systems.
In short, nature moves towards its most probable state. If you flip a coin a thousand times, the most probable result is that you’ll end up with 500 heads and 500 tails, even though each individual flip is random. There’s no driving force involved. This is simply the result of how probability works for large numbers.[2] Is it possible to observe a thousand heads for a thousand flips? Yes. Probable? No. You’ll never see this in your lifetime. After many flips, the 50:50 distribution (with some very small error bars) is a near certainly. This same basic probability logic applies to colliding atoms. Are other distributions possible? Yes. Probable? No. The most probable distributions for a system of colliding atoms are uniform-location and Gaussian-velocity. The probability that each occurs is (just shy of) 100%.
I go into much greater technical detail on this topic in my book, Block by Block – The Historical and Theoretical Foundations of Thermodynamics. But for now, my view remains that entropy is a beautiful form of the Gaussian distribution and, additionally, the Maxwell-Boltzmann and the Boltzmann distributions that result from this.
[1] Technically the Gaussian distribution is based on the population distribution of one component of velocity such as vx as opposed to speed, which is √(vx2+vy2+vz2), or energy. Because these properties are all tied together, you’ll often see population distributions based on one or the other or both. When I speak of a Gaussian-velocity distribution, I am referring to vx, which has both positive and negative values centered around zero for a non-moving system.
[2] In another example, have you ever visited a museum in which a large-scale machine called the Galton box demonstrates how balls cascading down a vertical board containing interleaved rows of pins, bouncing either left or right as they hit each pin, grows into the Normal or Gaussian distribution as a large number of such balls collect in the one-ball wide bins at the bottom? Such demonstrations are pretty cool to watch unfold and reflect how many sequential 50:50 decisions result in a Gaussian distribution.
END




Leave a Reply