“I never understood entropy” – anonymous engineer
Part IV of my book delves into the discovery and science of entropy. In this post, I explore what we know and question about entropy’s science, acknowledging my own unresolved inquiries. I’m open to learning, so if you have insights, references, or know someone who does, please share.
I – Classical Thermodynamics
- Rudolf Clausius discovered entropy (S) in 1865 after using James Joule’s experimental demonstration of work-heat equivalence to correct Sadi Carnot’s 1824 theoretical analysis of the steam engine (1824) that was based on the false caloric theory of heat.
- Clausius defined two properties of entropy:
- dS = δQ/T reversible thermal energy exchange (Q)
- dS = O reversible adiabatic volume change.
- While the first bullet has been explained to a large part by Ludwig Boltzmann (see II below), the second bullet has not. It amazes me that the entropy decrease resulting from adiabatic expansion to cause work ( W ) is exactly canceled by the entropy increase resulting from the larger volume. I have yet to see an explanation for why this is so.
- Declaring entropy to be a newly discovered property of state, Clausius transformed the 1st Law of Thermodynamics from dU = Q – W to dU = TdS – PdV (for fluids). Clausius defined U, later named “energy” by William Thomson, as another new property of the system,
- Entropy (S) is a property of state, just as are, for example, energy (U), enthalpy (H), temperature (T), and volume (V). J. Willard Gibbs based some of his earlier groundbreaking work on this fact by stating:
- U = f (S,V).
- The 3rd Law of Thermodynamics (S = zero at T = O for a pure, perfect crystalline substance) enabled absolute quantification of entropy via integration of the thermal energy (Q) added (reversibly) to the substance (divided by temperature) to raise its temperature from absolute zero to the reference temperature.
- S (absolute) = ∫ δQ/T = ∫ CpdT/T wherein Cp= heat capacity at constant pressure
II – Statistical Thermodynamics
- The rise of statistical mechanics (based on probability theory) by the work of Clausius, James Clerk Maxwell, and Ludwig Boltzmann brought physical meaning to entropy.
- The entropy of a system of particles corresponds to the most probable distribution of those particles based on momentum and location. More specifically,
- S = kB log (W) wherein kB = Boltzmann’s constant and W = the number of ways that the particles can be defined (momentum, location) to yield the macro-properties of the system, e.g., U, V, etc.
- With these findings, Boltzmann provided a mathematical proof to support the first physical interpretation of the equation: dS = δQ/T
- T defines the number of accessible energy states in a given system
- δQ defines the increment in the number of accessible energy states
- Hugo Tetrode and Otto Sackur independently developed an exact solution of Boltzmann’s statistical-mechanics definition of entropy for a monatomic ideal gas. Their equation was validated by comparison against experimentally quantified entropies (S = ∫ δQ/T).
- This is another source of fascination for me. By definition, the equation S = ∫ δQ/T depends on the historical path of heating a substance up from absolute zero to the reference temperature while the Sackur-Tetrode equation depends solely on the end state itself, with no regard to the history of getting there. How can this be?
- Given the above discussion, I would like to see a physical explanation of why exactly entropy varies as it does across the full periodic table. I have not yet seen this done to the (my) desired detail.
III – Gibbs Use of Entropy
- While entropy finds use in many calculations, e.g., exergy analysis (lost work), turbine expansion (change in entropy = 0), its use in the Gibbs equation for free energy is the one that most attracts my attention (click here) as I simply do not understand the implications that result. Per Gibbs
- Maximum work = -∆Grxn = -(∆Hrxn – T∆Srxn) (constant T,P)
- I hypothesize that T∆Srxn corresponds with the change in what I call the “structural energy” of a system. I define structural energy as the the energy required (based on ∫ δQ/T) to establish the momentum and location of each particle in the system. During a chemical reaction, the system structure changes, which results in a temperature effect. To maintain constant T, this translates into the need for heating or cooling.
- In his thermodynamic analysis of the electrochemical cell (click here), J. Willard Gibbs suggested that ∆Grxn corresponds to the voltage generated by the cell while T∆Srxn corresponds to the heating/cooling requirements of the cell to maintain constant T. Hermann von Helmholtz later named ∆Grxn “free energy” (click here) and T∆Srxn “bound energy.”
- I am currently working to test my hypothesis that connects T∆Srxn to the change in structural energy; I have not seen this hypothesized before.
- My “structural energy” hypothesis in (III.1) now challenges me to similarly explain the physical underpinning of the Gibbs-Helmholtz equation
- d(∆Grxn)/dT = -∆Srxn (constant P)
- While this equation illustrates a relationship between ∆Grxn and ∆Srxn, I don’t understand how this is so, especially since the orbital electron energies I hypothesize as being responsible for ∆Grxn, which corresponds to voltage in an electrochemical cell, are not directly involved in the entropy determination of atoms and molecules that are responsible for ∆Srxn.
So there we have it. My (our?) unanswered questions. As I have yet to find the answers I’m seeking in a textbook (or journal article), I am in search of a physical chemist who can help me. If you know of such a person, someone who could especially answer the questions I pose in III, please let me know.
Thank you for reading my post. I go into much greater detail about these concepts in my book Block by Block – The Historical and Theoretical Foundations of Thermodynamics.
END




Leave a Reply