This site is being phased out.

Modern Thermodynamics by John Denker

From Mathematics Is A Science
Jump to navigationJump to search

This is an online book.

I have found it quite readable so far.

Introduction

  • "The first law of thermodynamics states that energy obeys a local conservation law.
  • The second law of thermodynamics states that entropy obeys a local law of paraconservation."

"You must appreciate the fact that not every vector field is the gradient of some potential. Many things that non-experts wish were gradients are not gradients. You must get your head around this before proceeding. Study Escher’s “Waterfall” ... until you understand that the water there has no well-defined height."

The terminology is different sometimes... "The term “inexact differential” is sometimes used in this connection, but that term is a misnomer, or at best a highly misleading idiom. We prefer the term ungrady one-form."

I agree that "differential" is out of place. "Inexact" might be an idiom but with a meaningful mathematical term behind it. I understand that "exact/inexact" may lead to confusion but all is takes is to explain where the word comes from: this k-form is (or is not) exactly the exterior derivative of some (k-1)-form.

I don't find the 'grady' term that good. The name suggests that it's about the gradient. This won't work for differential forms of higher degree when we deal with the exterior derivative not gradient. Why not simply call it a "1-form that isn't exact"?

Energy

"..the simple phrase “energy conservation” is practically begging to be misunderstood. You can easily have two profound misconceptions in a simple two-word phrase. For example, you may have seen a placard that says “Please Conserve Energy by turning off the lights when you leave...”

Entropy

"One configuration of the card deck corresponds to one microstate... The macrostate is the ensemble of microstates consistent with what we know about the situation... the macrostate is the ensemble, in the sense that specifying the ensemble specifies the macrostate and vice versa. Equivalently, we can say that the macrostate is a probability distribution over microstates."

Sounds like outcome vs event in probability.

"we can define its entropy as: $$S[P]= ∑_ i P_i \log(1/P_i) ,$$ where the sum runs over all possible outcomes and $P_i$ is the probability of the i-th outcome." OK, it is about outcomes after all.

"..the entropy is additive whenever the probabilities are multiplicative, i.e. whenever the probabilities are independent."


Basic Concepts (Zeroth Law)

  • disjoint regions - partition of the world
  • thermal equilibrium
  • temperature
  • equilibrium => same temperature

Low-Temperature Entropy (Alleged Third Law)

"the entropy of a system must go to zero as the temperature goes to zero."

The Rest of Physics, Chemistry, etc.

Functions of State

  • state = thermodynamic state = the macrostate
  • state function = state potential = a function of state

"In an ordinary chunk of metal at equilibrium, state functions include energy ($E$), entropy ($S$), temperature ($T$), molar volume ($V/N$), total mass, speed of sound, et cetera."

a function of state => independent of history => independent of path

"It is a fairly common mistake for people to say that $ΔE$ is a function of state. It’s not a function of state; it’s a function of two states, namely the initial state and the final state... $dE$ is a function of state "

sigma-delta = a sum of changes

"The sigma-delta of any function of state is independent of path."

"we assume that the system has a well-defined thermodynamic state, i.e. macrostate. This macrostate can be represented as a point in some abstract state-space. At each point in macrostate-space, the macroscopic quantities we are interested in (energy, entropy, pressure, volume, temperature, etc.) take on well-defined values. We further assume that this macrostate-space has dimensionality M, and that M is not very large."

Examples:

  • "ideal gas in equilibrium in a table-top cylinder, where the macrostate is determined by a few variables such as volume, temperature, and number of particles."
  • "a metal bar that is kept hot at one end and cold at the other end... at each point in the bar, there is a well-defined local temperature, a well-defined local energy density, et cetera."

State functions are smooth. This doesn't mean that the quantities are smooth functions of each other, so it may be impossible to use them to parametrize the state space.

The parameters of the state space:

  • $S$, the entropy.
  • $V$, the remaining variables = a vector with $D−1$ dimensions.

"In particular, we choose the macroscopic variable V in such a way that the microscopic energy $Ê_i$ of the ith microstate is determined by $V$. (For an ideal gas in a box, $V$ is just the volume of the box.)"

It follows: $$dE = \frac{∂E}{∂ V} dV + \frac{∂E}{∂S}dS.$$ Define: $$P=-\frac{∂E}{∂ V} dV,$$ $$k\beta=\frac{∂E}{∂S},$$ $$T=1/(k\beta).$$ Something related to temperature... It follows: $$dE = −P dV + T dS.$$ Define: $$w=-PdV,$$ $$q=TdS.$$ It follows: $$dE = w + q.$$ Work and heat?

For the constant-volume situation: $$dE = \frac{∂E}{∂ V} dV + \frac{∂E}{∂T}dT.$$ Here $$C_V= \frac{∂E}{∂T} = T\frac{∂S}{∂T}$$ is the energy capacity, aka “heat capacity at constant volume”.

For the constant-pressure situation: $$dH = \frac{∂H}{∂ P} dP + \frac{∂H}{∂T}dT.$$ Here: $$C_P= \frac{∂H}{∂T}$$ is the enthalpy capacity, aka "heat capacity at constant pressure".

Collecting results for comparison, we have

  • $dE = CV dT$ at constant $V$,
  • $dH = CP dT$ at constant $P$,
  • $dS = 1/T C_V dT$ at constant $V$,
  • $dS = 1/T C_P dT$ at constant $P$.

"...the concept of “heat” is a confusing chimera. It’s part energy and part entropy. It is neither necessary nor possible to have an unambiguous understanding of “heat”. If you understand energy and entropy, you don’t need to worry about heat."

Note: It is annoying when formulas from future chapters are used.

From the partial derivatives, it follows: $$dE = µ dN − P dV + T dS, $$ where $N$ is the number of particles and $$µ=\frac{ ∂ E }{∂ N } $$ is the chemical potential.

"So we see that dN and dV are two different directions in parameter space... all that matters is that they be different i.e. linearly independent. At the end of the day, we need a sufficient number of linearly independent variables, sufficient to span the parameter space."

"I cannot imagine why anyone would want to use [w+q] equation.. as “the” first law of thermodynamics. Instead, I recommend using the local law of conservation of energy … which is simpler, clearer, more fundamental, more powerful, and more general."

The W + Q Equation

Not $$dE = dW + dQ $$ because the forms might be non-exact, path-dependent, etc.

"People heretofore have interpreted d in several ways: as a differential operator (with the power, among other things, to produce one-forms from scalars), as an infinitesimal step in some direction, and as the marker for the weighting function in an integral. The more I think about it, the more convinced I am that the differential operator interpretation is far and away the most advantageous. "

Connecting Entropy with Energy

9 Additional Fundamental Notions

10 Experimental Basis

11 More About Entropy

12 Entropy versus “Irreversibility” in Chemistry

13 The “Big Four” Energy-Like State Functions

14 Adiabatic Processes

15 Boundary versus Interior

16 Heat

17 Work

18 Cramped versus Uncramped Thermodynamics

19 Ambiguous Terminology

20 Thermodynamics, Restricted or Not

21 The Relevance of Entropy

22 Equilibrium, Equiprobability, Boltzmann Factors, and Temperature

23 Partition Function

24 Partition Function for Particle(s) in a Box

25 Density Matrices