Gibbs' Representation of Entropy
📂Thermal PhysicsGibbs' Representation of Entropy
Given that the macrostate of a system is in the i state, let’s denote the probability as Pi. The measured entropy S of this system can be expressed as follows:
S=−kBi∑PilnPi
Here, kB is the Boltzmann constant.
Explanation
Shannon Entropy: When the probability mass function of a discrete random variable X is p(x), the entropy of X is expressed as follows.
H(X):=−∑p(x)log2p(x)
Gibbs’ expression of entropy takes a similar form to Shannon entropy as defined in probability information theory and others.
Derivation
Part 1.
First law of thermodynamics:
dU=δQ+δW
From the definition of entropy, where δQ=TdS and δW=−pdV, we get the following equation, which is another form of the first law of thermodynamics.
dU=TdS−pdV
Considering the total differential of U, we have:
dU=∂S∂UdS+∂S∂UdV
Comparing only the terms of dS in both equations, we can see that the following holds.
T=∂S∂U
Definition of temperature:
kBT1:=dEdln(Ω)
Following the definition of temperature, if kB1∂U∂S=dEdln(Ω) and the energy is U=E, solving the differential equation yields:
S=kBlnΩ
Here, Ω represents the number of states distinguishable by energy U.
Part 2. Stotal=S+Smicro
According to Part 1, if the number of states we can easily observe and distinguish is X, then S can be expressed as:
S=kBlnX
If these X states can be equally composed of Y microstates, then, while it’s impossible to observe the microstates directly, the entropy itself can be expressed as Smicro=kBlnY. Meanwhile, the entire system actually has X×Y=XY states, so Stotal=kBlnXY. If we write this in a formula, it can be expressed as the sum of observed entropy and the entropy corresponding to the microstates.
Stotal===kBlnXYkBlnX+kBlnYS+Smicro
Part 3. Pi
Let the total number of possible microstates in a system be N, and the number of macrostates for the i state be ni, then i∑ni=N. Therefore, the probability of the macrostate being in the i state is as follows:
Pi:=Nni
Part 4.
However, since we cannot easily calculate the entropy of a microstate, if we take a probabilistic expected value, it would be as follows:
Smicro=⟨Si⟩=i∑PiSi=i∑PikBlnni
Meanwhile, the entropy of the entire system can simply be represented as Stotal=kBlnN, and since it was Stotal=S+Smicro in Part 2,
S=Stotal−Smicro=kB(lnN−i∑Pilnni)
At this point, because lnN=∑iPilnN, the following holds true:
lnN−i∑Pilnni=i∑Pi(lnN−lnni)=−i∑PilnPi
To summarize, we obtain the following formula:
S=−kBi∑PilnPi
■