• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Heat, Energy, Motion and Absolute Zero

Note that R(N) is ambiguous. We can replace R(N) with R(N) + N*K for constant K.

Another composition rule is that a combined system where the component systems' variations are uncoupled must have an entropy that is the sum of the systems' entropies.

This is easiest for equal probability. In that case, S = R(1) - R(N)/N. Redefining R with a suitable value of K to make R(1) = 0, we get S(N) = - R(N)/N.

For two equal-probability systems with probabilities 1/M and 1/N, the combined system has probability 1/(M*N). So,

S(M*N) = S(M) + S(N)

and this means that S(N) = S0*log(N) for some constant S0. This means that R(N) = S0*N*log(N) and that

\( \displaystyle{ R(N_1, \dots, N_n) = - S_0 \left( \sum_{i=1}^n N_i \log N_i - N \log N \right) = - S_0 \left( \sum_{i=1}^n N_i \log \frac{N_i}{N} \right) = - S_0 N \left( \sum_{i=1}^n P_i \log P_i \right) } \)

Thus,

\( \displaystyle{ S = - S_0 \sum_s P_s \log P_s } \)
 
21: The Boltzmann Distribution Function - Chemistry LibreTexts - also contains a derivation of entropy.

Now I derive the Boltzmann distribution. I try to maximize entropy while keeping the total energy fixed. One can do that with the method of Lagrange multipliers, minimizing with the fixed quantities being multiplied by additional variables.

\( \displaystyle{ S' = S - q_0 - q_1 E } \)
is
\( \displaystyle{ S' = \sum_s ( - P_s \log P_s - q_0 P_s - q_1 P_s E_s ) } \)

Now take the derivative with respect to Ps:

\( \displaystyle{ \frac{\partial S'}{\partial P_s} = - \log P_s - 1 - q_0 - q_1 E_s } \)

This has solution

\( \displaystyle{ P_s = P_0 e^{- q_1 E_s } } \)

So the Lagrange multiplier q1 is β = 1/T for temperature T, ignoring Boltzmann's constant, because it is a units factor.
 
Back
Top Bottom