lpetrich
Contributor
Note that R(N) is ambiguous. We can replace R(N) with R(N) + N*K for constant K.
Another composition rule is that a combined system where the component systems' variations are uncoupled must have an entropy that is the sum of the systems' entropies.
This is easiest for equal probability. In that case, S = R(1) - R(N)/N. Redefining R with a suitable value of K to make R(1) = 0, we get S(N) = - R(N)/N.
For two equal-probability systems with probabilities 1/M and 1/N, the combined system has probability 1/(M*N). So,
S(M*N) = S(M) + S(N)
and this means that S(N) = S0*log(N) for some constant S0. This means that R(N) = S0*N*log(N) and that
\( \displaystyle{ R(N_1, \dots, N_n) = - S_0 \left( \sum_{i=1}^n N_i \log N_i - N \log N \right) = - S_0 \left( \sum_{i=1}^n N_i \log \frac{N_i}{N} \right) = - S_0 N \left( \sum_{i=1}^n P_i \log P_i \right) } \)
Thus,
\( \displaystyle{ S = - S_0 \sum_s P_s \log P_s } \)
Another composition rule is that a combined system where the component systems' variations are uncoupled must have an entropy that is the sum of the systems' entropies.
This is easiest for equal probability. In that case, S = R(1) - R(N)/N. Redefining R with a suitable value of K to make R(1) = 0, we get S(N) = - R(N)/N.
For two equal-probability systems with probabilities 1/M and 1/N, the combined system has probability 1/(M*N). So,
S(M*N) = S(M) + S(N)
and this means that S(N) = S0*log(N) for some constant S0. This means that R(N) = S0*N*log(N) and that
\( \displaystyle{ R(N_1, \dots, N_n) = - S_0 \left( \sum_{i=1}^n N_i \log N_i - N \log N \right) = - S_0 \left( \sum_{i=1}^n N_i \log \frac{N_i}{N} \right) = - S_0 N \left( \sum_{i=1}^n P_i \log P_i \right) } \)
Thus,
\( \displaystyle{ S = - S_0 \sum_s P_s \log P_s } \)