Keith&Co:
Just a tangent:
As far as piles of cards vs houses of cards go, there is also the concept of "entropy" which seems to be measured in joules per kelvin...? (though that measurement doesn't have a clear relationship with the complexity of the behaviour required to create the arrangement)
The entropy of a system in Physical Chemistry is proportional to the (natural) log of the number of 'microstates' in the system, and is a measure of how 'disordered' it is - microstates are defined by the fact that new information is needed to describe them (rather than saying 'just like another one we already looked at'). The number of microstates is obviously a dimensionless number, but the constant of proportionality (Boltzmann's Constant*) in this case has units J.K
-1, and this allows us to see that statistical mechanical entropy (ie as defined by the number of microstates) is equivalent to thermodynamic entropy - the change in entropy of a system is the change in the internal energy of the system divided by the absolute temperature - hence J.K
-1.
Entropy has nothing to do with 'behaviour'; It's a measure of disorder, and can be thought of as the amount of information needed to completely describe the system. In your picture, once you describe the position and orientation of the first card, you can describe group A without doing that separately for all four cards - card 2 is 'the same orientation as card 1', and the position of cards 3 & 4 is a function of the positional difference between cards 1 & 2. Group B is more disordered, and has a higher entropy, but still cards 1 and 4 share a vertical orientation (unlike in group C), so group B has lower entropy than group C does.
In most cases, one cannot determine the absolute entropy of a system, but only the change in entropy - to know the absolute entropy, one would need to know all of the possible variables, including any (as yet undiscovered) sub-quark structures that may or may not even exist. So you can say that a particular change to a system has increased the entropy by ΔS J.K
-1, but not what the absolute starting (or ending) entropy was. By convention, entropy is denoted by 'S', so change in entropy is given as 'ΔS', where Δ ('delta') is the conventional symbol for the change in a quantity.
So we can precisely determine how 'disordered' a system is
relative to another system.
Disorder is all about unpredictability. The less easily you can predict (or describe) the next part of the system, based on the parts you have already examined, the higher the entropy of the system is.
*k
B = 1.38064852(79) × 10
-23 J.K
-1, is equal to the
Ideal Gas Constant divided by Avogadro's constant, and relates temperature to energy per particle in an ideal gas (the Ideal Gas Constant relates temperature to energy
per mole).