In statistical mechanics, entropy (as per Boltzmann) is defined to be, in dimensionless form
where is the number of accessible microstates. For a complex system composed of subsystems in interaction with each other, we define the total entropy as
A subsystem can be the phase space of an independent variable, as for example position or velocity. It doesn't necessarily refer to physical systems like bottles of gas that are in contact. So, for example, for an ideal gas that is composed of independently moving point-like atoms, one phase space plots all the current positions of the individual atoms while another phase space plots all the current velocities of the individual atoms.
One property of entropy is that if there are any changes among the subsystems, then the total entropy will always rise, and reach a maximum when equilibrium is reached. This is known as the 2nd Law of Thermodynamics [which is yet to be rigorously proven]. This property, in statistical mechanics, is based a number of assumptions, some of which are
1) is the number of accessible microstates, each of which has [at least roughly] equal probability of happening, or accessed
2) Each subsystem is circulating freely within their phase space of microstates
3) There is some conserved quantity or constraint or interdependence between the subsystems so that there isn’t runaway expansion in the number of microstates, if there is to be a state of equilibrium for which entropy is at maximum
4) Probability favors a subsystem being in a larger phase space than smaller
The simplest example of such a system is where
so that total entropy would have a maximum at
For an ideal thermodynamic gas, we assume that the governing variables are
or mass, root-mean-velocity, length, and number
For two subsystems of phase volumes and and for some dimensional proportionality constant we have (with having a phase space size of because it remains constant)
Defining physical volume to be , we have
From statistical mechanics of an ideal gas, we make use of the following definitions and relations
where is temperature and pressure, and is internal energy
From these, and the expression for , we have
If we define , where is heat, then we have the classic thermodynamic expression
is the Boltzmann Constant, with the value so that classical entropy has the physical dimension of energy divided by temperature.
Easy Math Editor
This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.
When posting on Brilliant:
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
\(
...\)
or\[
...\]
to ensure proper formatting.2 \times 3
2^{34}
a_{i-1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
Awesome!
I'm fascinated!