Jump to content

Biological Physics/Probability, Entropy, & the Second Law

From Wikibooks, open books for an open world

Entropy of an Ideal Gas

[edit | edit source]

When releasing heat a reaction is said to be exothermic, with enthalpy, . If a reaction absorbs heat from its surroundings, the reaction is said to be endothermic with enthalpy, . Many endothermic reactions get energy by rearranging themselves into more entropic final states. Entropy in its most basic definition is the amount of disorder a system contains. Entropy always increases; a reaction will not proceed if the entropy of the system and/or the environment does not increase.

When considering the entropy of an ideal gas, one can consider how the gas molecules can be arranged in a volume. Let's take, for example, a sphere. The volume of the sphere divided into its smallest division of the sphere gives a number of subdivisions equal to Ωvol. The entropy of an ideal gas involves its multiplicity (represented by the symbol Ω). The multiplicity of a gas is defined as how many microstates exist for each macrostate. A microstate is a specific configuration for a system, whereas the macrostate describes the probability of actually finding a system in a certain microstate. The multiplicity of a sphere is . To derive the total multiplicity for an ideal gas in a sphere, we will consider the multiplicity pertaining to the volume (how many places a particle can occupy) and the multiplicity of the momentum (how many speeds that particle can have). To do this, we will use the Heisenburg Uncertainty Principle. The Heisenburg Uncertainty Principle tells us what the smallest way you can break up momentum. The formula is defined as where h is Planck's constant and is momentum. The Heisenburg Uncertainty Principle states that particles cannot get too close or else they will form waves; particles shouldn't react or they will exhibit quantum mechanical properties. This essentially means that the gas cannot be too dense. The kinetic energy of a particle is equal to or . Considering three dimensions, then kinetic energy would look something like which represents a vector in three dimensional space. The multiplicity of the sphere in relation to momentum would appear as: . The total multiplicity of the ideal gas would then be the number of ways the gas can be arranged spatially multiplied by the number of ways the momenta can be divided among the gas particles. Or, more succinctly, . By substitution, . We consider both v and p because each particle can have a different position and a different momentum. This equation in simplified form appears as: .

The formula for entropy is . For an ideal gas, the Sackur-Tetrode Equation is applied where . When considering an ideas gas going from a smaller volume to a larger, this is what the entropy would look like after going through the derivations: . For an isothermal reaction, where . Referring back to the previous formula, where . Kinetic energy is also the sum of work and heat. Thus, for an isothermal process , so then . If we remember that a property of natural logs is the ability to flip the argument within parentheses so that the negative of the argument is true, then we have: . The resulting general formula is . But Remember! This is only true for an isothermal process.

What happens when temperature is not held constant? By the definition of heat capacity, . Since , then it follows that

Example

[edit | edit source]

Take 1 g of water at 293 K and another 1 g of water at 323 K and mix them together. In thermal equilibrium, the 2 g of water are at temperature 308 K. What are the entropy changes involved?

Begin with . For the cooler system, . For the warmer system, . Now, to determine how the total entropy of the system behaves, add the entropy from the cooler and the warmer system together: .

Two State Systems

[edit | edit source]

Two state systems are systems that can only exist in one of two states at a time. Examples of this are monomers that can be linked into polymers either in a straight line or with a 180° turn, the spin up or spin down state of electrons, or a coin flip. Let's consider the coin flip situation for a moment. The following chart displays the relationship between the number of coins flipped, the macrostates, the probability, the microstates, and the multiplicity. The macrostate is the total number of heads (or tails) that turn up for a given system. The micro state is any particular order that can occur to give a particular macrostate. The probability is the reciprocal of the product of the number of states that can exist for each particle (or coin) and the total number of particles (coins). The multiplicity is the summation of the number of microstates for a given macrostate. This will make more sense when looking at the table below.

# Coins Macrostate (# heads & # tails) Probability of Microstate Microstates = Multiplicity
0 0H 0T 1 0H 0T 1
1 1H 0T
0H 1T
½
½
HT
TH
1
1
2 2H 0T
1H 1T
0H 2T
½ x ½
½ x ½
½ x ½
HH
HT, TH
TT
1
2
1
3 3H
2H 1T
1H 2T
3T
½ x ½ x ½
½ x ½ x ½
½ x ½ x ½
½ x ½ x ½
H,H,H
H,H,T H,T,H T,H,H
T,T,H T,H,T H,T,T
T,T,T
1
3
3
1

You might be able to see a pattern forming in the Ω column. It is beginning to follow Pascal's Triangle

              1
            1   1
          1   2   1
        1   3   3   1
      1   4   6   4   1
   1   5   10   10   5   1

etc.

The total probability for each macrostate is then the probability of the microstate multiplied by the multiplicity. This means that for the situation where there are two coins being flipped, the probability is actually as follows:

# Coins Probability
0 1
1 1/2
2 1/4
3 1/8

We will start moving to more general systems now. Say you have N coins. How many ways can you get n heads? You can use a choose function to determine the number, n.

N choose n function

and

By doing this we convert a number from one that multiplies to one that adds just like we know that entropy is additive. For a mole of coin flips using this you will be able to see basically a single spike at 0.5 * 10^23 for the number of times heads occurs.

The Einstein Ideal Solid

[edit | edit source]

Einstein came up with a simple way to model "ideal solids". Imagine that particles in a solid are connected together via springs. If the springs are compressed or expanded, then the energy that is stored in the springs is like a simple harmonic oscillator: . Imagine that the particles are arranged in a cubic lattice. Then, the total energy for three dimensions is . So . By the Equipartition Principle, . From Quantum Mechanics (just take this one on faith) where is Planck's constant and and ; essentially, the units of energy that can be used are not continuous but quantized.

Einsteins ideal solid

Now, let's look at the multiplicity of the Einstein solid. How many atoms are in the solid? N. How many units of energy does the solid have? n. How many ways can I arrange the energy? ! The multiplicity tells us how many different ways the energy can be shared among the different particles! However, calculating the multiplicity of this system is much more difficult than 2-state microstate systems. Instead of having only 2 different states (heads and tails), there are now any number n of states (or amounts of energy) for each particle (or coin).

Example

[edit | edit source]

Let's look at a system of 3 particles with a total energy packets n 3.

Example: N = 3

N atom 1 atom 2 atom 3
0 0 0 0 1
1 0
0
1
0
1
0
1
0
0
3
2 0
1
1
2
0
0
1
0
1
0
2
0
1
1
0
0
0
2
6
3 3
0
0
2
2
1
1
0
0
1
0
3
0
1
0
2
0
1
2
1
0
0
3
0
1
0
2
2
1
1
10

As you can see, these calculations are much more involved and complex than the 2-state system of coin flips. There is, however, a formula that can be used to calculate the multiplicity that is very similar to the Binomial Theorem: the Multinomial Theorem: . In fact, this formula is the generalization of the Binomial Theorem for any n number of states for each particle.

Example

[edit | edit source]

Let's look at the case where we have 10 particles (N = 10) with 4 units of energy (n = 4). From the Multinomial Theorem, we can calculate the multiplicity by .

How many ways can we give the first atom (a_{1}) have 0 units of energy . How the rest of the energy is divided up doesn't matter. Then, we have an N = 9 (9 atoms left over) and an n = 4 (and 4 units of energy to split among themselves). So .

Now, let the first atom have 1 unit of energy. Then, .

A similar argument can show that , , and . If we plot the multiplicity versus the number of energy packets that the first atom has, then we can see a graph that represents the Boltzmann Distribution of the system. A Boltzmann Distribution says that when a system has energy, the energy gets passed around and you are most likely to measure an atom with an energy of 0 in this particular system. This is because the energy is constantly passed so most of the time an atom has no energy.

Dispersion of energy

Let's now find the probability of each state

As a sanity check, the sum of the probabilities should equal 1, and it does.

Let's calculate the Average energy of each particle. You may think of this as a simple calculation: . However, let's look at the average energy from a more statistical approach. If we take the probabilities calculated above and multiply each by the energy that the first atom has and add all of these together ! This gives us the same answer as the previous calculation.

Thus, the Average energy may be 0.4 but the probabilistic energy is actually zero! this means that if you ever randomly choose a particle and measure its energy, it's most likely going to have an energy of zero. However, if you somehow add up all of the energies together and divide by the total number of particles, then this would yield an average energy of 0.4.

A 2-Body System

[edit | edit source]

Imagine having 2 solids (A and B) that are right next to each other and can freely exchange energy. Both A and B have the same number of particles: 10. A is hotter with 4 units of energy and B is cooler with only 2 units of energy.

Caption text

Intuition tells us that A will cool and B will heat up to 3 units of energy each. But why does that work? Well, entropy will show us the answer. The next table shows what the multiplicity looks like with a certain number n of energy packets in A. To find the total multiplicity, we must multiply the multiplicities from A and B together.

0 1 5005 5005
1 10 2002 20020
2 55 715 39325
3 220 220 48400
4 715 55 39325
5 715 55 20020
6 5005 1 5005

As you can see, the highest multiplicity is when the energy packets are evenly divided between A and B. Thus, we're most likely to see both groups sharing energy equally.

Now, let's look at another 2-body system where the number of particles are not equal. This one may have a less intuitive answer than the previous. Let A have 10 particles with 4 packets of energy and B have 6 particles with 4

0 1 1287 1287
1 10 792 7920
2 55 462 25410
3 220 252 55440
4 715 126 90090
5 2002 56 112112
6 5005 21 105105
7 4440 6 68640
8 24310 1 24310

So, the one we'll most likely observe is the state where A has 5 packets of energy and B has 3.

Energy, Entropy, and Temperature

[edit | edit source]

Recall from the Equipartition Theorem that and let f = 6. So where are constant. Then, . In the previous example, and . So initially,

And the final conditions are

In the final conditions, the temperatures are equal! Thus, the system is at equilibrium not only when the entropy has maximized but also when the temperatures are equal. Now knowing , let's determine . From this relationship:

0 7.16 7.16
2.3 6.67 8.97
4.01 6.14 10.15
5.39 5.53 10.93
6.57 4.84 11.41
7.60 4.02 11.62
8.52 3.04 11.50
9.34 1.79 11.14
10.1 0 10.1

Now, let's plot these entropy values versus the energy packets in a, .

Caption text

The slope of the entropy from A and the slope of the entropy from B are equal and opposite exactly when entropy is at its maximum. Or written more succinctly, . Since the total change in entropy when energy changes a little is , then this implies that at equilibrium.

Is it true that ? No. If you do some dimensional analysis, you find that has units so actually or when N and V are held constant (i.e. no work is done on the system).

Next Page: Stirling's Approximation | Previous Page: Charles' Law
Home: Biological Physics