21
Page 1 of 21 The second law of thermodynamics (and a little statistics!) 10/3/10 The First Law of Thermodynamics tells us energy is conserved but does not provide any restrictions about how energy may be converted from one type to another. The Second Law of Thermodynamics restricts the kinds of energy conversions that are possible in a cyclic process. For example it says we cannot only convert heat to work; some other changes will have to occur at the same time. There are several different ways of stating the second law. Although it might not be obvious at first glance, each different statement of the second law can be used to prove the others. 1. Version one of the second law: Entropy remains constant or increases in a closed system. 2. Version two of the second law: In a closed system heat flows from hot to cold; it takes input energy to make it flow the other way. 3. Version three of the second law: The Carnot cycle is the most efficient cycle possible for a reversible heat engine operating in a cyclic process: η = 1 T C T H (Carnot cycle) (1) A thermodynamically closed system is one where energy neither leaves nor is added to the system. A cyclic process is one where energy may flow into or out of the system but the system returns to the exact same state (pressure, volume, energy, temperature, etc.) after each cycle. Each of the statements of the second law and the connections between them will be examined below. 1. Entropy remains constant or increases in an isolated system Suppose we have four coins and want to know how many different results we could get from tossing them. We assume the coins are fair, meaning there is a 50% chance each will come up as either heads or tails. We also assume we can track which coin is which at all times or, in thermodynamic terms, they are distinguishable. All 16 of the possible outcome are shown in Table 1 where H indicates heads and T indicates tails. Coins Toss 1 Toss 2 Toss 3 Toss 4 Toss 5 Toss 6 Toss 7 Toss 8 Toss 9 Toss 10 Toss 11 Toss 12 Toss 13 Toss 14 Toss 15 Toss 16 1 H H H H T H H H T T T H T T T T 2 H H H T H H T T H H T T H T T T 3 H H T H H T H T T H H T T H T T

The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page1of21

Thesecondlawofthermodynamics(andalittlestatistics!)10/3/10

The First Law of Thermodynamics tells us energy is conserved but does not provide any restrictions about how energy may be converted from one type to another. The Second Law of Thermodynamics restricts the kinds of energy conversions that are possible in a cyclic process. For example it says we cannot only convert heat to work; some other changes will have to occur at the same time.

There are several different ways of stating the second law. Although it might not be obvious at first glance, each different statement of the second law can be used to prove the others.

1. Version one of the second law: Entropy remains constant or increases in a closed system.

2. Version two of the second law: In a closed system heat flows from hot to cold; it takes input energy to make it flow the other way.

3. Version three of the second law: The Carnot cycle is the most efficient cycle possible for a reversible heat engine operating in a cyclic process:

η = 1−TC

TH

(Carnot cycle) (1)

A thermodynamically closed system is one where energy neither leaves nor is added to the system. A cyclic process is one where energy may flow into or out of the system but the system returns to the exact same state (pressure, volume, energy, temperature, etc.) after each cycle. Each of the statements of the second law and the connections between them will be examined below.

1. Entropy remains constant or increases in an isolated system

Suppose we have four coins and want to know how many different results we could get from tossing them. We assume the coins are fair, meaning there is a 50% chance each will come up as either heads or tails. We also assume we can track which coin is which at all times or, in thermodynamic terms, they are distinguishable. All 16 of the possible outcome are shown in Table 1 where H indicates heads and T indicates tails.

Coins

Toss 1

Toss 2

Toss 3

Toss 4

Toss 5

Toss 6

Toss 7

Toss 8

Toss 9

Toss 10

Toss 11

Toss 12

Toss 13

Toss 14

Toss 15

Toss 16

1 H H H H T H H H T T T H T T T T 2 H H H T H H T T H H T T H T T T 3 H H T H H T H T T H H T T H T T

Page 2: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page2of21

4 H T H H H T T H H T H T T T H T Three heads Two heads One head Table 1. All possible outcomes of tossing four coins.

From the results we can see that there is only one way to have all heads but six ways to have two heads and two tails. Our conclusion is that there is a six out of 16 chance of getting a result of two heads and two tails but only a one out of 16 chance of getting all heads.

What does coin tossing have to do with thermodynamics and the second law? Suppose we have four molecules which can move randomly between two regions of a container. Let's call the left half of the container T and the right half H. Using the same reasoning we can see that it will be six times more likely to find two molecules in the H side and two in the T side than it is to find all four in the H side. In other words there is a tendency for molecules to spread out in roughly equal numbers between the two halves of the container simply because there are more ways for that to happen.

The number of different ways the same event can happen is called the number of microstates or the multiplicity,

Ω which has no units. So for the case of four coins

Ω =1 for all heads (or all tails); there is only one way to have this occur so only one microstate is available. For two heads and two tails we have

Ω = 6 because there are six distinct ways for this to occur so the multiplicity is six. We are more likely to see two heads and two tail in a random coin toss because there are more microstates available (the multiplicity is higher) for this to happen.

Entropy is defined to be

S = kB lnΩ (2)

where is Boltzmann’s constant (

kB =1.38 ×10−23 J /K ),

Ω is the number of available microstates (the multiplicity) and ln is the natural logarithm. Notice the units for entropy will be Joules per Kelvin. The entropy for having half heads (

S1/ 2heads = kB ln6 = 2.47 ×10−23 J /K ) is larger than the entropy of having all heads (

Sall head = kB ln1= 0J /K ). So from an entropy standpoint, final states with higher entropy (two atoms in one half and the two in the other half) are more likely. From this we see that the first statement of second law, that entropy remains constant or increases in a closed system, is the most likely outcome from a probabilistic standpoint*. High entropy states are more probable because there are more possible ways for those states to occur. If we continue to toss the coins we will see the highest entropy state more often.

* Fluctuations from the highest entropy state will obviously occur but, as shown in the following,deviations from thismaximumentropy statebecome smaller as thenumberof atoms increases.Thetopicoffluctuationsisadiscussionbeyondthescopeofthisbriefpresentation.

Page 3: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page3of21

But how much more likely is a higher entropy state? This turns out to be very dependent on the number of objects (atoms or coins). For larger numbers of coins (or atoms) it is more convenient to use equations from probability. The number of possible outcomes for N coins when there are two states (heads and tails) is given by 2N and the probability of the number of heads for a given throw is

pH = N /2 (the most likely outcome is half heads and half tails). Probability theory also tells us that the number of ways of arranging N molecules (or coins) between two possible states (H and T) with

N H molecules in state H is given by the binomial distribution,

ΩN H

N =N!

N H!(N − N H )! (3)

where

x!= x(x −1)(x − 2)...1 is called the factorial (so for example

4!= 4(3)(2)(1) = 24 ). For the case of four atoms divided between the two states with two on one side we have

Ω24 = 6

, a result we got above by simple counting. We also get

Ω04 =1 for no molecules in the H

state (here we have used the fact that

0!=1). For larger numbers of molecules counting by hand becomes difficult if not impossible but Equation (3) allows us to calculate the multiplicity†. Suppose we wish to look at multiplicities for 40 molecules. We have

Ω040 =1

or one way to put them all in one state but now we have

Ω2040 =1.38 ×1011 different ways to

split the molecules so that half are on one side. The total number of possible outcomes is

240 =1.10 ×1012 so there is a one in

1.10 ×1012 chance of finding all 40 molecules in one state but

1.38 ×1011 /1.10 ×1012 = 0.125 or 12.5% chance of finding the molecules equally split between the two states. There is only a

1/1.10 ×1012 = 9.09 ×10−13 or

9.09 ×10−11% chance of finding them all on one side. From this we see that some things are very, very, very, very, unlikely to happen while other events are common.

Figure (1) shows the distribution of multiplicities for two possible outcomes (H or T) for two cases. On the left is the case of four coins and on the right is the case for 40 coins. Notice that the heights of the two graphs are not equivalent; as the number of coins increases, the distribution of multiplicities gets very narrow compared to its height. This is another way of saying that the likelihood of not finding the coins more or less equally distributed between the two states becomes very small for large numbers of coins or molecules.

†Most calculators and computers are limited to calculating factorials of numbers less than 500 because the values become very large. For higher numbers of entities the Stirling approximation and other mathematical tricks must be used to evaluate Equation (3) (see [1]).

Page 4: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page4of21

Figure 1. For a small number of objects (four) the distribution of multiplicities is broad (left) but for a larger number of objects (40) the distribution is very narrow (right) relative to the height of the curve.

In probability theory a measure of the spread of a particular data set is given by the standard deviation,

σ . The width of the curves shown in Figure (1) at half the maximum value is approximately

2.35σ (the reader is referred to texts on probability theory for more detail [2]). For a large sample of independent, unbiased measurements of an outcome (a normal distribution), approximately 68% of the values should fall within

±σ of the average (or mean) value. Around 95% of the values will be within

±2σ of the mean and 99.7% of the values will be within

±3σ of the mean. For the binomial distribution (Equation (3)),

σ = Np(1− p) where p is the probability of the event. In the case of coin tosses

p = 0.5 or

50% assuming the coins are equally likely to come up heads as tails. Notice that for

N = 4 we have

σ =1 and the height of the plot is

ΩN / 2N = 6 as shown on the left of Figure (1). For

N = 40 we have

σ = 2.24 and the height of the plot is

ΩN / 2N =1.38 ×1011 as shown in the

right graph of Figure 1. Clearly the distribution curve becomes very narrow for large numbers of objects and it becomes extremely unlikely that the objects are not more or less distributed equally between the two states.

Given the large numbers of molecules involved, anytime we work with realistic problems in thermodynamics we can expect the system to quickly move towards a state of highest entropy with very little likelihood of returning to a lower entropy state. We often work with a mole of a substance which contains

6.02 ×1023 entities (atoms or molecules) and is defined to be the number of atoms in 12g of Carbon 12. The first law of thermodynamics, conservation of energy, says there is no reason why all of the perfume molecules escaping from a bottle can’t collide in such a way that they exactly reverse their path at some point and migrate back into the bottle. But the second law says that this is so improbable that it will never occur (or possibly occur once in a time larger than the age of the universe).

0

1

2

3

4

5

6

7

0 1 2 3 4

Mul

tipl

icit

y

Number of Tails

Distribution of Multiplicities (4 objects)

2.35σ

0.00E+00

2.00E+10

4.00E+10

6.00E+10

8.00E+10

1.00E+11

1.20E+11

1.40E+11

1.60E+11

0 10 20 30 40

Mul

tipl

icit

y

Number of Tails

Distribution of Multiplicites (40 objects)

2.35σ

Page 5: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page5of21

2. Heat flows from hot to cold in isolated systems

Let’s look at a different example that involves heat flow between two objects rather than the number of molecules in two halves of a container [1]. Here the meaning of a closed system of two objects is that energy may be exchanged between them but not with the outside world. We will assume that the exchange between the two objects occurs slowly, much slower that the energy exchanges internal to the two objects. For this example there will be more than two states to choose from so the binomial distribution (Equation (3)) does not apply but similar laws of probabilities hold. Again, as in the previous case, we assume that all microstates are equally probable and equally accessible, a property sometimes referred to as the fundamental assumption of statistical mechanics.

For real solids the energies available to the atoms are quantized; in other words only certain vibrational energies are allowed. To make things simple, suppose we have a solid with identical quantized energy levels so you can add (or subtract) exactly one unit of energy or two or three units to an atom but never, say, 1.5 units of energy (this simplified model is called an Einstein solid). Let’s start with a solid which has only three atoms (N = 3) and count the number of microstates,

Ω, available for a given quantity of added energy to this solid. If no energy is added there is only one available microstate; all three atoms have zero energy and the multiplicity is one. For one unit of added energy the multiplicity is three; we can give the one unit to one atom and there are three possible atoms which might have it. For two units of energy added there are six microstates. Table 2 below summarizes these possibilities.

Energy and Multiplicity Atom 1 Atom 2 Atom 3

No energy,

Ω =1 0 0 0 One energy unit,

Ω = 3 1 0 0 0 1 0 0 0 1 Two energy units,

Ω = 6 2 0 0 0 2 0 0 0 2 1 1 0 1 0 1 0 1 1 Table 2. Multiplicity chart for distributing energy units among three atoms in an Einstein solid.

The general formula for finding the multiplicity for a given number of oscillators, N, with a given number of energy units, q, is given by probability theory and is

Ω(N,q) =(q + N −1)!

q!(N −1)! . (4)

Page 6: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page6of21

From this formula we see that

Ω(3,0) =1,

Ω(3,1) = 3,

Ω(3,2) = 6, as in the table above and

Ω(3,3) =10 etc.

Now let’s look at what happens when we have two Einstein solids which have different temperatures [1]. Let’s call the solids A and B and suppose there are NA atoms in solid A with qA units of energy and NB atoms in solid B with qB units of energy. As a simple example let’s suppose each solid has only three atoms or NA = NB = 3 and there are six energy units to be divided between them: qA + qB = 6. Initially the solids will be isolated; for now we are just considering the possible ways to split the energy. If A gets no energy, B has all six units; if A gets one energy unit, B gets five, and so forth. Using formula (4) above for multiplicity we can fill in Table (3) for the microstates of each solid. For example in the second row we see solid A has no energy and only one microstate but solid B has all six energy units and 28 different ways (microstates) to spread those six units among the three atoms in solid B.

Energy units for qA

ΩA Energy units for qB

ΩB

Ωtotal =ΩA ×ΩB

0 1 6 28 28 1 3 5 21 63 2 6 4 15 90 3 10 3 10 100 4 15 2 6 90 5 21 1 3 63 6 28 0 1 28 Table 3. Multiplicity of two Einstein solids sharing six units of energy.

Now put the two solids in thermal contact with each other and assume the energy will move around between them, with equal probability for each microstate. For the seven cases in the table above we see that the combined number of microstates (

Ωtotal =ΩA ×ΩB ) for the two solids in contact are 1(28) = 28, 3(21) = 63, 6(15) = 90, 10(10) = 100, 15(6) = 90, 21(3) = 63 and 28(1) = 2, respectively. The multiplicity for having half the energy in each solid is much larger (

Ωtotal =100) than having all the energy in one solid (

Ωtotal = 28). In other words, assuming the energy can move between the two solids, it is a lot more likely that energy will tend to equalize between them because there are more microstates available for this outcome. Since entropy is related to the number of microstates (Equation (2)) we may also say that the entropy is higher if the energy is distributed equally between the two solids.

For our Einstein solid the only available energy states are vibrational so two samples with the same average energy per molecule will also have the same average kinetic energy per molecule. From

32 kBT = 1

2 mv 2 (derived below) we know that two objects having

molecules with the same average kinetic energy also have the same temperature. So

Page 7: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page7of21

equalization in energy also means equalization in temperature (at least for Einstein solids‡). From the above discussion we conclude that it is more probable for energy to move from hot objects to cold objects as compared to other redistributions of energy. We have thus shown that the first two statements of the second law given above are equivalent; heat flow from a hot object to a cool object increases the overall entropy of the combined system. This is also the most probable outcome, other things being equal. Although we have used a specific example (two Einstein solids in thermal contact) it is true for any two objects (solids, liquids or gas) that can exchange thermal energy with each other but are isolated from their surroundings.

The above concepts can be extended to larger numbers of atoms in each solid and to the case where the number of atoms is different in each solid. When we do this we see something similar to the above example of coin tosses for very large numbers of coins. Equal division of energy between the two solids becomes extremely likely compared to the probability of all, or even most of the energy in only one solid. For example for two solids with 300 atoms each and 100 units of energy split between them the calculation above gives the number of microstates with all the energy in one solid as

1.7 ×1096, a large number, but the number of microstates for an equal distribution of energy is

1.3×10122 which is 1026 times larger§. For practical applications the mechanisms of heat flow (conduction, convection, radiation) tell us under what circumstances and how quickly this redistribution of energy will occur but the laws of probability for large numbers of objects tells us it is extremely likely for the system to move towards equally distributed energy, compared to some other configuration of energy sharing.

Although there are examples where an association between entropy and disorder does not hold, often this connection is a useful way to think about entropy. In our example of molecules located in two halves of a container, if all of the molecules are in one half, we have more information about where they are than if they are spread equally between the two halves. They are less disordered if they have a specific arrangement (all on one side) than if half is on each side. So in this case the second law says closed systems tend towards an increase in disorder. In contrast, systems that are not isolated, such as growing organisms, can increase in order because they use energy (they are not closed systems). There is an overall increase in entropy of the organism plus environment; the organism by

‡ For real solids there are other available energy states (rotational, bending, etc.) and these introduce additional multiplicities. In the end the same result holds; equalization in energy between microstates of the two solids (the most probably outcome) is equivalent to equalization of temperature. This result is known as the equipartion theorem [1].

§It should be clear now why it useful to use the natural logarithm in the definition of entropy. For the first

case (all energy in one solid) we have

ln(1.7 × 1096) = 221.6 and for evenly split energy

ln(1.3× 10122) = 281.6. For multiplicities involving moles of atoms the entropy will be a manageable number after taking natural logarithms and multiplying by Boltzmann’s constant.

Page 8: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page8of21

itself lowers its entropy as it grows at the expense of the environment. If there is no energy flow into the system the organism will die and decay into disorder.

3. Heat engine efficiency is limited by the Carnot efficiency

A heat engine is a cyclic device that converts heat into mechanical work. Gasoline and diesel engines are important examples. Typical car engines vary between 10% and 35% efficiency, depending on their size and how they are driven. This means at best, 65 cents of every dollar spent on gasoline for your car is wasted! Why can’t we make a more efficient gasoline engine? The short answer is we can make engines more efficient than they are currently but the second law limits the maximum efficiency of all heat engines. For cars and other real applications it is also possible to capture lost energy to make a combination of processes more efficient than each individual stage of the combination.

First let’s look at a schematic diagram for an ideal heat engine, in Figure (2) below. Energy, QH, leaves the hot reservoir, does work, W (the useful work output) and waste energy in the form of heat, QC, is expelled to the cold reservoir. For a gasoline engine the hot reservoir is the exploding gasoline that pushes a piston connected to the drive shaft. The work done is the mechanical rotation of the drive shaft. The cold reservoir is the atmosphere with the radiator and muffler as intermediaries. Because the piston returns to its original location the process is cyclic and the change in internal energy,

ΔU is zero for the cycle. Thus for a heat engine, conservation of energy (the first law of thermodynamics) gives

QH = QC + Wcycle .

Figure 2. Schematic operation of an ideal heat engine.

Recall that efficiency is defined to be

η =energy benefit

energy cost which becomes

η =Wcycle

QH

for our heat engine. Using

QH = QC + Wcycle from the first law, efficiency can be written as

Hot reservoir at TH

Cold reservoir at TC

Engine

QH

QC

Wcycle

Page 9: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page9of21

η =Wcycle

QH

=QH −QC

QH

= 1−QC

QH

. (5)

Notice that from this equation efficiency can never be 100% with the possible exception that either the hot reservoir expels infinite heat or the cold reservoir takes in no heat. We have accounted for all the energy in this perfect engine; no energy is lost to friction in this hypothetical case. An infinite heat transfer from the hot reservoir is impossible but why couldn’t the heat expelled to the cold reservoir, QC, be zero so that all the input energy is converted to useful work and efficiency is 100%? As we show below, this is exactly what the second law says cannot happen because a change in entropy in the wrong direction would be required and as we have seen, decreases in entropy for isolated systems of large numbers of molecules are, for all practical purposes, impossible from a probabilistic standpoint. Intuitively we should suspect this already, heat energy has no reason to flow out of the hot reservoir unless it is moving to a cool reservoir.

To apply the second law to our heat engine, first notice in the example of two Einstein solids at different temperatures in the previous section that energy will flow from the hot to the cold solid until the entropy is at a maximum. The solids may then exchange equal amounts of energy but the net energy flow has ceased and total entropy is constant**. Fluctuations of entropy as the result of fluctuations in either heat flow, Q or internal energy, U, average out to zero. In other words, the change in entropy per change in heat flow or change in internal energy equals zero. So for solid A at equilibrium we may write

∂Stotal

∂QA

= 0 =∂Stotal

∂UA

where UA is the internal energy and S is entropy. Here we have used

partial derivatives to indicate that we are only interested in how entropy changes with energy or internal energy (we aren’t going to change the volume or pressure or other conditions; the system is isolated). The total entropy is the sum of the entropies of the two

solids so fluctuations of the total are also zero and

∂SA

∂UA

+∂SB

∂UB

= 0 at equilibrium. We have

assumed the system of two solids is isolated so the energy lost by solid A was gained by

solid B and we have

dUA = −dUB which allows us to write

∂SA

∂UA

=∂SB

∂UA

at equilibrium.

We also know that when two solids are in thermal equilibrium the temperatures are the same: TA = TB. This in fact is the definition of temperature and allows us to formulate a connection between change in entropy with respect to internal energy and temperature (in

Kelvin):

TA =∂SA

∂UA

−1

. The inverse of the entropy change is chosen so that increasing

**Thisissometimesknownastheprincipleofdetailedbalance.Basicallyitsaysthatsmallexchangesofidenticalmolecules and/or energy cannotbedetected.While it cannotbeproved in a strict sense, itseemsreasonableforlargenumbersofmolecules.

Page 10: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page10of21

entropy results in lower temperatures if solid A loses heat to solid B. This choice also makes the units for temperature correct. This is, in fact a more fundamental definition of temperature than the one relating temperature to average kinetic energy; temperature is defined to be the inverse of the change in entropy as a result of heat flow. The proper definition of temperature therefore is

T =∂U

∂S

N ,V

(6)

where the N and V indicate that the number of particles and the volume of the system should be held constant (the system is closed).

One very important thing to notice from the above discussion is that anytime there is an energy flow, , there is also a change in entropy (assuming the number of particles and the volume stay constant). The first law of thermodynamics, says that the change in internal energy equals the heat flow plus the work done;

ΔU = Q + W . To see how much entropy change there is for a given heat flow when no work is done we can use the first law and rearrange Equation (6) as

dS =dU

T=

Q

T . (7)

Now we are in a position to show why the heat flow to the cold reservoir for our heat engine cannot be zero preventing us (or anyone else!) from making a heat engine with 100% efficiency. When heat flows out of the hot reservoir, the entropy of the hot reservoir

decreases by

dSH =QH

TH

. There is no entropy change of the engine since it returns to

exactly the same state from which it started; the process is cyclic. So the second law

requires that the entropy of the cold reservoir,

dSC =QC

TC

, has to increase by at least the

same amount as the hot reservoir. In other words the heat flow QC cannot be zero in order to maintain constant (or increasing) entropy for the closed system. The second law requires the total entropy change to be equal or greater than zero:

dSC − dSH ≥ 0. The best possible case would be an equal exchange of entropy from hot to cold reservoir which leads to the maximum efficiency possible.

Substituting the heat flow expressions (7) into the inequality

dSC − dSH ≥ 0 gives

QC

TC

≥QH

TH

or

QC

QH

≥TC

TH

. The equal signs apply to cases where the process is gradual enough

to be reversible; at any step of the way we could return to the previous step. For irreversible processes (for example if there is friction) the prior state cannot be returned to and the

Page 11: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page11of21

inequality applies. Using the previous definition of efficiency in gives the efficiency of any heat engine undergoing a cyclic process as

η ≤ 1−QC

QH

= 1−

TC

TH

(8)

where it is understood that the inequality applies to irreversible processes.

Notice that we arrived at these expressions without referring to any particular kind of heat engine, they are applicable to all heat engines. These expressions can never yield

efficiency unless we assume infinite temperatures or a reservoir at absolute zero. It is very important to notice that the second law expresses a fundamental limitation of nature that cannot be thwarted by technological innovation. Even if we eliminate all friction and

all other energy losses, we cannot have any better efficiency than

ηmax = 1−TC

TH

for a heat

engine. A physical process that has this efficiency is the Carnot cycle. Most real engines have less than the theoretical maximum efficiency because they are not Carnot cycles and they lose at least some energy to irreversible processes such as friction. The Carnot cycle represents the maximum theoretical efficiency of a perfect (no friction, etc.) heat engine.

One very important point: The second law applies to all processes where one type of energy is converted to another (e.g. mechanical to electrical, thermal to mechanical, electrical to radiant, etc.). HOWEVER: There are different limitations imposed on processes other than heat engines. So for example, if we apply the second law to electric engines we find the limitations are very slight; electric motors in theory can be close to 99% efficient and in fact some real electric motors are nearly this efficient. Fuel cells have theoretical efficiencies of 85% or more. This is because, although some heat transfer is usually involved, they are not heat engines and the second law applies in a different way. Calculating entropy losses for these processes require finding changes in the Gibbs free energy of the system. It is beyond the scope of this outline to go into details of his important point however much more can be found in the references below.

4. Three Different Statistics

As we have seen, the second law is fundamentally a statistical law. We made one fundamental assumption about the atoms involved in that discussion; we assumed that, although the particles (or energy units) were all the same, we could track where each one went. In other words we could distinguish the particles and energy units, and much like billiard balls with numbers on them we could track where each one went. This assumption leads to a type of statistics call Boltzmann statistics which we investigate further in the next few paragraphs. Until the advent of quantum mechanics it was thought that atoms and energy obeyed Boltzmann statistics and in fact in many cases Boltzmann statistics does

Page 12: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page12of21

give the correct answer. We now know, however, that in many important cases sub-atomic particles do not behave like classic billiard balls which leads to a different statistical behavior for large numbers of particles.

The object of the following is just to indicate why there has to be different types of statistics. We will not try to explain all of the applications of these ideas.

In general we are interested in calculating the number of particles

n(E) which have a particular energy E. We will write that in the form

n(E) = g(E) f (E) . Here

g(E) is called the density of states function and is the number of states available at energy E; this is also the statistical weight corresponding to energy E. The density of states may represent discrete (quantized) energy states or a continuous distribution of energies. The distribution function,

f (E) ,is the average number of particles that have energy E or the probability that energy state is occupied. These two functions will be different for different kinds of particles, and for different circumstances as shown below.

Distinguishable Particles Lets start out assuming we can label individual atoms and keep track of where each

goes. Suppose we have three units of energy which we want to distribute between five atoms of an Einstein solid. From Equation (4) we know that the total number of combinations of ways to distributetheenergy is

Ω(N,q) =Ω(5,3) = 35 but we will look at the distribution a little differently now. Here we want to count the total number of microstates which have zero, one, two and three units of energy out of all the possible distributions. In this case we first find the number of ways of distributing all three energy units as a single block among the five atoms. We use Equation (3) and we get

ΩqN =

N!

q!(N − q)!=

5!

1!(5 −1)!= 5 since we only have one block of energy to distribute among

five atoms. The number of atoms with zero energy is four in each case and there are five different ways to have that result. So there are 5(4) = 20 possible distinct microstates with zero energy and five with all three energy units.

Next if we break the energy into two blocks (one block with two energy units and one

with one energy unit) we have

Ω25 =

5!

2!(5 − 2)!=10 ways to distribute either block. But we

don’t know which two units are in the block of two so we should include

Ω35 =

5!

3!(5 − 3)!=10 more ways to distribute the energy for a total of 20 ways to distribute

the three units of energy in two portions (if you doubt this, as any good science student should, get a piece of paper, make five columns and start distributing the three units of energy to see if you get the same results). This also means there are 20 different ways for the other three atoms to have zero energy or 3(20) = 60 distinct states with zero energy.

Page 13: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page13of21

Finally we can break the energy into three individual portions. There are

Ω35 =

5!

3!(5 − 3)!=10 different ways to have one energy unit in one atom. This means there

are also (10)2 = 20microstates with zero energy for this case. Table 4 summarizes these results and shows the proportion of microstates out of the total possible combinations (35 as mentioned above).

Microstates with energy:

Proportion (out of 35)

0 4(5) + 3(20) + 2(10) = 100 100/35 = 2.86 1 1(20) + 3 (10) = 50 50/35 = 1.43 2 (1)20 = 20 20/35 = 0.57 3 (1)5 = 5 5/35 = 0.14

Table4.Proportionofavailablestatesforagivendistributionofenergy.

A plot of the proportion of microstates with each possible energy is shown in Figure (4). A more careful (mathematical) analysis shows this curve to be an exponential given by

fMB =1

AeE / kBT where kB is Boltzmann’s constant, T is the temperature and E is the total

energy. Here A is a normalization constant that depends on the number of particles and can

be determined by requiring that

N = n(E)dE0

∫ . This distribution,

fMB , the Maxwell-

Boltzmann distribution, can be used for classical particles, for example atoms of an ideal gas.

Figure 4. A plot of the data in Table 4 and the Boltzmann distribution, fMB .

Similar information is presented a different way in Table (5) for the case of four particles with two units of energy. Here we can see that there are four ways to distribute

00.51

1.52

2.53

3.5

0 1 2 3 4 5 6

FractionofM

icrostateswithEnergy

Energy

FractionofMicrostateswithEnergy,E

Page 14: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page14of21

two units of energy in one atom with the remaining atoms having no energy. There are six different ways to split the energy equally between them. As we saw above, this way of counting results in Maxwell-Boltzmann distribution.

Energy Distinguishable Particles Number of microstates

2 a b c d 4 1 ab ac ad bc bd cd 12 0 bcd acd abd abc cd bd bc ad ac ab 24

Table 5. Distribution of two energy units between four distinguishable particles: Maxwell-Boltzmann statistics. There are a total of 40 microstates.

To give you the flavor of how the density of states is calculated, imagine an ideal gas

molecule with momentum

p = 2mE = px2 + py

2 + pz2 .Wecan imaginep tobeavector in

momentumspacewithcoordinatespx,py andpz.Thenumberofavailablemomentumstateswill be proportional to the volume

g( p)dp = 4πp2dp of a thin shell of thickness dp inmomentumspace.(Thisislikeaddingupthechargedensityofathinshellorthemomentofinertiaofathinshell;weareinterestedinhowmuchofsomethingisfoundinagivenregion‐in this case howmany availablemomentums are available.)We can convert the density of

states in momentum to the energy variable using

p = 2mE and

dp =mdE

2mE. So

g(E)dE = 4πp2dp = 2m3 / 2 4π E dE .

We can now write

n(E)dE = g(E) fMB (E)dE for the number of particles having

energy E. To find the normalization constant, A, we use

N = n(E)dE0

∫ ††. The final

expression for the number of particles in a gas at temperature T with energy in a range dE around the energy E given by

n(E)dE = g(E) fMB (E)dE =2πN

(πkBT)3 / 2 Ee−E / kBT dE (5)

where N is the number of atoms or molecules. Think of equation (5) as giving the probability that a particle has energy E in the region dE around E.

Equation (5) can be used to find several interesting quantities (try these as exercises for practice):

††Youwillneedtheintegral

x0

∫ e−axdx =1

2a

πawhere

a =1/kBT .

Page 15: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page15of21

1) The average energy per molecule can be calculated:

E = En(E)0

∫ dE = 32 NkBT .‡‡

Recall that for an ideal gas the only energy the molecule can have is kinetic (

E = N 12 mv 2 ) so dividing by the total number of atoms we get the average kinetic

energy per molecule as

32 kBT = 1

2 mv 2 (should look familiar from first semester physics!).

This is an example of the equipartition theory of classical statistical mechanics; each degree of freedom gets

12 kBT of energy. For an idea gas there are three degrees of freedom

because the molecule can move in three directions.

2) Changing variables to velocity (

E = 12 mv 2 and

dE = mvdv ) gives the Maxwell-Boltzmann

velocity distribution in your book:

n(v)dv = 4πNm

2πkBT

3 / 2

v 2e−mv 2 / 2kBT dv .

3) Using the

32 kBT = 1

2 mv 2 we can get the root mean square (square root of the average

value of) speed,

vrms =3kBT

mforidealgasmoleculesattemperatureTwithmassm.

4)Theaveragespeedcanbefoundfrom

v = vn(v)dv =8kBT

πm0

∫ .

5) The peak speed

vpeak =2kBT

m and can be found by setting

∂n(v)

∂v= 0 and solving for

v = vpeak .

6) The specific heat of an ideal gas at constant volume should be

CV =∂E

∂T|V = 3

2 kB NA = 32 R

where R = 1.9787 cal/molK is the ideal gas constant,

NA is Avogadro’s number of atoms and E is the average energy (found above). This expression for specific heat works well for monatomic ideal gases which have specific heats around 1.50 cal/molK.

6) For the case of more complicated molecules the equipartition theory predicts more than three degrees of freedom. A diatomic molecule should have two additional rotational degrees of freedom (rotation around the axis joining the two atoms is too small to contribute much) so specific heat could be

CV = 52 R . Some diatomic gases (CO, H2, HCl,

N2, NO, O2) have this specific heat, some do not. Molecules can also vibrate adding two more degrees of freedom or a specific heat of

CV = 72 R which is right for CO2 and N2O. But

Cl2, H2S and SO2 do not fit these patterns.

‡‡Use a table of integrals,

x 3 / 2

0

∫ e−axdx =3

4a2

πaand a change of variables.

Page 16: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page16of21

7) For a solid the atoms can be considered to be harmonic oscillators with energy

kBT per degree of freedom. So we expect the specific heat of a solid with N atoms to be

CV =∂E

∂T|V = 3kB NA = 3R. This law (the Dulong-Petit law) is correct for many solids at

room temperature but not for lighter elements (e.g. diamond). And specific heat is measured to be temperature dependent for solids at very low temperatures, not a constant as the equipartition theory predicts. This problem was a piece of the experimental data that eventually supported the idea of quantized energies and requires a different set of statistics.

Indistinguishable Particles Particles don’t have tags on them and one important property atoms have is that they

are in fact all identical. Real atoms are indistinguishable which means that Boltzmann statistics aren’t correct in the real world. This is akin to the difference between having a drawer full of socks that are all the same color versus a drawer where there are many differently colored pairs. You only need to pick two socks to get a match if they are all the same but statistically a match is unlikely if there are many different pairs and you only pull out two. In the first case the socks are indistinguishable and in the second they are distinguishable so the statistics are different. It turns out that for many cases, particularly in the case of ideal gases, Boltzmann statistics gives an answer nearly the same as the correct answer. It is only when the quantum nature of atoms needs to be taken into account that the statistics for indistinguishable particles must be taken into account.

Quantum particles can be divided into two types. Sub-atomic entities that have half integer spin (electrons, protons, neutrons, some atoms) are called fermions and obey one type of statistics while entities with no spin (photons, some pairs of atoms) are called bosons and obey a different type of statistics. In the following a brief explanation of why the statistics are different for these two types of particles is given. Only a sketch of the difference between the two statistics is given; you will have to consult your text for more details.

Bosons

Quantum particles are indistinguishable, we cannot tell them apart. In this case we see that there is only one way to put both units of energy in one atom with the other three having zero energy because we cannot keep track of which atom has the energy. Likewise there is only one way to split the energy equally, again because we cannot know which atom is which.

Energy Bosons Number of microstates

2 x 1 1 xx 2 0 xxx xx 5

Table 6. Distribution of two energy units between four indistinguishable Bosons: Bose-Einstein statistics. There are a total of eight microstates. Compare to Table (5).

Page 17: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page17of21

The statistics for the case of the indistinguishable atoms in Table 6 are called Bose-Einstein statistics. The distribution function for this type of statistics looks like

fBE =1

BeE / kBT −1where again kB is Boltzmann’s constant, T is the temperature and E is the

total energy. Here B is a different normalization constant.

The most frequent use of the Bose-Einstein statistics is for photons which travel at the speed of light, c, and have energy E = hf where f is the frequency of the photon and h is Planck’s constant. The energy density, u(f), of photons with frequency f in a cavity at temperature T emitting black body radiation is given by the Bose-Einstein distribution which is

u( f )df =8πhf 3

c 3

1

ehf / kBT −1df = hf g( f ) fBE ( f )df (6)

where

g( f ) =8πf 2

c 3 df is the density of states for light waves in a cavity (the number of

available frequencies for waves trapped in a cavity).

Try the following as exercises:

1) A plot of equation (6) gives the blackbody curve found in discussions of heat transfer by radiation and verified experimentally. Using the Maxwell-Boltzmann distribution, fMB, in this application gives the wrong shape for the blackbody curve. (The average energy per oscillator state of

kBT times the density of states for light gives the Rayleigh-Jeans curve

u( f )df =8πf 2kBT

c 3 df , sometimes known as the “ultraviolet catastrophe” because although

it matches experiment for small frequencies it is very wrong for ultraviolet (higher) frequencies.)

2) Wien’s law,

λmaxT = 2.898 ×10−3Km , gives the maximum wavelength given off by a blackbody (e.g. the sun, the earth’s surface, any object above absolute zero) at temperature

T. This law can be derived from the Bose-Einstein distribution by setting

∂u(λ)

∂λ= 0 and

solving for λ(you have to change variables to wavelengths using

fλ = c ).

3) The Stephan-Boltzmann law,

U = u( f )df0

∫ =8π 5kB

4

15c 3h3 T 4 =σT 4 gives the total energy,

U, given off by a blackbody where

σ is Stephan’s constant. This law can be derived by integrating equation (6) from zero to infinity.

4) If the object is not a perfect blackbody the Stephan-Boltzmann law becomes

U = εσT 4 where

ε is the emissivity (a number from zero to one which basically gives the reflectivity of the object). Balancing the energy given off by the sun (at a temperature of 5800K) at a distance equal to the earth’s orbit with energy given off by the earth in the infrared gives

Page 18: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page18of21

the surface temperature of the earth (Note: The answer is 33C lower than we measure because of trapping of heat by the atmosphere).

5) Specific heat for a solid insulator is predicted by Maxwell-Boltzmann distribution to be

CV = 3R as shown above. Conductors have free electrons that add another one and a half degree of freedom so we should have

CV = 4.5R for conductors. Both of these classical predictions are wrong. Einstein assumed the vibrations in a solid had quantized energies of

E = hf and were indistinguishable (just like photons), which requires Bose-Einstein statistics. These quantized lattice vibrations can be thought of as particles and are given the name phonons. For NA atoms the energy of these phonons would be given by

E = 3NA hf ⋅ fBE =3NA hf

ehf / kBT −1. This leads to a prediction of specific heat that is much closer

to the measured specific heat of real solids.§§ We have

CV =∂E

∂T|V = 3R(E /kBT)2 eE / kBT

(eE / kBT −1)2 which does have nearly the correct temperature

dependence. At high temperature (E<<

kBT ) this reduces to

3R

or

kBT per degree of freedom which is what the equipartition theory predicts***. The failure of the Maxwell-Boltzmann statistic to correctly predict the specific heat of a non-ideal gas and of solids was in fact one of the problems that lead to the discovery of quantum mechanics.

Fermions

There is another possibility for indistinguishable particles. Suppose the particles are fermions, in other words we can put a maximum of two in the same state, one with spin up, the other with spin down. Table 7 shows the result.

Energy Fermions Number of microstates

2 0 1 xx 2 0 xx 2

Table 7. Distribution of two energy units between four indistinguishable Fermions: Fermi-Dirac statistics. There are only four available microstates. Compare to Tables (6) and (5).

The statistics for the case of the indistinguishable atoms in Table 7 are called Fermi-Dirac statistics. The distribution function for this type of statistics looks like

fFD =1

CeE / kBT +1where again kB is Boltzmann’s constant, T is the temperature and E is the

total energy. Here C is yet another normalization constant. This distribution is often put in a

slightly different form:

fFD =1

e(E−EF ) / kBT +1 where

EF is called the Fermi energy. At zero

§§As we will see below, including electrons requires using the Fermi-Dirac distribution which changes the specific heat prediction by a little bit.

***Useex=1+x+…forx<<1.

Page 19: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page19of21

degrees all the electrons of a system would be in the ground state. Notice that for

E < EF

we have

fFD =1

e−∞ +1=1 and for

E > EF we have

fFD =1

e+∞ +1= 0. So the Fermi energy is

the energy of the highest filled energy state at absolute zero temperature.

How can we find the density of states, g(E)? Suppose we are talking about a free electron in a 3D box. The number of states dN with energy in the range of energies between E and E + dE is defined by dN = g(E)dE where g(E) is what we want to find. Recall for a

3D box we have quantized energies

En2 ,n2 ,n3=

2π 2

2mL2 (n12 + n2

2 + n32) =

2π 2

2mL2 R2. Inverting this

gives

R =2mL2

2π 2 E . The total number of waves possible in a sphere with radius R would

be

Ns = 43 πR3 becausethevaluesofncountthenumberofwavespossible.Tocountenergy

states,however,weonlywantpositivevaluesofn.Soreallythevolumeweareinterestedinis1/8of thetotalvolume(thequadrantwithallpositivevaluesofn1,n2,n3)sothenumberofenergystatesavailable is

N = 18

43 πR3 .Butelectronshave twospinssoreally it is twice this

number or

N = 13 πR3 . Using the expression for R we have

N =(2mE)3 / 2 L3

33π 2 . Taking a

derivative with respect to E gives

dN =(2m)3 / 2 L3

23π 2 E dE . So

g(E) =(2m)3 / 2 L3

23π 2 E . This

showsthedensityofstatesforwaves inacavity(electronsorphotonsorotherparticles) isproportionaltothevolumeofthecavityand

E .

The Fermi-Dirac distribution is often used in solid-state physics. For example, the number of free electrons in a metal with energy E is given by the distribution

n(E)dE =3N

2

EF3 / 2 E

e(E−EF ) / kBT +1dE = g(E) fFDdE (7)

where, g(E) is the density of states found above (free electrons in a metal behave like electrons in a 3D box). For this case Fermi energy (the highest filled electron energy level

if the solid is at zero Kelvin) is

EF =h2

2m

3N

8πV

2 / 3

where h is Planck’s constant, m is the

electron mass, N is the total number of electrons and V is the volume of the solid containing the electrons. Note that this will be a small number, on the order of a few eV. Electrons are fermions so the number of microstates is very different than if they were bosons.

Here are some applications and other points:

1) The average energy of the electrons in a metal at T = 0 is found by:

E = En(E)dE0

EF∫ = 35 NEF . So the average energy per electron is

E /N = 35 EF .

Page 20: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page20of21

2) The Fermi energy tells us whether a solid is an insulator, conductor or semiconductor. The quantized energy levels of a large number of atoms merge when a solid is formed and become large energy bands holding many electrons. If a band is full, no electrons can change energy (i.e. move) so the solid is an insulator. If the highest filled energy level (the Fermi level) occurs in the middle of a band so that the band is not filled, electrons can move and the solid is a conductor. Semiconductors occur when the Fermi level falls in the gap between a full and an empty band and the band gap is small enough that electrons can jump the gap, given enough energy.

3) Away to think about the difference between bosons and fermions is thatmanybosons can be put into the same energy level because there are many availablemicrostates. This is a statistical feature that makes lasers possible; a laser is asituationwhere therearemanyphotons, all inexactly the samestate.Fermions,ontheotherhand,donotliketobeinthesamestate,statisticallyspeaking.Ineffectthestatisticalunlikelihoodoffindingfermionsinthesamestateactsaskindofapressurepreventingthemfromgettingtooclose.Thefactthat,evenatT=0theelectronenergylevels are filled up toEF means there are electrons at nonzero energy. Thismakeswhitedwarfandneutronstarspossible.The intensepressureat thecenterofastarcausesfusionthatreleasesenergythatkeepsthestarfromcollapsingduetogravity.Depending on the size of the star, there may be enough pressure to form heavyelements but at some point there is insufficient pressure for further fusion. At thispoint gravitywins and the star starts to collapse. One possible outcome is the starcollapses so rapidly that the outer layer “bounces” off the inner layer and the starexplodes in a supernova. For smaller stars (m < 1.4msun) the statistical repulsionbetween electrons may prevent collapse and the star becomes a white dwarf thatgraduallycoolsoff.

4) For intermediate sized stars 1.4msun <m < 3msun, there is enough gravitationalattraction (more than 0.8MeV per particle) to form neutrons out of protons andelectrons but the statistical pressure between neutrons (which are also fermions)prevents further collapse. This final state is a neutron star. If the star is massiveenough(m>3msun),gravityovercomesthestatisticsandthestarcollapsesintoablackhole.

Graphsofthethreedistributions:

Page 21: The second law of thermodynamics (and a little statistics!)pages.iu.edu/~kforinas/K/WebUses/2ndLawStat.pdfThe second law of thermodynamics (and a little statistics!) 10/3/10 The First

Page21of21

1)ThermalPhysics,D.V.Schroeder(AddisonWesleyLongman,2000).2) Statistical Mechanics, A Survival Guide, M. Glazer, J. Wark (Oxford University Press, 2001).

!"

!#$"

!#%"

!#&"

!#'"

("

(#$"

(#%"

(#&"

(#'"

$"

!" )" (!" ()" $!" $)"

!"#$%&'()*+,#-.'

/0$##'-1)2$13456")'(01%0'/.'

*+"

+,"

-."

!"

!#$"

!#%"

!#&"

!#'"

("

(#$"

(#%"

(#&"

(#'"

$"

!" )" (!" ()" $!" $)"

!"#$%&'()*+,#-.'

/0$##'-1)2$13456")'(7#-147'/.'

*+"

+,"

-."

!"

!#$"

!#%"

!#&"

!#'"

("

(#$"

(#%"

(#&"

(#'"

$"

!" )" (!" ()" $!" $)"

!"#$%&'()*+,#-.'

/0$##'-1)2$13456")'(,67'/.'

*+"

+,"

-."