This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CHAPTER ENTROPY For the present I will limit myself to quoting the following result: if we
imagine the same quantity, which in the case of a single body 1 have called its
entropy, formed in a consistent manner for the whole universe (taking into
account all the conditions), and if at the same time we use the other notion,
energy, with its simpler meaning. we can formulate the fundamental laws of the
universe corresponding to the laws of the mechanical theory of heat in the following simple form:
1. The energy of the universe is constant.
2. The entropy of the universe tends to a maximum. Rudolph Clausius. in Anna/en der Physik, Vol. 125 (1865) 17.1 TOWARD AN UNDERSTANDING OF ENTROPY In this chapter we turn our attention to the entropy principle, a concept which, like
Newton’s second law, is an organizing principle for understanding the world. The principle
is relatively simple to state, but understanding its meaning is more challenging.
Through theoretical studies of Carnot’s work in 1865, the German physicist Rudolph ,
Clausius introduced a new physical quantity closely linked to energy. He called it entropy 327 328 ENTROPY a word which sounds like energy and comes from the Greek word for transformation.
The use of entropy provides a way to analyze the behavior of energy in transformation. To obtain an intuitive feeling for the concept of entropy, let’s start with a familiar
mechanical system we’ve used many times: Galileo’s experiment with a ball rolling down
and up two inclined planes. If friction is ignored, the ball rolls down one plane and back
up the other, conserving mechanical energy. But if friction is not ignored, as time goes
on the ball loses energy. This energy is transformed to heat which warms the ball and
the material on the inclined planes. Without friction, all of the energy in the system can
be accounted for by describing the motion of a single object, the ball itself. But with
friction, the energy is shared among all the atoms and molecules that have been warmed
by friction. As time goes on, even remote parts of the apparatus are warmed through
conduction of heat. As a result, the number of objects sharing the energy continues to
grow. When the ball finally comes to rest at the lowest point of its travels, a huge number
of atoms have gained roughly the same fraction of the energy originally available. The
number of objects sharing this energy has increased dramatically, from one object (the
ball) to a number on the order of 1026 (the number of atoms and molecules in the ball,
the material of the planes, their supports, and everything else in the system). Consequently,
everything in the system is slightly warmer. Presently (in Example 2) we shall show that
everything warms up by about 0.002°C, not enough for anyone to notice. When all the energy resides in the ball, the ball can perform a positive amount of
work. But at the end of the experiment, when the energy has been distributed among
1026 atoms and molecules in the form of heat, no useful work can be extracted. The
entropy of the system is a measure of the amount of energy unavailable for work. For
this particular experiment the amount of energy transformed to heat keeps increasing as
more and more subunits share the energy. Very crudely, the entropy tends to increase in
proportion to the number of subunits sharing the energy. The example of the rolling ball illustrates two important points. First, the transfer
of energy in this process in irreversible. We cannot make the ball gain energy and roll
back up the inclined planes by cooling the system. Second, the entropy S increases as
more and more subunits come to share the available energy. But once all the atoms have
roughly equal shares of the energy, the entropy can increase no further. These two points are contained in the following statement of the entropy principle:
In an irreversible process, the total entropy of a system always increases until it reaches
a maximum value. After that, nothing else happens. Example 1
If the experiment just described is carried out with a 0.50kg ball which is initially lifted
to a height of 1.0 m, estimate the final change in the average energy per atom in the
system. Initially the system has potential energy E = mgh = (0.50 kg)(9.8 m/sz)(1.0 m) = 4.9 J.
The average energy per atom is roughly E/N z 4.9 X 10‘26 J. 17.1 TOWARD AN UNDERSTANDING OF ENTROPY 329 W ___—_——____________.—.—_———————— Example 2
By the end of the foregoing experiment, everything in the system is slightly warmer. How much warmer? For this estimation, we treat the system as though it were an ideal gas consisting of
N = 1026 atoms, and we assume that all the initial potential energy E (calculated in
Example 1 to be 4.9 J) is equally distributed as heat among the N atoms. Then the increase
in energy of each atom is E/N. Now in Chapter 15 we found that the average kinetic
energy E of an atom in an ideal gas at absolute temperature T is given by K = 31d", (15.12) AK
AT = —— .
3k/2
In this example, A? = E/N so
E/N
AT = — .
316/2 Substituting the values E/N = 4.9 X 10"26 J and k = 1.38 X 10—23 J/K we find the
temperature of the system rises by a meager 2.4 X 10—3 K, or about 0.002°C. (Most
solid materials have more energy per atom than an ideal gas at the same temperature so
the temperature change is actually smaller — perhaps half this size.) ——_—————————_————
_—————————_ Example 3
For the system in Example 1, what is the average speed of the center of mass of the ball at the end of the experiment? The ball doesn’t actually come to rest; it continues to jiggle around with the same
fraction of the original energy of the system that each atom ends up with, namely E/N.
The average speed 7 of any particle of mass M in the system at the end of the experiment satisfies M52 = E/N. 1
2
Hence for the center of mass of the ball we have
2E 2(4.9 J) _ m
‘= —= ———=4.4X1013—.
v M \/(1026)(0.50 kg) s Now that’s splitting hairs! 330 ENTROPY Questions 1. Is there any change in the entropy of a ball that is strictly obeying the law of iner
tia? 2. Can you think of a process for which the entropy decreases? 17.2 ENGINES AND ENTROPY There is a quantitative meaning for entropy that arises out of an analysis of reversible
processes. In studying the efficiency of a Carnot engine we learned that if an ideal engine
absorbs an amount of heat Q, at an absolute temperature Ti and then discards heat Q0 at
absolute temperature To, the quantities are related by Eq. (16.15), Qo/Qi = To/Ti, or 9.2 _ 21
To Ti . (17.1) In other words, in a reversible process, the quantity Q/T is conserved, and therefore has
physical significance. The ratio Q AS — T (17.2)
is also called the change in entropy and has units joules per kelvin (J/K). In a reversible process Q/T stays constant, but in an irreversible process the ratio
Q/T increases. Increasing Q/T has the same effect as friction increasing the entropy of a
system such as the rolling ball described earlier. To better understand the relation between
this meaning of entropy and the qualitative description in Sec. 17.1 we compare an ideal
Carnot engine with a nonideal engine. At a high temperature T, a Carnot engine has entropy change AS = Qi/Ti, and at
a low temperature To its entropy change is AS0 = QO/To. For the ideal Carnot engine,
Eq. (17.1) states that Aso = ASi (ideal). But a less efficient (nonideal) engine starting with the same Qi and Ti produces less work
and therefore deposits more heat Q; at temperature To. Hence its entropy change at the
low temperature T0 is g; A’=
so To s a quantity larger than the ratio ASo = QO/TO. In other words,
ASA > ASi (nonideal). So ideal engines keep Q/T constant, and nonideal engines increase it. Now we can see the analogy to friction. The ideal Carnot engine doesn’t create any
entropy. That’s why a Carnot engine connected to a Carnot refrigerator could operate
forever, but do nothing besides run itself. It would be exactly analogous to a frictionless
ball in Galileo’s experiment that rolls up and down the inclined planes forever. In the 17.2 ENGINES AND ENTROPY 331 real world there is always friction and rolling balls always come to rest. Likewise, in the
real world there are no ideal Carnot engines and entropy is not conserved. What property of the Carnot cycle makes the change in entropy zero? The key is
found in the reversibility of the cycle. In order for a process to be reversible, friction
must be eliminated and the process must be quasistatic. Most processes in nature, however,
are irreversible. To prove that the entropy increases in irreversible processes, let’s develop
a theoretical method to handle such processes. If we now adopt the convention that the heat extracted is negative and heat input is positive, so that Q0 = — IQol and Q, = IQiI, then the relation for the Carnot engine,
_Q_il = lQol ’ (17.1)
Ti T0 becomes
Qi Q0
_ + _ 2 0. 17.3
Ti To ( ) This was the starting point of Clausius’s derivation of the entropy principle: In a reversible
Carnot cycle, the entropy of the system does not change. What about any reversible
engine? Clausius realized that any reversible engine operating between two heat reservoirs
can be approximated as accurately as desired by a series of alternating quasistatic adiabatic
and isothermal paths. Here’s how: first a number of adiabats are drawn, then adjacent
adiabats are connected with two isotherms. The temperatures of the isotherms correspond
to the temperatures at the top and bottom of the strip. This idea is illustrated in Fig. 17.1. Pressure A adiabat isotherm Volume Figure 17.1 Any reversible cycle on a PV diagram can be
approximated as well as desired by a series of alternating adiabats and
isotherms. The lines A, B, C, D, for example, constitute a Carnot cycle in Fig. 17.1. In other
words, an arbitrary cycle can be approximated by a series of Carnot cycles. Since no
heat is absorbed or rejected in the adiabatic parts of the cycle, all the isothermal parts
may be paired off into Carnot cycles. And we know how to analyze Carnot cycles. The
approximation can be made as close to the actual cycle as we wish by making the mesh 332 ENTROPY of adiabatic and isothermal lines still ﬁner. For each of the small Carnot cycles, we know
that
m +
Ti AQo
T o = 0. (17.3) Summing over all the isothermal paths along which heat AQJ is either absorbed or rejected
at temperature Tj, we obtain 2 9% = 0. (17.4)
1' T‘ I By letting the heat AQJ» become arbitraﬁly small and the number of such subcycles become
arbitrme large, the sum approaches an integral: %d—TQ = 0, r (17.5) where the symbol f is that for a line integral taken around a complete cycle. In other
words, the change in entropy for any reversible system is zero. This implies that for a
reversible cycle, the entropy difference between two states depends only on those states
and not on the particular path connecting them — a result of Clausius first published in
1854. To understand why the entropy is independent of the path, consider the system moving
reversibly along path 1 from A to B as shown in Fig. 17.2, then back to A along a
different reversible path. Equation (17.5) implies that B d A d I _Q + I _Q = A T B T pathl path2 Because each path is reversible, l§%=lf%9 path2 path2
so that
f3 Q _ [B Q
A T A T '
path 1 path 2 Since this is true of an arbitrary cycle through points A and B, the change in entropy is
independent of the path taken, provided that path is reversible. This result is surprising
because the heat entering or leaving the substance does depend on the path, yet the entropy
change doesn’t. From this analysis it seems reasonable to define the change in entropy between any
two states A and B connected by a reversible process as B A T . (17.6) 17.2 ENGINES AND ENTROPY 333 Pressure path 2 path 1 Volume Figure 17.2 A reversible process from a state A to a state B made
along two different paths in the PV diagram. The change in entropy depends only on the initial and final equilibrium states of a system
and not on the path joining them. What is the change in entropy for an irreversible process? An irreversible process
cannot be represented by a continuous path on a PV diagram because it doesn’t move
through a series of equilibrium states. However, we can join the ends of an irreversible
process on a PV diagram by a reversible process, as shown in Fig. 17.3. Then we can
approximate the entire cycle by a sequence of small Carnot cycles. Consider a portion
of the cycle operating between TU and Toj in which heat AQU is extracted from the hot Pressure ' '3 .. Irreversible Reversible Volume Figure 17.3 The final equilibrium states of an irreversible process can
be connected by a reversible process between the same points. reservoir and an amount Aon is discarded at low temperature. The irreversible cycle is
less efficient than a reversible cycle operating between the same two temperatures, so we have
1 _ _Aon _ 531
AQij Tij ,
or by rearranging terms,
A i A 04
Q’ + Q J < 0. T To 1J J 334 ENTROPY Summing over all such cycles, and passing to the limit we obtain i; 312 < 0, (17.7)
T for an irreversible cycle. What has happened is that the irreversible part of the cycle has
generated more heat than it should have, and that extra heat has been extracted from the
system to bring it back to its starting point. That’s why the integral i; dQ/T is negative.
The result (17.7) is known as the Clausius inequality and it allows us to prove that in
any irreversible process the entropy of an isolated system always increases. Here’s how: Imagine that the irreversible process shown in Fig. 17.3 proceeds from
the equilibrium state A to B. For the complete cycle, first going from A to B in an
irreversible process, then returning from B to A by a reversible path, Clausius inequality
implies B A
iQJ+ 1Q..R<O,
A T B T irreversible reversible or
[email protected]<_f‘£23,
A T B T 01'
[31%<Fﬂy
A T A T From the deﬁnition of entropy for the reversible path, we know B
(1
AS = SB _ SA = [A % , SO BdQ
ATI<SB_SA. This last inequality implies that as a result of the irreversible process taking the isolated
system from equilibrium state A to B, the entropy of the ﬁnal state is greater than the
initial state. This is true even for an adiabatic process occurring in an isolated system.
For an adiabatic process, no heat is exchanged, so the lefthand side of the equation is
zero, implying that SB > SA. In other words, even for an isolated system, the entropy
of the system increases after any irreversible process. This result makes a profound statement about the behavior of the universe. Since
the universe is an isolated system, the entropy of the universe must increase in time. As
a consequence, the universal usefulness of energy decreases. 17.2 ENGINES AND ENTROPY 335 Example 4 An ideal gas is allowed to expand freely and adiabatically from a volume V1 to a volume
V2. What is the change in entropy for this irreversible process? (A free expansion is one
in which no work is done, as when a balloon bursts in a vacuum.) Since no heat is exchanged with the surroundings, we might think that the change
in entropy is zero. However, something irreversible does happen. We never observe the
reverse process in which the air in a room freely contracts. On the other hand, since
neither work nor heat is exchanged the temperature of the gas does remain constant,
because the temperature of an ideal gas depends only on its energy, U. We compute the
change in entropy by considering a reversible process connecting the two states: a quas
istatic isothermal expansion from V1 to V2 at constant T. In this case, the change in
entropy will be given by Q/T, where Q is the heat absorbed by the gas. Since the internal
energy is constant in this process, the work done by the gas arises from heat absorbed
by it. From Eq. (16.2) we calculate the work done by the gas as V2 VZdV V2
=W= PdV=Nka ———=NkT1 —.
Q vi vi V n<V1) Therefore, the change in entropy for the irreversible process is AS = Q/T = Nk1n(V2/V1). Questions
3. Find the increase in the entropy of the system described in examples 1 and 2.
4. Construct an argument explaining why a gas by itself never freely contracts. 5. Suppose a vessel consists of two chambers, each of the same volume. In one cham
ber there is helium gas, and in the other there is argon gas at the same temperature
and pressure. The partition between the chambers is suddenly removed. Does the
entropy of the system increase? Explain your answer. 6. Calculate the change in entropy for the case of Question 5. 7. One mole of an ideal gas expands reversibly and isothermally from a volume of
15 L to a volume of 45 L. (a) Find the change in entropy of the gas. (b) Determine
the change in entropy of the universe for this process. 8. A system absorbs 200 J from a heat bath at 300 K, does 50 J of work and rejects
150 J of heat at a temperature T, then returns to its initial state. (a) Calculate the change in entropy of the system for a complete cycle.
(b) If the cycle is reversible, what is the temperature T? 9. A 2.0—kg block is dropped from a height of 3.0 m above the ground, strikes the
ground, and remains at rest. If the block, air, and ground are all initially at a tem
perature of 300 K, what is the change in entropy for the universe in this process? 336 ENTROPY 17.3 ENTROPY AND THE SECOND LAW OF THERMODYNAMICS Clausius’s indelible contribution to thermodynamics was his analysis of,irreversible pro—
cesses. He derived his inequality (17.7) from the observation that heat never ﬂows by
itself from low temperature to high temperature. For example, an ice cube placed in water
never causes the water to boil. He concluded that a negative value of the change in
entropy for an isolated system would correspond to heat ﬂowing from cold to hot.
Consequently, the change in entropy for any cycle must be zero if it is reversible, and
positive if it is irreversible. Later he capsulized his ideas in the lines quoted at the beginning
of this chapter: “The energy of the universe is constant. The entropy of the universe
tends to a maximum.” Let’s look into another simple, but crucial, example of how the entropy principle
works when there is no friction involved. Suppose a body consists of two parts that are
not at the same temperature. One body is at temperature Ti and the other is at T0. Now
imagine momentarily connecting the two parts by a piece of copper wire, as in Fig. 17.4. \\< Kim
K . copper w1re / \ / i T T l 0 Figure 17.4 Heat conduction from a hot body to a cool body. Since copper is a good conductor of heat, heat ﬂows from the hot piece at Ti to the cold
piece at T0. The entropy change of the hot piece is ASi = —g . (17.2) T1 No work is done, and all the heat ﬂows into the cool body at T0. The entropy of the
cooler body is increased by A&=g.
To The process has thereby increased the entropy of the two pieces combined by AS=ASO+ASi=g—Q To Ti , which is positive because T0 < T1. The flow of heat warmed the cooler piece and cooled the warmer piece. The two
temperatures are now closer together, but so long as the temperatures are not equal, we
can reconnect the copper wire and repeat the process. Each time we do so, the entropy
of the combined body will increase slightly, and the two temperatures will become closer
to each other. We can repeatedly reconnect the two bodies until the two temperatures 17.3 ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 337 become equal. Then no heat will ﬂow, and the entropy will no longer increase. At that
point, without changing the mass, volume, or energy of the system, we have made the
entropy as large as it can possibly be. And at that point the combined body has reached
thermal equilibrium. This situation is closely analogous to the ball at the end of Galileo’s
experiment, which is in a state of mechanical as well as thermal equilibrium. Example 5
A copper wire connected between two large pieces of metal conducts 40 J of heat from one piece at 400 K to the other at 350 K. What is the change in entropy for this process?
The hot piece of metal loses an amount of energy Q = 40 J. Since this occurs at a
constant temperature Ti = 400 K, the change in entropy of this body is Asi = —Q/Ti = —(40 J)/(400 K) = —0.IOJ/K. Now the other metal piece gains 40 J at a temperature of 350 K, so its change in entropy
is ASo = Q/T0 = (40 J)/(350 K) = 0.11 J/K.
Therefore, the overall change in entropy is AS = Aso + Asi = 0.01 J/K. This simple and obvious argument has breathtaking consequences. As we’ve seen,
the entropy of a body can decrease: in our example, the hot body kept losing entropy.
However, the combined entropy of both pieces increased. If energy can ﬂow in or out
of a body, its entropy can increase or decrease. But if the total energy of a system is
fixed, then the entropy can only increase until it reaches a maximum value. At that point
the system has reached thermal equilibrium, and nothing more will happen. What we’ve seen now is that a rolling ball eventually coming to rest at its lowest
point and two pieces of matter placed in contact eventually reaching the same temperature
are both consequences of the law of increase of entropy. In both cases, of course, energy
is conserved, but in neither instance does simple conservation of energy help predict the
result that the entropy principle gives us. That is why Clausius chose the name entropy.
Entropy is something that tags along with energy, keeping track of how useful or well
organized the energy is. As time goes on, entropy tends to increase, meaning that energy
tends to more random, disorganized, useless forms. The association of entropy with usefulness of energy is obvious in our analysis of
Galileo’s experiment, and we can also see the connection in the example we just discussed.
Before the combined body reached thermal equilibrium, there were still two pieces at
temperatures T, and To; instead of connecting a copper wire between them, we could
have connected an engine, and thereby extracted work. In other words, a copper wire is
just an example of the most inefficient possible “engine,” since it merely transforms
heat from high to low temperature, doing no work at all. Once the process is ﬁnished,
entropy has increased to the maximum value,‘everything is at the same temperature, and 338 ENTROPY no work can be obtained. An increase in entropy means a loss of the ability to do work.
Although energy is conserved, its usefulness is destroyed. Also from examples, we see that the entropy of a system can change in two distinct
ways: Another form of energy may turn into heat, as in Galileo’s experiment, or heat
may ﬂow from higher temperature to lower, as in the case of the two bodies connected
by a copper wire. But whatever the cause, the entropy of the universe, that is, of a system
and its surroundings, always increases up to a maximum value. The energy of a system
becomes less organized, less able to do useful work. The entropy of the universe increases with time as the energy of the universe becomes
less useful. Ordered mechanical energy eventually and inexorably is converted into the
unordered, random motion of atoms. This conclusion is a consequence of the second law
of thermodynamics and allows the second law to be cast into another form: The entropy of the universe always increases toward a maximum.
As we shall see, this law makes a profound statement about the fate of the universe. Questions 10. A 1200kg car traveling at 80 km/h crashes into a brick wall. If the temperature of
the air is 25°C, calculate the entropy change of the universe. 11. Suppose 300 J of heat is conducted from one reservoir at a temperature of 500 K
to another reservoir at a temperature T. Calculate the change in entropy of the
system if T equals (a) 100 K, (b) 200 K, (c) 400 K, (d) 490 K. What can you
conclude about the change in entropy as the reservoirs are closer in temperature? 12. One mole of an ideal gas undergoes a free, adiabatic expansion from V1 = 12 L,
T1 = 400 K to V2 = 24 L, T2 = 400 K. Afterward it is compressed isothermally
back to its original state. (a) Compute the change in entropy of the universe in this
process. (b) Show that the work made useless is given by T AS. 13. Which process is more wasteful: (a) a 1.0kg ball starting from a height of 1.5 m
and rolling down and up inclines until it comes to rest at its lowest point, where
the temperature of everything is 300 K; or (b) the conduction of 150 J of heat
from a reservoir at 350 K to one at 300 K? 17.4 AN IMPLICATION OF THE ENTROPY PRINCIPLE We know that entropy depends in some way on how many parts of a system have a share
of the energy, and that when heat ﬂows, an amount of entropy Q/T accompanies the
flow. Entropy has one more very important quality: it is associated with the internal order
or configuration of a body. For example, under otherwise identical conditions (total
volume, temperature, etc.) we might imagine organizing a certain large number of mol
ecules into either a liquid or a solid. The difference between these two states is that in
the solid state, the molecules are arranged in a neat crystal lattice. If we form both states,
the liquid will turn out to have more entropy then the solid. 17.4 AN IMPLICATION OF THE ENTROPY PRINCIPLE 339 We can see this clearly by noting what happens when an ice cube melts. At constant
temperature (0°C) as heat flows into the ice from the warm drink it is immersed in,
instead of warming up, the ice melts. An amount of heat Q ﬂows in at temperature T,
meaning that the entropy increases by Q/T. That entropy indicates the change in the
internal structure of the ice from solid to liquid. In other words, the liquid has a higher
entropy than the solid at the same temperature. So entropy is a measure not only of
uselessness of energy, but also of disorder. The question of why ice melts in the ﬁrst place raises a paradox. If equilibrium is
always a state of maximum entropy, and liquid is a state of higher entropy than solid,
why do solids, like ice, ever exist in equilibrium at all? The crux of the problem is this:
a body of a given energy reaches equilibrium when it has maximized its entropy. We
don’t know yet what constitutes the condition for equilibrium of a body of a given
temperature. Let’s now find the answer to that problem. Consider something we’ll call our system, which is divided into two parts. One part,
which is very small compared to the system (but still large enough to be macroscopic)
is called the sample. Everything else in the system is called the bath. The total energy
of the system will always remain constant. We can therefore apply the entropy principle
to the system: the entropy of the system tends to a maximum value. Moreover, the bath
is always assumed to be in equilibrium; its entropy is always as large as it can be for the
amount of energy it has. Being in equilibrium, the bath has a definite temperature T.
Finally, there is the sample that is not necessarily in equilibrium. In fact the sample could
be in any state at all. Heat is free to ﬂow either way between the bath and the sample.
However, the sample is so small compared to the bath that whatever heat ﬂows in or out
doesn’t appreciably affect the temperature of the bath. Therefore we know that when the
sample finally reaches equilibrium, its temperature will be T. But we don’t know whether
it will be a solid or a liquid, or more importantly what decides its fate when it reaches
equilibrium. When we put our sample in contact with the bath, everything that happens tends to
increase the entropy of the entire system. Now suppose some heat Q ﬂows into the sample
and increases the energy of the sample by AE = Q. That action also decreases the
entropy of the bath by Q AE AS" _ T T T ’
where the minus sign reﬂects the decrease. Since the sample is not in equilibrium, we
don’t know what happens to its entropy, but the event must increase the entropy of the
system, or at least leave it constant — that’s the second law of thermodynamics. If we
call SS the entropy of the system and S the entropy of the sample, ASs = ASb + AS 2 0.
It follows that AE
AS——20.
T In other words, the entropy of the sample must increase by at least as much as or more
than the entropy of the bath decreased. Thus, as time goes on, the quantity AS — 340 ENTROPY AE/T continually increases, until it reaches its maximum value. These quantities, the entropy, energy, and ﬁnal equilibrium temperature, are properties of the sample only. Therefore we can now ignore the bath and make a statement about the sample only.
Let’s write the above result in a slightly different way by multiplying by T: TAS— AEZO, where all the changes refer to those that occurred when heat ﬂowed into the sample.
Since the temperature doesn’t change, we can write the left side as A(TS — E), and
because it increases, its negative obeys A(E — TS) s 0. (17.8) In other words, whereas the entropy of the system increases to a maximum, the quantity
E — TS of the sample tends to a minimum when the sample is kept in a bath at constant
temperature T. The quantity E — TS is called the Helmholtz free energy, or simply the free energy,
and is written F = E — TS. (17.9) The quantity F plays a role in thermodynamics somewhat analogous to that of potential
energy in mechanics: the system is in a state of stable equilibrium whenever F is at a
minimum. The crucial idea is that whenever a sample can lower its free energy, it does.
Although that process might decrease the entropy of the sample, the action always
increases the entropy of the universe. Now we can understand why H20 is sometimes solid and sometimes liquid. We need
to compare the energies and entropies of the two states. And the state of the H20 will
be that for which the free energy is lower. We already know that water, being a more disorganized state than ice, has more
entropy at the same temperature. But what about its energy? The energy consists of the
kinetic plus potential energies of all the atoms and molecules. At a given temperature,
both states have the same kinetic energy. In a gas, liquid, or solid, each atom generally
has ng of kinetic energy in equilibrium. The difference lies in the potential energy. In
solids, unlike liquids, the molecules arrange themselves into stable configurations of the
lowest possible potential energy. Of course, the arrangement that minimizes the potential
energy of a bunch of molecules, fitting one together with another in just the right way,
is the same for any small set of molecules, which is exactly why solids are made up of
identical building blocks that repeat endlessly in a rigid lattice. The liquid, on the other
hand, is disordered; it does not have all of its molecules in this optimal configuration.
Therefore, the liquid has higher potential energy than the solid. Now, the H20 molecules must find the arrangement that will minimize the free
energy. For this purpose, the solid has smaller energy but also smaller entropy. The
liquid, on the other hand, has larger E and also larger S. What combination wins? The
decision is made by the temperature. When T is very small, the negative term TS is small
and unimportant, and the lower E state wins out: at low temperature H20 is solid. But
when the temperature is high, the term TS becomes more important, so the high S state 17.5 A FINAL WORD 341 wins and the HZOis liquid. At some temperature in between, the winner switches from
solid to liquid. That’s the melting point. At that point the ice cube would rather melt
completely than just warm up a little bit. Questions
14. Develop an argument as to why different metals have different melting points. 15. In Chapter 14 we saw that nature tends to seek states of lowest possible potential
energy. Does the behavior of ice at temperatures greater than the melting point
contradict that idea? Explain. 16. Extend the discussion presented in the text to explain why H20 is a gas and not a
liquid above the vaporization temperature. 17. A tree takes unorganized molecules and organizes them into branches and leaves.
Do you think living organisms violate the second law of thermodynamics? Explain
your reasoning. 17.5 A FINAL WORD Entropy is a measure not only of uselessness, but also of disorder. As time goes on,
entropy increases. Energy is degraded to more useless forms, and matter into less ordered
states. Of course, a given bit of matter might temporarily decrease its entropy, but that
always means that something else nearby is increasing its entropy by at least as much,
and usually more. Strictly speaking, the principle of increasing entropy only applies to
systems of conserved total energy. The universe itself is such a system, so the universe
appears to be headed for a state of thermal equilibrium, after which nothing else will
happen. This cheerfully optimistic view of the future is generally referred to as the heat
death of the universe. There is another equally extravagant extrapolation of the entropy principle which
turns around the sentence, “As time goes on, entropy increases” to read “As entropy
increases, time goes on.” In other words, the increase in entropy is the very arrow of
time; all other physical laws would work equally well if time ran backwards instead of
forwards. We have seen that all systems, including the universe, tend to evolve in an irreversible
way. Despite the action of men and women, the inexorable law of nature is for energy
to become less useful and the universe more disorganized. This tendency of the ﬂow of
time was pointed out by an eleventhcentury Persian poetmathematician, Omar Khayyam,
who wrote The Moving Finger writes; and having writ,
Moves on: not all thy Piety nor Wit Shall lure it back to cancel half a line, Nor all thy Tears Wash out a Word of it. There have been many other statements of the second law of thermodynamics, but none
so elegant. ...
View
Full Document
 Fall '07
 Goodstein
 Energy, Entropy, Clausius

Click to edit the document details