handout10

handout10 - 4– The Second Law of Thermodynamics 4.1...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 4– The Second Law of Thermodynamics 4.1 Introduction • In 1867, Rudolf Clausius published "Abhandlungen über die mechanische Wärmetheorie, Zweite Abteilung", where he introduced the term entropy (Greek for ’transformation’). REMINDER: Entropy is an extensive state function. • The difference in entropy is a way of measuring the effects of irreversibility of a thermodynamic process. • We use the symbol S , units are Energy Temperature . NOTE: There are different definitions and ways of understanding entropy: ‹ Thermodynamics (Clausius) • If a system moves from state A to B , then the difference in entropy is Δ S = S B- S A = Z B A 1 T ( Q ) d Q . (4.1) • We can think of entropy as a ’measure of waste’ in a heat engine: (at least) T Δ S will be given up by the engine to the surroundings as unusable heat. • This is also true e.g. for chemical reactions: We will see that the Gibbs free energy change of a thermodynamic process is Δ G = Δ H- T Δ S . (4.2) Δ G is a measure of the actual useful energy that can be 4–1 extracted from an isothermal, isobaric system. • Here, we have to subtract T Δ S from Δ H to account for the ’internal losses’ due to entropy. NOTE: We use the word ’free energy’ because this is energy we can extract; the entropic part is ’not free’ (bound) in this sense. NOTE: The new IUPAC name for this is simply ’Gibbs Energy’. › Statistical Mechanics (Boltzmann) The entropy of a (macroscopic) state I is defined as S I = k B log e ( Ω I ) , (4.3) where Ω I is the number of microscopic states in the macrostate. NOTE: We can now see why entropy and the Boltzmann constant have the same units: log e ( Ω I ) is dimensionless. EXAMPLE: We have seen the Ω I before. When we derived the Boltzmann distribution in 2.18, we showed the example of flipping a coin four times and then asked how many times, m 4 ( k H ) , we would observe a game with k H heads: 4–2 Table 2.B k H m 4 ( k H ) p 4 ( k H ) 1 1 16 1 4 4 16 2 6 6 16 3 4 4 16 4 1 1 16 • Here, the macroscopic state I is k H and Ω I = m 4 ( k H ) . • Then it is easy to calculate the log e ( Ω I ) of each macroscopic state: Table 4.A k H = I m 4 ( k H ) = Ω I log e ( Ω I ) 1 1 4 1.3863 2 6 1.7918 3 4 1.3863 4 1 NOTE: When we derived the Boltzmann distribution, we said that, at equilibrium, the most probable macrostate I max is the one with the most microstates Ω I max . 4–3 macroscopically: ’S looks blue’ S ⇒ This also means, that the entropy of a system is maximal when it is at equilibrium . NOTE: This is where the notion of ’entropy as a measure of disorder’ comes from: • Macroscopic state with ‹ small number of microstates ⇒ low entropy › large number of microstates ⇒ high entropy QUESTION: So entropy is a measure of the number of microstates in a macroscopic state. Why do we take the logarithm when measuring entropy, instead of just using Ω I directly?...
View Full Document

This note was uploaded on 12/05/2010 for the course CHBE 251 taught by Professor Scotty during the Winter '09 term at UBC.

Page1 / 21

handout10 - 4– The Second Law of Thermodynamics 4.1...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online