Unformatted text preview: Chemistry 260 Today in Chemistry 260 • spontaneity • entropy: a measure of disorder • statistical and thermodynamic de5initions of entropy This Week in Chemistry 260 • reading: (Fri.) 13.3, 13.4, 13.6 • 2nd and 3rd laws of thermodynamics, the Gibbs energy • …more on spontaneity • OWL Homework Due on Thursday (3 pm) • MS 8 due Friday, Nov. 11 Lecture 24 November 2, 2011 Demonstration ConcepTest 2NH 4SCN(s) + Ba(OH)2 • 8H 2O(s) ⎯⎯
→ 2NH 4SCN(s) + Ba(OH)2 • 8H 2O(s) ⎯⎯
→ 2 NH 3 (g ) + Ba(SCN)2 (s) + 10H 2O() 2 NH 3 (g ) + Ba(SCN)2 (s) + 10H 2O()
Based on your observations, we can infer that for this reaction: A. q < 0 B. q = 0 C. q > 0 Heat 5lows IN http://www.youtube.com/watch?v=yTzcoyzPQE0 1 Demonstration Spontaneity Δ H° = standard molar enthalpies 2NH 4 SCN( s ) + Ba(OH) 2 • 8H 2 O( s ) ⎯⎯→ 2 NH 3 ( g ) + Ba(SCN) 2 ( s ) + 10H 2 O() F T1 > T2 Non spontaneous: can only be brought about by doing work 2 NH4SCN(s) + Ba(OH)2• 8H2O(s) Non spontaneous changes can be MADE to occur by doing work m y B A T1 = T2
How can we account for the distinction between these two types of change? x Spontaneity is not determined by the First Law  411.15 kJ/mol Common observation leads to two classes of chemical and physical processes: Spontaneous: has the tendency to occur without work being done 2NH3 (g) + Ba(SCN)2 (s) + 10 H2O(l) m . . . . . . Some things happen… some things don’t.  240.12 What Spontaneity is NOT  167.16 NaCl(s)    > Na+(aq) + Cl (aq) Thermodynamics deals with the tendency to change; without more information it is silent on the rate at which that tendency is realized. Calculate ΔH in kJ/mol ΔH =  167.16  240.12  ( 411.15) = 3.87 kJ/mol A reaction happening fast (rate is due to barriers) “Spontaneous combustion” (all combustion is spontaneous) What is NOT an Increase in Entropy This is an endothermic reaction  but clearly spontaneous 2NH 4SCN(s) + Ba(OH)2 • 8H 2O(s) ⎯⎯ 2 NH 3 (g ) + Ba(SCN)2 (s) + 10H 2O()
→
The reverse reactions are not spontaneous  clearly there is more to this than conservation of energy. Qualitatively: “Nature prefers disorder” Better: "A situation with more available states is more commonly observed" NaCl(s) “ordered” ——> Na+(aq) + Cl (aq) “disordered” “con;ined” “less con;ined” A bedroom/desk getting messy (“clean” and “messy” are just two different con5igurations) On the other hand – if there is one way to be clean and many ways to be messy … 2NH 4SCN(s) + Ba(OH)2 • 8H 2O(s) ⎯⎯ 2 NH 3 (g ) + Ba(SCN)2 (s) + 10H 2O()
→ NaCl(s) “ordered” ——> Na+(aq) + Cl (aq) “disordered” Few States Many States 2 Entropy is the quantitative thermodynamic measure of disorder Consider a system with 3 balls A macrostate of the system is 1 ball in each bowl. and 3 bowls: 3 Possibilities # = N! N1! N2! N3! Disordered State A Another macrostate is 2 balls in bowl 1, 1 ball in bowl 2. 6 Possibilities # = 3! 1! 1! 1! # = o rdered More 3! 2! 1 State! 0! A macrostate of the system is 1 ball in each bowl. 6 Possibilities # = N! N1! N2! N3! # = 3! 1! 1! 1! B Two gas bulbs of equal size Gas Molecules Coin Flips The probability of any one molecule being in A is W = 1 2 1 1 Any two molecules in A is W = 1 • = 4 2 2 1 1 1 1 Any three molecules in A is W = • • = 2 2 2 8 1 ( 1 ) 2 2 ( 1 ) 2 3 ( 1 ) 2 Or 3 balls in bowl 1. 1 Possibility # = 3! Ordered State 3! 0! 0! Entropy is the quantitative thermodynamic measure of disorder Consider a system with 3 balls Disorder in terms of probability and 3 bowls: Probability of N molecules in A 1 N
W = 2 3 Possibilities # = 3! 2! 1! 0! Or 3 balls in bowl 1. 1 Possibility # = 3! 3! 0! 0! W = N ( 1 ) 2 Entropy is the quantitative thermodynamic measure of disorder Consider a system with 3 balls Another macrostate is 2 balls in bowl 1, 1 ball in bowl 2. N consecutive heads ( ) and 3 bowls: Another state is 2 balls in any bowl, one in another ball 3 Possibilities # = 3! 2! 1! 0! 6 ways to do this – with three possibilities for each 3 × 6 = 18 Possibilities 3 Entropy is the quantitative thermodynamic measure of disorder Consider a system with 3 balls Or 3 balls in bowl 1. and 3 bowls: Consider a system with 3 balls 1 Possibilities A state of the system is 1 ball in each bowl. # = 3! 3! 0! 0! Or 3 balls in bowl 2. 1 Possibilities # = 3! 1! 1! 1! 6 + 18 + 3 = 27 Total possible states 6 Statistical Debinition of Entropy W ∝ probability of a state, # W = 3 ! = 1 = 3 27 ! 2! 1! 09 Or 3 balls in bowl 1. 1 Possibility # =W 31 = ! 3! 27 0! 0! k = Boltzmann Constant ∴ΔS has units of J K 1 ⎛ 1⎞
W =⎜ ⎟
⎝ 2⎠ 10 = 1
1024 S =  6.93 k S is a state function. The value of S depends on the state, not on how the state was reached. W = probability Flip three coins “isoenergetic” degeneracy of a state, and/or number of microstates S is an extensive function. 5 3 Possibilities Let’s start using “microstates” 1.3805 × 10 23 J K 1 1
⎛ 1⎞
W =⎜ ⎟ =
⎝ 2⎠
32
S =  3.47 k 2 W = 27 = 9 # = 3! 3! 0! 0! Boltzmann Equation Another state is 2 balls in bowl 1, 1 ball in bowl 2. # = N! N1! N2! N3! 3 Possibilities 1 Possibilities S = k ln W and 3 bowls: 6 Possibilities # = 3! 3! 0! 0! Or 3 balls in bowl 3. Entropy is the quantitative thermodynamic measure of disorder A B C D H H H H H T H T H T H H T T H T H T H T T T T T 4 possible “states” 8 possible outcomes S  2  2.08 k  0.98 k  0.98 k  2.08 k S = k ln W = k ln(1) = 0 0  1 1/8 3/8 3/8 1/8 S B C A D By this de5inition using probabilities entropy = 0 when probability =1. ⇒ All possible states are accessible. More ordered states (lower probability) have negative entropy. 4 … doing so allows us to de;ine a conceptual zero point for entropy 4 possible “states” 8 possible outcomes 1 3 3 1 0 k 1.10 k 1.10 k 0 k S 0  2 S  2.08 k  0.98 k  0.98 k  2.08 k 1/8 3/8 3/8 1/8 S  1 W = # of states B A ⇒ C D 1 B C A D We’ve only shifted the scale. 1 3 3 1 0 k 1.10 k 1.10 k 0 k 4 possible “states” 8 possible outcomes If there is only one accessible microstate, the entropy of the system is given by: S = k ln W = k ln(1) = 0 Entropy is a state function; heat is not. Heat that would be transferred in a reversible process ΔS = SB  SA = q>0 S  2.08 k  0.98 k  0.98 k  2.08 k 1/8 3/8 3/8 1/8 Total number of accessible microstates: 8, so S = k ln(8) = 2.1 k Thermodynamic Debinition of Entropy q>0 W = # of states 2 0 dS = dqrev
T A B C D S W = probability Flip three coins H H H H H
T H T H T H H T T H T H T H T T T T T … doing so allows us to de;ine a conceptual zero point for entropy B q
∫A dT rev Adding heat increases the entropy of the system. The entropy of the system increases in an isothermal expansion. Irreversible expansion q = –w = 1.42 kJ w = –P2 ΔV = –1.42 kJ M P1 = 15 atm, V1 = 1 L P1 = 15 atm V1 = 1 L T = 298 K Pressure “isoenergetic” A B C D H H H H H T H T H T H H T T H T H T H T T T T T S W = probability Flip three coins Let’s start using “microstates” “isoenergetic” Let’s start using “microstates” P2 = 1 atm V2 = 15 L T = 298 K M Volume Reversible expansion ⎛ྎ V ⎞ྏ
w = −nRT ln⎜ྎ 2 ⎟ྏ = −4.12kJ
⎜ྎ V ⎟ྏ
⎝ྎ 1 ⎠ྏ
qrev = –wrev = 4.12 kJ P2 = 1 atm, V2 = 15 L ΔSsys = q rev 4.12 kJ
=
= 13.8 J K −1
T
298 K Entropy change in a system is calculated from reversible heat 5 Thermodynamics meets statistics… Thermodynamic de5inition: ΔS = SB − SA = ∫ B A dq rev 1 B
q
= ∫ dq rev = rev
A
T
T
T (V ) qrev = nRT ln isothermal ⇒ ΔU= q + w = 0 wrev =  nRT ln V2
1 (V ) ΔS = nR ln V2
1 (V )
V
2
1 entropy change in the isothermal expansion of an ideal gas. Statistical de5inition (counting states): S = k ln W V1 W1∝ (V1 )N ΔS = k ln (V2)N  k ln (V1)N ( V ) ( V ) ΔS = k ln V2 N = N k ln V2 1
1 N molecules of gas V2 W2∝ (V2 )N ( N ) ( V ) = N NA k ln V2 1
n A
R ⎛ྎ V ⎞ྏ
ΔS = nR ln⎜ྎ 2 ⎟ྏ
⎜ྎ V ⎟ྏ
⎝ྎ 1 ⎠ྏ 6 ...
View
Full
Document
This note was uploaded on 01/19/2012 for the course CHEM 260 taught by Professor Staff during the Fall '08 term at University of Michigan.
 Fall '08
 STAFF
 Chemistry

Click to edit the document details