Lecture-31 - Chem111 Fall, 2005 Lecture 31: Thermodynamics...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Chem111 Fall, 2005 thermo-23 Boltzmann and Entropy . connection between entropy and probability! S = k B ln W entropy for a given (macro) state # of microstates that can describe that macrostate Boltzmann's Constant = R/N 0 The greater the number of microstates that can describe a given macrostate, the ` more probable that macrostate is, and the higher its entropy (disorder) is Recall that we only see the macrostates. The more ways there are to arrange the particles (molecules; atoms; socks; whatever) to give the same macroscopic state, the more disorder there is "under the surface" of what we perceive as the macroscopically visible state (the ideal gas at a given temperature, pressure, and volume; unpaired socks in a drawer; whatever) Analogy: rolling two six-sided dice. See if you can describe in terms of microstates and macrostates why the (macroscopic) state of a roll of 7 has the highest entropy Lecture 31: Thermodynamics Entropy changes Gibbs Free Energy
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Chem111 Fall, 2005 thermo-24
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/08/2010 for the course CHEM 111 taught by Professor Kenney during the Spring '08 term at Case Western.

Page1 / 6

Lecture-31 - Chem111 Fall, 2005 Lecture 31: Thermodynamics...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online