lecture 13 handout - what is entropy

lecture 13 handout - what is entropy - What is Entropy? by...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
What is Entropy? by John Colton, Feb 2011 version a.k.a. “Derivation of the Boltzmann factor and the Maxwell-Boltzmann speed distribution” The “Boltzmann factor” is a vitally important expression which tells you how likely states are to be occupied due to thermal energy. It results from the Second Law of Thermodynamics. One formulation of the Second Law states: Isolated physical systems will tend to be in the macrostate that has the most number of microstates. For example, if you roll two six-sided dice, you will most likely get a 7 as the sum of the two, because there are more ways of getting a 7 than any other number (1+6, 2+5, 3+4, 4+3, 5+2, and 6+1). So, if you jumble two dice in a box and want to know (without peeking) what state they are in, you can say “They probably add up to a 7”. In thermodynamics terms, picture 10 23 molecules instead of two dice; the microstates are the positions/velocities of the molecules, the macrostates are the macroscopic variables (P, V, T) that a given microscopic configuration produces. When you combine two systems, the number of microstates of the combined system is the product of the individual microstates. For example, when you roll one die, there are 6 microstates. When you roll two dice, there are 36 microstates (six of which result in a sum of 7, as enumerated above). When you roll three dice, there are 6 × 6 × 6 microstates. And so forth. For large systems, the phrase “will tend to be in” becomes “will be extremely close to”. For example, if you roll 10 23 dice their total will be VERY close to 3.5 × 10 23 . There are a HUGE number of ways (# microstates) that that total could be achieved, though. So, when we have large systems, let’s deal with the logarithm of the number of microstates instead of the number of microstates itself. Log functions are very efficient at reducing huge numbers to much more manageable ones. (Log 10 (10 23 ) = 23, for example.) By convention we will use log base e. Also, for reasons you will see shortly, let’s multiply the log by a constant which has units of Joules/Kelvin. I will call the multiplicative constant simply “constant” for now. We’ll define this new quantity as S, called “entropy”: S = constant × ln(#microstates) [units of J/K] Since S increases and decreases as the # microstates increases and decreases, we can rephrase Second Law as such: Large isolated physical systems will be extremely close to the state which has the largest S. * Using the logarithm instead of the # of microstates has this added benefit: when you combine two systems, the entropies of each system ADD. This is because the # microstates multiply, so S tot = constant × ln(#microstates1 × #microstates2) = constant × ln(#microstates1) + constant × ln(#microstates2) = S 1 + S 2 Now, let’s think about two systems which can exchange thermal energy. We’ll suppose that we have a small system which is really our system of interest, which comes to thermal equilibrium with a much
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 02/29/2012.

Page1 / 5

lecture 13 handout - what is entropy - What is Entropy? by...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online