Lecture4Ch161Feb4

Lecture4Ch161Feb4 - Key Concepts and Lessons a Entropy b...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Key Concepts and Lessons: a) Entropy b) Constraints c) Density of States for an Ideal Gas d) Temperature Reading: Reif, Ch2,3
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Entropy and Information Now we come to the key concept of Statistical Mechanics: the concept of Entropy. Entropy of a Macrostate equals to Log of Number of accessible microstates (multiplied by Boltzmann constant to account for units consistency as we will see later) S = k ln ! Why log? Because we want Entropy to be extensive, I.e. proportional to size of the system. We will see later that number of microstates of a system scales exponentially with its size, Meaning that entropy scales linearly with systems size. As you will see later Entropy is ‘’ on equal footing’’ with energy in thermodynamics relations - so both have to be extensive quantities. Entropy directly reflects the degree of uncertainty in our prediction of microstates of a system. Indeed probability that we find the system in state j on an energy surface E is: p j = 1 ! = e " S / k In other words greater Entropy means less information, in rare cases when S=0 the microstate appears fully specified (we will encounter that later when we discuss theory of protein folding)
Background image of page 2
Now we can evaluate the probabilities and averages for any interesting parameter of the system (e.g. volume, pressure, density, density distribution etc) P y k ( ) = ! y k E ( ) ! E ( ) where ! y k E ( ) is the number of states with energy in the range E , E + " E ( ) having value of the parameter y k Average value of y k is straightforward: y k = y k y k # P y k ( ) = y k y k # ! y k E ( ) ! E ( )
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Calculation of Probabilities: Why molecules spread evenly in a volume Consider ideal gas in a cubic volume V what is the probability that all molecules assemble in its left half?. Particles are indistinguishable. Now by y k we mean two numbers; (N l, N r ) telling us how many molecules are in the left half of the volume and how many particles are in the right half of it,. Again our old friend combinatorics helps: ! y k E ( ) = ! N l , N r { } E ( ) = N ! N r ! N l ! " ( E ); ! E ( ) = ! N l , N r { } E ( ) N r N # = 2 N ( E ); P y k ( ) = ! y k E ( ) ! E ( ) = 1 2 N N ! N r ! N l ! In particular P y k = N ,0 { } ( ) = 1 2 N i.e. spontaneous location of all molecules in the left half of the partition is an extremely unlikely event for the customary N ~ 10 23
Background image of page 4
Large N guarantees a sharp distribution of any parameter y around its peak value
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Cumulative density of states A related useful concept is the cumulative density of states, I.e. the number of states whose energy is less or equal E Apparently ! , the number of states in the range E , E + " E ( ) is related to the density of states # by a simple relationship: ! E ( ) = d # ( E ) dE E (Q: prove that - 5pts)
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 20

Lecture4Ch161Feb4 - Key Concepts and Lessons a Entropy b...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online