This preview shows pages 1–9. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chemistry 2000 Lecture 11: Entropy and the second law of thermodynamics Marc R. Roussel The thermodynamic description of matter I In classical thermodynamics , we describe the state of a system by macroscopic variables which can be measured using ordinary lab equipment. Macroscopic variables include I the number of moles of each chemical component in a system I the temperature I the total pressure I the volume I We typically only need to know a few of the macroscopic variables since they are connected by equations of state . Examples: I PV = nRT for an ideal gas. I V = V 1 ( T T ) + ( P P ) for solids or liquids with ( T , P ) near a reference state ( T , P ). The mechanical description of matter I We can also describe matter by its microscopic state . The microscopic state includes I positions of all particles I momenta of all particles ( p = mv ) I occupation of all energy levels of the atoms or molecules I The microscopic state (or just microstate ) represents an extraordinarily large number of variables. The statistical approach I These two very different ways of describing the same piece of matter (microscopic and macroscopic) can be related using a statistical approach . I This works because of the very large number of molecules in a typical macroscopic system. Statistical entropy I Entropy is a key quantity in thermodynamics. I The statistical entropy is calculated by S = k B ln where k B is Boltzmanns constant (again). is the total number of microscopic states which are consistent with a given macroscopic state. I S is a measure of our ignorance of the microscopic state at any given time. Example: Entropy of 12 I Suppose that I tell you that I have 12 in my pocket. I Your ignorance of how this 12 is composed could be considered a form of entropy. I Possible microstates of 12 : I 12 1 I 7 1 + 1 5 I 2 1 + 2 5 I 2 1 + 1 10 I S 12 = k B ln4 I In information theory , we use the base2 logarithm and set k B = 1 (corresponds to a change of units for the entropy)....
View
Full
Document
 Fall '06
 Roussel
 Chemistry

Click to edit the document details