# PS1 - h ^ &amp; W ^ E D d K d , ^ ^ ^ ^ &amp; / d &amp; / K...

This preview shows pages 1–4. Sign up to view the full content.

Department of Chemical Engineering ChE 210A University of California, Santa Barbara Fall 2011 Problem Set No. 1 Due: Monday, 10/03/10 Objective : To become familiar with the thermodynamic entropy, its derivatives, and its connection to microscopic, molecular properties. Helpful reminders: Take heed of these statistical counting formulas and approximations: ways to pick g objects from G , order matters ± ² ³ = G! (G − g)! ways to pick g objects from G , order doesn’t matter ´ ³ ² = µ G g ¶ = G! g!(G − g)! ways to pick g objects from G , with replacement G ² Stirling’s approximation for factorials ln G! ≈ G lnG − G Also note that the following approximate expression for the combinations formula, valid for large g and G , will often greatly simplify your work (you should be able to derive this): G! g!(G − g)! ≈ ·¸ ¹ (1 − ¸) º»¹ ¼ »³ where ¸ ≡ g G 1. Statistical antics : After you put the finishing touches on your first perfectly-worked problem set, what’s the first thing that you’re looking forward to exploring in Santa Barbara: (1) outdoorsy activities (hiking, biking, etc.), (2) beach, (3) restaurants / nightlife, (4) the arts (concerts, theater, visual arts, etc), (5) shopping, or (6) historic sites. 2. Fundamentals problem (3 points). Simple functional forms are often used to correlate thermodynamic properties. In one set of experiments, it is found that a pure substance obeys the following heat capacity and equation of state relations: ½/G = ¾¿ + À Á ± = Â¿Ã Ä

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
where g = G/± and ² , ³ , and ´ µ are G -, -, and ± -independent constants. The first of these expressions invokes the so-called constant heat capacity approximation. Find the underlying entropy function ·(¶, ±,G) , up to an -, and ± -independent constant. Be sure to consider that · must have proper extensive behavior.
3. Conceptual problem (1 point). In class, we discussed several properties of the entropy function. One of them was that the entropy is extensive, i.e., g(G±,G²,G³) = Gg(±,², ³) for a single-component system. Why is this always the case? You may want to proceed show that ln Ω is extensive. A rough way to do this is to consider a large, macroscopic system of ℴ(10 ´µ ) molecules. For conceptual purposes, we will consider this system to be a cube of sugar. You then “scale” the system by copying it G times over into a much larger cube. What can be said about the interfacial interactions between the different copies, relative to the total energy? Write down

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## PS1 - h ^ &amp; W ^ E D d K d , ^ ^ ^ ^ &amp; / d &amp; / K...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online