The entropy on the left in case 1 is therefore given

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: d to M . The entropy on the left in case 1 is therefore given by: pA1, left = A1, left S1, left = − Mk B ⎡ pA1, left ln pA1, left + pB1, left ln pB1, left + pempty, left ln( pempty, left ) ⎤ ⎣ ⎦ Because pempty, left ≈ 1, the last term is zero (ln1=0). Hence: S1, left = − Mk B ⎡ pA1, left ln pA1, left + pB1, left ln pB1, left ⎤ ⎣ ⎦ Likewise, the entropy of the right side in state 1 is given by: S1, right = − Mk B ⎡ pA1, right ln pA1, right ⎤ (note that there are no B molecules on the right). ⎣ ⎦ So, the total entropy in case 1 is: S1 = S1, left + S1, right = − Mk B ⎡ pA1, left ln pA1, left + pB1, left ln pB1, left + pA1, right ln pA1, right ⎤ ⎣ ⎦ By similar reasoning, the entropy for case 2 is: S2 = − Mk B ⎡ pA 2, left ln pA 2, left + pB 2, left ln pB 2, left + pA 2, right ln pA 2, right ⎤ ⎣ ⎦ But the number of B molecules is the same in state 1 and state 2. Therefore, when we calculate the difference in entropy between the two states the term involving B...
View Full Document

This note was uploaded on 01/27/2011 for the course MCB 100A taught by Professor Kuryian during the Spring '09 term at University of California, Berkeley.

Ask a homework question - tutors are online