This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Lecture 21. Independence and Conditioning The following material on independence is from Chapter 8, section 2. Definition. Let X and Y be discrete random variables on the same probability space. Suppose the joint pmf is p ( x, y ) and the individual pmfs (i.e., the marginals) are p X ( x ) and p Y ( y ). We say X and Y are independent if p ( x, y ) = p X ( x ) p Y ( y ) . Comment. This definition should remind you of what we said about recognizing indepen- dence in a table. Problem. Suppose X and Y take the values 1 , 2 , 3. Make up and depict in a table a joint pmf for X and Y such that X and Y are: i ) independent, ii ) not independent. Definition. Let X and Y be continuous random variables on the same probability space. Suppose the joint pdf is f ( x, y ) and the individual pdfs (i.e., the marginals) are f X ( x ) and f Y ( y ). We say X and Y are independent if f ( x, y ) = f X ( x ) f Y ( y ) ....
View Full Document
This note was uploaded on 11/29/2011 for the course MATH 3355 taught by Professor Britt during the Spring '08 term at LSU.
- Spring '08