back-matter - Exercises Chapter 2 2.1 Marginal and...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Exercises Chapter 2 2.1 Marginal and conditional probability: The social mobility data from Sec- tion 2.5 gives a joint probability distribution on ( Y 1 ,Y 2 )= (father’s oc- cupation, son’s occupation). Using this joint distribution, calculate the following distributions: a) the marginal probability distribution of a father’s occupation; b) the marginal probability distribution of a son’s occupation; c) the conditional distribution of a son’s occupation, given that the father is a farmer; d) the conditional distribution of a father’s occupation, given that the son is a farmer. 2.2 Expectations and variances: Let Y 1 and Y 2 be two independent random variables, such that E[ Y i ] = μ i and Var[ Y i ] = σ 2 i . Using the definition of expectation and variance, compute the following quantities, where a 1 and a 2 are given constants: a) E[ a 1 Y 1 + a 2 Y 2 ] , Var[ a 1 Y 1 + a 2 Y 2 ]; b) E[ a 1 Y 1- a 2 Y 2 ] , Var[ a 1 Y 1- a 2 Y 2 ]. 2.3 Full conditionals: Let X,Y,Z be random variables with joint density (dis- crete or continuous) p ( x,y,z ) ∝ f ( x,z ) g ( y,z ) h ( z ). Show that a) p ( x | y,z ) ∝ f ( x,z ), i.e. p ( x | y,z ) is a function of x and z ; b) p ( y | x,z ) ∝ g ( y,z ), i.e. p ( y | x,z ) is a function of y and z ; c) X and Y are conditionally independent given Z . 2.4 Symbolic manipulation: Prove the following form of Bayes’ rule: Pr( H j | E ) = Pr( E | H j )Pr( H j ) ∑ K k =1 Pr( E | H k )Pr( H k ) where E is any event and { H 1 ,...,H K } form a partition. Prove this using only axioms P1- P3 from this chapter, by following steps a)-d) below: a) Show that Pr( H j | E )Pr( E ) = Pr( E | H j )Pr( H j ). P.D. Hoff, A First Course in Bayesian Statistical Methods , Springer Texts in Statistics, DOI 10.1007/978-0-387-92407-6 BM2, c Springer Science+Business Media, LLC 2009 226 Exercises b) Show that Pr( E ) = Pr( E ∩ H 1 ) + Pr( E ∩ {∪ K k =2 H k } ). c) Show that Pr( E ) = ∑ K k =1 Pr( E ∩ H k ). d) Put it all together to show Bayes’ rule, as described above. 2.5 Urns: Suppose urn H is filled with 40% green balls and 60% red balls, and urn T is filled with 60% green balls and 40% red balls. Someone will flip a coin and then select a ball from urn H or urn T depending on whether the coin lands heads or tails, respectively. Let X be 1 or 0 if the coin lands heads or tails, and let Y be 1 or 0 if the ball is green or red. a) Write out the joint distribution of X and Y in a table. b) Find E[ Y ]. What is the probability that the ball is green? c) Find Var[ Y | X = 0], Var[ Y | X = 1] and Var[ Y ]. Thinking of variance as measuring uncertainty, explain intuitively why one of these variances is larger than the others....
View Full Document

This note was uploaded on 11/24/2010 for the course STAT 201a taught by Professor Wu during the Spring '10 term at Pasadena City College.

Page1 / 44

back-matter - Exercises Chapter 2 2.1 Marginal and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online