slides21 - Lecture Stat 302 Introduction to Probability -...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Stat 302 Introduction to Probability - Slides 21 AD April 2010 AD () April 2010 1 / 17 Conditional Distributions: Discrete Case Given two r.v. X , Y , we have Discrete Continuous p ( x , y ) f ( x , y ) p X j Y ( x j y ) = p ( x , y ) p Y ( y ) f X j Y ( x j y ) = f ( x , y ) f Y ( y ) E ( g ( X ) j y ) = ∑ g ( x ) . p X j Y ( x j y ) E ( g ( X ) j y ) = R g ( x ) . f X j Y ( x j y ) dx AD () April 2010 2 / 17 Example: A Toy example Let X and Y be two r.v. of joint p.d.f. f ( x , y ) = & 1 / x for 0 & y & x & 1 otherwise. (a) Establish the expression of f X ( x ) . (b) Establish the expression of f Y ( y ) . (c) Establish the expression of f X j Y ( x j y ) and E ( X j Y = y ) . (d) Establish the expression of f Y j X ( y j x ) and E ( Y j X = x ) . AD () April 2010 3 / 17 Example: A Toy example (a) We have for 0 & x & 1 f X ( x ) = Z ∞ ¡ ∞ f ( x , y ) dy = 1 x Z x dy = 1 and f X ( x ) = 0 elsewhere. (b) We have for 0 & y & 1 f Y ( y ) = Z ∞ ¡ ∞ f ( x , y ) dx = Z 1 y 1 x dx = ¡ log y and f Y ( y ) = 0 elsewhere. AD () April 2010 4 / 17 Example: A Toy example (c) We have for 0 & y & x & 1 f X j Y ( x j y ) = f ( x , y ) f Y ( y ) = ¡ 1 x log y and f X j Y ( x j y ) = 0 elsewhere. Hence E ( X j Y = y ) = Z 1 y x f X j Y ( x j y ) dx = ¡ ( 1 ¡ y ) log y . (d) We have for 0 & y & x & 1 f Y j X ( y j x ) = f ( x , y ) f X ( x ) = 1 x . and f Y j X ( y j x ) = 0 elsewhere. Hence E ( Y j X = x ) = Z x y f Y j X ( y j x ) dy = x 2 . AD () April 2010 5 / 17 Example: Bayesian Signal Estimation Let X be a random signal. We do not observe X directly but have access to a noisy measurement Y = X + N where N is a random noise. Assume that X is a normal r.v. of parameters & m , σ 2 x ¡ and N is a normal of mean 0 and variance...
View Full Document

This note was uploaded on 10/21/2010 for the course STAT Stat302 taught by Professor 222 during the Spring '10 term at UBC.

Page1 / 17

slides21 - Lecture Stat 302 Introduction to Probability -...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online