Review_notes

# Review_notes - 1 Review - conditional probability and...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 Review - conditional probability and conditional expectation 1.1 Conditional probability & distributions We consider two random variables (rv&s) X and Y with joint probability mass function (pmf) or joint density function (pdf) f X;Y ( x;y ) . It is known that & the marginal pmf or pdf of X f X ( x ) = Z all y f X;Y ( x;y ) dy , or f X ( x ) = X all y f X;Y ( x;y ) . & the marginal pmf or pdf of Y f Y ( y ) = Z all x f X;Y ( x;y ) dx , or f Y ( y ) = X all x f X;Y ( x;y ) . & the conditional pmf or pdf of X j Y = y is f X j Y ( x j y ) = f X;Y ( x;y ) f Y ( y ) . Remark 1 If the rv&s X and Y are independent, f X;Y ( x;y ) = f X ( x ) f Y ( y ) . As an immediate consequence, we have f X j Y ( x j y ) = f X ( x ) f Y ( y ) f Y ( y ) = f X ( x ) . Using the conditional distribution of X j Y , the marginal pmf or pdf of X can be expressed as f X ( x ) = Z all y f X j Y ( x j y ) f Y ( y ) dy , or f X ( x ) = X all y f X j Y ( x j y ) f Y ( y ) . (1) Often f X j Y ( x j y ) is a known parametric probability distribution with a given set of parameters. Under the represen- tation (1), the marginal distribution of X is called a mixed distribution (or mixture) . Example 2 ( two-point mixture ) Assume that Y is a discrete r.v. with 2 possible values, namely y 1 and y 2 , with f Y ( y 1 ) = p = 1 ¡ f Y ( y 2 ) . Then, f X ( x ) = f X j Y ( x j y 1 ) f Y ( y 1 ) + f X j Y ( x j y 2 ) f Y ( y 2 ) = p f X j Y ( x j y 1 ) + (1 ¡ p ) f X j Y ( x j y 2 ) . 1 Example 3 ( negative binomial ) Suppose that & X j & = & is Poisson distributed with mean & & & is gamma distributed with pdf f & ( & ) = ¡ ( ¡& ) & & 1 e & ¡¢ ¡ ( ¢ ) , & > , where ¢;¡ > with ¡ ( ¢ ) = Z 1 & & & 1 e & ¢ d& . Aside: Using integration by parts, ¡ ( ¢ + 1) = ¢ ¡ ( ¢ ) , for ¢ > . It follows that ¡ ( ¢ + n ) = ( ¢ + n ¡ 1) ¡ ( ¢ + n ¡ 1) = ( ¢ + n ¡ 1) ( ¢ + n ¡ 2) ¡ ( ¢ + n ¡ 2) ::: For ¢ a positive integer, ¡ ( ¢ + 1) = ¢ ¡ ( ¢ ) = ¢ ( ¢ ¡ 1) ¡ ( ¢ ¡ 1) = ¢ ( ¢ ¡ 1) ::: (1) ¡ (1) = ¢ ! Find the marginal distribution of X . f X ( x ) = Z 1 e & ¢ & x x ! ¡ ( ¡& ) & & 1 e & ¡¢ ¡ ( ¢ ) d& = ¡ & ¡ ( ¢ ) 1 x ! Z 1 & x + & & 1 e & ( ¡ +1) ¢ d& = ¡ & ¡ ( ¢ ) 1 x ! ¡ ( x + ¢ ) ( ¡ + 1) x + & Z 1 ( ¡ + 1) x + & & x + & & 1 e & ( ¡ +1) ¢ ¡ ( x + ¢ ) d& | {z } =1 = ¡ ( x + ¢ ) ¡ ( ¢ ) 1 x ! ¡ & ( ¡ + 1) x + & = ¡ ( x + ¢ ) ¡ ( ¢ ) 1 x ! & ¡ ¡ + 1 ¡ & & 1 ¡ + 1 ¡ x = & x + ¢ ¡ 1 x ¡& ¡ ¡ + 1 ¡ & & 1 ¡ + 1 ¡ x . With this pmf, X is known to be a negative binomial rv. 1.2 Conditional expectation In this section, we assume that X j Y = y is a continuous rv with a valid pdf f X j Y ( ¢j y ) (if X j Y is discrete, replace all the integral signs by summation signs). The conditional moment of X j Y is given by E [ X j Y = y ] = Z all x xf X j Y ( x j y ) dx ....
View Full Document

## This note was uploaded on 02/02/2010 for the course ACTSC 432 taught by Professor Davidlandriault during the Spring '09 term at Waterloo.

### Page1 / 11

Review_notes - 1 Review - conditional probability and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online