Vs let x and y be two random variables and g x y be an

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X , Y ) ∈ A) = ∑ pX ,Y (x , 7 − x ). x =1 M. Chen (IE@CUHK) ENGG2430C lecture 6 9 / 17 Expectation and Variance of Functions of R.V.s Let X and Y be two random variables, and g (X , Y ) be an arbitrary function of X and Y , E [g (X , Y )] = ∑ ∑ g (x , y )pX ,Y (x , y ) x y Var (g (X , Y )) = E g 2 (X , Y ) − E 2 [g (X , Y )]. In general E [g (X , Y )] = g (E [X ], E [Y ]), and Var (g (X , Y )) = g (Var(X ), Var(Y )) (examples) Let X and Y be two r.v.s with finite E [X ] and E [Y ] Suppose that X and Y are independent, E [X + Y ] =?E [X ] + E [Y ] Show that for any two random variables X and Y (not necessarily independent), E [X + Y ] = E [X ] + E [Y ] M. Chen (IE@CUHK) ENGG2430C lecture 6 10 / 17 Right or Wrong? If X and Y are independent, then E [XY ] = E [X ]E [Y ] E [X ] E [X /Y ] = E [Y ] E [g (X )h(Y )] = E [g (X )]E [h(Y )] M. Chen (IE@CUHK) ENGG2430C lecture 6 11 / 17 PMF and Probability Law pX (x ) ⇔ P (A) pX ,Y (x , y ) ⇔ P (A ∩ B ) pX |Y (x |y ) ⇔ P (A|B ) pX ,Y ,Z (x , y , z ) = pX (x )pY |X (y |x )pZ |X ,Y (z |x , y ) pX (x ) Bayes’law: pX |Y (x |y ) = pY |X (y |x ) pY (y ) Total probability law: pX (x ) = ∑...
View Full Document

This document was uploaded on 03/31/2014.

Ask a homework question - tutors are online