lect04 - Lecture Notes 4 Expectation Definition and...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Notes 4 Expectation Definition and Properties Mean and Variance Markov and Chebyshev Inequalities Covariance and Correlation Conditional Expectation EE 278: Expectation 4 1 Expectation Let X X be a discrete r.v. with pmf p X ( x ) and let g ( x ) be a function of x . The expectation (or expected value or mean ) of g ( X ) can be defined as E( g ( X )) = X x X g ( x ) p X ( x ) For a continuous r.v. X f X ( x ) , the expected value of g ( X ) can be defined as E( g ( X )) = Z - g ( x ) f X ( x ) dx Properties of expectation: If c is a constant, then E( c ) = c Expectation is linear , i.e., for any constant a E[ ag 1 ( X ) + g 2 ( X )] = a E( g 1 ( X )) + E( g 2 ( X )) EE 278: Expectation 4 2 Fundamental Theorem of Expectation: If Y = g ( X ) p Y ( y ) , then E( Y ) = X y Y yp Y ( y ) = X x X g ( x ) p X ( x ) = E( g ( X )) The corresponding formula for f Y ( y ) uses integrals instead of sums: E( Y ) = Z - yf Y ( y ) dy Proof: We prove the theorem for discrete r.v.s. Consider E( Y ) = X y yp Y ( y ) = X y y X { x : g ( x )= y } p X ( x ) = X y X { x : g ( x )= y } yp X ( x ) = X y X { x : g ( x )= y } g ( x ) p X ( x ) = X x g ( x ) p X ( x ) Thus E( Y ) = E( g ( X )) can be found using either f X ( x ) or f Y ( y ) It is often much easier to use f X ( x ) than to first find f Y ( y ) then find E( Y ) EE 278: Expectation 4 3 Remark: We know that a r.v. is completely specified by its cdf (pdf, pmf), so why do we need expectation? Expectation provides a summary or an estimate of the r.v. a single number instead of specifying the entire distribution. It is far easier to estimate the expectation of a r.v. from data than to estimate its distribution Expectation can be used to bound or estimate probabilities of interesting events (as we shall see) EE 278: Expectation 4 4 Mean and Variance The first moment (or mean ) of X f X ( x ) is the expectation E( X ) = Z - xf X ( x ) dx The second moment (or mean square or average power ) of X is E( X 2 ) = Z - x 2 f X ( x ) dx The variance of X is Var( X ) = E ( X- E( X )) 2 = E X 2- 2 X E( X ) + (E( X )) 2 = E( X 2 )- 2(E( X )) 2 + (E( X )) 2 = E( X 2 )- (E( X )) 2 The standard deviation of X is defined as X = p Var( X ) , i.e., Var( X ) = 2 EE 278: Expectation 4 5 Mean and Variance for Famous R.V.s Random Variable Mean Variance Bern( p ) p p (1- p ) Geom( p ) 1 p 1- p p 2 Binom( n, p ) np np (1- p ) Poisson( ) U[ a, b ] a + b 2 ( b- a ) 2 12 Exp( ) 1 1 2 N ( , 2 ) 2 EE 278: Expectation 4 6 Expectation Can Be Infinite or May Not Exist Expectation can be infinite. For example f X ( x ) = ( 1 /x 2 1 x < otherwise E( X ) = Z 1 x/x 2 dx = Expectation may not exist. To find conditions for expectation to exist, consider E( X ) = Z - xf X ( x ) dx =- Z- | x | f X ( x ) dx + Z | x | f...
View Full Document

Page1 / 15

lect04 - Lecture Notes 4 Expectation Definition and...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online