This preview shows page 1. Sign up to view the full content.
Unformatted text preview: 5 Let g(X, Y) be a function of the discrete random variables X and Y. The expected value of this function is defined as: E[g(X , Y )] = ∑ ∑ g(x , y ) PX , Y (x , y ) xy A property of expectation is: E(X + Y ) = E(X ) + E(Y ) This result can be shown: E(X + Y ) = ∑ ∑ (x + y ) PX , Y (x , y )
xy [ = ∑ ∑ x PX , Y (x , y ) + y PX , Y (x , y )
xy = ∑ x ∑ PX , Y (x , y ) + ∑ y ∑ PX , Y (x , y ) x y y x = ∑ x PX (x) + ∑ y PY (y )
x y = E(X ) + E(Y )
For constant fixed numbers a and b a rule is: E(a X + b Y ) = a E(X ) + b E(Y ) A general result is that for K random variables X 1 , X 2 , . . . , X K with means μ 1 , μ 2 , . . . , μ K the expected value of their sum is: 29 E(X 1 + X 2 + K + X K ) = μ 1 + μ 2 + K + μ K Chapter 5 A measure of a linear relationship between two random variables is of interest. For random variables X and Y with means μ X and μ Y the covariance between X and Y is defined as: Cov(X , Y ) = E[(X − μ X ) (Y − μ Y )] = ∑ ∑ (x − μ X ) (y − μ Y ) PX , Y (x , y ) xy An equiva...
View
Full
Document
This note was uploaded on 02/06/2014 for the course ECON ECON 325 taught by Professor Whistler during the Spring '10 term at The University of British Columbia.
 Spring '10
 WHISTLER

Click to edit the document details