Solution a by the linearity of expectation e x 1 n n e

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ntly, the mean and variance, of a single random variable convey important information about the distribution of the variable, and the moments are often simpler to deal with than pmfs, pdfs, or CDFs. Use of moments is even more important when considering more than one random variable at a time. That is because joint distributions are much more complex than distributions for individual random variables. Let X and Y be random variables with finite second moments. Three important related quantities are: the correlation: E [XY ] the covariance: Cov(X, Y ) = E [(X − E [X ])(Y − E [Y ])] Cov(X, Y ) Cov(X, Y ) the correlation coefficient: ρX,Y = = . σX σY Var(X )Var(Y ) Covariance generalizes variance, in the sense that Var(X ) = Cov(X, X ). Recall that there are useful shortcuts for computing variance: Var(X ) = E [X (X − E [X ])] = E [X 2 ] − E [X ]2 . Similar shortcuts exist for computing covariances: Cov(X, Y ) = E [X (Y − E [Y ])] = E [(X − E [X ])Y ] = E [XY ] − E [X ]E [Y ]. In particular, if ei...
View Full Document

This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Tech.

Ask a homework question - tutors are online