PS-5-2009

PS-5-2009 - EE 261 The Fourier Transform and its...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 261 The Fourier Transform and its Applications Fall 2009 Problem Set Five Due Wednesday, October 28 1. (10 points) Expected values of random variables, orthogonality, and approximation Let X be a random variable with probability distribution function p ( x ). Recall that the mean, or expected value, of X is the number E ( X ) = integraldisplay xp ( x ) dx. Some important properties of the expected value (not trivial) are: Linearity E ( X 1 + X 2 ) = E ( X 1 ) + E ( X 2 ) [Note: there is no assumption that X 1 and X 2 are independent. The additivity of expected values when the random variables are not independent has been called the First Fundamental Mystery of Probability.] Functional dependence If Y = f ( X ) for a function f then E ( Y ) = integraldisplay f ( x ) p ( x ) dx, where p ( x ) is the probability distribution function of X . Let R be the collection of random variables with expected value 0 and finite variance. There is a close analogy between the inner product of two vectors (including functions) and the expected value of the product of two random variables in R , namely one sets ( X 1 ,X 2 ) = E ( X 1 X 2 ) , the expected value of the product of X 1 ,X 2 R . Two random variables in R are orthogonal if ( X 1 ,X 2 ) = 0. If X 1 and X 2 are independent then...
View Full Document

Page1 / 4

PS-5-2009 - EE 261 The Fourier Transform and its...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online