PS-5-2009-solutions

PS-5-2009-solutions - EE 261 The Fourier Transform and its...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 261 The Fourier Transform and its Applications Fall 2009 Problem Set Five Due Wednesday, October 28 1. (10 points) Expected values of random variables, orthogonality, and approximation Let X be a random variable with probability distribution function p ( x ). Recall that the mean, or expected value, of X is the number E ( X ) = integraldisplay xp ( x ) dx. Some important properties of the expected value (not trivial) are: Linearity E ( X 1 + X 2 ) = E ( X 1 ) + E ( X 2 ) [Note: there is no assumption that X 1 and X 2 are independent. The additivity of expected values when the random variables are not independent has been called the First Fundamental Mystery of Probability.] Functional dependence If Y = f ( X ) for a function f then E ( Y ) = integraldisplay f ( x ) p ( x ) dx, where p ( x ) is the probability distribution function of X . Let R be the collection of random variables with expected value 0 and finite variance. There is a close analogy between the inner product of two vectors (including functions) and the expected value of the product of two random variables in R , namely one sets ( X 1 ,X 2 ) = E ( X 1 X 2 ) , the expected value of the product of X 1 ,X 2 R . Two random variables in R are orthogonal if ( X 1 ,X 2 ) = 0. If X 1 and X 2 are independent then E ( X 1 X 2 ) = E ( X 1 ) E ( X 2 ) and so independence implies orthogonality in R , which is another reason that both ideas are both natural and important. The norm of X R is defined to be bardbl X bardbl = E ( X 2 ) 1 / 2 1 which, according to the Functional dependence property above, is the variance of X and hence is finite (by definition of R ). If X 1 and X 2 in R are orthogonal then E (( X 1 + X 2 ) 2 ) = E ( X 2 1 ) + E ( X 2 2 ) = bardbl X 1 + X 2 bardbl 2 = bardbl X 1 bardbl 2 + bardbl X 2 bardbl 2 , which is the Pythagorean theorem for random variables. In (other) words, for orthogonal random variables the variance of the sum is the sum of the variances. Just as with Fourier series there is a natural (geometric) application of these ideas. We know that the shortest distance between a plane A and a point p not on A is the length of the line segment through p perpendicular to A . More generally, and stated differently, is if you wish to approximate a vector p by a vector tildewide p from a lower dimensional subspace A , the error of making this approximation, namely bardbl p tildewide p bardbl , will be smallest if p tildewide p is orthogonal to A . Suppose Y 1 ,Y 2 ,... Y N are known, orthogonal random variables in R and that X is an un- known random variable in R . Find the c k so that N summationdisplay n =1 c n Y n is the best approximation to X ....
View Full Document

This note was uploaded on 11/28/2009 for the course EE 261 at Stanford.

Page1 / 10

PS-5-2009-solutions - EE 261 The Fourier Transform and its...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online