ps08_sol - braceleftbigg braceleftbigg Massachusetts...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: braceleftbigg braceleftbigg Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Problem Set 8 Solutions 1. Let A t (respectively, B t ) be a Bernoulli random variable that is equal to 1 if and only if the t th toss resulted in 1 (respectively, 2). We have E [ A t B t ] = 0 (since A t = 0 implies B t = 0) and 1 1 E [ A t B s ] = E [ A t ] E [ B s ] = k k for s = t. Thus, E [ X 1 X 2 ] = E [( A 1 + + A n )( B 1 + + B n )] 1 1 = n E [ A 1 ( B 1 + + B n )] = n ( n 1) k k and cov( X 1 ,X 2 ) = E [ X 1 X 2 ] E [ X 1 ] E [ X 2 ] n ( n 1) n 2 n = = . k 2 k 2 k 2 2. (a) The minimum mean squared error estimator g ( Y ) is known to be g ( Y ) = E [ X Y ]. Let | us first find f X,Y ( x,y ). Since Y = X + W , we can write 1 x ) = 2 , if x 1 y x + 1; f Y | X ( y | , otherwise and, therefore, 1 if x 1 y x + 1 and 5 x 10; f X,Y ( x,y ) = f Y X ( y | x ) f X ( x ) = 10 , | , otherwise as shown in the plot below. o o x , y x,y f ( y o 5 10 ) = 1/10 5 10 x o We now compute E [ X Y ] by first determining f X Y ( x y ). This can be done by | | | looking at the horizontal line crossing the compound PDF. Since f X,Y ( x,y ) is uniformly distributed in the defined region, f X Y ( x y ) is uniformly distributed as well. Therefore, | | 5+( y +1) , if 4 y < 6; 2 g ( y ) = E [ X Y = y ] = y, if 6 y 9; | 10+( y 1) , if 9 < y 11 . 2 The plot of g ( y ) is shown here. Page 1 of 7 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) 4 5 6 7 8 9 10 11 4 5 6 7 8 9 10 11 o ) g (y y o (b) The linear least squares estimator has the form cov( X,Y ) g L ( Y ) = E [ X ] + ( Y E [ Y ]) , 2 Y where cov( X,Y ) 2 + W 2 25 / 12, W (10 5) 2 / 12 (1 ( 1)) 2 / 12 [( X [ X ])( Y [ Y ])]. We compute [ X ] 7 5, [ Y ] [ X ] + E E E E E E = = = . 2 fact that X and W are independent, Y 2 7 5, . X E [ W ] 4 / 12 and, using the = = = = = 2 X 29 / 12. Furthermore, = = cov( X,Y ) = E [( X E [ X ])( Y E [ Y ])] = E [( X E [ X ])( X E [ X ] + W E [ W ])] 2 X 2 X E [( X E [ X ])( X E [ X ])] + E [( X E [ X ])( W E [ W ])] + E [( X E [ X ])] E [( W E [ W ])] = = 25 / 12 . = = Note that we use the fact that ( X E [ X ]) and ( W E [ W ]) are independent and E [( X E [ X ])] = 0 = E [( W E [ W ])]. Therefore, 25 g L ( Y ) = 7 . 5 + ( Y 7 . 5) . 29 The linear estimator g L ( Y ) is compared with g ( Y ) in the following figure. Note that g ( Y ) is piecewise linear in this problem....
View Full Document

Page1 / 7

ps08_sol - braceleftbigg braceleftbigg Massachusetts...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online