Thus we have m x s m n s e s m t i s parenleftbigg 1

This preview shows page 6 - 8 out of 8 pages.

Thus we have, M X ( s ) = M N ( s ) | e s = M T i ( s ) = parenleftBigg 1 3 + 2 3 3 e s 6 ( s 3) parenrightBigg 12 (c) By the law of iterated expectations, E [ N ] = E [ E [ N | P ]]. We can compute E [ N | P ] from the fact that N is a binomial with parameter P , where P is a random variable uniformly distributed between 0 and 1. Thus E [ N ] = E [12 P ] = 12 E [ P ] = 12 1 2 = 6 We compute the var ( N ) using the law of conditional variance: var ( N ) = E [ var ( N | P )]+ var ( E [ N | P ]) = E [12 P (1 P )] + var (12 P ) Page 6 of 8
Image of page 6

Subscribe to view the full document.

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) = 12 E [ P (1 P )] + 144 var ( P ) = 12(1 / 2 1 / 3) + 12 = 14. (d) M N ( s ) = E [ E [ e sN | P ]] = E [ M P ( s )] = E [1 P + Pe s ] = 1 E [ P ]+ e s E [ P ] = 1 1 2 + 1 2 e s = 1 2 + 1 2 e s 10. Let A t (respectively, B t ) be a Bernoulli random variable which is equal to 1 if and only if the t th toss resulted in 1 (respectively, 2). We have E [ A t B t ] = 0 and E [ A t B s ] = E [ A t ] E [ B s ] = p 1 p 2 for s negationslash = t . We have E [ X 1 X 2 ] = E [( A 1 + · · · + A n )( B 1 + · · · + B n )] = n E [ A 1 ( B 1 + · · · + B n )] = n ( n 1) p 1 p 2 , and cov( X 1 ,X 2 ) = E [ X 1 X 2 ] E [ X 1 ] E [ X 2 ] = n ( n 1) p 1 p 2 np 1 np 2 = np 1 p 2 . 11. (a) Here it is easier to find the PDF of Y . Since Y is the sum of independent Gaussian random variables, Y is Gaussian with mean 2 μ and variance 2 σ 2 X + σ 2 Z . (b) i. The transform of N is M N ( s ) = 1 11 (1 + e s + e 2 s + · · · + e 10 s ) = 1 11 10 summationdisplay k =0 e ks Since Y is the sum of a random sum of Gaussian random variables an independent Gaussian random variable, M Y ( s ) = parenleftbigg M N ( s ) | e s = M X ( s ) parenrightbigg M Z ( s ) = parenleftbigg 1 11 10 summationdisplay k =0 ( e + s 2 σ 2 X 2 ) k parenrightbigg e s 2 σ 2 Z 2 = parenleftbigg 1 11 10 summationdisplay k =0 e skμ + s 2 2 X 2 parenrightbigg e s 2 σ 2 Z 2 = 1 11 10 summationdisplay k =0 e skμ + s 2 ( 2 X + σ 2 Z ) 2 In general, this is not the transform of a Gaussian random variable. ii. One can differentiate the transform to get the moments, but it is easier to use the laws of iterated expectation and conditional variance: E Y = E X E N + E Z = 5 μ var( Y ) = E N var( X ) + ( E X 2 )var( N ) + var( Z ) = 5 σ 2 X + 10 μ 2 + σ 2 Z iii. Now, the new transform for N is M N ( s ) = 1 9 ( e 2 s + · · · + e 10 s ) = 1 9 10 summationdisplay k =2 e ks Page 7 of 8
Image of page 7
Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Therefore, M Y ( s ) = parenleftbigg M N ( s ) | e s = M X ( s ) parenrightbigg M Z ( s ) = parenleftbigg 1 9 10 summationdisplay k =2 ( e + s 2 σ 2 X 2 ) k parenrightbigg e s 2 σ 2 Z 2 = parenleftbigg 1 9 10 summationdisplay k =2 e skμ + s 2 2 X 2 parenrightbigg e s 2 σ 2 Z 2 = 1 9 10 summationdisplay k =2 e skμ + s 2 ( 2 X + σ 2 Z ) 2 (c) Given Y , the linear least-squared estimator of X k is given by ˆ X k = E X k + cov( X k ,Y ) var( Y ) ( Y E Y ) = μ + cov( X k ,Y ) var( Y ) ( Y E Y ) . To determine the mean and variance of Y we first determine those of N : E N = parenleftbigg 1 4 10 + 3 4 5 parenrightbigg = 25 4 var( N ) = E var( N | timeofday ) + var( E N | timeofday ) = 10 + 75 16 = 235 16 Now E Y = EE Y | N = E N E X + E Z = E N E X = 25 4 μ var( Y ) = E N var( X ) + ( E X 2 )var( N ) + var( Z ) = 25 4 σ 2 X + 235 16 μ 2 + σ 2 Z cov( X k ,Y ) = E ( X k μ )( Y 25 μ/ 4) = EE ( X k μ )( Y 25 μ/ 4) | N Since E ( X k μ )( Y 25 μ/ 4) | N = braceleftBigg σ 2 X if N k, 0 otherwise then cov( X k ,Y ) = σ 2 X P ( N k ) = σ 2 X braceleftBigg 0 . 25 k 10 k e - 10 k !
Image of page 8
  • Spring '06
  • Munther Dahleh
  • Computer Science, Electrical Engineering, Probability theory, Massachusetts Institute of Technology, Probabilistic Systems Analysis, Department of Electrical Engineering & Computer Science

{[ snackBarMessage ]}

Get FREE access by uploading your study materials

Upload your study materials now and get free access to over 25 million documents.

Upload now for FREE access Or pay now for instant access
Christopher Reinemann
"Before using Course Hero my grade was at 78%. By the end of the semester my grade was at 90%. I could not have done it without all the class material I found."
— Christopher R., University of Rhode Island '15, Course Hero Intern

Ask a question for free

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern