This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 562a Homework Solutions 2 September 22, 2009 1 1. Solution: This problem is a straightforward problem involving moment calculations of the quantity y ( u ) = A t x ( u ) + B . m y = E { y ( u ) } = A t E { x ( u ) } + B = A t μ + B σ 2 y , E { ( y ( u ) m y )( y ( u ) m y ) * } = E A t ( x ( u ) μ ) ( x ( u ) μ ) t A * = E A t ( x ( u ) μ )( x ( u ) μ ) † A * = A t E ( x ( u ) μ )( x ( u ) μ ) † A * = A t K x A * We have carried conjugates along in the above calculation in case A were complex, and we assumed that ( · ) t represented transpose alone. Assuming A t = [2 1 2] and B = 5, m y = [2 1 2] 5 5 6 + 5 = 32 σ 2 y = [2 1 2] 5 2 1 2 5 1 0 4 2 1 2 = 25 . 2. Solution: The intent of this problem is to give the student practice with partitioned matrices, so we will use them to the utmost in this solution. There are other (less organized) ways to solve this problem. Since part (a) is a special case of part (b), we’ll solve (b) first and then note the simplifications which arise when the assumptions of (a) are made. Define the partitioned column vector as s ( u ) = w ( u ) x ( u ) y ( u ) z ( u ) = O I F O O G H J v ( u ) w ( u ) . 2 EE 562a Homework Solutions 2 September 22, 2009 Here I is an identity matrix and O is an allzeros matrix. The mean of s ( u ) is m s , E { s ( u ) } = E O I F O O G H J v ( u ) w ( u ) = O I F O O G H J E v ( u ) w ( u ) = O I F O O G H J m v m w = m w Fm v Gm w Hm v + Jm w . Either of these last two forms is a perfectly good answer. Centering a random vector by subtracting its mean gives a useful form for covariance com putations. It follows immediately that s o ( u ) = s ( u ) m s = O I F O O G H J v ( u ) w ( u ) ...
View
Full Document
 Fall '07
 ToddBrun
 Normal Distribution, Variance, Probability theory, Covariance matrix

Click to edit the document details