This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 562a Homework Solutions 2 September 22, 2009 1 1. Solution: This problem is a straightforward problem involving moment calculations of the quantity y ( u ) = A t x ( u ) + B . m y = E { y ( u ) } = A t E { x ( u ) } + B = A t + B 2 y , E { ( y ( u ) m y )( y ( u ) m y ) * } = E A t ( x ( u ) ) ( x ( u ) ) t A * = E A t ( x ( u ) )( x ( u ) ) A * = A t E ( x ( u ) )( x ( u ) ) A * = A t K x A * We have carried conjugates along in the above calculation in case A were complex, and we assumed that ( ) t represented transpose alone. Assuming A t = [2 1 2] and B = 5, m y = [2 1 2] 5 5 6 + 5 = 32 2 y = [2 1 2] 5 2 1 2 5 1 0 4 2 1 2 = 25 . 2. Solution: The intent of this problem is to give the student practice with partitioned matrices, so we will use them to the utmost in this solution. There are other (less organized) ways to solve this problem. Since part (a) is a special case of part (b), well solve (b) first and then note the simplifications which arise when the assumptions of (a) are made. Define the partitioned column vector as s ( u ) = w ( u ) x ( u ) y ( u ) z ( u ) = O I F O O G H J v ( u ) w ( u ) . 2 EE 562a Homework Solutions 2 September 22, 2009 Here I is an identity matrix and O is an allzeros matrix. The mean of s ( u ) is m s , E { s ( u ) } = E O I F O O G H J v ( u ) w ( u ) = O I F O O G H J E v ( u ) w ( u ) = O I F O O G H J m v m w = m w Fm v Gm w Hm v + Jm w . Either of these last two forms is a perfectly good answer. Centering a random vector by subtracting its mean gives a useful form for covariance com putations. It follows immediately that s o ( u ) = s ( u ) m s = O I F O O G H J v ( u ) w ( u ) ...
View
Full
Document
This note was uploaded on 04/03/2010 for the course EE 562a taught by Professor Toddbrun during the Fall '07 term at USC.
 Fall '07
 ToddBrun

Click to edit the document details