Chapter_08 - CHAPTER 8 8.1 The Hermitian transpose of data...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
170 CHAPTER 8 8.1 The Hermitian transpose of data matrix A is defined by where u ( k,i ) is the output of sensor k in the linear array at time i , where , and . (a) The matrix product A H A equals: This represents the M -by- M spatial (deterministic) correlation matrix of the array with temporal averaging applied to each element of the matrix. This form of averaging assumes that the environment in which the array operates is temporally stationary. (b) The matrix product AA H equals A H u 11 , () u 12 , u 1 n , u 21 , u 22 , u 2 n , uM 1 , 2 , uMn , = . . . k M ,, , = i n = A H A u 1 i , u *1 i , i =1 n u 1 i , u *2 i , i =1 n u 1 i , u * Mi , i =1 n u 2 i , u i , i =1 n u 2 i , u i , i =1 n u 2 i , u * , i =1 n uMi , u i , i =1 n , u i , i =1 n , u * , i =1 n = AA H = u * k 1 , uk 1 , k =1 M u * k 1 , 2 , k =1 M u * k 1 , ukn , k =1 M u * k 2 , 1 , k =1 M u * k 2 , 2 , k =1 M u * k 2 , , k =1 M u * kn , 1 , k =1 M u * , 2 , k =1 M u * , , k =1 M
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
171 This second matrix represents the n -by- n temporal (deterministic) correlation matrix of the array with spatial averaging applied to each element of the matrix. This form of averaging assumes that the environment is spatially stationary. 8.2 We say that the least-squares estimate is consistent if (1) We note that (2) where the correlation matrix is dependent on n . Substituting Eq. (2) in (1): This result shows that is consistent if w ˆ E w ˆ w o 2 [] N lim 0 = E w ˆ w o 2 E w ˆ w o () H w ˆ w o = tr E w ˆ w o H w ˆ w o    = E w ˆ w o H w ˆ w o = E w ˆ w o w ˆ w o H = E w ˆ w o w ˆ w o H = K = σ 2 Φ 1 = Φ E w ˆ w o 2 N lim σ 2 Φ 1 N lim = w ˆ Φ 1 N lim 0 =
Background image of page 2
172 8.3 The data matrix is The desired response vector is The tap-weights vector of the linear least-squares filter is (1) We first note that Hence, using Eq. (1) we get A 23 12 1 –1 = d 2 1 13 4 = w ˆ A T A () 1 A T d = A T A 21 1 32 1 1 = 67 714 = det A T A 35 == A T A 1 1 35 ----- 14 7 7 –6 2 5 -- 1 5 1 5 6 35
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
173 8.4 Express the transfer function of the (forward) prediction-error filter as follows (see Fig. 1): where and From this figure we note that Hence w ˆ 2 5 -- 1 5 1 5 6 35 ----- 21 1 32 1 2 1 13 4 = 2 5 1 5 1 5 6 35 169 34 -------- 273 34 = 0.382 0.382 = H f z () 1 z i z 1 H f z = H f z a Mk , * z k k =0 M = , a M 0 , 1 = z i ρ i e j θ i = 1 - z i z -1 u ( n ) g ( n ) f M ( n ) Fig. 1: H f ( z ) Zf M n [] 1 z i z 1 Zgn = f M n gn z i -1 =
Background image of page 4
174 The prediction-error energy equals (according to the autocorrelation method) With , we may expand the expression for the prediction-error energy as Differentiate ε f with respect to ρ i : (1) From the Cauchy-Schwartz inequality, we have Here we note that (2) where it is recognized that in the autocorrelation method g ( n ) is zero for n <0 . Accordingly, we may rewrite the Cauchy-Schwartz inequality as ε f f M n () 2 n =1 = gn z i -1 [] g * n z i * g * n -1 n =1 = z i ρ i e j θ i = ε f 2 ρ i 2 -1 2 2Re ρ i e j θ i -1 g * n +    n =1 = ∂ε f ∂ρ i -------- 2 ρ i -1 2 e j θ i -1 g * n n =1 n =1 = Re e j θ i -1 g * n n =1
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 09/11/2010 for the course EE EE254 taught by Professor Ujin during the Spring '10 term at YTI Career Institute.

Page1 / 27

Chapter_08 - CHAPTER 8 8.1 The Hermitian transpose of data...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online