SolutionsManual-Statistical and Adaptive Signal Processing.pdf

366 statistical and adaptive signal processing

This preview shows page 370 - 373 out of 467 pages.

366 Statistical and Adaptive Signal Processing - Solution Manual %Lattice Ef(m) = lambda*Efold(m) + alphaold(m)*conj(ef(m))*ef(m); cf(m) = lambda*Efold(m)/Ef(m); sf(m) = alphaold(m)*ef(m)/Ef(m); Eb(m) = lambda*Ebold(m) + alpha(m)*conj(eb(m))*eb(m); cb(m) = lambda*Ebold(m)/Eb(m); sb(m) = alpha(m)*eb(m)/Eb(m); ef(m+1) = ef(m) + conj(kfold(m))*ebold(m); kf(m) = cbold(m)*kfold(m) - sbold(m)*conj(ef(m)); eb(m+1) = ebold(m) + conj(kbold(m))*ef(m); kb(m) = cf(m)*kbold(m) - sf(m)*conj(ebold(m)); alpha(m+1) = alpha(m) - abs(alpha(m)*eb(m))*abs(alpha(m)*eb(m))/Eb(m); % Ladder e(m+1)=e(m)-conj(kcold(m))*eb(m); kc(m)= cb(m)*kcold(m) + sb(m)*conj(e(m)); end er(n,i)=e(M+1);erf(n,i)=ef(M);erb(n,i)=eb(M); Erf(n,i)=Eb(1); Erb(n,i)=Eb(M); al(n,i)=alpha(M); % Time Delay cbold=cb; cfold=cf; sbold=sb; sfold=sf; Ebold=Eb; Efold=Ef; ebold=eb; efold=ef; kfold=kf; kbold=kb; kcold=kc; alphaold=alpha; end i end er2=er.^2; er2m=mean(er2,2); t=(1:N)’; %plot(t,Erf,’r’,t,Erb,’b’); %text(250,300,’\leftarrow\fontsize{9pt} E^f’); %text(100,200,’E^b \rightarrow\fontsize{9pt} ’); Jmin1=1; %subplot(’position’,[0.1,0.55,0.85,0.4]); semilogy(t,er2m,’r’); axis([0,N,10^(-4), 10^0]); title([’MSE Learning Curve (\lambda=.99, W=’ num2str(W) ’ )’],’fontsize’,10); xlabel(’Number of iterations (n)’,’fontsize’,8); ylabel(’Mean squared error’,’fontsize’,8);
Image of page 370

Subscribe to view the full document.

Statistical and Adaptive Signal Processing - Solution Manual 367 0 100 200 300 400 500 600 700 800 900 1000 10 -4 10 -3 10 -2 10 -1 10 0 MSE Learning Curve ( λ =.99, W=2.9 ) Number of iterations (n) Mean squared error Figure 10.43: MSE learning curve in P10.43 set(gca,’xtick’,[0:100:N],’fontsize’,8); print -deps2 P1043.eps 10.44 In this problem we discuss the derivation of the normalized lattice-ladder RLS algorithm, which uses a smaller number of time and order updating recursions and has better numerical behavior due to the normalization of its variables. (a) Define the energy and angle normalized variables ¯ e f m ( n ) = ε f m ( n ) α m ( n ) E f m ( n ) ¯ e b m ( n ) = ε b m ( n ) α m ( n ) E b m ( n ) ¯ e m ( n ) = ε m ( n ) α m ( n ) E m ( n ) ¯ k m ( n ) = β m ( n ) E f m ( n ) E b m ( n 1 ) ¯ k c m ( n ) = β c m ( n ) E m ( n ) E b m ( n ) and show that the normalized errors and the partial correlation coefficients ¯ k m ( n ) and ¯ k c m ( n ) have magni- tude less than 1.
Image of page 371
368 Statistical and Adaptive Signal Processing - Solution Manual (b) Derive the following normalized lattice-ladder RLS algorithm: E f 0 ( 1 ) = E 0 ( 1 ) = δ > 0 For n = 0 , 1 , 2 , . . . E f 0 ( n ) = λ E f 0 ( n 1 ) + | x ( n ) | 2 , E 0 ( n ) = λ E 0 ( n 1 ) + | y ( n ) | 2 ¯ e f 0 ( n ) = ¯ e b 0 ( n ) = x ( n ) E f 0 ( n ) , ¯ e 0 ( n ) = y ( n ) E 0 ( n ) For m = 0 to M 1 ¯ k m ( n ) = 1 − |¯ e f m ( n ) | 2 1 − |¯ e b m ( n 1 ) | 2 ¯ k m ( n 1 ) + ¯ e f m ( n ) ¯ e b m ( n 1 ) ¯ e f m + 1 ( n ) = 1 − | e b m ( n 1 ) | 2 1 − | ¯ k m ( n ) | 2 1 e f m ( n ) ¯ k m ( n ) ¯ e b m ( n 1 ) ] ¯ e b m + 1 ( n ) = 1 − |¯ e f m ( n ) | 2 1 − | ¯ k m ( n ) | 2 1 e b m ( n 1 ) ¯ k m ( n ) ¯ e f m ( n ) ] ¯ k c m ( n ) = 1 − |¯ e m ( n ) | 2 1 − |¯ e b m ( n ) | 2 ¯ k c m ( n 1 ) + ¯ e m ( n ) ¯ e b m ( n ) ¯ e m + 1 ( n ) = 1 − |¯ e b m ( n ) | 2 1 − | ¯ k c m ( n ) | 2 1 e m ( n ) ¯ k c m ( n ) ¯ e b m ( n ) ] (c) Write a Matlab function to implement the derived algorithm, and test its validity by using the equalization experiment in Example 10.5.2.
Image of page 372

Subscribe to view the full document.

Image of page 373

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern