WienerAsAHammerstein

# WienerAsAHammerstein - % Wiener identification : thinking...

This preview shows pages 1–3. Sign up to view the full content.

%% Wiener identification : thinking it as a Hammerstein model. Using small %% signal analysis. a filter and a gain is obtained. than the svm is %% trained to model the overall nonlineah svm. % Then we will construct a closed loop system where at the feedback the % inverse model of the nonlinearity is present. And we may add a controller % such that the control design spesifications are satisfied. clear all u=.25*normrnd(0,2,1,700) ; %.1111; % A white gaussian input sequence u with length %700 0 mean and standard deviation 2 %u=8*rand(1,700)-4; e=.1*normrnd(0,.2,1,700); % A white gaussian with zero mean and standart de % viation .2 with length 700. it is error term e = zeros(1,1189); % this is added after all. actually it should have ic = i; % been done before rts = [.98*exp(1.*ic) .98*exp(-1.*ic) .98*exp(1.6*ic) .98*exp(-1.6*ic) . 95*exp(2.5*ic) .95*exp(-2.5*ic)]; a = poly(rts); b = [1 -.7 .5 -.4] ; % bi s % the roots below will be used for the closed loop system below. A % controller is added to the roots: rts. rts_c = [1 .98*exp(ic) .98*exp(-ic) .98*exp(1.6*ic) .98*exp(-1.6*ic) . 95*exp(2.5*ic) .95*exp(-2.5*ic)]; a_c = poly(rts_c); % now we will get the input output data of the original Wiener system % ,small signals are used. [h,tt] = impz(b,[a]); %filter impulse response us = [0 u(1:end-1)]; % past values of "u" v = conv(h,u); v2 = .25*u; y2 = conv(h,v2); y = 3*(-.5 + 1./(1 + exp(-.5*v)) ) +e;% 2*v; % (sin(v).*v)+4*v; figure(1);subplot(4,1,1) ; plot(u(1:700)); title('input to the system'); subplot(4,1,2) ; plot(v(1:700)); title('output of the filter: before nonlinearity'); subplot(4,1,3) ; plot(y(1:700)); title('output of the whole system');hold off subplot(4,1,4) ; plot(y2(1:700));title('output of the whole system;v2 = .25*u hammersteinish'); N=200; r=7; m=3;n= sum(size(a))-2; sg = 1; %% solve linear equation % construct Kernel matrix . The last two hundred data points will be used. xtrain = u(201:400); for i=1:N % K is omega matrix for j=1:N K(i,j) = exp(-((u(1,i+200)-u(1,j+200))^2)*1); %itis oki end end % Construct Yf. Again the last two hundred data points will be used . Yf = y(1,200+r:200+N); %it is okei % Construct Yp:n*N-r+1. Again the last two hundred data points will be used. for i=1:n Yp(i,1:N-r+1) = y(1,200+r-i:N+200-i); %itis okei end % Construct Ko. Ko:194*4 (expected) it is okei. for p = 1:N-r+1 for q = 1:m+1 sumk = 0; for t = 1:N sumk = sumk + K(t,r+p-q); end Ko(p,q) = sumk;

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
end end % Construct Ksus . Ksus:194*194 (expected, not sure): well Ksus2 = Ksus . % it is great.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 07/04/2011 for the course ECE 501 taught by Professor Deniz during the Spring '11 term at Istanbul Universitesi.

### Page1 / 6

WienerAsAHammerstein - % Wiener identification : thinking...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online