WienAsHammerValContEx - % Wiener identification : thsing...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
%% Wiener identification : thsing small %% signal analysis. a fear function with svm. % Then we will construct a closed loop system where at the feedback the % inverse model of the nonlinearity is present. And we may add a controller % such that the control design spesifications are satisfied. % %% This file is pretty much similar to WienerAsAHammerstein . Here the %% example of valve control is examined as examined in a paper. .. % clear all u=.25*normrnd(0,1,1,700) ; %.1111; % A white gaussian input sequence u with length %700 0 mean and standard deviation 2 %u=8*rand(1,700)-4; e=.1*normrnd(0,.2,1,700); % A white gaussian with zero mean and standart de % viation .2 with length 700. it is error term e = zeros(1,700); % this is added after all. actually it should have ic = i; a = [1 -1.4138 0.6065] b = [0 0.1044 0.0883] ; % bi s % the roots below will be used for the closed loop system below. A % controller is added to the roots: rts. rts_c = [1 .98*exp(ic) .98*exp(-ic) .98*exp(1.6*ic) .98*exp(-1.6*ic) . 95*exp(2.5*ic) .95*exp(-2.5*ic)]; a_c = poly(rts_c); % now we will get the input output data of the original Wiener system % ,small signals are used. [h,tt] = impz(b,[a]); %filter impulse response v = conv(h,u); v2 = .95*u; y2 = conv(h,v2); y = v./(sqrt(0.1 + 0.9*v.^2)); %3*(-.5 + 1./(1 + exp(-.5*v)) );% 2*v; % (sin(v).*v)+4*v; figure(1);subplot(4,1,1) ; plot(u(1:700)); title('input to the system'); subplot(4,1,2) ; plot(v(1:700)); title('output of the filter: before nonlinearity'); subplot(4,1,3) ; plot(y(1:700)); title('output of the whole system');hold off subplot(4,1,4) ; plot(y2(1:700));title('output of the whole system;v2 = .95*u hammersteinish'); N=200; m=2;n= sum(size(a))-2;r=n+1; sg = 1; %% solve linear equation % construct Kernel matrix . The last two hundred data points will be used. xtrain = u(201:400); for i=1:N % K is omega matrix for j=1:N K(i,j) = exp(-((u(1,i+200)-u(1,j+200))^2)*1); %itis oki end end % Construct Yf. Again the last two hundred data points will be used . Yf = y(1,200+r:200+N); %it is okei % Construct Yp:n*N-r+1. Again the last two hundred data points will be used. for i=1:n Yp(i,1:N-r+1) = y(1,200+r-i:N+200-i); %itis okei end % Construct Ko. Ko:194*4 (expected) it is okei. for p = 1:N-r+1 for q = 1:m sumk = 0; for t = 1:N sumk = sumk + K(t,r+p-q-1); % -1 is added compared to the original case end
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Ko(p,q) = sumk; end end % Construct Ksus . Ksus:194*194 (expected, not sure): well Ksus2 = Ksus . % it is great.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 07/04/2011 for the course ECE 501 taught by Professor Deniz during the Spring '11 term at Istanbul Universitesi.

Page1 / 6

WienAsHammerValContEx - % Wiener identification : thsing...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online