{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

CS545_Lecture_6

CS545_Lecture_6 - CS545Contents VI Control Theory II Linear...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CS545—Contents VI Control Theory II §੿ Linear Stability Analysis §੿ Linearization of Nonlinear Systems §੿ Discretization Reading Assignment for Next Class §੿ See http://www-clmc.usc.edu/~cs545
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Stability Analysis Given the control system How can we the show that a particular choice of a controller generates a stable control system? In order to get started, consider whether the generic dynamical system is stable: ˙ x = f x , u ( ) or ˙ x = Ax + Bu ˙ x = f x ( ) or ˙ x = Ax
Background image of page 2
Equilibrium Points and Stability Definition of an Equilibrium Point A state x is an equilibrium state (or equilibrium point) of the system if once x(t) is equal to x, it remains equal to x for all future time. Mathematically, this means: Definition of Stability An equilibrium state x is said to be stable, if, for any R>0, there exists r>0, such that if ||x(0)||<r,then ||x(t)||<R for all t 0. Otherwise, the equilibrium point is unstable.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}