11Ma2aPracLectures 6-9

# 11Ma2aPracLectures 6-9 - Lecture 6 Dinakar Ramakrishnan...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 6 Dinakar Ramakrishnan Stability This is a very important and not so transparent a concept. It arises from the need to know if the solutions one comes up with for differential equations are stable, especially the special solutions called the equilibrium solutions. What one wants to know is this: If we start with a solution close to a desired solution, will we stay close to this solution as things evolve in time? This is of practical importance, as it is next to impossible in real situations to manufacture exact solutions, and when one deviates a little initially, one needs to know that what evolves later will stick close to the plan. Suppose we’re given a 1st order ODE: dy dt = f ( t, y ) . (*) When we succeed in solving this equation, we find that there are infinitely many solutions involving some constant c , due to the ambiguity arising from indefinite integration. Very important solutions are those which satisfy dy dt = 0; they are called the equilibrium solutions . Often, but not always, we start with an initial condition say y = y at t = 0, which allows us to solve for c and find one solution. At times, though, it is profitable to look at the structure of all the solutions, without fixing an initial state. There are many solutions if we do not fix an initial condition! What one wants to do : i) Look for stable solutions; ii) Check if the equilibrium solutions are asymptotically stable . Intuitive Definition Suppose y 1 ( t ) is a solution of ( ∗ ). We say that it’s stable iff any other solution y 2 ( t ) which starts out being close to y 1 ( t ) at t = 0 remains close to y 1 ( t ) for all t > 0. To make this precise, we need to quantify what it means to be close: 1 A precise definition of stability : Start with a fixed solution y 1 ( t ) of ( ∗ ). It is a stable solution if for every ϵ > 0, ∃ δ > 0 such that, for every other solution y 2 ( t ) satisfying | y 1 (0) − y 2 (0) | < δ, one has | y 2 ( t ) − y 1 ( t ) | < ϵ, ∀ t > . Example : (#) dy dt = ry The only possible —it equilibrium solution is y = 0 when r ̸ = 0. When r = 0, every solution, necessarily of the form y = constant , is an equilibrium solution, When y ̸ = 0, 1 y dy dt = r ∫ d dt (log | y | ) dt = r ∫ dt = rt + c General solution : y = Be rt , where B is any constant, with B = 0 corresponding to the equilibrium solu- tion y = 0. First consider when r < . The solution Be rt , for any fixed B , approaches the equilibrium solution y = 0 as t → ∞ . We claim the following: Every solution of (#) is stable for r < 0. Indeed, fix a solution y B ( t ) = Be rt , and consider any other solution y C ( t ) = Ce rt , and an arbitrary positive number ϵ . We have to be able to choose a δ > 0 such that whenever | B − C | < δ , we have | y B ( t ) − y C ( t ) | < ϵ ....
View Full Document

{[ snackBarMessage ]}

### Page1 / 16

11Ma2aPracLectures 6-9 - Lecture 6 Dinakar Ramakrishnan...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online