This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: The Exponential Series 1 Section 1 We consider the initial value problem X = AX X (0) = [1 , 1] t (1) where A = 2 1 4 2 Then (as you can check) det( A λI ) = λ 2 so the only eigenvalue is λ = 0. The equation AX o = 0 X o is equivalent with the system x o + 2 y o = 0 4 x o 2 y o = 0 The corresponding eigenspace is spanned by [ 2 , 1] t and the straight line solution is Y 1 ( t ) = e t 2 1 To solve our initial value problem (1), we attempt to find a constant C such that 1 1 = CY 1 (0) = 2 C C No such C exists! The problem is that it takes two linearly independent vectors to span R 2 . We cannot hope to solve for the general initial value using only Y 1 (0). We will solve our initial value problem in another, quite cleaver, technique. To explain this, consider, for a moment, the following initial value problem. y = 2 y, y (0) = y o 1 1 SECTION 1 2 This is a separable equation and the solution is y ( t ) = y o e 2 t Now, recall that e x is given by the power series e x = 1 + x + x 2 2! + x 3 3! + ··· + x n n ! + ... Hence, y ( t ) = y o e 2 t = y o + 2 ty o + 2 2 t 2 2! y o + 2 3 t 3 3! y o + ··· + 2 n t n n ! y o + ... = y o + t 2 y o + t 2 2! 2 2 y o + t 3 3! 2 3 y o + ··· + t n n ! 2 n y o + ... Is it conceivable that the solution to our initial value problem (1) can be expressed in exactly the same manner? i.e. X ( t ) = e At X o = X o + tAX o + t 2 2! A 2 X o + t 3 3! A 3 X + ··· + t n n ! A n X o + ... (2) (We interpret quantities such as A 3 X o as A ( A ( AX o )).) To check this, we compute AX o = 2 1 4 2 1 1 = 3 6 A 2 X o = A ( AX o ) = 2 1 4 2 3 6 = Thus, A n X o = for all n ≥ 2, so our reasoning suggests that X ( t ) = X o + tAX o = 1 1 + t 3 6 = 1 + 3 t 1 6 t As a check, we compute that AX ( t ) = 2 1 4 2 1 + 3 t 1 6 t = 3 6 = X ( t ) 1 SECTION 1 3 It is also clear that X (0) = [1 , 1] t , so we indeed have solved our initial value problem. In general, it turns out that the series represented by formula (2) con verges for any n × n matrix A and n × 1 column vector X o . Granted this, and granted that we may differentiate this series term by term, we can show that this series always represents a solution to the initial value problem X = AX , X (0) = X o . Specifically, from formula (2), we compute that X (0) = X o and X ( t ) = AX o + 2 t 2! A 2 X o + 3 t 2 3! A 3 X + ··· + n t n 1 n ! A n X o + ... = A ( X o + tAX o + t 2 2! A 2 X o + t 3 3! A 3 X + ··· + t n 1 ( n 1)! A n 1 X o + ... = AX ( t ) Usually, this infinite series of vectors is difficult to explicitly sum. In our example, we were aided by the fact that the seemingly infinite series (2) actually turned out to be a finite series because A ( AX o ) = . Was this due to our choice of initial data? Suppose instead we had set X o = [ a,b ] t . Then AX o = 2 1 4 2 a b = a 2 4 + b 1 2 (3) Hence A ( AX o ) = aA 2 4 + bA 1 2 = a + b = Thus, for the given system, the same technique would work for any initial data....
View
Full Document
 Spring '08
 CHO
 Linear Algebra, Eigenvalue, eigenvector and eigenspace, Generalized eigenvector, XO

Click to edit the document details