This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Eigen Methods Math 246, Fall 2009, Professor David Levermore Eigenpairs. Let A be a real n Ã— n matrix. A number Î» (possibly complex) is an eigenvalue of A if there exists a nonzero vector v (possibly complex) such that (1) Av = Î» v . Each such vector is an eigenvector associated with Î» , and ( Î», v ) is an eigenpair of A . Fact 1: If ( Î», v ) is an eigenpair of A then so is ( Î», Î± v ) for every complex Î± negationslash = 0. In other words, if v is an eigenvector associated with an eigenvalue Î» of A then so is Î± v for every complex Î± negationslash = 0. In particular, eigenvectors are not unique. Reason. Because ( Î», v ) is an eigenpair of A you know that (1) holds. It follows that A ( Î± v ) = Î± Av = Î±Î» v = Î» ( Î± v ) . Because the scalar Î± and vector v are nonzero, the vector Î± v is also nonzero. Therefore ( Î», Î± v ) is also an eigenpair of A . square Finding Eigenvalues. Recall that the characteristic polynomial of A is defined by (2) p A ( z ) = det( z I âˆ’ A ) . It has the form p A ( z ) = z n + Ï€ 1 z n 1 + Ï€ 2 z n 2 + Â·Â·Â· + Ï€ n 1 z + Ï€ n , where the coefficients Ï€ 1 , Ï€ 2 , Â·Â·Â· , Ï€ n are real. In other words, it is a real monic polynomial of degree n . One can show that in general Ï€ 1 = âˆ’ tr( A ) , Ï€ n = ( âˆ’ 1) n det( A ) . In particular, when n = 2 one has p A ( z ) = z 2 âˆ’ tr( A ) z + det( A ) . Because det( z I âˆ’ A ) = ( âˆ’ 1) n det( A âˆ’ z I ), this definition of p A ( z ) coincides with the bookâ€™s definition when n is even, and is its negative when n is odd. Both conventions are common. We have chosen the convention that makes p A ( z ) monic. What matters most about p A ( z ) is its roots and their multiplicity, which are the same for both conventions. Fact 2: A number Î» is an eigenvalue of A if and only if p A ( Î» ) = 0. In other words, the eigenvalues of A are the roots of p A ( z ). Reason. If Î» is an eigenvalue of A then by (1) there exists a nonzero vector v such that ( Î» I âˆ’ A ) v = Î» v âˆ’ Av = 0 . It follows that p A ( Î» ) = det( Î» I âˆ’ A ) = 0. Conversely, if p A ( Î» ) = det( Î» I âˆ’ A ) = 0 then there exists a nonzero vector v such that ( Î» I âˆ’ A ) v = 0. It follows that Î» v âˆ’ Av = ( Î» I âˆ’ A ) v = 0 , whereby Î» and v satisfy (1), which implies Î» is an eigenvalue of A . square Fact 2 shows that the eigenvalues of a n Ã— n matrix A can be found if you can find all the roots of the characteristic polynomial of A . Because the degree of this characteristic polynomial is n , and because every polynomical of degree n has exactly n roots counting multiplicity, the n Ã— n matrix A therefore must have at least one eigenvalue and at most n eigenvalues. 1 2 Example. Find the eigenvalues of A = parenleftbigg 3 2 2 3 parenrightbigg ....
View
Full
Document
 Spring '10
 LEVERMORE
 Linear Algebra, Matrices, Orthogonal matrix, real solutions, real nÃ—n matrix

Click to edit the document details