This preview shows page 1. Sign up to view the full content.
Unformatted text preview: Eigenvalue problems
Ax=y will map into a vector y in the Ax=y column space of A (a square nxn matrix) If y is parallel to x, then Ax=λx. What are If Ax= What the solutions to λ (eigenvalue) and x (eigenvector)? Homogeneous equations (AλI)x = 0 will Homogeneous will have nontrivial eigenvector solutions iff AλI is singular
45 45 Finding the eigenvalues
Determinant and the Characteristic Determinant Equation Order of the characteristic polynomial = Order n = the number of roots; matrix properties (e.g., trace, Det) and polynomial coefficients Nature of the roots Nature
• Real or complex • Distinct, repeated, multiple (algebraic multiplicity) 46 Eigenvector and Eigenspace are associated with eigenvalue
Eigenvectors and the Nullspace(AλI) Eigenvectors Eigenspace = Nullspace(AλI) Eigenspace Dimension of Eigenspace = n – r(AλI) Dimension What is geometric multiplicity? What Dimension of Eigenspace (Algebraic) multiplicity >= geometric (Algebraic) multiplicity? Examples of “defective” or degenerated matrices
47 47 Diagonal and triangular matrices
Diagonal elements are the eigenvalues. Diagonal Why? Evaluating the C.E. Eigenvectors for distinct eigenvalues Eigenvectors Eigenvectors for repeated (multiple) Eigenvectors eigenvalue Algebraic multiplicity = geometric Algebraic multiplicity? Different cases
48 Idempotent matrix
PP…P = P (from P2=P) PP Examples related with projections (we Examples have PT=P as well) Eigenvalues are either 0 or 1, but why? Eigenvalues Eigenvectors and Eigenspace Eigenvectors Case of zero eigenvalue λ=0 Case 49 49 Rotation matrix
Example of rotating by 90 degree – Example what is the matrix? Ax to be parallel to x? But it looks like Ax something impossible Eigenvalues are complex conjugates Eigenvalues 50 Markov matrices
Probability matrix or stochastic matrix? Probability Fixed column sum and fixed row sum Fixed Is this fixed value an eigenvalue? Is The case of column sum = 1 The The case of row sum = 1 The Markov matrix takes the current state to Markov the next state: x(t+1) = Ax(t)
51 51 Eigenvectors and LI
Distinct eigenvalues (all have algebraic Distinct multiplicity =1) give distinct eigenvectors (all have geometric multiplicity = 1) k distinct eigenvalues =>k LI distinct eigenvectors n LI eigenvectors forms a vector basis LI for Rn
52 Similar matrices
A is similar to B if there is an invertible is matrix P such that P1AP = B The process is also called similarity The transformation Similar matrices have the same CE and Similar the same eigenvalues. Why? 53 53 Diagonalization of A
Why diagonalization? Decoupling, Why factorization, natural coordinates, power of matrices … A is diagonalizable iff A has n LI is eigenvectors (thus forming an eigenvector basis of Rn A = PDP1; what are D and P? PDP
54 54 ...
View
Full
Document
This note was uploaded on 11/13/2010 for the course SEEM SEEM2018 taught by Professor Chan during the Spring '10 term at CUHK.
 Spring '10
 Chan

Click to edit the document details