# u6 - Unit VI: Eigenvectors, Eigenvalues, and...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Unit VI: Eigenvectors, Eigenvalues, and Diagonalization 1. Matrix of a Linear Transformation with Respect to a Basis Recall that a linear transformation on R n is a function T from R n to R n that satisfies T ( a x + b y ) = aT ( x ) + bT ( y ) , x , y ∈ R n , a,b scalars . Associated with T is the n × n matrix A defined so that T ( x ) = A x for x ∈ R n . The k th column of A contains the coordinates of T ( e k ) with respect to the standard basis, T ( e k ) = n X j =1 A jk e j , 1 ≤ k ≤ n. We extend this definition to an arbitrary basis { v 1 ,..., v n } for R n by defining the matrix of T with respect to the basis { v 1 ,..., v n } to be the n × n matrix A satisfying T ( v k ) = n X j =1 A jk v j , 1 ≤ k ≤ n. Note that the summation is over the first index, not the second. Thus the k th column of A contains the coordinates of T ( v k ) with respect to the basis { v 1 ,..., v n } . The definition of a matrix with respect to a basis depends on the specific order of the vectors v 1 ,..., v n in the basis. With this in mind, we occasionally refer to { v 1 ,..., v n } as an ordered basis for R n . Example: Let T : R 2 → R 2 be the reflection in the line { y = x } . Since T ( e 1 ) = e 2 and T ( e 2 ) = e 1 , the matrix of T with respect to the standard basis { e 1 , e 2 } is A = 0 1 1 0 . The vectors w 1 = e 1 + e 2 and w 2 =- e 1 + e 2 also form a basis for R 2 . Since T ( w 1 ) = w 1 and T ( w 2 ) =- w 2 , the matrix of T with respect to the basis { w 1 , w 2 } is the diagonal matrix B = 1- 1 . Let { v 1 ,..., v n } be a basis for R n . Each v ∈ R n can be expressed uniquely as a linear combination v = ∑ c j v j , where the scalars c 1 ,...,c m are the coordinates of v with respect to the basis. Then T ( v ) = T ( ∑ k c k v k ) = ∑ k c k T ( v k ) = ∑ k c k ( ∑ j A jk v j ). If we interchange the order of summation, this latter sum becomes ∑ j ( ∑ k A jk c k ) v j . This shows that T ( v ) = ∑ d j v j , where d j = ∑ k A jk c k . Thus the coordinates d 1 ,...,d m of T ( v ) are related to the coordinates c 1 ,...,c m of v by the matrix equation d = A c . We state our result formally. Theorem 1. Let T be a linear transformation on R n , let { v 1 ,..., v n } be a basis for R n , and let A be the matrix of T with respect to { v 1 ,..., v n } . If v = ∑ n k =1 c k v k , then T ( v ) = ∑ n j =1 d j v j , where d j = n X k =1 A jk c k , 1 ≤ j ≤ n. 1 Note that the summation is over the second index, not the first. This apparent inconsis- tency with the formula for the images of the basis vectors has been the bane of linear algebra students for generations. As a well known mathematician Paul Halmos once pointed out, this inconsistency is not a perversity of mathematicians but rather a perversity of nature....
View Full Document

## This note was uploaded on 06/25/2008 for the course MATH 33a taught by Professor Lee during the Spring '08 term at UCLA.

### Page1 / 11

u6 - Unit VI: Eigenvectors, Eigenvalues, and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online