This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Unit VII: Symmetric Matrices 1. Spectral Theorem A linear transformation on a subspace V of R n is a function T from V to V that satisfies T ( a x + b y ) = aT ( x ) + bT ( y ) , x , y V, a,b scalars . The matrix of T with respect to the basis { v 1 ,..., v m } of V is the m m matrix A satisfying T ( v k ) = m X j =1 A jk v j , 1 k m. These definitions are virtually the same as those given earlier for a linear transformation on R n . Again note that the k th column of A contains the coordinates of T ( v k ) with respect to the basis { v 1 ,..., v m } . Also, note that if c 1 ,...,c m are the components of u V with respect to the basis { v 1 ,..., v m } , and d 1 ,...,d m are the components of T ( u ), then d j = k A jk c k . Lemma If { u 1 ,..., u m } is an orthonormal basis for V , then the matrix A for T with respect to the basis is given by A jk = T ( u k ) u j , 1 k m. Proof. This follows from T ( u k ) u j = r A rk u r u j = A jk . Definition: A symmetric linear transformation on a subspace V of R n is a linear trans formation T on V that satisfies T ( x ) y = x T ( y ) , x , y V. Lemma. Let T be a linear transformation on a subspace V of R n , and let { u 1 ,..., u m } be an orthonormal basis for V . Then T is a symmetric linear transformation if and only if the matrix of T with respect to the basis { u 1 ,..., u m } is a symmetric matrix. Proof. This follows from A jk = T ( u k ) u j = u k T ( u j ) = T ( u j ) u k = A kj . Theorem 1. Let A be a (real) symmetric n n matrix. Then the roots of the character istic polynomial p A ( ) = det ( I A ) of A are real. Thus A has n real eigenvalues (counting multiplicity). Proofsketch. Let be a root of the characteristic polynomial of A . Then there is a nonzero vector z = ( z 1 ,...,z n ) C n that satisfies ( I A ) z = . We express each z j = x j + iy j as the sum of its real and imaginary parts. Then z = x + i y , where x = ( x 1 ,...,x n ) and y = ( y 1 ,...,y n ) are vectors in R n . We denote x i y = z . Then from z j z j =  z j  2 , we obtain z z =  z j  2 6 = 0. Since A z = z , we have X  z j  2 = X j z j z j = X j ( Az ) j z j = X j,k A kj z k z j . 1 Since the matrix A is symmetric and has real entries, this is equal to X j,k z k A jk z j = X k z k ( Az ) k = X k z k z k = X k  z k  2 . Dividing by  z j  2 , we conclude that = , and is real. Theorem 2. If T is a symmetric linear transformation, then eigenvectors of T corre sponding to different eigenvalues are orthogonal. Proof. If T ( u ) = u and T ( v ) = v , then u v = T ( u ) v = u T ( v ) = u v . Thus either = or u v = 0....
View
Full
Document
This note was uploaded on 06/25/2008 for the course MATH 33a taught by Professor Lee during the Spring '08 term at UCLA.
 Spring '08
 lee
 Matrices, Scalar

Click to edit the document details