This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Orthogonal Projections and Reflections (with exercises) by D. Klain Version 2010.01.23 Corrections and comments are welcome! Orthogonal Projections Let X 1 ,...,X k be a family of linearly independent (column) vectors in R n , and let W = Span( X 1 ,...,X k ) . In other words, the vectors X 1 ,...,X k form a basis for the kdimensional subspace W of R n . Suppose we are given another vector Y ∈ R n . How can we project Y onto W orthogonally? In other words, can we find a vector ˆ Y ∈ W so that Y ˆ Y is orthogonal (perpendicular) to all of W ? See Figure 1. To begin, translate this question into the language of matrices and dot products. We need to find a vector ˆ Y ∈ W such that ( Y ˆ Y ) ⊥ Z, for all vectors Z ∈ W . (1) Actually, it’s enough to know that Y ˆ Y is perpendicular to the vectors X 1 ,...,X k that span W . This would imply that (1) holds. (Why?) Expressing this using dot products, we need to find ˆ Y ∈ W so that X T i ( Y ˆ Y ) = 0 , for all i = 1 , 2 ,...,k . (2) This condition involves taking k dot products, one for each X i . We can do them all at once by setting up a matrix A using the X i as the columns of A , that is, let A = X 1 X 2 ··· X k . Note that each vector X i ∈ R n has n coordinates, so that A is an n × k matrix. The set of conditions listed in (2) can now be rewritten: A T ( Y ˆ Y ) = 0 , which is equivalent to A T Y = A T ˆ Y . (3) 1 Figure 1: Projection of a vector onto a subspace. Meanwhile, we need the projected vector ˆ Y to be a vector in W , since we are projecting onto W . This means that ˆ Y lies in the span of the vectors X 1 ,...,X k . In other words, ˆ Y = c 1 X 1 + c 2 X 2 + ··· + c k X k = A c 1 c 2 . . . c k = AC. where C is a kdimensional column vector. On combining this with the matrix equation (3) we have A T Y = A T AC. If we knew what C was then we would also know ˆ Y , since we were given the columns X i of A , and ˆ Y = AC . To solve for C just invert the k × k matrix A T A to get ( A T A ) 1 A T Y = C. (4) How do we know that ( A T A ) 1 exists? Let’s assume it does for now, and then address this question later on. Now finally we can find our projected vector ˆ Y . Since ˆ Y = AC , multiply both sides of (4) to obtain A ( A T A ) 1 A T Y = AC = ˆ Y . The matrix Q = A ( A T A ) 1 A T is called the projection matrix for the subspace W . According to our derivation above, the projection matrix Q maps a vector Y ∈ R n to its orthogonal projection (i.e. its shadow) QY = ˆ Y in the subspace W ....
View
Full
Document
This note was uploaded on 01/14/2011 for the course ECE 210a taught by Professor Chandrasekara during the Fall '08 term at UCSB.
 Fall '08
 Chandrasekara

Click to edit the document details