Projection matrix
Orthogonal projection
Let
E
be a vector space, and
S
a subspace of
E
.
Project a vector
x
orthogonally on
S
, and denote
u
the projection of
x
. How can we calculate
u
?
The problem is not to be ignored by the statistician as it appears in several important occasions :
* Principal Component Analysis (
PCA
) is all about orthogonal projections.
*
Multiple Linear Regression
is fundamentally a problem in orthogonal projection.
* The distributional and independence properties of
quadratic forms
in multivariate normal vectors are also fundamental in problems of variance
decomposition (
ANOVA
and Multiple Linear Regression), and call on the concept of orthogonal projection.
Projection matrices
Orthogonal projection problems can be nicely represented and treated within the framework of Linear Algebra.
Projection of a vector
"Orthogonal projection on
S
" is a linear operator, and can therefore be conveniently represented by a matrix
P
S
. We'll
show
that if
Z
S
is a matrix
whose columns form an orthonormal basis of the subspace
S
,
then the orthogonal projection
u
of any vector
x
is given by :
u = P
S
x =
(
Z
S
Z'
S
)
x
Uniqueness of the projection matrix
From the above result, it would seem that the projection matrix
P
S
depends on the particular orthonormal basis chosen for spanning
S
. In fact, we'll
show that
P
S
does
not
depend on the choice of this basis. In other words, let
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '08
 Chandrasekara
 Linear Algebra, projection matrix

Click to edit the document details