Example
Consider
A
=
0
.
5

0
.
6
0
.
75
1
.
1
.
We have found that
λ
= 0
.
8

0
.
6
i
and a corresponding eigenvector is

2

4
i
5
.
Let
P
= [
<
v
=
v
]. We shall discover that
C
=
P

1
AP
is precisely a rotation. There is a rotation in
A
.
P
can be considered as a change of variables
matrix.
Theorem
5.4
.
Let
A
be a real
2
×
2
matrix with a complex eigenvalue
λ
=
a

bi
with
b
6
= 0
and an associated eigenvector
v
in
C
2
. Then
A
=
PCP

1
,
where
P
= [
<
v
=
v
]
and
C
=
a

b
b

a
.
38
5. EIGENVALUES AND EIGENVECTORS
5. Discrete Dynamical Systems
CHAPTER 6
Orthgonality and Least Squares
1. Inner Product, Length and Orthgonality
Given two vectors
u
and
v
in
R
n
, we can define the inner product of them
u
·
v
as
u
T
v
.
Example
Compute
u
·
v
and
v
·
v
for
u
=
2

5

1
, v
=
3
2

3
.
Theorem
6.1
.
If
u
,
v
and
w
are vectors in
R
n
and let
c
be a scalar. Then
(1)
u
·
v
=
v
·
u
.
(2) (
u
+
v
)
·
w
=
u
·
w
+
v
·
w
.
(3) (
cu
)
·
v
=
c
(
u
·
v
) =
u
·
(
cv
)
.
(4)
u
·
u
≥
0
and
u
·
u
= 0
if and only if
u
= 0
.
The length (or norm) of a vector
v
is the nonnegative scalar
k
v
k
defined by
k
v
k
=
√
v
·
v.
Example
Let
v
=
1

2
2
0
.
Find a unit vector in the same direction as
v
.
Let
u
and
v
be two vectors in
R
n
. The distance between
u
and
v
, written as dist (
u, v
) is
the length of the vector
u

v
, that is
dist (
u, v
) =
k
u

v
k
.
In
R
2
or
R
3
, we realize that two vectors
u
and
v
are perpendicular if and only if the distance
between
u
and
v
is the same as that from
u
to

v
. Computing these two distances, we get
k
u
k
2
+
k
v
k
2
+ 2
u
·
v
=
k
u
k
2
+
k
v
k
2

2
u
·
v.
Now we must have
u
·
v
= 0.
Two vectors
u
and
v
in
R
n
are said to be orthgonal if
u
·
v
= 0.
So we have
u
and
v
are orthogonal if and only if
k
u
+
v
k
2
=
k
u
k
2
+
k
v
k
2
.
39
40
6. ORTHGONALITY AND LEAST SQUARES
Let
W
be a subspace of
R
n
.
If a vector
z
is orthogonal to every vector in
W
, then
z
is
said to be orthogonal to
W
. The set of all vectors
z
that are orthogonal to
W
is called the
orthogonal complement of
W
and is denoted by
W
⊥
.
We have (
W
⊥
)
⊥
=
W
.
W
⊥
is also a subspace of
R
n
. A vector
v
is in
W
⊥
if and only if
v
is
perpendicular to every vector in a set that spans
W
.
Theorem
6.2
.
Let
A
be an
m
×
n
matrix. The orthogonal complement of the row space of
A
is the null space of
A
and the orthogonal complement of the columns space of
A
is the null
space of
A
T
.
If
u
and
v
are two vectors in
R
2
or
R
3
, then we have
u
·
v
=
k
u
kk
v
k
cos
θ
where
θ
is the angle between the two vectors. So we can easily compute the angel between two
vectors.
2. Orthogonal Sets
A set of vectors
{
v
1
, v
2
,
· · ·
, v
p
}
in
R
n
is said to be an orthogonal set if
v
i
·
v
j
= 0 for all
i
6
=
j.
Theorem
6.3
.
If
S
is an orthogonal set of nonzero vectors in
R
n
, then
S
is linearly inde
pendent and hence is a basis for the subspace spanned by
S
.
Proof.
Give a proof.