(c) If
A
is produced from
A
by interchanging a pair of columns or a pair of rows
then det
A
=
-
det
A
.
(d) If
A
is produced from
A
by adding a multiple of a row to another, then
det
A
= det
A
.
3. If
A
is upper–triangular:
det
A
=
a
11
*
. . .
0
a
22
*
. . .
0
0
a
33
. . .
.
.
.
.
.
.
0
0
. . .
a
nn
=
a
11
·
a
22
·
a
33
· · ·
a
nn
Proof by induction on the size
n
:
•
n
= 1: For a 1
×
1 matrix
A
= [
a
], det
A
=
a
.
•
Assume the statement is true for [
k
-
1,
k
-
1] matrices.
•
If
A
is a [
k
,
k
] matrix, then
M
kk
is the determinant of an [
k
-
1,
k
-
1] matrix,
and by the inductive assumption,
det
A
=
0
C
k
1
+ 0
·
C
k
1
+
· · ·
+
a
kk
·
C
kk
=
M
kk
a
kk
= (
a
11
·
a
22
· · ·
a
k
-
1
k
-
1
)
·
a
kk
4. det
AB
= det
A
det
B
.
An important consequence is that determinants can be calculated by first row reducing
using the properties (2), and then applying property (3) to the resultant upper triangular
row echelon matrix. Just remember to keep the multiplicative factors which accumulate
when constants are pulled out using properties (2).
Another consequence is verifying one important aspect of Theorem 2, namely:
A
is an invertible matrix
⇐⇒
det
A
= 0.
Proof.
A
is not invertible
⇐⇒
A
has a zero row, where
A
is some row echelon matrix derived from
A
⇐⇒
det
A
= 0
⇐⇒
det
A
= 0

Subscribe to view the full document.
1.7.
EIGENVALUES SND EIGENSPACES
19
1.7
Eigenvalues snd Eigenspaces
The concept of eigenvalues and their associated eigenspaces (eigenvectors) leads to un-
derstanding of the possible behaviour of the general linear operator
A
:
V
→
V
.
Definition 10.
Given the linear operator
A
, a vector
x
∈
V
is called an
eigenvector
of
A
with
eigenvalue
λ
, a scalar, if
x
is non-zero and the eigenvalue equation is satisfied:
A
x
=
λx
(1.12)
or equivalently
(
λ
I - A
)
x
= 0
For each eigenvalue
λ
, the subspace spanned by the
λ
-eigenvectors is called the
λ
-eigenspace
.
Example 3.
If
R
is a rotation about an axis
ω
, then
R
ω
=
ω
, so
ω
is an eigenvector
with eigenvalue
1
. Any non-zero scalar multiple of
ω
is also a
1
-eigenvector.
Theorem 4.
If
V
is a complex vector space of dimension
N
, then any linear operator
A
:
V
→
V
has at least one and at most
N
eigenvalues.
Proof.
(For matrices
A
:
C
N
→
C
N
) It is clear that
x
= 0 is always a solution of
Ax
=
λ
x
.
Parts 3.
and 12.
of our big theorem on invertibility, Theorem 2, implies a non-zero
solution exists if and only if
λ
I
-
A
is non-invertible if and only if
det(
λ
I
-
A
) = 0.
This equation is called the
characteristic equation
, and is satisfied by any eigenvalue
λ
.
A bit of thought convinces you that det(
λ
I
-
A
) is a (complex) polynomial in
λ
of degree
N
, some of whose coefficients can be identified:
det(
λ
I
-
A
) =
λ
N
-
Tr (
A
)
λ
N
-
1
+
· · ·
+ (
-
1)
N
det(
A
)
where the
trace
is
Tr (
A
) =
N
i
=1
a
ii
Finally, the fundamental theorem of algebra says that it must have at least one and no
more than
n
complex roots: these roots are the
eigenvalues of
A
, the set of all such is
called the
spectrum of
A
.


- Winter '10
- kovarik