Unformatted text preview: LINEAR ALGEBRA
W W L CHEN
c W W L Chen, 1997, 2005. This chapter is available free to all individuals, on the understanding that it is not to be used for ﬁnancial gain,
and may be downloaded and/or photocopied, with or without permission from the author.
However, this document may not be kept on any information storage and retrieval system without permission
from the author, unless such system is not accessible to any individuals other than its owners. Chapter 12
COMPLEX VECTOR SPACES 12.1. Complex Inner Products
Our task in this section is to deﬁne a suitable complex inner product. We begin by giving a reminder of
the basics of complex vector spaces or vector spaces over C.
Definition. A complex vector space V is a set of objects, known as vectors, together with vector
addition + and multiplication of vectors by elements of C, and satisfying the following properties:
(VA1) For every u, v ∈ V , we have u + v ∈ V .
(VA2) For every u, v, w ∈ V , we have u + (v + w) = (u + v) + w.
(VA3) There exists an element 0 ∈ V such that for every u ∈ V , we have u + 0 = 0 + u = u.
(VA4) For every u ∈ V , there exists −u ∈ V such that u + (−u) = 0.
(VA5) For every u, v ∈ V , we have u + v = v + u.
(SM1) For every c ∈ C and u ∈ V , we have cu ∈ V .
(SM2) For every c ∈ C and u, v ∈ V , we have c(u + v) = cu + cv.
(SM3) For every a, b ∈ C and u ∈ V , we have (a + b)u = au + bu.
(SM4) For every a, b ∈ C and u ∈ V , we have (ab)u = a(bu).
(SM5) For every u ∈ V , we have 1u = u.
Remark. Subspaces of complex vector spaces can be deﬁned in a similar way as for real vector spaces.
An example of a complex vector space is the euclidean space Cn consisting of all vectors of the form
u = (u1 , . . . , un ), where u1 , . . . , un ∈ C. We shall ﬁrst generalize the concept of dot product, norm and
distance, ﬁrst developed for Rn in Chapter 9.
Definition. Suppose that u = (u1 , . . . , un ) and v = (v1 , . . . , vn ) are vectors in Cn . The complex
euclidean inner product of u and v is deﬁned by
u · v = u1 v1 + . . . + un vn ,
Chapter 12 : Complex Vector Spaces page 1 of 5 c Linear Algebra W W L Chen, 1997, 2005 the complex euclidean norm of u is deﬁned by
u = (u · u)1/2 = (u1 2 + . . . + un 2 )1/2 ,
and the complex euclidean distance between u and v is deﬁned by
d(u, v) = u − v = (u1 − v1 2 + . . . + un − vn 2 )1/2 .
Corresponding to Proposition 9A, we have the following result.
PROPOSITION 12A. Suppose that u, v, w ∈ Cn and c ∈ C. Then
(a) u · v = v · u;
(b) u · (v + w) = (u · v) + (u · w);
(c) c(u · v) = (cu) · v; and
(d) u · u ≥ 0, and u · u = 0 if and only if u = 0.
The following deﬁnition is motivated by Proposition 12A.
Definition. Suppose that V is a complex vector space. By a complex inner product on V , we mean a
function , : V × V → C which satisﬁes the following conditions:
(IP1) For every u, v ∈ V , we have u, v = v, u .
(IP2) For every u, v, w ∈ V , we have u, v + w = u, v + u, w .
(IP3) For every u, v ∈ V and c ∈ C, we have c u, v = cu, v .
(IP4) For every u ∈ V , we have u, u ≥ 0, and u, u = 0 if and only if u = 0.
Definition. A complex vector space with an inner product is called a complex inner product space or
a unitary space.
Definition. Suppose that u and v are vectors in a complex inner product space V . Then the norm of
u is deﬁned by
u = u, u 1 /2 , and the distance between u and v is deﬁned by
d(u, v) = u − v .
Using this inner product, we can discuss orthogonality, orthogonal and orthonormal bases, the GramSchmidt orthogonalization process, as well as orthogonal projections, in a similar way as for real inner
product spaces. In particular, the results in Sections 9.4 and 9.5 can be generalized to the case of complex
inner product spaces. 12.2. Unitary Matrices
For matrices with real entries, orthogonal matrices and symmetric matrices play an important role in the
orthogonal diagonalization problem. For matrices with complex entries, the analogous roles are played
by unitary matrices and hermitian matrices respectively.
Definition. Suppose that A is a matrix with complex entries. Suppose further that the matrix A is
obtained from the matrix A by replacing each entry of A by its complex conjugate. Then the matrix
A∗ = A t is called the conjugate transpose of the matrix A.
Chapter 12 : Complex Vector Spaces page 2 of 5 Linear Algebra c W W L Chen, 1997, 2005 PROPOSITION 12B. Suppose that A and B are matrices with complex entries, and that c ∈ C. Then
(a) (A∗ )∗ = A;
(b) (A + B )∗ = A∗ + B ∗ ;
(c) (cA)∗ = cA∗ ; and
(d) (AB )∗ = B ∗ A∗ .
Definition. A square matrix A with complex entries and satisfying the condition A−1 = A∗ is said to
be a unitary matrix.
Corresponding to Proposition 10B, we have the following result.
PROPOSITION 12C. Suppose that A is an n × n matrix with complex entries. Then
(a) A is unitary if and only if the row vectors of A form an orthonormal basis of Cn under the complex
euclidean inner product; and
(b) A is unitary if and only if the column vectors of A form an orthonormal basis of Cn under the
complex euclidean inner product. 12.3. Unitary Diagonalization
Corresponding to the orthogonal disgonalization problem in Section 10.3, we now discuss the following
unitary diagonalization problem.
Definition. A square matrix A with complex entries is said to be unitarily diagonalizable if there exists
a unitary matrix P with complex entries such that P −1 AP = P ∗ AP is a diagonal matrix with complex
entries.
First of all, we would like to determine which matrices are unitarily diagonalizable. For those that
are, we then need to discuss how we may ﬁnd a unitary matrix P to carry out the diagonalization. As
before, we study the question of eigenvalues and eigenvectors of a given matrix; these are deﬁned as for
the real case without any change.
In Section 10.3, we have indicated that a square matrix with real entries is orthogonally diagonalizable
if and only if it is symmetric. The most natural extension to the complex case is the following.
Definition. A square matrix A with complex entries is said to be hermitian if A = A∗ .
Unfortunately, it is not true that a square matrix with complex entries is unitarily diagonalizable
if and only if it is hermitian. While it is true that every hermitian matrix is unitarily diagonalizable,
there are unitarily diagonalizable matrices that are not hermitian. The explanation is provided by the
following.
Definition. A square matrix A with complex entries is said to be normal if AA∗ = A∗ A.
Remark. Note that every hermitian matrix is normal and every unitary matrix is normal.
Corresponding to Propositions 10E and 10G, we have the following results.
PROPOSITION 12D. Suppose that A is an n × n matrix with complex entries. Then it is unitarily
diagonalizable if and only if it is normal.
PROPOSITION 12E. Suppose that u1 and u2 are eigenvectors of a normal matrix A with complex
entries, corresponding to distinct eigenvalues λ1 and λ2 respectively. Then u1 · u2 = 0. In other words,
eigenvectors of a normal matrix corresponding to distinct eigenvalues are orthogonal.
Chapter 12 : Complex Vector Spaces page 3 of 5 c Linear Algebra W W L Chen, 1997, 2005 We can now follow the procedure below.
UNITARY DIAGONALIZATION PROCESS. Suppose that A is a normal n × n matrix with
complex entries.
(1) Determine the n complex roots λ1 , . . . , λn of the characteristic polynomial det(A − λI ), and ﬁnd
n linearly independent eigenvectors u1 , . . . , un of A corresponding to these eigenvalues as in the
Diagonalization process.
(2) Apply the GramSchmidt orthogonalization process to the eigenvectors u1 , . . . , un to obtain orthogonal eigenvectors v1 , . . . , vn of A, noting that eigenvectors corresponding to distinct eigenvalues are
already orthogonal.
(3) Normalize the orthogonal eigenvectors v1 , . . . , vn to obtain orthonormal eigenvectors w1 , . . . , wn of
A. These form an orthonormal basis of Cn . Furthermore, write P = ( w1 ... wn ) D= and λ1
.. , .
λn where λ1 , . . . , λn ∈ C are the eigenvalues of A and where w1 , . . . , wn ∈ Cn are respectively their
orthogonalized and normalized eigenvectors. Then P ∗ AP = D.
We conclude this chapter by discussing the following important result which implies Proposition 10F,
that all the eigenvalues of a symmetric real matrix are real.
PROPOSITION 12F. Suppose that A is a hermitian matrix. Then all the eigenvalues of A are real.
Sketch of Proof. Suppose that A is a hermitian matrix. Suppose further that λ is an eigenvalue of
A, with corresponding eigenvector v. Then
Av = λv.
Multiplying on the left by the conjugate transpose v∗ of v, we obtain
v∗ Av = v∗ λv = λv∗ v.
To show that λ is real, it suﬃces to show that the 1 × 1 matrices v∗ Av and v∗ v both have real entries.
Now
(v∗ Av)∗ = v∗ A∗ (v∗ )∗ = v∗ Av
and
(v∗ v)∗ = v∗ (v∗ )∗ = v∗ v.
It follows that both v∗ Av and v∗ v are hermitian. It is easy to prove that hermitian matrices must have
real entries on the main diagonal. Since v∗ Av and v∗ v are 1 × 1, it follows that they are real. Problems for Chapter 12
1. Consider the set V of all matrices of the form
z
0 0
z , where z ∈ C, with matrix addition and scalar multiplication. Determine whether V forms a complex
vector space.
Chapter 12 : Complex Vector Spaces page 4 of 5 c Linear Algebra W W L Chen, 1997, 2005 2. Is Rn a subspace of Cn ? Justify your assertion.
3. Prove Proposition 12A.
4. Suppose that u, v, w are elements of a complex inner product space, and that c ∈ C.
a) Show that u + v, w = u, w + v, w .
b) Show that u, cv = c u, v .
5. Let V be the vector space of all continuous functions f : [0, 1] → C. Show that
1 f, g = f (x)g (x) dx
0 deﬁnes a complex inner product on V .
6. Suppose that u, v are elements of a complex inner product space, and that c ∈ C.
a) Show that u − cv, u − cv = u, u − c u, v − c u, v + cc v, v .
b) Deduce that u, u − c u, v − c u, v + cc v, v ≥ 0.
c) Prove the CauchySchwarz inequality, that  u, v 2 ≤ u, u v, v .
7. Generalize the results in Sections 9.4 and 9.5 to the case of complex inner product spaces. Try to
prove as many results as possible.
8. Prove Proposition 12B.
9. Prove Proposition 12C.
10. Prove that the diagonal entries on every hermitian matrix are all real.
11. Suppose that A is a square matrix with complex entries.
a) Prove that det(A) = det A.
b) Deduce that det(A∗ ) = det A.
c) Prove that if A is hermitian, then det A is real.
d) Prove that if A is unitary, then  det A = 1.
12. Apply the Unitary diagonalization process to each of the following matrices:
4
1−i
a) A =
1+i
5
b) A = 3 −i
i3 5
0
0
c) A = 0
−1
−1 + i 0 −1 − i
0
13. Suppose that λ1 and λ2 are distinct eigenvalues of a hermitian matrix A, with eigenvectors u1 and
u2 respectively.
a) Show that u∗ Au2 = λ1 u∗ u2 and u∗ Au2 = λ2 u∗ u2 .
1
1
1
1
b) Complete the proof of Proposition 12E.
14. Suppose that A is a square matrix with complex entries, and that A∗ = −A.
a) Show that iA is a hermitian matrix.
b) Show that A is unitarily diagonalizable but has purely imaginary eigenvalues. Chapter 12 : Complex Vector Spaces page 5 of 5 ...
View
Full Document
 Fall '08
 PETRINA
 Linear Algebra, Euclidean space, Hilbert space, inner product, complex vector spaces

Click to edit the document details