Chapter 1
Introduction and Review
1. If A IRnn and is a scalar, what is det(A)? What is det(A)?
Answer 1.1 Suppose A IRnn and IR. Denote the identity
of IRnn by In . Then det(A) = det(In A) = det(In )A) =
det(In ) det A = n det A.
Thus det(A) = det(1)A) =
EE 205A
Matrix Analysis
Instructor: Lara Dolecek
Homework 6
Monday, Nov. 17th, 2014
Due: Monday, Nov. 24th, 2014
1. Provide an example for 3 3 matrix with two identical eigenvalues (but not Identity
matrix).
2. What are the eigenvalues of an upper triangu
Chapter 13
Kronecker Products
1. For any two matrices A and B for which the indicated matrix product
is dened, show that (vec(A)T (vec(B) = Tr (AT B). In particular, if
B IRnn then Tr (B) = vec(In )T vec(B).
Answer 13.1 Suppose A is mn. For the product AT
Chapter 2
An Introduction to Vector Spaces
1. Suppose cfw_v1 , . . . , vk is a linearly dependent set. Then show that one of the vectors must be a linear combination of the others. Answer 2.1 Since cfw_v1 , . . . , vk is a linearly dependent set of vect
Chapter 13
Kronecker Products
1. For any two matrices A and B for which the indicated matrix product is dened, show that (vec(A)T (vec(B ) = Tr (AT B ). In particular, if B IRnn then Tr (B ) = vec(In )T vec(B ). Answer 13.1 Suppose A is m n. For the produ
Chapter 8
Linear Least Squares Problems
1. For A IRmn , b IRm , and any y IRn , check directly that (I A+ A)y and A+ b are
orthogonal vectors.
T
T
(I A+ A)y) A+ b = y T (I A+ A) A+ b
= y T (I A+ A)A+ b
= y T (A+ A+ AA+ )b
= 0.
2. Consider the following se
Prof. Alan J. Laub
November 6, 2014
EE 205A MIDTERM EXAMINATION
Fall 2014
Instructions:
(a) The exam is closed-book (except for one sheet [8.5 x 11 or A4, both sides] of notes)
and will last 90 minutes. Use of a calculator is encouraged but no cell phones
Chapter 3
Linear Transformations
1. Let A =
234 and consider A as a linear transformation map851 ping IR3 to . Findthematrix IR2 representation of A with respect 1 0 1 3 2 , of to the bases 1 , 1 , 0 of IR3 and 1 1 0 1 1 2 IR . thematrix M = Mat A satisfy
Chapter 2
An Introduction to Vector
Spaces
1. Suppose cfw_v1 , . . . , vk is a linearly dependent set. Then show that one
of the vectors must be a linear combination of the others.
Answer 2.1 Since cfw_v1 , . . . , vk is a linearly dependent set of vect
EE 205A
Matrix Analysis
Instructor: Lara Dolecek
Homework 6
Monday, Nov. 17th, 2014
Due: Monday, Nov. 24th, 2014
1. Provide an example for 3 3 matrix with two identical eigenvalues (but not Identity
matrix).
3 0 0
Sol: A = 0 3 0 . The characteristic polyn
Chapter 11
Linear Dierential and Dierence
Equations
1. Let P IRnn be a projection. Show that eP I + 1.718P .
+
+
+
1
1
1 k
P =I+
P =I+
1 P = I + (e 1)P I + 1.718 P.
e =
k!
k!
k!
k=0
k=1
k=0
P
2. Suppose x, y IRn and let A = xy T . Further, let = xT y.
Chapter 2
Vector Spaces
1. Suppose cfw_v1 , . . . , vk is a linearly dependent set. Then show that one of the vectors must be
a linear combination of the others.
Since cfw_v1 , . . . , vk is a linearly dependent set of vectors, there exist k scalars 1 ,
Chapter 5
An Introduction to Singular
Value Decomposition
1. Let X IRmn . If X T X = 0, show that X = 0.
Answer 5.1 This is easily seen directly. Let the n columns of X be
denoted by xi . Then the ijth element of the matrix X T X is xT xj and
i
each of th
EE 205A
Matrix Analysis
Instructor: Lara Dolecek
Homework 2
Monday, Oct. 13th, 2014
Due: Monday, Oct. 20th, 2014
1. Prove the statements 5 and 6 in in Theorem 3.11 in the book.
(a) Sol: Let v (R+S) . Since v T (au+bw) = 0 for any scalar a, b and u R, w S,
EE 205A
Matrix Analysis
Instructor: Lara Dolecek
Homework 7
Monday, Nov. 24th, 2014
Due: Monday, Dec. 1st, 2014
1. Show that the minimum value in the least squares problem (see section 8.4) does not
depend on the SVD decomposition of A, ie holds true for
Chapter 11
Linear Dierential and Dierence Equations
1. Let P IRnn be a projection. Show that eP I + 1.718P . Answer 11.1 eP =
+ 1k 1 P = I+ P = I+ k! k! k=0 k=1 +
1 1 P = I +(e1)P I +1.718 P. k! k=0
+
2. Suppose x, y IRn and let A = xy T . Further, let =
Chapter 10
Canonical Forms
1. Show that if a triangular matrix is normal, then it must be diagonal.
We prove this by induction on n. Let T Cnn be normal and, without loss of generality, assume it is upper triangular. For n = 1, the matrix T is simply a co
Chapter 8
Linear Least Squares Problems
1. For A IRmn , b IRm , and any y IRn , check directly that (I A+ A)y and A+ b are orthogonal vectors. Answer 8.1 (I A+ A)y ) A+ b = y T (I A+ A) A+ b = y T (I A+ A)A+ b = 0. = y T (A+ A+ AA+ )b 2. Consider the foll
Chapter 7
Projections, Inner Product
Spaces, and Norms
1. If P is an orthogonal projection, prove that P + = P .
Answer 7.1 Straightforward verication of the four Penrose conditions.
2. Suppose P and Q are orthogonal projections and P + Q = I. Prove
that
Chapter 10
Canonical Forms
1. Show that if a triangular matrix is normal, then it must be diagonal. Answer 10.1 We prove this by induction on n. Let T Cnn be normal and, without loss of generality, assume it is upper triangular. For n = 1, the matrix T is
Prof. Alan J. Laub
December 11, 2014
EE 205A FINAL EXAMINATION
Fall 2014
Instructions:
(a) The exam is closed-book (except for one two-sided page of notes) and will last 2 hours.
Write your answers on your own paper; you may keep the exam itself.
(b) You
Chapter 2
An Introduction to Vector
Spaces
1. Suppose cfw_v1 , . . . , vk is a linearly dependent set. Then show that one
of the vectors must be a linear combination of the others.
Answer 2.1 Since cfw_v1 , . . . , vk is a linearly dependent set of vect
Chapter 11
Linear Dierential and
Dierence Equations
1. Let P IRnn be a projection. Show that eP I + 1.718P .
Answer 11.1
eP =
+
+
1 k
1
P = I+
P = I+
k!
k!
k=0
k=1
+
1
1 P = I+(e1)P I+1.718 P.
k!
k=0
2. Suppose x, y IRn and let A = xy T . Further, let =
EE 205A
Matrix Analysis
Instructor: Lara Dolecek
Homework 7
Monday, Nov. 24th, 2014
Due: Monday, Dec. 1st, 2014
1. Show that the minimum value in the least squares problem (see section 8.4) does not
depend on the SVD decomposition of A, ie holds true for
Chapter 9
Eigenvalues and Eigenvectors
1. Let A Cnn have distinct eigenvalues 1 , . . . , n with corresponding right eigenvectors
x1 , . . . , xn and left eigenvectors y1 , . . . , yn , respectively. Let v Cn be an arbitrary vector.
Show that v can be exp
Chapter 9
Eigenvalues and Eigenvectors
1. Let A Cnn have distinct eigenvalues 1 , . . . , n with corresponding right eigenvectors x1 , . . . , xn and left eigenvectors y1 , . . . , yn , respectively. Let v Cn be an arbitrary vector. Show that v can be exp
Chapter 6
Solution of Linear Equations
1. As in Example 6.8, characterize all left inverses of a matrix A IRmn . Answer 6.1 A has a left inverse AT has a right inverse R(In ) R(AT ) AT (AT )+ In = In rank(AT ) = r = n (since r n) AT is onto (AT )+ is then
Chapter 1
Introduction and Review
1. If A IRnn and is a scalar, what is det(A)? What is det(A)? Answer 1.1 Suppose A IRnn and IR. Denote the identity of IRnn by In . Then det(A) = det(In A) = det(In )A) = det(In ) det A = n det A. Thus det(A) = det(1)A) =
Chapter 3
Linear Transformations
2 3 4
and consider A as a linear transformation map8 5 1
ping IR3 to . Findthematrix
IR2
representation of A with respect
1
0
1
3
2
to the bases 1 , 1 , 0 of IR3 and
,
of
1
1
0
1
1
IR2 .
1. Let A =
Answer 3.1 We m
Chapter 6
Solution of Linear Equations
1. As in Example 6.8, characterize all left inverses of a matrix A E 13an.
A has a left inverse 4: AT has a right inverse => RU") g R(AT)
=> AT(AT)+In = In 4: rank(AT) = 7 = n (since 1' g n) 4:) AT
is onto (AT)+ is t
xii
Preface
Preface
Mathcad is also excellent. Since this text is not intended for a course in
Mathematica or Mathcad
numerical linear algebra per
per se, the details of most of the numerical aspects of linear algebra
are deferred
deferred to such a cours
Preface
Preface
This book is intended to be used as a text for beginning graduate-level (or even senior-level)
sciences, mathematics, computer science,
science, or computational
students in engineering, the sciences,
science who wish to be familar with en